getting scheduling error when forwarding s3 object to flask response

Issue

I’m using the following to pull a data file from an s3 compliant server:

    try:
        buf = io.BytesIO()
        client.download_fileobj(project_id, path, buf)
        body = buf.getvalue().decode("utf-8")

    except botocore.exceptions.ClientError as e:
        if defaultValue is not None:
            return defaultValue
        else:
            raise S3Error(project_id, path, e) from e
    else:
        return body

The code generates this error:

RuntimeError: cannot schedule new futures after interpreter shutdown

In general, I’m simply trying to read an s3-compliant file into the body of a response object. The caller of the above snippet is as follows:

        data = read_file(project_id, f"{PATH}/data.csv")
        response = Response(
            data,
            mimetype="text/csv",
            headers=[
                ("Content-Type", "application/octet-stream; charset=utf-8"),
                ("Content-Disposition", "attachment; filename=data.csv")
            ],
            direct_passthrough=True
        )

Playing with the code, if I don’t get a runtime error, the request hangs in that I don’t get a returned response.

Thank you to anyone with guidance.

Solution

I’m not sure how generic this answer will be, however, the combination of using boto to access the digital ocean version of the s3 implementation does not "strictly" permit using an object key that starts with /. Once I removed the offending leading character, the files downloaded as expected.

I base the boto specificity on the fact that I was able to read the files using a Haskell’s amazonka.

Answered By – Edmund's Echo

This Answer collected from stackoverflow, is licensed under cc by-sa 2.5 , cc by-sa 3.0 and cc by-sa 4.0

Leave a Reply

(*) Required, Your email will not be published