问题
While implementing ActiveStorage at work we found out that when uploading big file, 12GB , the operations hangs for about 10 minutes and the I get the error Google::Execution::Expired
or sometimes HTTPClient::SendTimeoutError: execution expired
.
I am running most uploads with a line like this:
backup.file.attach(io: File.open("/my/file/path.ext"), filename: "myfilename")
Is there a way to make the request to last longer or a way to circunvent this timeouts.
This strategy has worked fine, so far, for uploads of 4GB. It's just when I go overboard with the file size that this occurs. Time is not a problem on our side since this is a nightly task on a Cron job.
回答1:
The Google Cloud Storage client’s send timeout defaults to 1 minute or so. (You see a delay of 10 minutes because the client tries several times to resume the upload after encountering a timeout.) You can specify a different timeout in seconds in config/storage.yml
:
production:
service: GCS
credentials: { ... }
project: my-project
bucket: my-bucket
timeout: 120 # 2 minutes
Use timeout: 0
to disable the send timeout.
来源:https://stackoverflow.com/questions/50972352/activestorage-big-file-uploads-triggers-googleexecutionexpired