ActiveStorage big file uploads triggers Google::Execution::Expired

≯℡__Kan透↙ 提交于 2020-01-06 06:05:51

问题


While implementing ActiveStorage at work we found out that when uploading big file, 12GB , the operations hangs for about 10 minutes and the I get the error Google::Execution::Expired or sometimes HTTPClient::SendTimeoutError: execution expired.

I am running most uploads with a line like this:

backup.file.attach(io: File.open("/my/file/path.ext"), filename: "myfilename")

Is there a way to make the request to last longer or a way to circunvent this timeouts.

This strategy has worked fine, so far, for uploads of 4GB. It's just when I go overboard with the file size that this occurs. Time is not a problem on our side since this is a nightly task on a Cron job.


回答1:


The Google Cloud Storage client’s send timeout defaults to 1 minute or so. (You see a delay of 10 minutes because the client tries several times to resume the upload after encountering a timeout.) You can specify a different timeout in seconds in config/storage.yml:

production:
  service: GCS
  credentials: { ... }
  project: my-project
  bucket: my-bucket
  timeout: 120  # 2 minutes

Use timeout: 0 to disable the send timeout.



来源:https://stackoverflow.com/questions/50972352/activestorage-big-file-uploads-triggers-googleexecutionexpired

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!