Load big file from Google Cloud Storage into Google Cloud Functions?

风流意气都作罢 提交于 2020-01-04 05:45:10

问题


Is there a way to load big files (>100MB) from Google Cloud Storage into Google Cloud Functions? I read in their quotas that the "Max event size for background functions" is limited to 10MB. Can I read it chunk-wise or something like that?

Many thanks.


回答1:


Cloud Functions for Storage are triggered with the metadata for the file, which is relatively small and won't hit the max-event-side limit.

To access the actual contents of the file, you'll use the node.js package for Cloud Storage, which is not affected by the 10MB limit.




回答2:


Unfortunately to my knowledge this isn't possible.

It is however possible to upload larger files from Google Cloud Functions to Cloud Storage by setting resumable=true. The way this works is that it uploads 10MB of the file to your bucket, the request eventually times out and retries, which will then re-download, re-process and re-upload the the file, resuming from where it left of with the next 10MB of the file, and so on.

Obviously this requires all processing to be done repeatedly and the request to time out making the entire process extremely inefficient and not recommended.



来源:https://stackoverflow.com/questions/47210842/load-big-file-from-google-cloud-storage-into-google-cloud-functions

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!