问题
Is there a way to load big files (>100MB) from Google Cloud Storage into Google Cloud Functions? I read in their quotas that the "Max event size for background functions" is limited to 10MB. Can I read it chunk-wise or something like that?
Many thanks.
回答1:
Cloud Functions for Storage are triggered with the metadata for the file, which is relatively small and won't hit the max-event-side limit.
To access the actual contents of the file, you'll use the node.js package for Cloud Storage, which is not affected by the 10MB limit.
回答2:
Unfortunately to my knowledge this isn't possible.
It is however possible to upload larger files from Google Cloud Functions to Cloud Storage by setting resumable=true. The way this works is that it uploads 10MB of the file to your bucket, the request eventually times out and retries, which will then re-download, re-process and re-upload the the file, resuming from where it left of with the next 10MB of the file, and so on.
Obviously this requires all processing to be done repeatedly and the request to time out making the entire process extremely inefficient and not recommended.
来源:https://stackoverflow.com/questions/47210842/load-big-file-from-google-cloud-storage-into-google-cloud-functions