Send large files to GAE Blobstore via API (from S3)

淺唱寂寞╮ 提交于 2019-12-11 09:09:27

问题


I'm trying to send large files (50MB-2GB) that I have stored in S3 using filepicker.io to the Google App Engine Blobstore (in Python). I would like this to happen without going through a form on the client browser as it defeats the project requirements (and often hangs with very large files).

I tried multiple solutions, including:

  • trying to load the file into GAE with urlfetch (but GAE has a 32MB limit for requests/responses)
  • constructing a multi-part form in python and sending it to blobstore.create_upload_url() (can't transfer the file via url, and can't load it in the script because of the 32MB limit)
  • using boto to read the file straight into the blobstore (connection times out and logs show encountered HTTPException exception from boto that triggers CancelledError: The API call logservice.Flush() was explicitly cancelled. from GAE that crashes the process.

I am struggling to find a working solution. Any hints on how I could perform this transfer, or how to pass the file from s3 as a form attachment without loading it in python first (ie. just specifying its url) would be very much appreciated.


回答1:


The BlobstoreUploadHandler isn't constrained by a 32MB limit: https://developers.google.com/appengine/docs/python/tools/webapp/blobstorehandlers. However, I'm not sure how this might fit into your app. If you can post the file to an endpoint handled by a BlobstoreUploadHandler, then you should be good to go.



来源:https://stackoverflow.com/questions/15970514/send-large-files-to-gae-blobstore-via-api-from-s3

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!