API Gateway GET / PUT large files into S3

戏子无情 提交于 2020-08-04 05:19:20

问题


Following this AWS documentation, I was able to create a new endpoint on my API Gateway that is able to manipulate files on an S3 repository. The problem I'm having is the file size (AWS having a payload limitation of 10MB).

I was wondering, without using a lambda work-around (this link would help with that), would it be possible to upload and get files bigger than 10MB (even as binary if needed) seeing as this is using an S3 service as a proxy - or is the limit regardless?

I've tried PUTting and GETting files bigger than 10MB, and each response is a typical "message": "Timeout waiting for endpoint response".

Looks like Lambda is the only way, just wondering if anyone else got around this, using S3 as a proxy.

Thanks


回答1:


You can create a Lambda proxy function that will return a redirect link with a S3 pre-signed URL.

Example JavaScript code that generating a pre-signed S3 URL:

var s3Params = {
    Bucket: test-bucket,
    Key: file_name,
    ContentType: 'application/octet-stream',
    Expires: 10000
};

s3.getSignedUrl('putObject', s3Params, function(err, data){
   ...
}

Then your Lambda function returns a redirect response to your client, like,

{
    "statusCode": 302,
    "headers": { "Location": "url" }
}

You might be able to find more information you need from this documentation.




回答2:


If you have large files, consider directly uploading them to S3 from your client. You can create a API endpoint to return a signed URL for the client to use for the upload (To Implement Access Control) your private content.

Also you can consider using multi-part uploads for even larger files to speed up the uploading.



来源:https://stackoverflow.com/questions/42967547/api-gateway-get-put-large-files-into-s3

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!