AWS - want to upload multiple files to S3 and only when all are uploaded trigger a lambda function

后端 未结 1 401

I am seeking advice on what\'s the best way to design this -

Use Case

I want to put multiple files into S3. Once all files are successfully

1条回答
  •  耶瑟儿~
    2021-01-02 14:08

    If you don't want the client program responsible for invoking the Lambda function directly, then would it be OK if it did something a bit more generic?

    Option 1: (SNS) What if it simply notified an SNS topic that it had completed a batch of S3 uploads? You could subscribe your Lambda function to that SNS topic.

    Option 2: (DynamoDB Streams) What if it simply updated the DynamoDB record with something like an attribute record.allFilesUploaded = true. You could have your Lambda function trigger off the DynamoDB stream. Since you are already creating a DynamoDB record via the client, this seems like a very simple way to mark the batch of uploads as complete without having to code in knowledge about what needs to happen next. The Lambda function could then check the "allFilesUploaded" attribute instead of having to go to S3 for a file listing every time it is called.

    Alternatively, don't insert the DynamoDB record until all files have finished uploading, then your Lambda function could just trigger off new records being created.

    Option 3: (continuing to use S3 triggers) If the client program can't be changed from how it works today, then instead of listing all the S3 files and comparing them to the list in DynamoDB each time a new file appears, simply update the DynamoDB record via an atomic counter. Then compare the result value against the size of the file list. Once the values are the same you know all the files have been uploaded. The down side to this is that you need to provision enough capacity on your DynamoDB table to handle all the updates, which is going to increase your costs.

    Also, I agree with you that SWF is overkill for this task.

    0 讨论(0)
提交回复
热议问题