aws-lambda

SQS distinguish between duplicate and failed/retry messages?

天大地大妈咪最大 提交于 2019-12-13 03:48:43
问题 I am writing an application in Lambda that is invoked by SQS messages. I would like to be able to tell the difference between an invocation resulting from a "duplicate" message vs one resulting from a previous failure/retry (both SQS and Lambda will retry in case of failure). Is the messageId the same for duplicate messages, or just the body? If they are different I might be able to track a messageId against a key from the body to identity duplicates. TIA. 回答1: Ideally, you would want to

KeyConditions and KeyConditionExpression fail in Python

痞子三分冷 提交于 2019-12-13 03:45:32
问题 I am trying to modify the lambda-microservice example to submit a GET query to dynamodb from an aws-api-gateway. I have gotten the example to work but it does a table scan, not a query. The query fails trying to set the KeyConditionExpression with the following message: "Lambda execution failed with status 200 due to customer function error: name 'Key' is not defined." The relevant python is here: operations = { 'GET': lambda dynamo, x: dynamo.query(**x), } dynamodb = boto3.resource('dynamodb

Unable to integrate API gateway with aws lambda

落爺英雄遲暮 提交于 2019-12-13 03:43:24
问题 I am trying to integrate AWS API Gateway with an AWS lambda function. The integration works flawlessly until I use the 'Lambda Proxy integration' in my Integration Request. When I check 'Use Lambda Proxy integration' in my integration request, I start getting: "Execution failed due to configuration error: Malformed Lambda proxy response" I googled around a bit and realized that I need to send back the response in a certain format: { "isBase64Encoded": true|false, "statusCode": httpStatusCode,

Invalid permission from Lambda to MongoDB in EC2

こ雲淡風輕ζ 提交于 2019-12-13 03:39:00
问题 I have created a Lambda Function which intends to connect to MongoDB running on EC2. I have followed some tutorials and guaranteed that: Lambda and EC2 run in the same VPC Lambda has configured EC2's subnet Lambda has its own security group my-lambda-sg Lambda's security group is allowed in EC2's security group inbounds rules for MongoDB's port as a "Custom TCP Rule" Lambda's role has assigned permission AWSLambdaVPCAccessExecutionRole However, I am stil unable to connect from the lambda to

call sagemaker endpoint using lambda function

為{幸葍}努か 提交于 2019-12-13 03:34:02
问题 I have some data in S3 and I want to create a lambda function to predict the output with my deployed aws sagemaker endpoint then I put the outputs in S3 again. Is it necessary in this case to create an api gateway like decribed in this link ? and in the lambda function what I have to put. I expect to put (where to find the data, how to invoke the endpoint, where to put the data) Thanks 回答1: you definitely don't have to create an API in API Gateway. You can invoke the endpoint directly using

Find which resource triggered CodePipeline when multiple resources from CodeCommit

江枫思渺然 提交于 2019-12-13 03:28:02
问题 I'm using AWS and created a CodePipeline using multiple resources (CodeCommit). I properly getting events like this: {'CodePipeline.job': {'id': '... In this event I can find the latest commit for each CodeCommit resource but I did not find how I can determine which specific resource triggered the CodePipeline execution. Is it something doable ? Thanks for your help. 回答1: It is not supported yet, but will be in the future. Right now, maybe you could work around by tracking the eventname of

Nothing inside the S3 API's getObject callback is running in Lambda function

|▌冷眼眸甩不掉的悲伤 提交于 2019-12-13 03:23:48
问题 I'm having a problem where I can't read my file from S3... or even get inside the S3 callback. I'm using node 8.10 for my lambda, and I've verified everything is running until I try to get inside of getObject -- the console.log below won't even run. Does anything look askew here? I've granted full access to lambda and S3, so I don't think that's the issue. const AWS = require('aws-sdk') exports.handler = async (event, context, callback) => { const s3options = { accessKeyId: process.env.AWS

access denied in lambda when trying to access file uploaded to s3 bucket

大城市里の小女人 提交于 2019-12-13 03:10:48
问题 Following on from the great help I received on my original post Uploading a file to an s3 bucket, triggering a lambda, which sends an email containing info on the file uploaded to s3 buket I have tested previously sending the email so I know that works but when I try to include the data of the upload it fires error Could not fetch object data: { AccessDenied: Access Denied at Request.extractError (/var/runtime/node_modules/aws-sdk/lib/services/s3.js:577:35) at Request.callListeners (/var

Boto3 Cli: How to pass the variable content in USERDATA to php script?

眉间皱痕 提交于 2019-12-13 02:59:47
问题 I need to substitute the server name in the user data with the RDS ENDPOINT. I can get the RDS endpoint but not sure how to substitute that in the php file properly. Here is how I get the RDS endpoint: instances = source.describe_db_instances(DBInstanceIdentifier=db_instance) rds_host = instances.get('DBInstances')[0].get('Endpoint').get('Address') Another way is also: RDS=aws rds --region ca-central-1 describe-db-instances --query "DBInstances[*].Endpoint.Address" Next I need to pass the RDS

upload failed: { Error: unable to verify the first certificate

微笑、不失礼 提交于 2019-12-13 02:53:18
问题 I wrote a small code in AWS-Lambda(Node.js) to send the file to some API. I am able to run the code but i am getting the upload error. Error: Function Logs: START RequestId: 08ad7fab-3658-11e8-8483-a7fbad976cb7 Version: $LATEST 2018-04-02T09:27:17.787Z 08ad7fab-3658-11e8-8483-a7fbad976cb7 upload failed: { Error: unable to verify the first certificate at Error (native) at TLSSocket.<anonymous> (_tls_wrap.js:1092:38) at emitNone (events.js:86:13) at TLSSocket.emit (events.js:185:7) at TLSSocket