aws-lambda

AWS Lambda Container destroy event

早过忘川 提交于 2019-12-22 01:39:28
问题 When to release connections and cleanup resources in lambda. In normal Node JS application we do use the hook process.on('exit', (code) => { console.log(`About to exit with code: ${code}`); }); However this doesn't work on AWS Lambda. Resulting the Mysql connection in sleep mode. We don't have enough resource for such active connections. None of the AWS documentation specify a way to achieve this. How to receive stop event of AWS Lambda container ? 回答1: EDIT: The short answer is that there is

Custom Slot Type with AWS Lambda hook for Amazon Lex

允我心安 提交于 2019-12-22 01:32:18
问题 The Amazon Lex chatbot framework offers to use custom slot types. However, the mechanism is to provide an array of values that will be validated. But I want a custom validator that e.g. checks if the input is in a database. Ideally, I want to develop an AWS lambda hook that receives the input parameter and then executes some program that returns the either well-formated slot type or gives an error if the input was not valid. Anyone an idea? 回答1: AWS exposes an API to dynamically create slot

Create AWS Transcoder job from Lambda

不打扰是莪最后的温柔 提交于 2019-12-22 01:04:41
问题 I've created a Lambda function that is called with every new s3 object creation. I'm trying to retrieve the object, then create a new Transcoder job that alters the video quality. But the transcoder job is never created. creating job.... shows up but job created never appears in my logs. Going off of this tutorial. My Lambda Function: var aws = require('aws-sdk'); var elastictranscoder = new aws.ElasticTranscoder(); exports.handler = function(event, context) { console.log('Got Video:', JSON

Use Firebase SDK with Netlify Lambda Functions

馋奶兔 提交于 2019-12-22 00:35:36
问题 I create a project that uses React + Firebase + Lambda Functions. I have Firebase code on the front end and I needed a bit of back-end to handle some events. (Prevent users from modifying data in Firebase but allow this data to be updated by the application) As I use Netlify to deploy my app, I have access to Amazon Lambda Functions with netlify-lambda. (https://www.netlify.com/docs/functions/) Usually everything just works (mailchimp API, snipcart API etc ...) But I can not get Firebase

Allow lambda to access particular s3 bucket in serverless config

好久不见. 提交于 2019-12-21 20:42:35
问题 How can I allow specific lambda to access to a particular s3 bucket in the serverless.yml? For example, I am porting file upload functionality to lambda by using serverless. To upload a file to a particular s3 bucket, I need to allow lambda to access to that s3 bucket. How can I do this in the serverless.yml? 回答1: From Serverless Framework - AWS Lambda Guide - IAM: To add specific rights to this service-wide Role, define statements in provider.iamRoleStatements which will be merged into the

Can I pass path parameters using lambda invoke to another lambda function?

血红的双手。 提交于 2019-12-21 20:06:22
问题 I'm trying to call and get the response from another lambda function using lambda invoke. The problem is other lambda function needs the id to be sent as path parameters (or as a query string). But I do not see an option in lambda invoke for this. If I pass the id in payload the other function will receive it in the event body and not as path parameters. Is there an existing solution for this? Here is a function inside a lambda function which calls another lambda function which receives the

AWS SNS Publish to specific User via Cognito Identity ID

落花浮王杯 提交于 2019-12-21 20:02:01
问题 What I'm trying to do here is sending a notification via SNS and APNS when a specific user is part of a newly added DynamoDB Item. I want to send it to the users Cognito Identity ID, not to device token. So Lambda should be triggered when the item is added and then go through a list of Cognito Identity IDs, which is also part of the item. Then Lambda is supposed to publish the push notifications to each Cognito Identity ID. All the devices are registered as endpoints within sns. I also keep

Import libraries in lambda layers

爷,独闯天下 提交于 2019-12-21 17:52:27
问题 I wanted to import jsonschema library in my AWS Lambda in order to perform request validation. Instead of bundling the dependency with my app , I am looking to do this via Lambda Layers. I zipped all the dependencies under venv/lib/python3.6/site-packages/ . I uploaded this as a lambda layer and added it to my aws lambda using publish-layer-version and aws lambda update-function-configuration commands respectively. The zip folder is name "lambda-dep.zip" and all the files are under it.

Package Python Pipenv project for AWS Lambda

女生的网名这么多〃 提交于 2019-12-21 17:23:23
问题 I have a python project and I am using pipenv to handle deps. I need to create a zip file that includes the source code and all the dependencies code as well. I need this zip file for uploading it to AWS Lambda. When working with pipenv, it downloads the dependency libraries somewhere in the computer, but for packaging/distribution of the project I need all the necessary code to be contained in the same place (a zip file). Is there a way to run pipenv and set it to install dependencies at a

AWS Lambda Function is misinterpreting event dictionary in python?

心不动则不痛 提交于 2019-12-21 06:33:28
问题 I am trying to deploy a google calendar api to AWS Lambda. Since I was facing a problem in extracting the value from the event dictionary (created by lambda from the JSON payload of a POST request), i created a toy function to test def handler(event,context): a=event.get("type") if a=='create': return { "statusCode": 200, "headers": { "Content-Type": "text/plain"}, #"body": "Event_id"+ str(event_identifier) + " Event Link: " +str(links) "body" : str(a) } else: return { "statusCode": 200,