aws-lambda

libmysqlclient.so.18: cannot open shared object file: No such file or directory

五迷三道 提交于 2020-07-10 01:47:12
问题 I'm currently trying to AWS Lambda a Ruby application. I was having issues installing mysql2 , when I ran bundle install I was getting the following error: An error occurred while installing mysql2 (0.5.2), and Bundler cannot continue. Make sure that `gem install mysql2 -v '0.5.2' --source 'https://rubygems.org/'` succeeds before bundling I came across a Stackoverflow post to create a Docker container to load the dependencies, here is the post. Cannot load file mysql2 on AWS Lambda I followed

Python Boto3 - how to check if s3 file is completely written before process start copying to another bucket

ぐ巨炮叔叔 提交于 2020-07-09 15:07:49
问题 How to make sure that Process A has completely written large file (5+ GB) in AWS S3 Bucket A before Process B starts copying file to AWS S3 Bucket B using boto3? 回答1: If a new object is being created in Amazon S3, it will only appear after the upload is complete. Other processes will not be able to view it until is has finished uploading. Objects cannot be updated in S3. Rather, they are replaced with a new object. So, if an object is in the process of being updated, it will still appear as

Python Boto3 - how to check if s3 file is completely written before process start copying to another bucket

扶醉桌前 提交于 2020-07-09 15:07:29
问题 How to make sure that Process A has completely written large file (5+ GB) in AWS S3 Bucket A before Process B starts copying file to AWS S3 Bucket B using boto3? 回答1: If a new object is being created in Amazon S3, it will only appear after the upload is complete. Other processes will not be able to view it until is has finished uploading. Objects cannot be updated in S3. Rather, they are replaced with a new object. So, if an object is in the process of being updated, it will still appear as

AWS API Gateway request body as Java POJO for function

本小妞迷上赌 提交于 2020-07-09 14:53:37
问题 I was just having a really basic problem using aws-lambda, API Gateway and the serverless framework. I just wanted to hand over the body of a post request as a Java POJO. Okay, so here's the setup: POJO: public class Person { private String lastName; private string firstName; ... Setters and Getters omitted } Handler: public class PersonHandler implements RequestHandler<Person, ApiGatewayResponse> { @Override public ApiGatewayResponse handleRequest(lastNamePerson person, Context context) { //

AWS lambda with python asyncio. Event loop closed problem?

这一生的挚爱 提交于 2020-07-09 12:52:45
问题 Closing the event loop in aws lambda affects future lambda runs?? I have some aysncio python code running within an aws lambda service. The logic of the code is as follows def lambda_handler(event,context): loop = asyncio.get_event_loop() # perform all operations with the loop loop.close() return results If I run this once, it appears to work fine. However, if I rerun it immediately afterwards, I get an error saying Event loop closed Why is this happening? Shouldn't each lambda run be

AWS Lex Lambda return multiple lines with Python

夙愿已清 提交于 2020-07-09 05:25:32
问题 I've been reading the AWS Lex / Lambda docs and looking at the examples. I don't see a way to return multiple lines. I want to create an intent that when a user types 'Help' It gives me an output like below. Options: Deploy new instance. Undeploy instance. List instances. I've tried this: def lambda_handler(event, context): logger.debug('event.bot.name={}'.format(event['bot']['name'])) a = { "dialogAction": { "type": "Close", "fulfillmentState": "Fulfilled", "message": { "contentType":

How to copy from one bucket to another bucket in s3 of certain suffix

我的未来我决定 提交于 2020-07-08 06:24:10
问题 I have 3 buckets 1.commonfolder 2.jsonfolder 3.csvfolder . Common folder will be having both json and csv files need to copy all csv files to csvfolder need to copy all json files to json folder Code is below to get all the files from commonfolder How to copy after that import boto3 s3 = boto3.client('s3') def lambda_handler(event, context): #List all the bucket names response = s3.list_buckets() for bucket in response['Buckets']: print (bucket) print(f'{bucket["Name"]}') #Get the files of

AWS Lambda NodeJS Connect to RDS Postgres Database

柔情痞子 提交于 2020-07-05 07:07:30
问题 I'm trying to test connectivity between my Lambda and an RDS instance. I have them both on the same private subnets with all ports open in the security group. When I trigger the Lambda I do see a connection opened on the RDS instance. However, the Lambda times out after 4 minutes, 40 seconds. The PG environment variables are set in the Lambda configuration. const { Client } = require('pg'); const client = new Client(); var hello = [ { name: 'test', description: 'testerface' } ]; exports

ValidationException: The provided key element does not match the schema

守給你的承諾、 提交于 2020-07-05 01:26:29
问题 I created a table 'user_info' in DynamoDB with one primary hash key 'user_id'(String), no range key. Then I created 2 AWS lambda functions to insert and query the items. I can insert items into the table, but when I query the table, it returns: ValidationException: The provided key element does not match the schema. My query function : var params = { Key: { user_id:{ S: "usr1@s.com" } }, TableName: 'user_info', ProjectionExpression: 'password' }; dynamodb.getItem(params, function(err, data) {

ValidationException: The provided key element does not match the schema

醉酒当歌 提交于 2020-07-05 01:25:09
问题 I created a table 'user_info' in DynamoDB with one primary hash key 'user_id'(String), no range key. Then I created 2 AWS lambda functions to insert and query the items. I can insert items into the table, but when I query the table, it returns: ValidationException: The provided key element does not match the schema. My query function : var params = { Key: { user_id:{ S: "usr1@s.com" } }, TableName: 'user_info', ProjectionExpression: 'password' }; dynamodb.getItem(params, function(err, data) {