aws-lambda

AWS Lambda - NAT Gateway internet access results in timeout

淺唱寂寞╮ 提交于 2020-08-24 03:45:10
问题 I have a AWS Lambda function which: checks a Redis Elasticache instance, if the item is not found in the cache, goes to Google Places API service. The Redis instance is in a private subnet ; so, to fetch it, I added the VPC and the subnet in which the instance resides. I also specified the security group which allows all the outbound traffic. The Network ACL is the default one which is supposed to all the inbound and the outbound traffic. When adding VPC to Lambda function like that via the

async / await breaks my Webpack build for AWS Lambda; how can I migrate to Node 8.10?

£可爱£侵袭症+ 提交于 2020-08-23 07:52:29
问题 Note: this is a Q&A on migrating AWS Lambda Webpack builds from v6.10 to v8.10 — no help is needed, but even better answers are of course always encouraged! For a while now I have been a disciple of using Webpack to build my back-end Lambdas after reading James Long's excellent series entitled "Backend Apps with Webpack" (part 1, part2, and part3). Up until recently, the only version of Node.js that Amazon Web Services offered was 6.10; you had to write your Lambda fn in the "callback" style.

AWS: how to fix S3 event replacing space with '+' sign in object key names in json

ぐ巨炮叔叔 提交于 2020-08-22 03:33:22
问题 I have a lamba function to copy objects from bucket 'A' to bucket 'B', and everything was working fine, until and object with name 'New Text Document.txt' was created in bucket 'A', the json that gets built in S3 event, key as "key": "New+Text+Document.txt". the spaces got replaced with '+'. I know it is a known issue by seraching on web. But I am not sure how to fix this and the incoming json itself has a '+' and '+' can be actually in the name of the file. like 'New+Text Document.txt'. So I

Access-control-allow-origin is not allowed by Access-Control-Allow-Headers in preflight response

空扰寡人 提交于 2020-08-20 11:32:30
问题 I have prepared an Lambda function using Express(node.js) and enabled the Authorization with IAM as well. The API is working in the Postman as per below link : https://www.youtube.com/watch?v=KXyATZctkmQ&t=35s As I'm fairly new with CORS policy and API concepts. I'm trying to access the sample using Ajax call. So far I have prepared the Authorization Header as per documentation and few reference. Git Repo Link : https://github.com/mudass1r/aws-iam-authorization.git Reference Link for

How to create a key as folder and value as files

流过昼夜 提交于 2020-08-20 06:27:06
问题 I have a bucket name testfolder Inside testfolder there are test1,test2,test3 Each folder have there are csv files Need to create a key value pair for folder and files Expected out output1 { 'test1':['csv1.csv'], 'test2':['csv2'], 'test3':['csv3']} output2 { 'test1':'csv1.csv', 'test2':'csv2', 'test3':'csv3'} #list all the objects import boto3 s3 = boto3.client("s3") final_data = {} all_objects = s3.list_objects(Bucket = 'testfolder') #List the object in subfolder #create a dictionary 回答1:

How to extract the elements from csv to json in S3

六眼飞鱼酱① 提交于 2020-08-19 17:39:07
问题 I need to find the csv files from the folder List all the files inside the folder Convert files to json and save in the same bucket Csv file, Like below so many csv files are there emp_id,Name,Company 10,Aka,TCS 11,VeI,TCS Code is below import boto3 import pandas as pd def lambda_handler(event, context): s3 = boto3.resource('s3') my_bucket = s3.Bucket('testfolder') for file in my_bucket.objects.all(): print(file.key) for csv_f in file.key: with open(f'{csv_f.replace(".csv", ".json")}', "w")

How to push the data from dynamodb through stream

喜夏-厌秋 提交于 2020-08-17 11:06:12
问题 Below is the json file [ { "year": 2013, "title": "Rush", "actors": [ "Daniel Bruhl", "Chris Hemsworth", "Olivia Wilde" ] }, { "year": 2013, "title": "Prisoners", "actors": [ "Hugh Jackman", "Jake Gyllenhaal", "Viola Davis" ] } ] Below is the code to push to dynamodb. I have created testjsonbucket bucket name, moviedataten.json is the filename and saved above json.Create a dynamodb with Primary partition key as year (Number) and Primary sort key as title (String). import json from decimal

How to create invoke same lambda function for the two api-gateway

时光怂恿深爱的人放手 提交于 2020-08-15 13:02:34
问题 I aws create a 2 api-gateway First one https://xx.xx-api.us-east-1.amazonaws.com/v1/uploadapi/?search=all then my lambda function will invoke below searchone = es.search(index="my-index", body={"query": {"match_all": {}}}) return searchone Second One https://xx.xx-api.us-east-1.amazonaws.com/v1/uploadapi/?search=matchphrase=name_computer searchtwo = es.search(index="my-index", body={"query": {"match": {"name":"computer"}}}) return searchtwo Basically need to create single lambda function if

How to create invoke same lambda function for the two api-gateway

这一生的挚爱 提交于 2020-08-15 12:58:55
问题 I aws create a 2 api-gateway First one https://xx.xx-api.us-east-1.amazonaws.com/v1/uploadapi/?search=all then my lambda function will invoke below searchone = es.search(index="my-index", body={"query": {"match_all": {}}}) return searchone Second One https://xx.xx-api.us-east-1.amazonaws.com/v1/uploadapi/?search=matchphrase=name_computer searchtwo = es.search(index="my-index", body={"query": {"match": {"name":"computer"}}}) return searchtwo Basically need to create single lambda function if

How to design dynamodb to Elastic search with Insert/Modify/Remove

╄→гoц情女王★ 提交于 2020-08-14 04:33:28
问题 How to pass this entire document into elastic search using Python? Is this this the right way to put into elastic search? In dynamodb id is the primary key How to insert in to dynamodb Below is the code import boto3 from boto3.dynamodb.conditions import Key, And, Attr def lambda_handler(event, context): dynamodb = boto3.resource ('dynamodb') table =dynamodb.Table('newtable') with table.batch_writer(overwrite_by_pkeys=['id']) as batch: batch.put_item( Item={ 'id': '1', 'last_name': 'V', 'age':