aws-lambda

S3 Static website hosting: Is it possible to add multiple query string parameters to the target URL in a redirect rule?

試著忘記壹切 提交于 2019-12-11 09:46:51
问题 I am currently hosting a static website on S3 which is using redirection rules to re-route the request to a lambda function. The problem is I have two buckets that are both acting as static websites (sandbox vs production) and I need to be able to point them to the same lambda function but be able to tell them apart when the function is run. Whether it's a header, a GET parameter, string manipulation, anything I can use to determine which bucket the request came from. Does anyknow know how

Netlify NodeJS Function always returns 'Response to preflight request doesn't pass'

偶尔善良 提交于 2019-12-11 08:57:51
问题 I'm trying to create an API endpoint using Netlify Lambda Function . The code works perfectly in my local, but always returns Access to XMLHttpRequest at 'https://<my-netlify-project>.netlify.com/.netlify/functions/submit' from origin 'http://localhost:3000' has been blocked by CORS policy: Response to preflight request doesn't pass access control check: No 'Access-Control-Allow-Origin' header is present on the requested resource. I'm trying to handle OPTIONS and POST in my code, but it doesn

Run AWS Athena’s queries with Lambda function

删除回忆录丶 提交于 2019-12-11 08:06:43
问题 I created a table on AWS Athena on which I can run any query without any error: select * from mytestdb.test The table has three columns, customer_Id, product_Id, price . I tried to create a lambda function that run the same query for me using boto3: import time import boto3 DATABASE = 'mytestdb' TABLE = 'test' output='s3://mybucketons3/' COLUMN = 'Customer_Id' def lambda_handler(event, context): keyword = 'xyz12345' query = "SELECT * FROM %s.%s where %s = '%s';" % (DATABASE, TABLE, COLUMN,

Why sam package publishes the artifacts to bucket?

不羁岁月 提交于 2019-12-11 07:49:14
问题 As part of packaging the SAM application, the application published to s3 bucket as shown below: sam package --template-file sam.yaml --s3-bucket mybucket --output-template-file output.yaml Why sam package provides --s3-bucket option? Is this mandatory option? What is the purpose of publishing artifacts to s3 bucket? 回答1: --s3-bucket option in sam package command is mandatory. What the command does is that it takes your local code, uploads it to S3 and returns transformed template where

AWS API Gateway - Manage multiple model versions for different stages

旧巷老猫 提交于 2019-12-11 07:23:11
问题 I use multiple AWS API Gateway for many of our backend services. This is the standard structure of the Gateway (in relation to my question); API Gateway 2 stages (test and prod) AWS CLI Tool for automated changes and deployments The problem I'm having (this has been since I started using AWS API Gateway); The Gateway manages versionsing of the API perfectly fine, however, not model versioning... During a quarter (we work in 3 month sprints), we would make changes multiple models. This stage

AWS Elastic transcoder, HLS Content Protection, No Store, how to get the data key

匆匆过客 提交于 2019-12-11 06:49:44
问题 I'm using AWS Lambda to create an elastic transcoder job with HLS content protection. Following the doc here: http://docs.aws.amazon.com/elastictranscoder/latest/developerguide/content-protection.html At the end it says: "Note If you choose No Store, Elastic Transcoder returns your data key as part of the job object, but does not store it. You are responsible for storing the data key." But I don't see a way to get the data key once the job is finished. In my AWS Lambda source code I have

Step Functions with Lambdas using Lambda Proxy Integration

ⅰ亾dé卋堺 提交于 2019-12-11 06:37:25
问题 I have written a bunch of Lambda functions that are exposed as Rest endpoints through API Gateway. I have chosen the "Lambda Proxy Integration" since it seemed like a straightforward way to get started. Now I want to chain together 2 of these functions via AWS Step Functions. The general integration and configuration works fine except how to create the proper inputs for each task. Using the console I can start an Execution and give the following JSON: { "headers": { "Authorization": "Bearer

list RDS snapshot created today using Boto 3

笑着哭i 提交于 2019-12-11 06:35:11
问题 I am doing a Python Lambda function to describe list of RDS snapshots created today. The challenge is how to convert the datetime.datetime.today() into a format which RDS client understands? UPDATE: I have implemented some changes suggested, I have added a string variable to convert the date expression into format which Boto3 RDS understands. 'SnapshotCreateTime': datetime(2015, 1, 1), today = (datetime.today()).date() rds_client = boto3.client('rds') snapshots = rds_client.describe_db

Invoke Alexa device from lambda function

淺唱寂寞╮ 提交于 2019-12-11 06:29:48
问题 I am new in Alexa development. I have successfully create an Alexa skill with AWS lambda function and Node.js code. It's working fine with my Alexa echo plus device. e.g : Alexa, open "mySampleApp" Now, I need to invoke Alexa device to make it speak via from another lambda function. Is it possible? e.g: I just execute my lambda function. I need to make speech output via my Alexa echo plus device. 回答1: Sounds like you want to trigger a notification (make speech output) from outside of your

Nodejs async issue while decrypting aws kms keys

有些话、适合烂在心里 提交于 2019-12-11 06:22:54
问题 I have a lambda function in node6 which has 5 env variables all encrypted with aws kms. I have the following method which takes a encrypted key and returns a decrypted key. function decryptKMS(encryptedKey) { console.log('inside decryptkms'); const kms = new AWS.KMS(); kms.decrypt({ CiphertextBlob: new Buffer(encryptedKey, 'base64') }, (err, data) => { if (err) { console.log('Decrypt error:', err); return callback(err); } var result = data.Plaintext.toString('ascii'); return result; }); } And