aws-lambda

Voice message save in aws s3 bucket using Amazon Connect

≯℡__Kan透↙ 提交于 2019-12-11 12:27:42
问题 how to save voice message of customer number and store in an s3 bucket using aws connect. I made a contact workflow but I am not understanding how to save voice message to s3 bucket? 回答1: The recording in S3 is only starting when an agent is taking the call. Currently, there is no direct voice mail feature in Amazon connect. You can forward the call to a service that allows it, such as Twillio. 回答2: As soon as you enabled Voice Recording all recordings are placed automatically in the bucket

Publish to AWS… Missing from Visual Studio 2017

老子叫甜甜 提交于 2019-12-11 12:18:02
问题 I am trying to publish a lambda function to AWS but my VS Solution Explorer's Project right-click menu does not have Publish to AWS... option in Visual Studio 2017. This was there just a day back. How or can the Publish to AWS... be added to the Visual Studio 2017 right click menu? 回答1: You need to add the Amazon.Lambda.Tools package to your project using the nuget package manager If adding the package using nuget fails, add these lines to your .csproj file inside the <project> tags

How High Is the s3 File Size Download Limit If I Move My Google App Engine Service to an Amazon Lambda Function?

本小妞迷上赌 提交于 2019-12-11 12:15:09
问题 Broadly, I seek to answer the question, "Should I redevelop my app with the Amazon Web Service (AWS) platform?" As described in another post, I have a google app engine (GAE) php55 service (which I am now contemplating migrating to AWS) that periodically checks a public web server and downloads a file directly to a Google Cloud Storage bucket. This file is typically small (<1MB), but occasionally over the 32MB individual response size limit on GAE. My simple app is based on the following: <

Recipient rule set to put emails into dynamic s3 bucket Amazon

烂漫一生 提交于 2019-12-11 12:02:42
问题 I am trying to set up amazon SES recipient rule set for putting emails into an s3 bucket. I have created an s3 bucket and I want these mails to sent into folders according to the email id. For example if an email is coming to 1@mydomain.com it should go into mytestbucket/1 and if it is coming to 2@mydomain.com it should go into mytestbucket/2 . AWSCredentials awsCredentials = new BasicAWSCredentials(accessKey, secretKey); AmazonSimpleEmailServiceClient sesClient = new

AWS Lambda connect via PG.js to RDS Postgres database (connection made, no timeout, but no database?)

血红的双手。 提交于 2019-12-11 11:55:46
问题 I have my lambda function is trying to connect to an RDS PostGreSQL DB. Since I use https://serverless.com/ to deploy the function (sets up my cloudfront) it puts the LF in a separate VPC from the RDS DB. Not a big issue. If you read: https://docs.aws.amazon.com/lambda/latest/dg/services-rds-tutorial.html you see you can setup the serverless.yml file (as below) with the subnet, and security Group IDs, and then give a role to the Lambda Function that has AWSLambdaVPCAccessExecutionRole (I gave

AWS API Gateway with Lambda proxy integration fails

我是研究僧i 提交于 2019-12-11 10:45:17
问题 My problem is pretty simple, really. I wrote a Scala handler class that returns a String containing the following JSON: { "isBase64Encoded":false, "statusCode":404, "headers":{ "Content-Type":"text/html" }, "body":"<p>The requested resource could not be found.</p>" } My class has the following signature: object Handler extends RequestHandler[APIGatewayProxyRequestEvent, String] { override def handleRequest(input: APIGatewayProxyRequestEvent, context: Context): String = ??? } I am using the

Unable to import submodules of scipy in AWS Lambda

耗尽温柔 提交于 2019-12-11 10:42:35
问题 I have imported several packages into my AWS Lambda python code . I have scipy installed and seems to be imported correctly using import scipy .However, when I try to import the submodules like from scipy.signal import lfilter , I am unable to import those kind of modules in the Lambda function. I have no clue on why the package import works locally and not on Lambda Here is a my worker.py import logging def lambda_handler(event, context): bucket = event['Records'][0]['s3']['bucket']['name']

sed recognition response to DynamoDB table using Lambda-python

╄→гoц情女王★ 提交于 2019-12-11 10:02:30
问题 I am using Lambda to detect faces and would like to send the response to a Dynamotable. This is the code I am using: rekognition = boto3.client('rekognition', region_name='us-east-1') dynamodb = boto3.client('dynamodb', region_name='us-east-1') # --------------- Helper Functions to call Rekognition APIs ------------------ def detect_faces(bucket, key): response = rekognition.detect_faces(Image={"S3Object": {"Bucket": bucket, "Name": key}}, Attributes=['ALL']) TableName = 'table_test' for face

Upload Image into AWS S3 bucket using Aws Lambda

大兔子大兔子 提交于 2019-12-11 09:58:22
问题 I want some suggestion on Upload Image file in S3 bucket using Lambda function.I am able to create bucket using lambda function but unable to upload file to S3 using Lambda function. It is possible? can we upload local system files(image,text etc) files to S3 bucket using lambda?. when I am trying upload file using C:\users\images.jpg to S3 using Lambda function its showing me error ..Error: ENOENT, no such file or directory 'C:\Users\Images'. Please suggest. Thanks 回答1: You have to imagine

Could not get the syntax of policy definition in SAM template resource(serverless function)

会有一股神秘感。 提交于 2019-12-11 09:47:49
问题 Policy definition of AWS managed policy ( AWSLambdaExecute ) is: { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "logs:*" ], "Resource": "arn:aws:logs:*:*:*" }, { "Effect": "Allow", "Action": [ "s3:GetObject", "s3:PutObject" ], "Resource": "arn:aws:s3:::*" } ] } But the AWS_documentation gives a sample serverless function using the same policy name AWSLambdaExecute , as shown below: Type: AWS::Serverless::Function Properties: Handler: index.js Runtime: nodejs8.10