aws-lambda

Lambda S3 Put function not triggering for larger files

这一生的挚爱 提交于 2021-01-18 10:09:41
问题 I am currently exploring storing the attachments of an email separately from the .eml file itself. I have an SES rule set that delivers an inbound email to a bucket. When the bucket retrieves the email, an S3 Put Lambda function parses the raw email (MIME format), base64 decodes the attachment buffers, and does a putObject for each attachment and the original .eml file to a new bucket. My problem is that this Lambda function does not trigger for emails with attachments exceeding ~3-4 MB. The

Lambda S3 Put function not triggering for larger files

自闭症网瘾萝莉.ら 提交于 2021-01-18 10:09:40
问题 I am currently exploring storing the attachments of an email separately from the .eml file itself. I have an SES rule set that delivers an inbound email to a bucket. When the bucket retrieves the email, an S3 Put Lambda function parses the raw email (MIME format), base64 decodes the attachment buffers, and does a putObject for each attachment and the original .eml file to a new bucket. My problem is that this Lambda function does not trigger for emails with attachments exceeding ~3-4 MB. The

Run Crawler using custom resource Lambda

久未见 提交于 2021-01-07 16:31:51
问题 I am trying to create and invoke an AWS Glue crawler using cloud formation. The creation part of the crawler(dynamo DB as target) is in lambda function. how can I achieve all this using cloud formation? i.e creation of lambda function from cod present in s3 , After lambda function is created, it should get triggered to create crawler and then crawler should be invoked to create targted tables. I want all of this is cloud formation. link for reference: Is it possible to trigger a lambda on

Run Crawler using custom resource Lambda

£可爱£侵袭症+ 提交于 2021-01-07 16:27:56
问题 I am trying to create and invoke an AWS Glue crawler using cloud formation. The creation part of the crawler(dynamo DB as target) is in lambda function. how can I achieve all this using cloud formation? i.e creation of lambda function from cod present in s3 , After lambda function is created, it should get triggered to create crawler and then crawler should be invoked to create targted tables. I want all of this is cloud formation. link for reference: Is it possible to trigger a lambda on

Run Crawler using custom resource Lambda

左心房为你撑大大i 提交于 2021-01-07 16:25:23
问题 I am trying to create and invoke an AWS Glue crawler using cloud formation. The creation part of the crawler(dynamo DB as target) is in lambda function. how can I achieve all this using cloud formation? i.e creation of lambda function from cod present in s3 , After lambda function is created, it should get triggered to create crawler and then crawler should be invoked to create targted tables. I want all of this is cloud formation. link for reference: Is it possible to trigger a lambda on

Added custom lambda layer caused SSL authorization error when calling AWS S3

ε祈祈猫儿з 提交于 2021-01-07 07:00:19
问题 I created a lambda layer for AWS lambda in Python 3.8, however, it is causing a SSL authorization error when calling S3 in my lambda function (even though no packages in the lambda layer are imported in the main lambda function) The following code succeeded when no lambda layer is added, but failed when added my custom lambda layer. import boto3 def lambda_handler(event, context): c1 = boto3.client("s3") lst = c1.list_buckets() print(lst) return { 'statusCode': 200, 'body': 'Hello from Lambda

How to use zappa in gitlab CI/CD to deploy app to AWS Lambda?

北城以北 提交于 2021-01-07 06:42:27
问题 I am trying to deploy a flask application on aws lambda via zappa through gitlab CI. Since inline editing isn't possible via gitlab CI, I generated the zappa_settings.json file on my remote computer and I am trying to use this to do zappa deploy dev . My zappa_settings.json file: { "dev": { "app_function": "main.app", "aws_region": "eu-central-1", "profile_name": "default", "project_name": "prices-service-", "runtime": "python3.7", "s3_bucket": -MY_BUCKET_NAME- } } My .gitlab-ci.yml file:

How to locally debug dependencies in a lambda layer?

限于喜欢 提交于 2021-01-07 03:10:59
问题 I'm creating a lambda layer from a dockerfile that installs python packages to a directory and zips the result. FROM amazonlinux WORKDIR / RUN yum update -y # Install Python 3.7 RUN yum install python3 zip -y RUN pip3.7 install --upgrade pip # Install Python packages RUN mkdir /packages RUN echo "opencv-python" >> /packages/requirements.txt RUN mkdir -p /packages/opencv-python-3.7/python/lib/python3.7/site-packages RUN pip3.7 install -r /packages/requirements.txt -t /packages/opencv-python-3

Serverless issue with AWS Lambda and Open CV in Python

落爺英雄遲暮 提交于 2021-01-07 03:02:40
问题 I am developing a microservice to analyze an image uploaded to an S3 AWS Bucket. I am using Serverless framework. I am using virtualenv to install the dependencies with PIP and serverless-python-requirements plugin to deploy these dependencies to the Lambda function. However I am having an error when I deploy the microservice because of a missing .so file. The error I get is Unable to import module 'handlers.image': libgthread-2.0.so.0: cannot open shared object file: No such file My

Serverless issue with AWS Lambda and Open CV in Python

橙三吉。 提交于 2021-01-07 03:01:49
问题 I am developing a microservice to analyze an image uploaded to an S3 AWS Bucket. I am using Serverless framework. I am using virtualenv to install the dependencies with PIP and serverless-python-requirements plugin to deploy these dependencies to the Lambda function. However I am having an error when I deploy the microservice because of a missing .so file. The error I get is Unable to import module 'handlers.image': libgthread-2.0.so.0: cannot open shared object file: No such file My