aws-lambda

How can a cognito user be assigned to a group in iOS app

↘锁芯ラ 提交于 2020-01-05 05:32:26
问题 I have created a cognito user group in AWS console. I am able to create a cognito user from my iOS app and I can see the user record from the AWS console. Now I need to add this user to a group. Is it possible to do this from the app without using AWS console. i.e When a new user is created the user should be added to a group. This should be handled in the iOS app. 回答1: You can create and manage groups in a user pool from the AWS Management Console, the APIs, and the CLI. As a developer

Node.js AWS Lambda inconsistent s3.putObject upload of large data object

*爱你&永不变心* 提交于 2020-01-05 05:31:16
问题 Here is the Lambda code I am using to read a table and then upload the results to S3: 'use strict'; const pg = require('pg'); const aws = require('aws-sdk'); const awsParamStore = require( 'aws-param-store' ); exports.handler = async function (context) { function putObjectToS3(bucket, key, data){ var s3 = new aws.S3(); var params = { Bucket : bucket, Key : key, Body : data } s3.putObject(params, function(err, data) { if (err) console.log(err, err.stack); // an error occurred else console.log

Copy files from S3 to EMR local using Lambda

被刻印的时光 ゝ 提交于 2020-01-05 04:57:28
问题 I need to move the files from S3 to EMR's local dir /home/hadoop programmatically using Lambda. S3DistCp copies over to HDFS. I then login into EMR and run a CopyToLocal hdfs command on commandline to get the files to /home/hadoop. Is there a programmatic way using boto3 in Lambda to copy from S3 to Emr's local dir? 回答1: I wrote a test Lambda function to submit a job step to EMR that copies files from S3 to EMR's local dir. This worked. emrclient = boto3.client('emr', region_name='us-west-2')

Copy files from S3 to EMR local using Lambda

对着背影说爱祢 提交于 2020-01-05 04:57:06
问题 I need to move the files from S3 to EMR's local dir /home/hadoop programmatically using Lambda. S3DistCp copies over to HDFS. I then login into EMR and run a CopyToLocal hdfs command on commandline to get the files to /home/hadoop. Is there a programmatic way using boto3 in Lambda to copy from S3 to Emr's local dir? 回答1: I wrote a test Lambda function to submit a job step to EMR that copies files from S3 to EMR's local dir. This worked. emrclient = boto3.client('emr', region_name='us-west-2')

AWS Lambda Function not joining VPC

試著忘記壹切 提交于 2020-01-05 04:54:10
问题 I am trying to connect to my AWS Aurora DB. Following the documentation guide 3 times over I recieved the same timeout error on the mysql connetiontion. After digging in, it seems that my lambda function is simply not joining the VPC. I will list some outputs (with unnecessary lines removed) to show how I came to this conclusion. If anyone can point out where I went wrong in my configuration. Please let me know. Before anyone mentions it, yes, I have checked the db program variables many

AWS Lambda Python lots of “could not create '/var/task/__pycache__/FILENAMEpyc'” messages

梦想的初衷 提交于 2020-01-05 04:51:19
问题 In the configuration for my Pyhon 3.6 AWS Lambda function I set the environment variable "PYTHONVERBOSE" with a setting of 1 Then in the Cloudwatch logs for my function it shows lots of messages similar to: could not create '/var/task/ pycache /auth.cpython-36.pyc': OSError(30, 'Read-only file system') Is this important? Do I need to fix it? 回答1: I don't think you can write in the /var/task/ folder. If you want to write something to disk inside of the lambda runtime try the /tmp folder. 来源:

Amazon Lambda Java Function not insert data to DynamoDB

血红的双手。 提交于 2020-01-04 22:24:52
问题 Currently I am working with Amazon Webservices. I create one Java lambda function using Eclipse IDE . After created function in eclipse I test function eclipse using JUnit Test . Lambda function execute successfully and also inserted data successfully to DynamoDB . But problem is : I run function on lambda from Eclipse using Run Function on AWS Lambda functionality. Data is not inserting and its throwing exception. I also test this method using API Gateway but get same exception. Exception :

Disable AWS Lambda Environment Variables

試著忘記壹切 提交于 2020-01-04 09:18:09
问题 I'm currently using AWS Lambda to run code that I don't have control over. As such, I want to make sure that the Lambda environment is sandboxed and does not have access to sensitive data. The default environment variables passed to a Lambda function are outlined here. The ones that I'd be worried about a user getting access to are: AWS_ACCESS_KEY AWS_ACCESS_KEY_ID AWS_SECRET_KEY AWS_SECRET_ACCESS_KEY AWS_SESSION_TOKEN AWS_SECURITY_TOKEN Is it possible to disable these environment variables?

How can I use a Lambda function to call a Glue function (ETL) when a text file is loaded to an S3 bucket

牧云@^-^@ 提交于 2020-01-04 05:55:27
问题 I am trying to set up a lambda function that activates a Glue function when a .txt file is uploaded to an S3 bucket, I am using python 3.7 So far I have this: from __future__ import print_function import json import boto3 import urllib print('Loading function') s3 = boto3.client('s3') def lambda_handler(event, context): # handler source_bucket = event['Records'][0]['s3']['bucket']['name'] key = urllib.parse.quote_plus(event['Records'][0]['s3']['object']['key'].encode('utf8')) try: # what to

Lambda nodeJS 4.3 not finishing/executing success callback

风流意气都作罢 提交于 2020-01-04 05:09:54
问题 Despite running the log statement immediately above it, my call to callback(null) isn't working. Even tried wrapping it in a try catch block but got nothing. For reference, here's the full function: var Firebase = require('firebase'); var request = require('request'); //noinspection AnonymousFunctionJS /** * * @param event - from Lambda * @param context - from Lambda * @param callback - from Lambda */ exports.handler = function (event, context, callback) { var AUTOPILOT_API_KEY = getKey(event