aws-cli

Invoking aws lambda without output file

穿精又带淫゛_ 提交于 2019-12-07 03:01:17
问题 I'm trying to invoke a lambda on AWS using CLI: aws lambda invoke --function-name GetErrorLambda --payload '{"body":"{\"Id\":[\"321\",\"123\"]}"}' \output. I would like to know if there's a way to print the output on the cli instead of create a file. Thanks in advance. 回答1: It's not possible to output directly to the terminal after invoking a lambda function. This is likely by design as the output could easily be greater than the buffer size for a window. A simple workaround would be to

AWS Cloudwatch log stream name not recognised

 ̄綄美尐妖づ 提交于 2019-12-06 17:45:34
问题 So, I'm using the automated logging from AWS Lambda. It generates log streams with names that look like this: 2016/05/18/[$LATEST]99577d10a8cb420cb124a90c20d5653a I can query, using 'aws logs describe-log-streams', the available log streams and get some JSON containing these names alongside other meta data. However if I then try to do this: aws logs get-log-events --log-group-name /aws/lambda/categorise --log-stream-name "2016/05/18/[$LATEST]99577d10a8cb420cb124a90c20d5653a" I get an error A

PyCharm intellisense for boto3

十年热恋 提交于 2019-12-06 17:17:34
问题 having problems seeing full intellisense (code completion) options in PyCharm. working with python 3.4 on Windows. the suggests are partially working: import boto3 s = boto3.Session() (boto3. will bring up list of methods/params of object boto3) ec2 = s.resource('ec2') (resource is a suggested method!) ec2. <<<< this brings up nothing. For some reason PyCharm cant detect that ec2 object would have while I can work off documentation alone, intellisense is just such a nice feature to have! ive

Shell script - Sorting 'AWS cloudwatch metrics' json array based on the “Timestamp” property value which comes in ISO 8601 UTC format

点点圈 提交于 2019-12-06 14:09:24
I have an Amazon cloudwatch ELB Latency metrics like below. { "Datapoints": [ { "Timestamp": "2016-10-18T12:11:00Z", "Average": 0.25880099632013942, "Minimum": 0.00071811676025390625, "Maximum": 3.2039437294006352, "Unit": "Seconds" }, { "Timestamp": "2016-10-18T12:10:00Z", "Average": 0.25197337517680762, "Minimum": 0.00063610076904296875, "Maximum": 2.839790821075439, "Unit": "Seconds" }, { "Timestamp": "2016-10-18T12:19:00Z", "Average": 0.2287127116954388, "Minimum": 0.00061678886413574219, "Maximum": 1.416410446166992, "Unit": "Seconds" } ] } i'm running 'awscli' inside shell script for

Amazon EKS: generate/update kubeconfig via python script

ε祈祈猫儿з 提交于 2019-12-06 11:54:14
问题 When using Amazon's K8s offering, the EKS service, at some point you need to connect the Kubernetes API and configuration to the infrastructure established within AWS. Especially we need a kubeconfig with proper credentials and URLs to connect to the k8s control plane provided by EKS. The Amazon commandline tool aws provides a routine for this task aws eks update-kubeconfig --kubeconfig /path/to/kubecfg.yaml --name <EKS-cluster-name> Question: do the same through Python/boto3 When looking at

Stream logs to elastic using aws cli

☆樱花仙子☆ 提交于 2019-12-06 09:59:31
I would like to enable the Stream to Amazon Elasticsearch Service from Cloudwatch to Elasticsearch. I'm familiar with how to do that manually, I'm looking for a way to achieve that by running aws cli commands. assuming Elasticsearch is already configured, is there any way to automate the process ? Behind the scene Stream to Amazon Elasticsearch service create new lambda and then it pushes the log to Lambda then ELK. destination arn The Amazon Resource Name (ARN) of the Kinesis stream, Kinesis Data Firehose stream, or Lambda function you want to use as the destination of the subscription feed.

AWS Cli in Windows wont upload file to s3 bucket

末鹿安然 提交于 2019-12-06 08:08:56
问题 Windows server 12r2 with python 2.7.10 and the aws cli tool installed. The following works: aws s3 cp c:\a\a.txt s3://path/ I can upload that file without problem. What I want to do is upload a file from a mapped drive to an s3 bucket, so I tried this: aws s3 cp s:\path\file s3://path/ and it works. Now what I want to do and cannot figure out is how to not specify, but let it grab all file(s) so I can schedule this to upload the contents of a directory to my s3 bucket. I tried this: aws s3 cp

AWS S3 sync between buckets overwriting newer destination files

情到浓时终转凉″ 提交于 2019-12-06 03:53:45
We have two s3 buckets, and we have a sync cron job that should copy bucket1 changes to bucket2. aws s3 sync s3://bucket1/images/ s3://bucket2/images/ When a new image is added to bucket1, it correctly gets copied over to bucket2. However, if we upload a new version of that image to bucket2, when the sync job next runs it actually copies the older version from bucket1 over to bucket2, replacing the newer version we just put there. This is part of a migration process, and in time the only place images will be uploaded to will be bucket2, but for the time being sometimes they may be uploaded to

Setting up the path so AWS cli works properly

烈酒焚心 提交于 2019-12-06 02:21:48
问题 I installed AWSCLI by using: pip install --upgrade --user awscli Now if I type aws configure in the cmd I get: 'aws' is not recognized as an internal or external command... I'm pretty sure the path needs to be set correctly. I know how to go into the environment variables to set the path, but I don't know WHAT to set the path to because I don't see where awscli is installed. By the way, I already have boto3 installed and I'm able to import that just fine. I should also mention I'm setting

aws configure delete access key profile

这一生的挚爱 提交于 2019-12-06 00:29:23
I seem to be having difficulty deleting the access key profile i created for a test user using aws configure --profile testuser I have tried deleting the entries in my ~/.aws directory however when i run aws configure , i am getting the following error. botocore.exceptions.ProfileNotFound: The config profile (testuser) could not be found A workaround is adding [profile testuser] in my ~/.aws/config file but i dont want to do that. I want to remove all traces of this testuser profile from my machine. The Configuring the AWS Command Line Interface documentation page lists various places where