aws-cli

Is there a way to export an AWS CLI Profile to Environment Variables?

 ̄綄美尐妖づ 提交于 2019-11-29 17:18:30
问题 When working with certain third-party tools like Terraform, it's not easily possible to specify an AWS CLI profile, and I like working with the environment variables better than the profiles. Is there a way for me to have the AWS CLI simply export the current profile as AWS_ACCESS_KEY_ID and AWS_SECRET_KEY environment variables to my session? 回答1: you could use the following command to set your environment variable aws configure get default.aws_access_key_id aws configure get default.aws

How to use awscli inside python script?

被刻印的时光 ゝ 提交于 2019-11-29 16:23:03
问题 I'm using aws ec2 service with awscli. Now I want to put all the commands I type in the console into a python script. I see that if I write import awscli inside a python script it works fine but I don't understand how to use it inside the script. For instance how do I execute the commands aws ec2 run-instances <arguments> inside the python script after import awscli ? Just to make it clear, I'm not looking for a solution like os.system('aws ec2 run-instances <arguments>') , I'm looking for

Difference between s3cmd, boto and AWS CLI

风流意气都作罢 提交于 2019-11-29 15:53:51
问题 I am thinking about redeploying my static website to Amazon S3. I need to automate the deployment so I was looking for an API for such tasks. I'm a bit confused over the different options. Question : What is the difference between s3cmd, the Python library boto and AWS CLI? 回答1: s3cmd and AWS CLI are both command line tools. They're well suited if you want to script your deployment through shell scripting (e.g. bash). AWS CLI gives you simple file-copying abilities through the "s3" command,

Elastic Beanstalk “git aws.push” only commited difference?

核能气质少年 提交于 2019-11-29 11:22:43
We are storing our PHP project on github. For fast deployment we are using .bat file for git pushing changes to AWS Elastic Beanstalk cloud: "C:\Program Files (x86)\Git\bin\sh.exe" --login -i -c "git aws.push --environment envname" We are making commit every time before push, and it's working just perfect, as expected. Unfortunately, for some reason, sometime it is pushing really quick (just pushing difference in PHP Code changes), but sometimes it is sending whole 300mb project (with all media). Is there any way to git push only changed diference? Maybe there is any additional parameters on

AWS create role - Has prohibited field

自古美人都是妖i 提交于 2019-11-29 11:07:08
问题 I am trying out a simple example suggested by AWS documentation to create a role using a policy json file http://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_create_for-service.html And I get the error A client error (MalformedPolicyDocument) occurred when calling the CreateRole operation: Has prohibited field Resource Here's the command, >> aws iam create-role --role-name test-service-role --assume-role-policy-document file:///home/ec2-user/policy.json A client error

Downloading the latest file in an S3 bucket using AWS CLI?

浪尽此生 提交于 2019-11-29 06:20:39
问题 I have an S3 bucket that contains database backups. I am creating a script that I would like to download the latest backup (and eventually restore it somewhere else), but I'm not sure how to go about only grabbing the most recent file from a bucket. Is it possible to copy only the most recent file from a s3 bucket to a local directory using AWS CLI tools? 回答1: This is a approach you can take. You can list all the objects in the bucket with aws s3 ls $BUCKET --recursive : $ aws s3 ls $BUCKET -

Filter S3 list-objects results to find a key matching a pattern

不打扰是莪最后的温柔 提交于 2019-11-29 06:17:25
问题 I would like to use the AWS CLI to query the contents of a bucket and see if a particular file exists, but the bucket contains thousands of files. How can I filter the results to only show key names that match a pattern? For example: aws s3api list-objects --bucket myBucketName --query "Contents[?Key==*mySearchPattern*]" 回答1: The --query argument uses JMESPath expressions. JMESPath has an internal function contains that allows you to search for a string pattern. This should give the desired

passing access and secret key aws cli

别来无恙 提交于 2019-11-29 05:01:26
问题 I am trying to embed access and secret key along with aws cli. e.g. aws ec2 describe-instances --aws-access-key <access_key> --aws-secret-key <secret_key> Also tried with -o and -w options for access and secret key respectively. It says : Unknown option aws-access-key and aws-secret-key 回答1: You can provide keys on the command line via envars: AWS_ACCESS_KEY_ID=ABCD AWS_SECRET_ACCESS_KEY=EF1234 aws ec2 describe-instances See http://docs.aws.amazon.com/cli/latest/topic/config-vars.html

How to use AWS S3 CLI to dump files to stdout in BASH?

橙三吉。 提交于 2019-11-28 21:48:14
问题 I'm starting a bash script which will take a path in S3 (as specified to the ls command) and dump the contents of all of the file objects to stdout . Essentially I'd like to replicate cat /path/to/files/* except for S3, e.g. s3cat '/bucket/path/to/files/*' . My first inclination looking at the options is to use the cp command to a temporary file and then cat that. Has anyone tried this or similar or is there already a command I'm not finding which does it? 回答1: dump the contents of all of the

The AWS Access Key Id does not exist in our records

走远了吗. 提交于 2019-11-28 21:05:37
I created a new Access Key and configured that in the AWS CLI with aws configure . It created the .ini file in ~/.aws/config . When I run aws s3 ls it gives: A client error (InvalidAccessKeyId) occurred when calling the ListBuckets operation: The AWS Access Key Id you provided does not exist in our records. AmazonS3FullAccess policy is also attached to the user. How to fix this? It might be happening that you have the old keys exported via env variables (bash_profile) and since the env variables have higher precedence over credential files it is giving the error "the access key id does not