aws-cli

New IAM admin user sees “You are not authorized to perform this operation”

岁酱吖の 提交于 2019-11-30 15:34:23
问题 I am trying to get started with the AWS CLI on OSX. I installed aws via pip. I have created a new user in IAM and attached the pre-built AdministratorAccess - AWS Managed policy policy. Next I have I have copied the Access Key ID and the Secret Access Key generated. The user I created is not in any groups. Their policy looks like this: { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": "*", "Resource": "*" } ] } Next, I ran aws configure from the command line, and

New IAM admin user sees “You are not authorized to perform this operation”

五迷三道 提交于 2019-11-30 15:01:43
I am trying to get started with the AWS CLI on OSX. I installed aws via pip. I have created a new user in IAM and attached the pre-built AdministratorAccess - AWS Managed policy policy. Next I have I have copied the Access Key ID and the Secret Access Key generated. The user I created is not in any groups. Their policy looks like this: { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": "*", "Resource": "*" } ] } Next, I ran aws configure from the command line, and entered the access key and secret key that I copied, plus a region code of eu-west-1 (which seems unlikely to

AWS CLI create RDS with elasticbeanstalk create-environment

守給你的承諾、 提交于 2019-11-30 13:54:20
How can I create an RDS instance with the create-environment or another subcommand of aws elasticbeanstalk ? I've tried several combinations of parameters to no avail. Below is an example. APP_NAME="randall-railsapp" aws s3api create-bucket --bucket "$APP_NAME" APP_VERSION="$(git describe --always)" APP_FILE="deploy-$APP_NAME-$APP_VERSION.zip" git archive -o "$APP_FILE" HEAD aws s3 cp "$APP_FILE" "s3://$APP_NAME/$APP_FILE" aws --region us-east-1 elasticbeanstalk create-application-version \ --auto-create-application \ --application-name "$APP_NAME" \ --version-label "$APP_VERSION" \ --source

Difference between s3cmd, boto and AWS CLI

无人久伴 提交于 2019-11-30 10:45:35
I am thinking about redeploying my static website to Amazon S3. I need to automate the deployment so I was looking for an API for such tasks. I'm a bit confused over the different options. Question : What is the difference between s3cmd, the Python library boto and AWS CLI? mfisherca s3cmd and AWS CLI are both command line tools. They're well suited if you want to script your deployment through shell scripting (e.g. bash). AWS CLI gives you simple file-copying abilities through the "s3" command, which should be enough to deploy a static website to an S3 bucket. It also has some small

Downloading the latest file in an S3 bucket using AWS CLI?

ⅰ亾dé卋堺 提交于 2019-11-30 08:04:01
I have an S3 bucket that contains database backups. I am creating a script that I would like to download the latest backup (and eventually restore it somewhere else), but I'm not sure how to go about only grabbing the most recent file from a bucket. Is it possible to copy only the most recent file from a s3 bucket to a local directory using AWS CLI tools? This is a approach you can take. You can list all the objects in the bucket with aws s3 ls $BUCKET --recursive : $ aws s3 ls $BUCKET --recursive 2015-05-05 15:36:17 4 an_object.txt 2015-06-08 14:14:44 16322599 some/other/object 2015-04-29 12

AWS create role - Has prohibited field

a 夏天 提交于 2019-11-30 07:51:18
I am trying out a simple example suggested by AWS documentation to create a role using a policy json file http://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_create_for-service.html And I get the error A client error (MalformedPolicyDocument) occurred when calling the CreateRole operation: Has prohibited field Resource Here's the command, >> aws iam create-role --role-name test-service-role --assume-role-policy-document file:///home/ec2-user/policy.json A client error (MalformedPolicyDocument) occurred when calling the CreateRole operation: Has prohibited field Resource The policy is the

How to simplify aws DynamoDB query JSON output from the command line?

落花浮王杯 提交于 2019-11-30 07:14:13
I'm working with The AWS Command Line Interface for DynamoDB . When we query an item, we get a very detailed JSON output. You get something like this (it has been built from the get-item in order to be almost exhaustive (the NULL type has been omitted) aws command line help : { "Count": 1, "Items": [ { "Id": { "S": "app1" }, "Parameters": { "M": { "nfs": { "M": { "IP" : { "S" : "172.16.0.178" }, "defaultPath": { "S": "/mnt/ebs/" }, "key": { "B": "dGhpcyB0ZXh0IGlzIGJhc2U2NC1lbmNvZGVk" }, "activated": { "BOOL": true } } }, "ws" : { "M" : { "number" : { "N" : "5" }, "values" : { "L" : [ { "S" :

Filter S3 list-objects results to find a key matching a pattern

流过昼夜 提交于 2019-11-30 06:43:21
I would like to use the AWS CLI to query the contents of a bucket and see if a particular file exists, but the bucket contains thousands of files. How can I filter the results to only show key names that match a pattern? For example: aws s3api list-objects --bucket myBucketName --query "Contents[?Key==*mySearchPattern*]" The --query argument uses JMESPath expressions. JMESPath has an internal function contains that allows you to search for a string pattern. This should give the desired results: aws s3api list-objects --bucket myBucketName --query "Contents[?contains(Key, `mySearchPattern`)]"

passing access and secret key aws cli

十年热恋 提交于 2019-11-30 06:00:49
I am trying to embed access and secret key along with aws cli. e.g. aws ec2 describe-instances --aws-access-key <access_key> --aws-secret-key <secret_key> Also tried with -o and -w options for access and secret key respectively. It says : Unknown option aws-access-key and aws-secret-key You can provide keys on the command line via envars: AWS_ACCESS_KEY_ID=ABCD AWS_SECRET_ACCESS_KEY=EF1234 aws ec2 describe-instances See http://docs.aws.amazon.com/cli/latest/topic/config-vars.html#credentials EDIT: @wisbucky noted this could leave secrets in your command history. One way around this in bash at

How to use AWS S3 CLI to dump files to stdout in BASH?

蓝咒 提交于 2019-11-30 01:06:57
I'm starting a bash script which will take a path in S3 (as specified to the ls command) and dump the contents of all of the file objects to stdout . Essentially I'd like to replicate cat /path/to/files/* except for S3, e.g. s3cat '/bucket/path/to/files/*' . My first inclination looking at the options is to use the cp command to a temporary file and then cat that. Has anyone tried this or similar or is there already a command I'm not finding which does it? dump the contents of all of the file objects to stdout. You can accomplish this if you pass - for destination of aws s3 cp command. For