aws-cli

What is the correct syntax for filtering by tag in describe-vpcs?

梦想与她 提交于 2019-12-04 11:10:59
问题 I am trying to understand a aws ec2 cli call. I am looking to describe all VPC then filer on a custom tag (vpcname=myvpc, however after trying multiple combinations I keep getting conflicting errors about the format and use of --filters. using as a reference [http://docs.aws.amazon.com/cli/latest/reference/ec2/describe-vpcs.html][1] aws --profile myProfile --region eu-west-1 ec2 describe-vpcs --filters vpcname,myvpc however this returns Error parsing parameter '--filters': should be: Key

Fastest way to sync two Amazon S3 buckets

耗尽温柔 提交于 2019-12-04 10:16:50
问题 I have a S3 bucket with around 4 million files taking some 500GB in total. I need to sync the files to a new bucket (actually changing the name of the bucket would suffice, but as that is not possible I need to create a new bucket, move the files there, and remove the old one). I'm using AWS CLI's s3 sync command and it does the job, but takes a lot of time. I would like to reduce the time so that the dependent system downtime is minimal . I was trying to run the sync both from my local

Setting up the path so AWS cli works properly

馋奶兔 提交于 2019-12-04 08:36:50
I installed AWSCLI by using: pip install --upgrade --user awscli Now if I type aws configure in the cmd I get: 'aws' is not recognized as an internal or external command... I'm pretty sure the path needs to be set correctly. I know how to go into the environment variables to set the path, but I don't know WHAT to set the path to because I don't see where awscli is installed. By the way, I already have boto3 installed and I'm able to import that just fine. I should also mention I'm setting this up in windows. Hi I just had the same problem, and I managed to solve this! I'm using python 3.7.0

AWS CLI upload failed: unknown encoding: idna

允我心安 提交于 2019-12-04 03:29:46
I am trying to push some files up to s3 with the AWS CLI and I am running into an error: upload failed: ... An HTTP Client raised and unhandled exception: unknown encoding: idna I believe this is a Python specific problem but I am not sure how to enable this type of encoding for my python interpreter. I just freshly installed Python 3.6 and have verified that it being used by powershell and cmd. $> python --version Python 3.6.7 If this isn't a Python specific problem, it might be helpful to know that I also just freshly installed the AWS CLI and have it properly configured. Let me know if

How can I create an AWS Lambda function using the AWS CLI?

懵懂的女人 提交于 2019-12-04 03:22:46
I am trying to create an AWS Lambda function using the command aws lambda create-function \ --function-name foo\ --runtime nodejs\ --role lambda_basic_execution \ --handler asdf --zip-file "fileb:://boom.zip" I have a file called boom.zip available in the directory. But I cannot deploy using the above command. The failure message I get is --zip-file must be a file with the fileb:// prefix. Does anyone have a working example to create a lambda function using the AWS CLI? You have an extra colon ':' in the file spec. $ aws lambda create-function --function-name foo --runtime nodejs --role lambda

How can I select all elastic IPs that are not assigned to an EC2 instance?

ⅰ亾dé卋堺 提交于 2019-12-04 03:17:34
I'm trying to get all Elastic IPs that are not currently assigned to instances. It's easy to get all of the Elastic IPs using this: aws ec2 describe-addresses From here, it would be easy to filter out any results that do not have an "AssociationId" . However, I'm not sure how to do that using --query . I know that the --query option uses JMESPath to filter results, but I have no idea how to tell it to return me all results that do not have an AssociationId . Any help? Thanks. You can check the Addresses collection for null values, but instead of AssociationId a better general solution may be

Throttling S3 commands with aws cli

偶尔善良 提交于 2019-12-04 00:14:49
I'm running a backup script using AWS CLI to perform an S3 sync command every night on my MediaTemple server. This has run without fail for months, but I updated my Plesk installation and now every night, when the backup script runs, MediaTemple disables my server due to excessive usage. The limits I seem to be crossing are as follows: RESOURCE INFO: Packets per second limit: 35000 Packets per second detected: 42229.11667000000306870788 Bytes per second limit: 50000000 Bytes per second detected: 61801446.10000000149011611938 They also include a networking snapshot at the time they take the

AWS ECR GetAuthorizationToken

邮差的信 提交于 2019-12-03 22:11:06
I've tried to follow AWS instructions on setting ECR authorization to my user by giving the AmazonEC2ContainerRegistryFullAccess policy to my user. However when I try to run on my PC the aws ecr get-login I get an error that I don't have permission. An error occurred (AccessDeniedException) when calling the GetAuthorizationToken operation: User: arn:aws:iam::ACCOUNT_NUMBER:user/MY_USER is not authorized to perform: ecr:GetAuthorizationToken on resource: * What have I done wrong ? You must attach a policy to your IAM role. I attached AmazonEC2ContainerRegistryFullAccess and it worked. I've

AWS CLI create RDS with elasticbeanstalk create-environment

☆樱花仙子☆ 提交于 2019-12-03 20:02:30
问题 How can I create an RDS instance with the create-environment or another subcommand of aws elasticbeanstalk ? I've tried several combinations of parameters to no avail. Below is an example. APP_NAME="randall-railsapp" aws s3api create-bucket --bucket "$APP_NAME" APP_VERSION="$(git describe --always)" APP_FILE="deploy-$APP_NAME-$APP_VERSION.zip" git archive -o "$APP_FILE" HEAD aws s3 cp "$APP_FILE" "s3://$APP_NAME/$APP_FILE" aws --region us-east-1 elasticbeanstalk create-application-version \ -

How to use AWS CLI to only copy files in S3 bucket that match a given string pattern

自古美人都是妖i 提交于 2019-12-03 17:31:35
问题 I'm using the AWS CLI to copy files from an S3 bucket to my R machine using a command like below: system( "aws s3 cp s3://my_bucket_location/ ~/my_r_location/ --recursive --exclude '*' --include '*trans*' --region us-east-1" ) This works as expected, i.e. it copies all files in my_bucket_location that have "trans" in the filename at that location. The problem that I am facing is that I have other files with similar naming conventions that I don't want to import in this step. As an example, in