aws-cli

how to add IAM role to an existing instance in aws?

只谈情不闲聊 提交于 2019-12-09 14:50:08
问题 I would like to add an IAM Role to an existing EC2 instance in AWS. I tried using AWS CLI. However, I could not find a way to do that. 回答1: As of AWS CLI v1.11.46 , which was released just yesterday (see the CHANGELOG file on GitHub), you can now attach an IAM role to an existing EC2 instance that was originally launched without an IAM role using the associate-iam-instance-profile command. You can also replace the currently attached IAM role for a running instance using replace-iam-instance

awscli fails to work: No module named 'awscli'

偶尔善良 提交于 2019-12-09 14:18:25
问题 I am trying to install awscli using pip3 on Linux Mint 17.2 Rafaela. I am getting the error: Traceback (most recent call last): File "/home/jonathan/.local/bin/aws", line 19, in <module> import awscli.clidriver ImportError: No module named 'awscli' These are the steps I am taking, following the aws installation guide: sudo pip install awscli --upgrade --user everything seems to install fine. adding to my .bashrc export PATH=~/.local/bin:$PATH then source ~/.bashrc then i try the command aws -

How to use AWS CLI with Elastic Beanstalk?

扶醉桌前 提交于 2019-12-09 05:24:57
问题 In the documentation it states that the EB CLI is replaced by AWS CLI but all of the documentation is still talking about EB CLI. I have created an application in Elastic Beanstalk console and now I'm ready to start developing. I have all the tools installed on Ubuntu and I've already tested it locally. Now I want to deploy it to Elastic Beanstalk. How do I do this with AWS CLI? 回答1: You have to create a source bundle from your application, see details here: http://docs.aws.amazon.com

AWS CLI S3API find newest folder in path

久未见 提交于 2019-12-08 19:23:42
问题 I've got a very large bucket (hundreds of thousands of objects). I've got a path (lets say s3://myBucket/path1/path2). /path2 gets uploads that are also folders. So a sample might look like: s3://myBucket/path1/path2/v6.1.0 s3://myBucket/path1/path2/v6.1.1 s3://myBucket/path1/path2/v6.1.102 s3://myBucket/path1/path2/v6.1.2 s3://myBucket/path1/path2/v6.1.25 s3://myBucket/path1/path2/v6.1.99 S3 doesn't take into account version number sorting (which makes sense) but alphabetically the last in

AWS S3 : Is there any way to count rows of uploaded file?

早过忘川 提交于 2019-12-08 13:13:22
问题 We'd like to check the row count of uploaded files on AWS S3, if the uploading task is sure to be end correctly. Now we're simply sending exported files by Windows AWS CLI command as follows, after exporting from SQL Server; aws s3 cp !SEND_FILE! %S3_DIR%/ Kindly let us know if there is any way to check the count of uploaded files on S3 bucket. If there is no appropriate way to count them, to get any uploading error code as altanative way would be also appreciated. Thanks for your advice in

AWS errors when installing requirements with python 2.7 when EB environment is python 3.6

[亡魂溺海] 提交于 2019-12-08 07:41:12
问题 How do I get the aWS EB instance to use the python 3 version that is already installed on the instance? I can't get a new environment running with Python 3.6 running Django 2.1+. Local (not in virtual env): which python -> /usr/local/bin/python python -V -> Python 2.7.15 which python3 -> /usr/local/bin/python3 python3 -V -> Python 3.6.5 which pip -> /usr/local/bin/pip pip -V -> pip 18.0 from /usr/local/lib/python3.6/site-packages/pip (python 3.6)` which pip3 -> /usr/local/bin/pip3 pip3 -V ->

AWS cli query to get to cloudfront “Domain Name” with specific origin name

天涯浪子 提交于 2019-12-07 18:09:08
问题 This is my JSON output from awscli I want to get xxxxxxxx.cloudfront.net using Origin DomainName example1.com with AWS cli query only. { I know this filtering with jq, awk and cut, grep }. "DistributionList": { "Items": [ { "WebACLId": "", "Origins": { "Items": [ { "OriginPath": "", "CustomOriginConfig": { "OriginProtocolPolicy": "http-only", "HTTPPort": 80, "HTTPSPort": 443 }, "Id": "DNS for Media Delivery", "DomainName": "example1.com" } ], "Quantity": 1 }, "DomainName": "xxxxxxxx

HTTPSConnectionPool(host='s3-us-west-1b.amazonaws.com', port=443): Max retries exceeded with url

a 夏天 提交于 2019-12-07 16:13:35
问题 I am trying to copy a file from my aws ec2 instance to S3 bucket folder, but i am getting error Here is the command sample aws s3 cp /home/abc/icon.jpg s3://mybucket/myfolder This the error i am getting upload failed: ./icon.jpg to s3://mybucket/myfolder/icon.jpg HTTPSConnectionPool(host='s3-us-west-1b.amazonaws.com', port=443): Max retries exceeded with url: /mybucket/myfolder/icon.jpg (Caused by : [Errno -2] Name or service not known) I have already configured the config file for aws cli

aws configure delete access key profile

让人想犯罪 __ 提交于 2019-12-07 15:46:35
问题 I seem to be having difficulty deleting the access key profile i created for a test user using aws configure --profile testuser I have tried deleting the entries in my ~/.aws directory however when i run aws configure , i am getting the following error. botocore.exceptions.ProfileNotFound: The config profile (testuser) could not be found A workaround is adding [profile testuser] in my ~/.aws/config file but i dont want to do that. I want to remove all traces of this testuser profile from my

aws cli s3 bucket remove object with date condition

心已入冬 提交于 2019-12-07 15:09:50
问题 how do i remove aws cli s3 bucket remove object with date condition recursively i am using this command for listing aws s3 ls --recursive s3://uat-files-transfer-storage/ | awk '$1 < "2018-02-01 11:13:29" {print $0}' | sort -n its run perfectly but when i use this command with rm its delete all files aws s3 rm --recursive s3://uat-files-transfer-storage/ | awk '$1 < "2018-02-01 11:13:29" {print $0}' | sort -n any solution 回答1: You're on the right track. To understand what's going on, let's