aws-cli

Check if file exists in s3 using ls and wildcard

烈酒焚心 提交于 2019-12-21 03:52:22
问题 Seems so simple, but not getting the syntax right. I want to know if a file exists in my s3 bucket using wildcards. Something like aws s3 ls s3://my-bucket/folder/*myfile* The goal is to see if a file called 2016_myfile.txt or a file called 2011_myfile.csv exists within this bucket. If I run the command, it doesn't return anything even though I know this file exists there. 回答1: (re-drafted from comment as it appears this answered the question) I myself tried, and failed to use wildcards in

Throttling S3 commands with aws cli

不问归期 提交于 2019-12-21 03:29:24
问题 I'm running a backup script using AWS CLI to perform an S3 sync command every night on my MediaTemple server. This has run without fail for months, but I updated my Plesk installation and now every night, when the backup script runs, MediaTemple disables my server due to excessive usage. The limits I seem to be crossing are as follows: RESOURCE INFO: Packets per second limit: 35000 Packets per second detected: 42229.11667000000306870788 Bytes per second limit: 50000000 Bytes per second

howto abort all incomplete multipart uploads for a bucket

别来无恙 提交于 2019-12-20 19:31:14
问题 Sometimes multipart uploads hang or don't complete for some reason. In that case you are stuck with orphaned parts that are tricky to remove. You can list them with: aws s3api list-multipart-uploads --bucket $BUCKETNAME I am looking for way to abort them all. 回答1: Assuming you have your awscli all setup and it'll output JSON you can use jq to project the needed keys with: BUCKETNAME=<xxx> aws s3api list-multipart-uploads --bucket $BUCKETNAME \ | jq -r '.Uploads[] | "--key \"\(.Key)\" --upload

How to run AWS ECS Task overriding environment variables

北战南征 提交于 2019-12-20 18:16:06
问题 To override environment variables via CLI we may use --overrides (structure) according to AWS ECS Commandline Reference. How to pass name value pairs (structure or JSON) in command line? [ { "name" : "NAME", "value" : "123" }, { "name" : "DATE", "value" : "1234-12-12" }, { "name" : "SCRIPT", "value" : "123456" } ] I'm looking for a way to override above environment variables using AWS ECS CLI. Something like: aws ecs run-task --overrides <<just environment vars here>> --task-definition ...

How to run AWS ECS Task overriding environment variables

不羁岁月 提交于 2019-12-20 18:16:06
问题 To override environment variables via CLI we may use --overrides (structure) according to AWS ECS Commandline Reference. How to pass name value pairs (structure or JSON) in command line? [ { "name" : "NAME", "value" : "123" }, { "name" : "DATE", "value" : "1234-12-12" }, { "name" : "SCRIPT", "value" : "123456" } ] I'm looking for a way to override above environment variables using AWS ECS CLI. Something like: aws ecs run-task --overrides <<just environment vars here>> --task-definition ...

How list Amazon S3 bucket contents by modified date?

霸气de小男生 提交于 2019-12-20 16:23:09
问题 Most of the time it happens that we load files in a common S3 bucket due to which it becomes hard to figure out data in it. How can I view objects uploaded on a particular date? 回答1: One solution would probably to use the s3api. It works easily if you have less than 1000 objects, otherwise you need to work with pagination. s3api can list all objects and has a property for the lastmodified attribute of keys imported in s3. It can then be sorted, find files after or before a date, matching a

How to export a dynamodb table as a csv through aws-cli ( without using pipeline)

只愿长相守 提交于 2019-12-20 11:50:58
问题 I am new to aws-cli and I am trying to export my dynamodb table as a csv so that i can import it directly into postgresql. Is there a way to do that using aws-cli ? So far i have came across this command aws dynamodb scan --table-name . But this does not provide an option of a csv export. Also, through this command I can get the output on my command prompt but I am not sure how to write it in a file. 回答1: If all items have the same attributes, e.g. id and name both of which are strings, then

AWS : The config profile (MyName) could not be found

≯℡__Kan透↙ 提交于 2019-12-20 10:15:48
问题 Every time I want to config something with AWS I get the following error : "The config profile (myname) could not be found" like : aws configure I'm using Python 3.4 and I want to use AWS CLI Keyring to encrypt my credentials.. 回答1: I think there is something missing from the AWS documentation in http://docs.aws.amazon.com/lambda/latest/dg/setup-awscli.html, it did not mention that you should edit the file ~/.aws/config to add your username profile. There are two ways to do this: edit ~/.aws

Error You must specify a region when running command aws ecs list-container-instances

一世执手 提交于 2019-12-20 10:15:08
问题 I am trying to use aws container service as per the documentation in http://docs.aws.amazon.com/AmazonECS/latest/developerguide/ECS_GetStarted.html The below error is thrown when running the command: aws ecs list-container-instances --cluster default You must specify a region. You can also configure your region by running "aws configure". The documentation does not mention anything about specifying a default region. How do we do it in a console? 回答1: I think you need to use for example: aws

How can I use wildcards to `cp` a group of files with the AWS CLI

耗尽温柔 提交于 2019-12-20 09:32:05
问题 I'm having trouble using * in the AWS CLI to select a subset of files from a certain bucket. Adding * to the path like this does not seem to work aws s3 cp s3://data/2016-08* . 回答1: To download multiple files from an aws bucket to your current directory, you can use recursive , exclude , and include flags like this: aws s3 cp s3://data/ . --recursive --exclude "*" --include "2016-08*" For more info on how to use these filters: http://docs.aws.amazon.com/cli/latest/reference/s3/#use-of-exclude