aws-cli

Querying multiple values

这一生的挚爱 提交于 2020-07-30 05:54:05
问题 I'm trying to filter by multiple values however I can't seem to get an and clause to work (e.g. filter1 and filter 2 ... etc.): Show me snapshots where the database name is 'testing' aws rds describe-db-snapshots --include-shared --query 'DBSnapshots[?DBInstanceIdentifier==`testing`].{DBNAME:DBInstanceIdentifier,SNAPSHOT:DBSnapshotIdentifier}' [ { "SNAPSHOT": "test1", "DBNAME": "testing" }, { "SNAPSHOT": "test2", "DBNAME": "testing" }, { "SNAPSHOT": "test3", "DBNAME": "testing" }, { "SNAPSHOT

How can multiple files be specified with “-files” in the CLI of Amazon for EMR?

左心房为你撑大大i 提交于 2020-07-23 04:08:08
问题 I am trying to start an amazon cluster via the amazon CLI, but I am a little bit confused how I should specify multiple files. My current call is as follows: aws emr create-cluster --steps Type=STREAMING,Name='Intra country development',ActionOnFailure=CONTINUE,Args=[-files,s3://betaestimationtest/mapper.py,- files,s3://betaestimationtest/reducer.py,-mapper,mapper.py,-reducer,reducer.py,- input,s3://betaestimationtest/output_0_inter,-output,s3://betaestimationtest/output_1_intra] --ami

How can multiple files be specified with “-files” in the CLI of Amazon for EMR?

╄→尐↘猪︶ㄣ 提交于 2020-07-23 04:07:17
问题 I am trying to start an amazon cluster via the amazon CLI, but I am a little bit confused how I should specify multiple files. My current call is as follows: aws emr create-cluster --steps Type=STREAMING,Name='Intra country development',ActionOnFailure=CONTINUE,Args=[-files,s3://betaestimationtest/mapper.py,- files,s3://betaestimationtest/reducer.py,-mapper,mapper.py,-reducer,reducer.py,- input,s3://betaestimationtest/output_0_inter,-output,s3://betaestimationtest/output_1_intra] --ami

How can multiple files be specified with “-files” in the CLI of Amazon for EMR?

梦想与她 提交于 2020-07-23 04:06:23
问题 I am trying to start an amazon cluster via the amazon CLI, but I am a little bit confused how I should specify multiple files. My current call is as follows: aws emr create-cluster --steps Type=STREAMING,Name='Intra country development',ActionOnFailure=CONTINUE,Args=[-files,s3://betaestimationtest/mapper.py,- files,s3://betaestimationtest/reducer.py,-mapper,mapper.py,-reducer,reducer.py,- input,s3://betaestimationtest/output_0_inter,-output,s3://betaestimationtest/output_1_intra] --ami

FFmpeg: Pipe segments to s3

£可爱£侵袭症+ 提交于 2020-07-19 06:03:46
问题 I'd like to pipe ffmpeg segments to s3 without writing them to disk. ffmpeg -i t2.mp4 -map 0 -c copy -f segment -segment_time 20 output_%04d.mkv Is it possible to modify this command so that ffmpeg writes segments to an s3 bucket? Something like this perhaps? ffmpeg -i t2.mp4 -map 0 -c copy -f segment -segment_time 20 pipe:1 \ | aws s3 cp - s3://bucket/output_%04d.mkv When I run the command above I receive this error Could not write header for output file #0 (incorrect codec parameters ?):

Copy multiple files from s3 bucket

不羁的心 提交于 2020-07-16 22:09:58
问题 I am having trouble downloading multiple files from AWS S3 buckets to my local machine. I have all the filenames that I want to download and I do not want others. How can I do that ? Is there any kind of loop in aws-cli I can do some iteration ? There are couple hundreds files I need to download so that it seems not possible to use one single command that takes all filenames as arguments. 回答1: There is a bash script which can read all the filenames from a file filename.txt . #!/bin/bash set

Copy multiple files from s3 bucket

爷,独闯天下 提交于 2020-07-16 22:07:00
问题 I am having trouble downloading multiple files from AWS S3 buckets to my local machine. I have all the filenames that I want to download and I do not want others. How can I do that ? Is there any kind of loop in aws-cli I can do some iteration ? There are couple hundreds files I need to download so that it seems not possible to use one single command that takes all filenames as arguments. 回答1: There is a bash script which can read all the filenames from a file filename.txt . #!/bin/bash set

global name 'ssl' is not defined error while executing aws s3 cp command

丶灬走出姿态 提交于 2020-07-09 06:41:22
问题 Am trying to upload a file to the AWS S3 using the AWS CLI command. I am using a system with RedHat 4 operating system. Python version is 2.7.9 OpenSSL version is 0.9.8v 19 Apr 2012. I have installed the AWS CLI and when I executed the below command I got an error regarding the ssl [test-user@redhat4 ~]$ aws s3 cp /export/home/test_dir/test_file.txt s3://test-bucket/ --region us-west-2 upload failed: test_file.txt to s3://test-bucket/test_file.txt global name 'ssl' is not defined Does I have

Error in AWS CLI (Amplify) when executing the “amplify init” command in windows environment (for an android studio project)

纵饮孤独 提交于 2020-07-07 11:13:16
问题 I'am new to the AWS SDK and I'am trying to create a new android studio project with AWS SDK for mobile. My ultimate goal is to get AWS SDK working on my Android studio project, I tried adding dependencies on Gradle, but it does not all all packages I need and it's not helping me, and thus I followed the official AWS documentation which lead me to Amplify CLI. As suggested by the official AWS documentation (this is the link to the documentation page I'm talking about), I was following along

awscli version 2 on alpine linux

限于喜欢 提交于 2020-06-12 01:48:12
问题 I was trying to put awscli_v2 into an alpine-based docker container and see that it fails with the following error message: /aws/install: line 78: /aws/dist/aws: not found Considering that the file itself is there and can be listed with ls, I would guess that some libraries that the executable ./aws/dist/aws relies upon are not present on alpine. Does someone know which libraries that might be? 回答1: Actually with a bit a effort it is possible to run AWS CLI v2 on Alpine: FROM alpine:3.11 ENV