aws-cli

Can't connect to local DynamoDb instance via AWS CLI

痴心易碎 提交于 2019-12-01 22:00:43
问题 This question was migrated from Server Fault because it can be answered on Stack Overflow. Migrated 3 years ago . I've created the local instance of DynamoDb by next steps: in Visual Studio I installed AWS explorer and create a new local instance on localhost:82. I can successfully work with it from my c# code using AWS library. I created the table and even put into it some data. I can even see that data in AWS explorer in Visual Studio. I need to add a lot of data, and I prepare to use AWS

Can't connect to local DynamoDb instance via AWS CLI

旧时模样 提交于 2019-12-01 20:35:32
I've created the local instance of DynamoDb by next steps: in Visual Studio I installed AWS explorer and create a new local instance on localhost:82. I can successfully work with it from my c# code using AWS library. I created the table and even put into it some data. I can even see that data in AWS explorer in Visual Studio. I need to add a lot of data, and I prepare to use AWS CLI tool. But I can't see any data and tables from the console. I put this : aws dynamodb list-tables --endpoint-url http://localhost:82 and see in response that: { "TableNames": [] } But I am pretty sure that DB on

Redirect output of console to a file on AWS S3

扶醉桌前 提交于 2019-12-01 16:36:35
Say I have a website that return me JSON data when I send a GET request using curl . I want to re-direct the output of curl to AWS S3 . A new file should be created on S3 for it. Currently I am able to redirect the output to store it locally. curl -s -X GET 'http://website_that_returns_json.com' > folder_to_save/$(date +"%Y-%m-%d_%H-%M.json") I have AWS CLI and s3cmd installed. How would I redirect the output of create to create a new file on AWS S3 ? Assume : AWS S3 access key and secret key are already set. Location to store file : mybucket/$(date +"%Y-%m-%d_%H-%M.json" The AWS Command-Line

Redirect output of console to a file on AWS S3

笑着哭i 提交于 2019-12-01 15:59:00
问题 Say I have a website that return me JSON data when I send a GET request using curl . I want to re-direct the output of curl to AWS S3 . A new file should be created on S3 for it. Currently I am able to redirect the output to store it locally. curl -s -X GET 'http://website_that_returns_json.com' > folder_to_save/$(date +"%Y-%m-%d_%H-%M.json") I have AWS CLI and s3cmd installed. How would I redirect the output of create to create a new file on AWS S3 ? Assume : AWS S3 access key and secret key

List all EC2 instance types in a region or AZ [closed]

谁说我不能喝 提交于 2019-12-01 06:18:44
While there appear to be a few ways to output and filter some AWSCLI commands into this list, does someone have a nice+easy way to list all EC2 instance types for a specific region? Or perhaps that list is published in a .json file up in a bucket someplace, maintained by AWS? I'm simply looking for this sort of output: t1.micro t2.nano t2.micro t2.small ... Well it seems that at least one programmatic way to do this is to query the AWS Pricing API: #!/bin/bash curl https://pricing.us-east-1.amazonaws.com/offers/v1.0/aws/AmazonEC2/current/index.json | jq -r '.products[].attributes["instanceType

How can I change the content-type of an object using aws cli?

岁酱吖の 提交于 2019-12-01 03:17:55
I've got several objects stored in Amazon S3 whose content-type I need to change from text/html to application/rss+xml . I gather that it should be possible to do this with a copy command, specifying the same path for the source and destination. I'm trying to do this using the AWS cli tools, but I'm getting this error: $ aws s3 cp s3://mybucket/feed/ogg/index.html \ s3://mybucket/feed/ogg/index.html \ --content-type 'application/rss+xml' copy failed: s3://mybucket/feed/ogg/index.html to s3://mybucket/feed/ogg/index.html A client error (InvalidRequest) occurred when calling the CopyObject

How can I change the content-type of an object using aws cli?

…衆ロ難τιáo~ 提交于 2019-11-30 22:22:45
问题 I've got several objects stored in Amazon S3 whose content-type I need to change from text/html to application/rss+xml . I gather that it should be possible to do this with a copy command, specifying the same path for the source and destination. I'm trying to do this using the AWS cli tools, but I'm getting this error: $ aws s3 cp s3://mybucket/feed/ogg/index.html \ s3://mybucket/feed/ogg/index.html \ --content-type 'application/rss+xml' copy failed: s3://mybucket/feed/ogg/index.html to s3:/

Is there a way to export an AWS CLI Profile to Environment Variables?

谁说我不能喝 提交于 2019-11-30 17:40:57
When working with certain third-party tools like Terraform, it's not easily possible to specify an AWS CLI profile, and I like working with the environment variables better than the profiles. Is there a way for me to have the AWS CLI simply export the current profile as AWS_ACCESS_KEY_ID and AWS_SECRET_KEY environment variables to my session? you could use the following command to set your environment variable aws configure get default.aws_access_key_id aws configure get default.aws_secret_access_key if you have another profile you can change, another way to write is aws configure get aws

How to run aws configure in a travis deploy script?

喜夏-厌秋 提交于 2019-11-30 17:25:11
I am trying to get travis-ci to run a custom deploy script that uses awscli to push a deployment up to my staging server. In my .travis.yml file I have this: before_deploy: - 'curl "https://s3.amazonaws.com/aws-cli/awscli-bundle.zip" -o "awscli-bundle.zip"' - 'unzip awscli-bundle.zip' - './awscli-bundle/install -b ~/bin/aws' - 'export PATH=~/bin:$PATH' - 'aws configure' And I have set up the following environment variables: AWS_ACCESS_KEY_ID AWS_SECRET_ACCESS_KEY AWS_DEFAULT_REGION with their correct values in the travis-ci web interface. However when the aws configure runs, it stops and waits

How to display only files from aws s3 ls command?

走远了吗. 提交于 2019-11-30 16:21:17
问题 I am using the aws cli to list the files in an s3 bucket using the following command (documentation): aws s3 ls s3://mybucket --recursive --human-readable --summarize This command gives me the following output: 2013-09-02 21:37:53 10 Bytes a.txt 2013-09-02 21:37:53 2.9 MiB foo.zip 2013-09-02 21:32:57 23 Bytes foo/bar/.baz/a 2013-09-02 21:32:58 41 Bytes foo/bar/.baz/b 2013-09-02 21:32:57 281 Bytes foo/bar/.baz/c 2013-09-02 21:32:57 73 Bytes foo/bar/.baz/d 2013-09-02 21:32:57 452 Bytes foo/bar/