aws-cli

install redis on aws micro instance

泄露秘密 提交于 2019-12-02 18:13:31
I need to install redis in amazon cloud. I need it as a part of my npm module kue (deployment). Can anyone link me step by step tutorial or explain how to do it, considering the fact that I'm not good to bad with linux and administration. Rather than spin up an EC2 instance and install/manage redis there, you could create an Elasticache instance running redis and let AWS manage it all for you. If you really do want to run your own redis server then you'll want to launch an EC2 instance and then manually install redis onto it. The AWS and redis documentation that I've linked to both provide

How do I delete a versioned bucket in AWS S3 using the CLI?

淺唱寂寞╮ 提交于 2019-12-02 16:36:19
I have tried both s3cmd : $ s3cmd -r -f -v del s3://my-versioned-bucket/ And the AWS CLI: $ aws s3 rm s3://my-versioned-bucket/ --recursive But both of these commands simply add DELETE markers to S3. The command for removing a bucket also doesn't work (from the AWS CLI): $ aws s3 rb s3://my-versioned-bucket/ --force Cleaning up. Please wait... Completed 1 part(s) with ... file(s) remaining remove_bucket failed: s3://my-versioned-bucket/ A client error (BucketNotEmpty) occurred when calling the DeleteBucket operation: The bucket you tried to delete is not empty. You must delete all versions in

AWS CLI $PATH Settings

纵饮孤独 提交于 2019-12-02 15:00:26
I am following the AWS CLI Setup guide . I have managed to successfully install the tool on my Mac OS X terminal with the following output: Running cmd: /usr/bin/python virtualenv.py --python /usr/bin/python /Users/fr/.local/lib/aws Running cmd: /Users/fr/.local/lib/aws/bin/pip install --no-index --find-links file:///Users/fr/Downloads/awscli-bundle/packages awscli-1.5.3.tar.gz You can now run: /Users/fr/.local/lib/aws/bin/aws --version My issue is that I have to type the full path /Users/fr/.local/lib/aws/bin/aws to execute any aws command. As per the guide's final step, I should be able to

AWS CLI command works on Bash, but not with PHP shell_exec()

自闭症网瘾萝莉.ら 提交于 2019-12-02 12:18:21
问题 I would like to trigger the following command: aws route53 change-resource-record-sets --hosted-zone-id XXX --change-batch '{ "Comment": "2018-06-19-11:31", "Changes": [ { "Action": "CREATE", "ResourceRecordSet": { "Name": "example.com", "Type": "TXT", "TTL": 60, "ResourceRecords": [ { "Value": "\"something\"" } ] } } ] }' It works when I trigger it on bash, but not when I run it in PHP: $json = trim(shell_exec($cmd_aws_submit)); // or: $json = trim(shell_exec("{$cmd_aws_submit}")); AWS

AWS CLI Query - describe-keys with parameters

ⅰ亾dé卋堺 提交于 2019-12-02 10:13:49
So this week, I have started to begin learning the CLI and seeing what can be done within this. I was given the task of grabbing information regarding this: Key Alias Key ID All associated tags I have tried many methods within this... and can't seem to get anywhere. I have only been doing this for around 4 days and I just began documenting key API calls that will come in use for the future. I seem to not be able to grab this in a --output table. If anyone could be able to give me a guidance on this. Also, does anyone have any tips from someone who is just starting his Cloud Journey and any

Is there any way to get the platform and OS from the instances

删除回忆录丶 提交于 2019-12-02 07:32:33
问题 I am trying get some information from my AWS EC2 instances. I will like to know if there is a way to pull information like: | Platform | Version | |-----------|---------------:| | CentOS | 6.0 or 7.0 | | Ubuntu | 10.04 or 12.04 | | Windows | | I will like to know if this is possible using the SDK. I tried with Python SDK Boto3 but no results. 回答1: It is not possible with SDK or CLI unless you have stored that information as tags . AWS SDK and CLI can help you get information that are

Stopping an RDS instance via CLI

感情迁移 提交于 2019-12-02 05:20:36
I am trying to use one of AWS's latest features where it allows you to stop an RDS instance. I followed this doc where it explains that I need to run the command: aws rds stop-db-instance --db-instance-identifier mydbinstance however, when I do that I get this: usage: aws [options] <command> <subcommand> [<subcommand> ...] [parameters] To see help text, you can run: aws help aws <command> help aws <command> <subcommand> help aws: error: argument operation: Invalid choice, valid choices are: add-role-to-db-cluster | add-source-identifier-to-subscription add-tags-to-resource | apply-pending

AWS CLI command works on Bash, but not with PHP shell_exec()

╄→尐↘猪︶ㄣ 提交于 2019-12-02 04:10:57
I would like to trigger the following command: aws route53 change-resource-record-sets --hosted-zone-id XXX --change-batch '{ "Comment": "2018-06-19-11:31", "Changes": [ { "Action": "CREATE", "ResourceRecordSet": { "Name": "example.com", "Type": "TXT", "TTL": 60, "ResourceRecords": [ { "Value": "\"something\"" } ] } } ] }' It works when I trigger it on bash, but not when I run it in PHP: $json = trim(shell_exec($cmd_aws_submit)); // or: $json = trim(shell_exec("{$cmd_aws_submit}")); AWS expects, that the value ( "\"something\"" ) for the TXT record is quoted. I tried to quote it like this:

aws iot describe-endpoint::You must specify a region

怎甘沉沦 提交于 2019-12-02 01:09:37
I am following this tutorial for connecting Raspberry Pi to AWS IoT using Node.js SDK and I have done all the other steps but I am lost in the authentication and certificate step: pi@raspberrypi:~ $ aws iot describe-endpoint You must specify a region. You can also configure your region by running "aws configure". pi@raspberrypi:~ $ aws configure AWS Access Key ID [None]: AWS Secret Access Key [None]: Default region name [None]: Default output format [None]: pi@raspberrypi:~ $ ls 2016-02-24-204612_1920x1080_scrot.png Desktop get-pip.py node_modules python_games sources WiringPi aws device.cfg

s3 - how to get fast line count of file? wc -l is too slow

眉间皱痕 提交于 2019-12-01 22:58:01
Does anyone have a quick way of getting the line count of a file hosted in S3? Preferably using the CLI, s3api but I am open to python/boto as well. Note: solution must run non-interactively, ie in an overnight batch. Right no i am doing this, it works but takes around 10 minutes for a 20GB file: aws cp s3://foo/bar - | wc -l Here's two methods that might work for you... Amazon S3 has a new feature called S3 Select that allows you to query files stored on S3. You can perform a count of the number of records (lines) in a file and it can even work on GZIP files. Results may vary depending upon