amazon-rds

AWS - Aurora replicas

痴心易碎 提交于 2019-12-11 08:54:46
问题 Scenario: I have two reader-aurora replicas. I make many calls to my system (high load) I see only one replica working at 99.30%, but the other one is not doing anything at all Why?, is because this second replica is ONLY to prevent failures of the first one?, cannot be possible to make both to share the load? 回答1: In your RDS console, you should be able to look at each of the 3 instances aurora-databasecluster-xxx.cluster-yyy.us-east-1.rds.amazonaws.com:3306 zz0.yyy.us-east-1.rds.amazonaws

AWS RDS DescribeDbInstances - limit result list only to specific instance using IaaC

江枫思渺然 提交于 2019-12-11 08:46:04
问题 I am trying to provision an architecture with Terraform and ensure that applications have as little knowledge about anything going around them as possible. My goal is to have an app that uses Amazon's SDK to ask which RDS instances are available for it and then to connect to one of them. This way, no outside information( db-instance-identifier or Tag ) is required. The application would just describe the RDS instances, and since EC2 on which it runs would have been given permissions by

AWS CLI search resource by tags

自古美人都是妖i 提交于 2019-12-11 08:03:47
问题 I am trying to use AWS CLI to search for resources by tags. I prepare this tag.json file: { "TagFilters": [ { "Value": "postgres-dev", "Key": "Name" } ] } and use this command: aws resourcegroupstaggingapi get-resources --tag-filters --cli-input-json file://tag.json However, instead of returning only the databases which have this tag, it returns every resource in my AWS account (EC2, ELB, etc.) Can anyone show me where did I do wrong? Thanks a lot. 回答1: Can you try it in plain text syntax in

Importing Large Size of Zipped JSON File from Amazon S3 into AWS RDS-PostgreSQL Using Python

三世轮回 提交于 2019-12-11 07:57:27
问题 I'm trying to import a large size of ZIPPED JSON FILE from Amazon S3 into AWS RDS-PostgreSQL using Python. But, these errors occured, Traceback (most recent call last): File "my_code.py", line 64, in file_content = f.read().decode('utf-8').splitlines(True) File "/usr/lib64/python3.6/zipfile.py", line 835, in read buf += self._read1(self.MAX_N) File "/usr/lib64/python3.6/zipfile.py", line 925, in _read1 data = self._decompressor.decompress(data, n) MemoryError //my_code.py import sys import

list RDS snapshot created today using Boto 3

笑着哭i 提交于 2019-12-11 06:35:11
问题 I am doing a Python Lambda function to describe list of RDS snapshots created today. The challenge is how to convert the datetime.datetime.today() into a format which RDS client understands? UPDATE: I have implemented some changes suggested, I have added a string variable to convert the date expression into format which Boto3 RDS understands. 'SnapshotCreateTime': datetime(2015, 1, 1), today = (datetime.today()).date() rds_client = boto3.client('rds') snapshots = rds_client.describe_db

Move data between AWS RDS instances

偶尔善良 提交于 2019-12-11 06:25:41
问题 I need to move millions of rows between identical mysql db's on two different rds instances. The approach I thought about is this: - use data-pipeline to export data from the first instance to amazon-s3 - use data-pipeline to import data from amazon-s3 to the second instance My problem is that I need to delete the data on the first instance at the end. Since we're talking about huge amounts of data I thought about creating a stored procedure to delete the rows in batches. Is there a way to

Does AWS RDS supports MySQL as document store

二次信任 提交于 2019-12-11 06:14:41
问题 I am able to connect normal AWS RDS MySQL instance (5.7.16). But, as I have to use MySQL as document store, I have configured MySQL instance by installing mysqlx plugin, Which is required for document store. After this, I am trying to connect MySQL document store on port 33060 on same instance but unable to connect. I am using lambda for connection which imports xdevapi (@mysql/xdevapi) package and tries to connect with MySQL RDS instance on port 33060. But, there is no error which I can see

Django trying to use wrong Database User

笑着哭i 提交于 2019-12-11 06:05:59
问题 SOLVED: I was using USERNAME, not USER from previously attempting Postgres driver. I'm trying to connect my django project to an RDS MySQL database. I can connect fine using my credentials in MySQL Workbench and mysql command line. I've set the AWS Security Group and VPC Security Group with All traffic | All | All | 0.0.0.0/0 Let's say my credentials are User: abc Password: password When I run python manage.py migrate it is attempting to login with Mike, not abc for some unknown reason (Mike

Rails not parsing database URL on production

社会主义新天地 提交于 2019-12-11 05:32:35
问题 I have the following database.yml file: default: &default adapter: postgresql encoding: unicode pool: <%= ENV.fetch("RAILS_MAX_THREADS") { 4 } %> development: <<: *default database: backoffice_authentication_development test: <<: *default database: backoffice_authentication_test production: <<: *default url: <%= ENV['DATABASE_URL'] %> and I have a DATABASE_URL on production similar to postgresql://user:passwrd@backoffice.xxxxxxxx.xxxxx.rds.amazonaws.com/backoffice_api when I try start my app

AWS RDS MySQL connection using IAM Role is not working

£可爱£侵袭症+ 提交于 2019-12-11 05:18:46
问题 Would like to inform upfront that I am new to AWS in general. For my project I am trying to use AWS RDS MySQL using IAM Authentication (IAM Role) from a Java application deployed on Tomcat on an EC2 instance. Before trying it from Java I am trying it from the command prompt on the EC2 instance. I am following this link to do it: https://aws.amazon.com/premiumsupport/knowledge-center/users-connect-rds-iam/ I have done the following so far (please pardon if I am not using correct terminologies)