google-cloud-sql

Can't access Google Cloud SQL with private IP from peered VPC network

本秂侑毒 提交于 2020-08-10 04:46:41
问题 These are the steps: In "Project A" I have a "network A" with postgresql private IP in it. Can access postgresql from VM existing in same "network A" through private IP. Create a new "network B" in same "Project A" Create a "VPC network peer" between "network A" and "network B" Fully open firewall Can't reach postgresql from "network B", though can ping VM existing on "network A" Why i can't reach postgresql? Is it because SQL Private IP is in Beta mode, or i'm missing smth here? 回答1: Cloud

Error connecting Google AppEngine Django with SQL 2nd generation instance?

∥☆過路亽.° 提交于 2020-08-09 12:24:35
问题 I want to migrate my site from a First to a Second Generation Cloud SQL instance, this is the old config: DATABASES['[DATABASE_NAME]'] = { 'ENGINE': 'google.appengine.ext.django.backends.rdbms', 'INSTANCE': '[PROJECT_ID]:[INSTANCE_ID_1stGEN]', 'NAME': '[DATABASE_NAME]', 'USER': [MY_USER], 'PASSWORD': [MY_PASSWORD], } This works fine, now I'm trying with this code: DATABASES['[DATABASE_NAME]'] = { 'ENGINE': 'django.db.backends.mysql', 'HOST': '/cloudsql/[PROJECT_NAME]:[REGION]:[INSTANCE_ID]',

Connecting to Cloud SQL from Dataflow Job

…衆ロ難τιáo~ 提交于 2020-08-06 12:46:53
问题 I'm struggling to use JdbcIO with Apache Beam 2.0 (Java) to connect to a Cloud SQL instance from Dataflow within the same project. I'm getting the following error: java.sql.SQLException: Cannot create PoolableConnectionFactory (Communications link failure The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.) According to the documentation the dataflow service account *@dataflow-service-producer-prod.iam

Connecting to Cloud SQL from Dataflow Job

ⅰ亾dé卋堺 提交于 2020-08-06 12:45:24
问题 I'm struggling to use JdbcIO with Apache Beam 2.0 (Java) to connect to a Cloud SQL instance from Dataflow within the same project. I'm getting the following error: java.sql.SQLException: Cannot create PoolableConnectionFactory (Communications link failure The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.) According to the documentation the dataflow service account *@dataflow-service-producer-prod.iam

Connecting to Cloud SQL from Dataflow Job

白昼怎懂夜的黑 提交于 2020-08-06 12:44:26
问题 I'm struggling to use JdbcIO with Apache Beam 2.0 (Java) to connect to a Cloud SQL instance from Dataflow within the same project. I'm getting the following error: java.sql.SQLException: Cannot create PoolableConnectionFactory (Communications link failure The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.) According to the documentation the dataflow service account *@dataflow-service-producer-prod.iam

Cloud SQL or VM Instance to host MySQL Database

梦想的初衷 提交于 2020-07-20 07:48:50
问题 I have a website and i am confused where to host its database. Google Cloud SQL D1 tier has 0.5 GB RAM and its cost is $1.46 per day. GCE n1-standard-2 has 7.5 GB RAM and its cost is 1.68 per day. I am hosting my current database on Cloud SQL and the performance goes down when concurrent active connections goes up. It must because of low RAM of cloud sql. I can set up mysql server on VM Instance and can give remote access to external servers. Also, Cloud SQL has limitation on maximum

MySQLdb Stored Procedure Out Parameter not working

北战南征 提交于 2020-06-13 20:55:10
问题 I have a database hosted on Google Cloud SQL, and a python script to query it. I am trying to call a Stored Procedure that has an Out Parameter. The SP is called successfully, but the value of the Out Parameter doesn't seem to be returned to my python code. For example, here is the example taken from here: Definition of the multiply stored procedure: CREATE PROCEDURE multiply(IN pFac1 INT, IN pFac2 INT, OUT pProd INT) BEGIN SET pProd := pFac1 * pFac2; END If I call the SP from the command

Control order of container termination in a single pod in Kubernetes

限于喜欢 提交于 2020-05-25 05:13:28
问题 I have two containers inside one pod. One is my application container and the second is a CloudSQL proxy container. Basically my application container is dependent on this CloudSQL container. The problem is that when a pod is terminated, the CloudSQL proxy container is terminated first and only after some seconds my application container is terminated. So, before my container is terminated, it keeps sending requests to the CloudSQL container, resulting in errors: could not connect to server:

Control order of container termination in a single pod in Kubernetes

生来就可爱ヽ(ⅴ<●) 提交于 2020-05-25 05:12:52
问题 I have two containers inside one pod. One is my application container and the second is a CloudSQL proxy container. Basically my application container is dependent on this CloudSQL container. The problem is that when a pod is terminated, the CloudSQL proxy container is terminated first and only after some seconds my application container is terminated. So, before my container is terminated, it keeps sending requests to the CloudSQL container, resulting in errors: could not connect to server: