google-cloud-sql

Google cloud sql instance infinite loading on restart

做~自己de王妃 提交于 2021-01-21 09:59:15
问题 We have google cloud sql instance with around 10.000 databases. Time to time mysql instance becomes unresponsive, so all we can do is just restart from the google console. (The reasons why it becomes unresponsive are not known currently :( ) Today we had similar issue, and we tried to restart it and now it is already restarting for more than an hour (usually it takes around 2 minutes to restart) What can be done ??? 回答1: Google cloud sql has limitation of 600 seconds on a query to run, after

Error while creating GCP Serverless VPC Connection (“code”: 13)

冷暖自知 提交于 2021-01-07 01:36:40
问题 Trying to create a serverless VPC connection but always getting the same error ERROR: (gcloud.compute.networks.vpc-access.connectors.create) { "code": 13, "message": "An internal error occurred: Failed to create a VPC Access connector. Please delete the connector } Have tried the following Reference Link: Serverless VPC Access Tried all steps in Troubleshooting - mentioned at the bottom of the above page Tried various IP ranges like 10.8.0.0, 10.128.0.0, 10.160.0.0 - no use Created a new

Error while creating GCP Serverless VPC Connection (“code”: 13)

我的未来我决定 提交于 2021-01-07 01:36:35
问题 Trying to create a serverless VPC connection but always getting the same error ERROR: (gcloud.compute.networks.vpc-access.connectors.create) { "code": 13, "message": "An internal error occurred: Failed to create a VPC Access connector. Please delete the connector } Have tried the following Reference Link: Serverless VPC Access Tried all steps in Troubleshooting - mentioned at the bottom of the above page Tried various IP ranges like 10.8.0.0, 10.128.0.0, 10.160.0.0 - no use Created a new

Error while creating GCP Serverless VPC Connection (“code”: 13)

不羁岁月 提交于 2021-01-07 01:33:49
问题 Trying to create a serverless VPC connection but always getting the same error ERROR: (gcloud.compute.networks.vpc-access.connectors.create) { "code": 13, "message": "An internal error occurred: Failed to create a VPC Access connector. Please delete the connector } Have tried the following Reference Link: Serverless VPC Access Tried all steps in Troubleshooting - mentioned at the bottom of the above page Tried various IP ranges like 10.8.0.0, 10.128.0.0, 10.160.0.0 - no use Created a new

Is it possible to replace Cloud SQL proxy with Istio proxy?

感情迁移 提交于 2021-01-01 06:53:24
问题 Currently I am using Cloud proxy to connect to a Postgres Cloud SQL database as a sidecar. When using Istio, however it introduces its own sidecar, which lead to the result that there are two proxies in the pod. So I thougth, can the encrypted connection not also established using Istio? Basically, it is possible to connect to an external IP using Istio. It should also be possible to configure a DestinationRule which configures TLS. And it also be possible to create Client certificates for

connect google cloud sql postgres instance from beam pipeline

妖精的绣舞 提交于 2020-12-29 07:52:26
问题 I want to connect google cloud sql postgres instance from apache beam pipeline running on google dataflow. I want to do this using Python SDK. I am not able to find proper documentation for this. In cloud SQL how to guide I dont see any documentation for dataflow. https://cloud.google.com/sql/docs/postgres/ Can someone provide documentation link/github example? 回答1: You can use the relational_db.Write and relational_db.Read transforms from beam-nuggets as follows: First install beam-nuggests:

Schedule Start/stop on GCP SQL Instance

自作多情 提交于 2020-12-15 07:26:21
问题 I want to schedule my SQL Instance of GCP. How to Trigger start/ stop SQL Instance automatically? I already successfully scheduled the compute Engine VM but stuck in SQL Instance scheduling. 回答1: In order to achieve this you can use a Cloud Function to make a call to the Cloud SQL Admin API to start and stop your Cloud SQL instance (you will need 2 Cloud functions) def hello_world(request): instance = 'test' # TODO: Update placeholder value. request = service.instances().get(project=project,

Airflow xcom pull only returns string

爷,独闯天下 提交于 2020-12-06 12:34:49
问题 I have an airflow pipeline where I need to get a filename from a pubsub subscription and then import that file into a cloud sql instance. I use the CloudSqlInstanceImportOperator to import the CSV file. This operator needs a body, which contains the filename and other parameters. Since I read that filename during runtime, I also have to define the body during runtime. This all works. But when I pull the body from xcom, it returns a string instead of a python dictionary. So the

Airflow xcom pull only returns string

大兔子大兔子 提交于 2020-12-06 12:31:04
问题 I have an airflow pipeline where I need to get a filename from a pubsub subscription and then import that file into a cloud sql instance. I use the CloudSqlInstanceImportOperator to import the CSV file. This operator needs a body, which contains the filename and other parameters. Since I read that filename during runtime, I also have to define the body during runtime. This all works. But when I pull the body from xcom, it returns a string instead of a python dictionary. So the

ETL approaches to bulk load data in Cloud SQL

南笙酒味 提交于 2020-12-04 08:47:18
问题 I need to ETL data into my Cloud SQL instance. This data comes from API calls. Currently, I'm running a custom Java ETL code in Kubernetes with Cronjobs that makes request to collect this data and load it on Cloud SQL. The problem comes with managing the ETL code and monitoring the ETL jobs. The current solution may not scale well when more ETL processes are incorporated. In this context, I need to use an ETL tool. My Cloud SQL instance contains two types of tables: common transactional