google-cloud-platform

Cannot connect Google Cloud SQL in MySql Workbench

不羁岁月 提交于 2021-02-11 02:46:59
问题 I have created a MySql server in Google Cloud SQL and trying it to connect it from my local machine using MySql Workbench but i am not able to establish a connection to it. I get the following error Failed to Connect to MySQL at {IP-Address}:3306 with user root(10060) 回答1: If you are trying to connect from outside network to gcp sql instance you need to add your public ip of your machine in the connection of sql instance Go to your sql instance Go to Connections tab Under Connectivity select

Parsing Nested JSON into STRUCT type BQ table

断了今生、忘了曾经 提交于 2021-02-10 23:17:04
问题 I am trying to load following data into BQ to create STRUCT type table. I am uploading the file using Upload option with Auto detect schema on BigQuery web UI. {"property": [ { "NAME": "65874aca2143", "VALUE": [ { "NAME": "time", "VALUE": [ { "NAME": "$date", "VALUE": "2020-06-16T09:42:49.449Z" } ] }, { "NAME": "type", "VALUE": "ACTION" }, { "NAME": "id", "VALUE": "1234" } ] } ]} But it is giving me below error. Error while reading data, error message: Failed to parse JSON: No active field

Parsing Nested JSON into STRUCT type BQ table

谁说我不能喝 提交于 2021-02-10 23:16:05
问题 I am trying to load following data into BQ to create STRUCT type table. I am uploading the file using Upload option with Auto detect schema on BigQuery web UI. {"property": [ { "NAME": "65874aca2143", "VALUE": [ { "NAME": "time", "VALUE": [ { "NAME": "$date", "VALUE": "2020-06-16T09:42:49.449Z" } ] }, { "NAME": "type", "VALUE": "ACTION" }, { "NAME": "id", "VALUE": "1234" } ] } ]} But it is giving me below error. Error while reading data, error message: Failed to parse JSON: No active field

Where is set global limit for gcloud steps timeout for all builds?

一世执手 提交于 2021-02-10 23:10:35
问题 Where can I find global limit for gcloud build step timeout? This is my gcloud build config: steps: - name: 'gcr.io/cloud-builders/yarn' - name: 'gcr.io/cloud-builders/yarn' args: ['build-nginx-config'] - name: 'gcr.io/cloud-builders/yarn' args: ['build'] timeout: 3601s ... timeout: 7200s And this is what I get when I try to run this build: [10:41:45]ERROR: (gcloud.builds.submit) INVALID_ARGUMENT: invalid build: invalid timeout in build step #2: build step timeout "1h0m1s" must be <= build

Where is set global limit for gcloud steps timeout for all builds?

泪湿孤枕 提交于 2021-02-10 23:08:14
问题 Where can I find global limit for gcloud build step timeout? This is my gcloud build config: steps: - name: 'gcr.io/cloud-builders/yarn' - name: 'gcr.io/cloud-builders/yarn' args: ['build-nginx-config'] - name: 'gcr.io/cloud-builders/yarn' args: ['build'] timeout: 3601s ... timeout: 7200s And this is what I get when I try to run this build: [10:41:45]ERROR: (gcloud.builds.submit) INVALID_ARGUMENT: invalid build: invalid timeout in build step #2: build step timeout "1h0m1s" must be <= build

Where is set global limit for gcloud steps timeout for all builds?

瘦欲@ 提交于 2021-02-10 23:05:28
问题 Where can I find global limit for gcloud build step timeout? This is my gcloud build config: steps: - name: 'gcr.io/cloud-builders/yarn' - name: 'gcr.io/cloud-builders/yarn' args: ['build-nginx-config'] - name: 'gcr.io/cloud-builders/yarn' args: ['build'] timeout: 3601s ... timeout: 7200s And this is what I get when I try to run this build: [10:41:45]ERROR: (gcloud.builds.submit) INVALID_ARGUMENT: invalid build: invalid timeout in build step #2: build step timeout "1h0m1s" must be <= build

How to annotate MULTIPLE images from a single call using Google's vision API? Python

故事扮演 提交于 2021-02-10 20:31:53
问题 I recently started using Google's vision API. I am trying to annotate a batch of images and therefore issued the 'batch image annotation offline' guide from their documentation. However, it is not clear to me how I can annotate MULTIPLE images from one API call. So let's say I have stored 10 images in my google cloud bucket. How can I annotate all these images at once and store them in one JSON file? Right now, I wrote a program that calls their example function and it works, but to put it

How to annotate MULTIPLE images from a single call using Google's vision API? Python

泪湿孤枕 提交于 2021-02-10 20:28:57
问题 I recently started using Google's vision API. I am trying to annotate a batch of images and therefore issued the 'batch image annotation offline' guide from their documentation. However, it is not clear to me how I can annotate MULTIPLE images from one API call. So let's say I have stored 10 images in my google cloud bucket. How can I annotate all these images at once and store them in one JSON file? Right now, I wrote a program that calls their example function and it works, but to put it

Google bigquery export table to multiple files in Google Cloud storage and sometimes one single file

泄露秘密 提交于 2021-02-10 20:25:46
问题 I am using Bigquery python libraries to export data from Bigquery tables into GCS in csv format. I have given a wildcard pattern assuming some tables can be more than 1 GB Sometimes even though table is few MB it creates multiple files and sometimes just it creates just 1 file. Is there a logic behind this? My export workflow is the following: project = bq_project dataset_id = bq_dataset_id table_id = bq_table_id bucket_name =bq_bucket_name workflow_name=workflow_nm csv_file_nm=workflow_nm+"/

gcloud app deploy with private docker images

懵懂的女人 提交于 2021-02-10 19:57:28
问题 I have a project that has a Dockerfile inside and inside that Dockerfile is a base private image. When I run gcloud app deploy it will return an error with the below message Error response from daemon: pull access denied for dean, repository does not exist or may require 'docker login' I tried docker login before running gcloud app deploy but it did not work 回答1: The easiest way to get this rolling is to push the private image up to Google Container Registry. The per-project registry is