google-cloud-platform

Dialogflow Google Assistant Alpha release always failing with the following message: “For en: Your sample pronunciations are structured incorrectly.”

馋奶兔 提交于 2021-02-20 09:41:11
问题 Alpha release of Google assistant not working. It always showing the message: "For en: Your sample pronunciations are structured incorrectly. 回答1: This seems to indicate a problem with the "additional invocation phrases" setup in the directory information. 回答2: This happens when you rename your action once it is released. Go To Deploy > Directory Information > Additional invocation phrases and replace the invocation phrase with the updated action name. This has to match your action invocation

cd .. command does not work with gcloud compute ssh while other basic commands (such as pwd) do work. Why? [duplicate]

喜欢而已 提交于 2021-02-20 04:41:11
问题 This question already has answers here : Multiple commands on remote machine using shell script (3 answers) Why would SSH commands through Putty work differently to those via PHP's phpseclib? (1 answer) Paramiko: calling “cd” command with exec_command does nothing (3 answers) Closed 2 years ago . I have a running instance my-instance running on google-cloud. I run the following code on my local machine: gcloud compute ssh my-instance --command 'pwd' gcloud compute ssh my-instance --command

cd .. command does not work with gcloud compute ssh while other basic commands (such as pwd) do work. Why? [duplicate]

别说谁变了你拦得住时间么 提交于 2021-02-20 04:41:11
问题 This question already has answers here : Multiple commands on remote machine using shell script (3 answers) Why would SSH commands through Putty work differently to those via PHP's phpseclib? (1 answer) Paramiko: calling “cd” command with exec_command does nothing (3 answers) Closed 2 years ago . I have a running instance my-instance running on google-cloud. I run the following code on my local machine: gcloud compute ssh my-instance --command 'pwd' gcloud compute ssh my-instance --command

How can we visualize the Dataproc job status in Google Cloud Plarform?

随声附和 提交于 2021-02-20 04:17:05
问题 How can we visualize (via Dashboards) the Dataproc job status in Google Cloud Platform? We want to check if jobs are running or not, in addition of their status like running, delay, blocked. On top of it we want to set alerting (Stackdriver Alerting) as well. 回答1: In this page, you have all the metrics available in Stackdriver https://cloud.google.com/monitoring/api/metrics_gcp#gcp-dataproc You could use cluster/job/submitted_count , cluster/job/failed_count and cluster/job/running_count to

Google Cloud Dataflow to BigQuery - UDF - convert unixTimestamp to local time

自闭症网瘾萝莉.ら 提交于 2021-02-20 03:48:05
问题 What is the best way to convert unixTimestamp to local time in the following scenario? I am using Pub/Sub Subscription to BigQuery Template. Dataflow fetches data in json format from PubSub, does the transformation, inserts into BigQuery Preferably, I want to use UDF for data transformation setup. (For simplicity,) Input data includes only unixTimestamp. Example: {"unixTimestamp": "1612325106000"} Bigquery table has 3 columns: unix_ts:INTEGER, iso_dt:DATETIME, local_dt:DATETIME where unix_ts

Google Cloud Dataflow to BigQuery - UDF - convert unixTimestamp to local time

蓝咒 提交于 2021-02-20 03:46:15
问题 What is the best way to convert unixTimestamp to local time in the following scenario? I am using Pub/Sub Subscription to BigQuery Template. Dataflow fetches data in json format from PubSub, does the transformation, inserts into BigQuery Preferably, I want to use UDF for data transformation setup. (For simplicity,) Input data includes only unixTimestamp. Example: {"unixTimestamp": "1612325106000"} Bigquery table has 3 columns: unix_ts:INTEGER, iso_dt:DATETIME, local_dt:DATETIME where unix_ts

Google Cloud Dataflow to BigQuery - UDF - convert unixTimestamp to local time

一个人想着一个人 提交于 2021-02-20 03:45:42
问题 What is the best way to convert unixTimestamp to local time in the following scenario? I am using Pub/Sub Subscription to BigQuery Template. Dataflow fetches data in json format from PubSub, does the transformation, inserts into BigQuery Preferably, I want to use UDF for data transformation setup. (For simplicity,) Input data includes only unixTimestamp. Example: {"unixTimestamp": "1612325106000"} Bigquery table has 3 columns: unix_ts:INTEGER, iso_dt:DATETIME, local_dt:DATETIME where unix_ts

One or more points were written more frequently than the maximum sampling period configured for the metric

你。 提交于 2021-02-19 08:53:05
问题 Background I have a website deployed in multiple machines. I want to create a Google Custom Metric that specifies the throughput of it - how many calls were served. The idea was to create a custom metric that collects information about served requests and 1 time per minute to update the information into a custom metric. So, for each machine, this code can happen a maximum of 1-time per minute. But this process is happening on each machine on my cluster. Running the code locally is working

Do GCP Deployment Manager templates require a schema file?

流过昼夜 提交于 2021-02-19 07:41:20
问题 Since the templates can store values themselves (in addition to receiving values from the configuration yaml file), is a schema required when we create a template? If not, is the use of the schema just to enforce which values are required, and the type of values that need to be provided to the template? EDIT I'm referring to template schemas, not resource properties schema: https://cloud.google.com/deployment-manager/docs/configuration/templates/using-schemas 回答1: Google Deployment Manager

Deploying Helm workloads with Terraform on GKE cluster

假装没事ソ 提交于 2021-02-19 06:19:06
问题 I am trying to use Terraform Helm provider (https://www.terraform.io/docs/providers/helm/index.html) to deploy a workload to GKE cluster. I am more or less following Google's example - https://github.com/GoogleCloudPlatform/terraform-google-examples/blob/master/example-gke-k8s-helm/helm.tf, but I do want to use RBAC by creating the service account manually. My helm.tf looks like this: variable "helm_version" { default = "v2.13.1" } data "google_client_config" "current" {} provider "helm" {