gcloud

Gcloud - cloud run deployment fails for deployment to GKE

房东的猫 提交于 2021-01-29 21:32:07
问题 I am trying to deploy a sample angular app to GKE. I created a sample cluster enabling cloud run and istio services in it gcloud beta container clusters create new-cluster \ --addons=HorizontalPodAutoscaling,HttpLoadBalancing,Istio,CloudRun \ --machine-type=n1-standard-2 \ --cluster-version=latest \ --zone=us-east1-b \ --enable-stackdriver-kubernetes --enable-ip-alias \ --scopes cloud-platform --num-nodes 4 --disk-size "10" --image-type "COS" Following is my cloudbuild.yaml file steps: #

Gcloud - cloud run deployment fails for deployment to GKE

╄→尐↘猪︶ㄣ 提交于 2021-01-29 19:23:07
问题 I am trying to deploy a sample angular app to GKE. I created a sample cluster enabling cloud run and istio services in it gcloud beta container clusters create new-cluster \ --addons=HorizontalPodAutoscaling,HttpLoadBalancing,Istio,CloudRun \ --machine-type=n1-standard-2 \ --cluster-version=latest \ --zone=us-east1-b \ --enable-stackdriver-kubernetes --enable-ip-alias \ --scopes cloud-platform --num-nodes 4 --disk-size "10" --image-type "COS" Following is my cloudbuild.yaml file steps: #

GCloud custom image upload failure due to size or permissions

丶灬走出姿态 提交于 2021-01-29 15:53:15
问题 I've been trying to upload two custom images for some time now and I have failed repeatedly. During the import process the Google application always responds with the message that the Compute Engine Default Service Account does not have the role 'roles/compute.storageAdmin'. However, I have both assigned it using the CLI as the webinterface. Notable is that the application throws this error during resizing of the disk. The original size of the disk is about 10GB, however, it tries to convert

gcloud app deploy fails with flexible environment

[亡魂溺海] 提交于 2021-01-29 12:42:08
问题 I've tried the hello world example with nodejs and php and both the standard environments work fine. But both examples using the flexible give the same error when I use 'gcloud app deploy': ERROR: gcloud crashed (TypeError): '>' not supported between instances of 'NoneType' and 'int' Works: https://cloud.google.com/nodejs/getting-started/hello-world https://cloud.google.com/appengine/docs/standard/php7/quickstart Fails: https://cloud.google.com/appengine/docs/flexible/nodejs/quickstart https:

Gcloud app deploy SQLSTATE[HY000] [2002] No such file or directory

生来就可爱ヽ(ⅴ<●) 提交于 2021-01-29 10:14:07
问题 I already have a working laravel project with database in my local host. But when i use GCP app engine i follow the step with database connection. I already create mysql database in my cloud and imported all my sql from working local database. When i app deploy the shows SQLSTATE[HY000] [2002] No such file or directory below are my app.yaml file content runtime: php env: flex runtime_config: document_root: public # Ensure we skip ".env", which is only for local development skip_files: - .env

Command to import multiple files from Cloud Storage into BigQuery

£可爱£侵袭症+ 提交于 2021-01-29 09:36:54
问题 I've figured that this command lists paths to all files: gsutil ls "gs://bucket/foldername/*.csv" This command imports a file to BQ and autodetects schema: bq load --autodetect --source_format=CSV dataset.tableName gs://bucket/foldername/something.csv Now I need to make it work together to import all files to respective tables in BQ. If table exists, then replace it. Could you give me a hand? 回答1: First, create a file with all the list with all the folders you want to load into BigQuery:

How to retrieve credentials of a created Google Kubernetes (GKE) cluster in Ansible?

北城以北 提交于 2021-01-29 09:34:38
问题 I'm creating a cluster and node pool with - name: "Create Google Kubernetes Engine Cluster to be setup with with kubectl" gcp_container_cluster: name: "{{cluster_name}}" project: "{{project_id}}" auth_kind: "serviceaccount" location: "{{cluster_location}}" logging_service: "none" monitoring_service: "none" service_account_contents: "{{service_account_contents}}" initial_node_count: 1 register: cluster - name: "Create node pool for system pods" gcp_container_node_pool: name: "default-pool"

ERROR: (gcloud.compute.scp) [/usr/bin/scp] exited with return code [1]

|▌冷眼眸甩不掉的悲伤 提交于 2021-01-29 09:26:57
问题 I'm just trying to move a simple text file from the local host to the remote host. I'm using Google's Cloud computing and more specifically, I'm using the gcloud command line tool. Here are the instructions and errors I received: Admins-MacBook-Pro-4:downloads kylefoley$ gcloud compute scp lst_calc.txt instance-1:/home/kylefoley76/hey.txt No zone specified. Using zone [us-central1-a] for instance: [instance-1]. Updating project ssh metadata...⠧Updated [https://www.googleapis.com/compute/v1

Enabling Google Cloud Shell “boost” mode via gcloud cli

此生再无相见时 提交于 2021-01-29 07:37:01
问题 I use the method mentioned in this excellent answer https://stackoverflow.com/a/49515502/10690958 to connect to Google Cloud Shell via ssh on my ubuntu workstation. Occasionally, I need to enable "boost-mode". In that case, I currently have to open the Cloud Shell via firefox (https://console.cloud.google.com/cloudshell/editor?shellonly=true), then login and enable boost mode. After that I can close firefox, and use the gcloud method to access the cloud shell VM in boost mode. I would like to

Enabling Google Cloud Shell “boost” mode via gcloud cli

淺唱寂寞╮ 提交于 2021-01-29 07:34:36
问题 I use the method mentioned in this excellent answer https://stackoverflow.com/a/49515502/10690958 to connect to Google Cloud Shell via ssh on my ubuntu workstation. Occasionally, I need to enable "boost-mode". In that case, I currently have to open the Cloud Shell via firefox (https://console.cloud.google.com/cloudshell/editor?shellonly=true), then login and enable boost mode. After that I can close firefox, and use the gcloud method to access the cloud shell VM in boost mode. I would like to