gcloud

Google Compute Engine: How to snapshot a VM's disk

為{幸葍}努か 提交于 2019-12-24 09:23:40
问题 I'm trying to follow these instructions: https://cloud.google.com/compute/docs/disks#creating_snapshots It's not obvious to me if I'm supposed to run the "gcloud compute disks snapshot DISK" command from my personal machine, or on the VM over SSH. I tried the former and I couldn't figure out an argument for DISK that worked. So I SSH'ed into the machine and ran $ gcloud compute disks snapshot / I left that running for several hours, without seeing any indication of progress. Now when I try to

Executing a Dataflow job with multiple inputs/outputs using gcloud cli

偶尔善良 提交于 2019-12-24 08:45:58
问题 I've designed a data transformation in Dataprep and am now attempting to run it by using the template in Dataflow. My flow has several inputs and outputs - the dataflow template provides them as a json object with key/value pairs for each input & location. They look like this (line breaks added for easy reading): { "location1": "project:bq_dataset.bq_table1", #... "location10": "project:bq_dataset.bq_table10", "location17": "project:bq_dataset.bq_table17" } I have 17 inputs (mostly lookups)

gcloud can connect but gsutil cannot

亡梦爱人 提交于 2019-12-24 07:28:45
问题 Trying to use gcloud & gsutil from a laptop. gcloud can connect but gsutil cannot: mylaptop:~ jamiet$ gcloud projects list | head -2 PROJECT_ID NAME PROJECT_NUMBER dev-99999 dev-99999 999999999999 mylaptop:~ jamiet$ gsutil ls INFO 0305 21:11:10.561232 util.py] Retrying request, attempt #4... INFO 0305 21:11:23.826426 util.py] Retrying request, attempt #5... ^CCaught CTRL-C (signal 2) - exiting Any suggestions what I can do to diagnose the problem? 回答1: Turned out to be human error. Internal

How to modify Google Cloud Pub/Sub subscription acknowledgement deadline for background Cloud Function

天大地大妈咪最大 提交于 2019-12-24 07:17:36
问题 When deploying a background Cloud Function for Cloud Pub/Sub via: gcloud functions deploy function_name --runtime python37 --trigger-topic some_topic A subscription gets automatically created with a push endpoint (likely App Engine standard endpoint, but those are claimed to be without the need of domain verification https://cloud.google.com/pubsub/docs/push#other-endpoints). For the generated subscription/endpoint there doesn't seem like a way to register/verify the domain (https://www

Your credentials are invalid. Please run $ gcloud auth login

生来就可爱ヽ(ⅴ<●) 提交于 2019-12-24 05:45:08
问题 gsutil was working as a stand-alone on my system. Then I installed SDK, including some authentication stuff. Now gsutil says my credentials are invalid. $ gcloud auth login wolfvolpi@gmail.com WARNING: `gcloud auth login` no longer writes application default credentials. If you need to use ADC, see: gcloud auth application-default --help You are now logged in as [redacted]. Your current project is [redacted]. You can change this setting by running: $ gcloud config set project PROJECT_ID $

Google App Engine Jenkins Integration

本秂侑毒 提交于 2019-12-23 23:42:23
问题 There used to be a link to a set of documentation on how to set up Jenkins on Google App Engine and configure it for push-to-deploy functionality, the original link no longer works however, it is still on the wayback machine (click here to view). Other cloud-based solutions either integrate with Bitbucket, or integrate with GAE but not both, which has led me to evaluating setting up my own Jenkins instance. Are the instructions provided in the link still recommended? Or have they been taken

How to do a cartesian product of two PCollections in Dataflow?

自古美人都是妖i 提交于 2019-12-23 15:58:23
问题 I would like to do a cartesian product of two PCollections. Neither PCollection can fit into memory, so doing side input is not feasible. My goal is this: I have two datasets. One is many elements of small size. The other is few (~10) of very large size. I would like to take the product of these two elements and then produce key-value objects. 回答1: I think CoGroupByKey might work in your situation: https://cloud.google.com/dataflow/model/group-by-key#join That's what I did for a similar use

Python Error on Google Cloud Install. How do I properly set the environment variable?

a 夏天 提交于 2019-12-23 12:53:26
问题 I am trying to install the Google Cloud SDK on my Windows machine. I have Python 2.7 currently installed on this machine, and it's located in the System Variables Path like this -> C:\Python27\; I am getting this error during installation: ERROR: gcloud failed to load: DLL load failed: %1 is not a valid Win32 application. The error message also prompts me to check the Python executable by saying: If it is not, please set the CLOUDSDK_PYTHON environment variable to point to a working Python 2

gcloud compute copy-files succeeds but no files appear

孤人 提交于 2019-12-23 12:44:35
问题 I am copying data from my local machine to a compute engine instance: gcloud compute copy-files /Users/me/project/data.csv instance-name:~/project The command runs and completes: data.csv 100% 74KB 73.9KB/s 00:00 However, I cannot find it anywhere on my compute engine instance. It is not visible in the ~/project folder. Is it failing silently or am I looking in the wrong place? 回答1: Short answer Most likely, you're looking into the wrong $HOME . Make sure you're looking in the home directory

Automating gsutil commands

六月ゝ 毕业季﹏ 提交于 2019-12-23 08:49:26
问题 I'm trying to automate some gsutils commands, but struggling to see where the authentication files are kept and how to re-use (if thats what happens). I've gone through the gcloud init process in bash... curl https://sdk.cloud.google.com | bash gcloud init All works well when I run 'gsutil ls' Now I'm trying to automate the process, so this would work on a new server adding into a crontab on it (rather than creating a new config each time). I saw a mention of setting env variable GOOGLE