gcloud

Can I automate Google Cloud SDK gcloud init - interactive command

狂风中的少年 提交于 2019-11-28 17:04:34
Documentation on Google Cloud SDK https://cloud.google.com/sdk/docs/ directs one to run gcloud init after installing it. Is there a way to automate this step given that gcloud init is an interactive command? cherba One does not need to run gcloud init . Main goal is to make sure credentials are configured and perhaps the project property is set. If you have service-account credentials, gcloud can be configured and ready to go via the following: gcloud auth activate-service-account --key-file=credential_key.json gcloud config set project my-project For completeness gcloud init essentially runs

Google App Engine: from six.moves import http_client no module named moves

末鹿安然 提交于 2019-11-28 14:28:34
Okie dokie, I'm trying to get Google's Dialogflow python API working with Google App Engine, and I seem to be running into issues when I run the application. I have pip installed dialogflow to a lib folder and added the lib folder through the app.yaml file. I keep running into an error where it says that it can't find 'six.moves.' Very new to this (app engine in general), so please tell me if I have something setup wrong. I've read a few other threads with no luck. This won't work locally or deployed. below are my app.yaml file: runtime: python27 api_version: 1 threadsafe: true service: basic

How can I use bucket storage to serve static files on google flex/app engine environment?

こ雲淡風輕ζ 提交于 2019-11-28 13:49:31
I have a nodejs backend and a reactjs frontend. I am using the gcloud flex environment (app engine) and want to serve all the frontend files using a CDN. I would not want the requests to touch my nodejs server. I am unable to configure my projects app.yaml to do the same. I suspect that my requests are not being served from a CDN because if I comment the below line in my nodejs code, I can no longer access index.html . app.use('/', express.static(path.resolve('./frontend/dist'))); Below is the YAML file. handlers: - url: /(.*\.html) mime_type: text/html static_files: frontend/dist/\1 upload:

gcloud: how to download the app via cli

廉价感情. 提交于 2019-11-28 13:44:23
I depolyed an app with gcloud preview app deploy. Is there a way to download it to an other local machine? How can I get the files? I tried it via ssh with no success (can't access the docker dir) UPDATE: I found this: gcloud preview app modules download default --version 1 --output-dir=my_dir but it's not loading files Log Downloading module [default] to [my_dir/default] Fetching file list from server... |- Downloading [0] files... -| Currently, the best way to do this is to pull the files out of Docker. Put instance into self-managed mode, so that you can ssh into it: $ gcloud preview app

Can't install gcloud on Amazon Linux : invalid syntax

江枫思渺然 提交于 2019-11-28 12:20:55
问题 I'm trying to install gcloud on my EC2 server running Amazon Linux 4.14.47-56.37 64bits, in interactive mode running the following command : curl https://sdk.cloud.google.com | bash The files download correctly, but the install then fails with the following Traceback : File "/home/ec2-user/google-cloud-sdk/bin/bootstrapping/install.py", line 12, in <module> import bootstrapping File "/home/ec2-user/google-cloud-sdk/bin/bootstrapping/bootstrapping.py", line 32, in <module> import setup #

How to stop creating extra instances when using google managed vms?

冷暖自知 提交于 2019-11-28 09:25:15
问题 Every time I deploy to Google's Managed VM service, the console automatically creates a duplicated instance. I am up to 15 instances running in parallel. I even tried using the command: gcloud preview app deploy "...\app.yaml" --set-default I tried doing some research and it looks like even deleting these duplicated instances can be a pain. Thoughts on how to stop this duplication? 回答1: You can deploy over the same version each time: gcloud preview app deploy "...\app.yaml" --set-default -

Cross project management using service account

大城市里の小女人 提交于 2019-11-28 06:50:01
I need a service account that can access multiple projects, but I have not been able to find a way to do this at all. It seems that a service account is always bound to a project. Another option is to create a service account on the separate projects and then authenticate them using gcloud auth activate-service-account --key-file SOME_FILE.json , but the problem here is that it does not seem possible to automate the creation of service accounts. So the question is then: Is it possible to create a cross project service account or to automate the creation of a service accounts? Even better would

Appengine remote_api_shell not working with application-default credentials since update

安稳与你 提交于 2019-11-28 01:13:37
I recently updated my gcloud libraries from 118.0.0 to 132.0.0 and immediately remote_api_shell no longer worked. I went through a number of permutations of re-logging in, to set the application-default credentials through gcloud, and to use a service account and environment variable. All permutations failed with the same error message: Traceback (most recent call last): File "/Users/mbostwick/google-cloud-sdk/bin/remote_api_shell.py", line 133, in <module> run_file(__file__, globals()) File "/Users/mbostwick/google-cloud-sdk/bin/remote_api_shell.py", line 129, in run_file execfile(_PATHS

Can datastore input in google dataflow pipeline be processed in a batch of N entries at a time?

情到浓时终转凉″ 提交于 2019-11-28 01:11:59
I am trying to execute a dataflow pipeline job which would execute one function on N entries at a time from datastore. In my case this function is sending batch of 100 entries to some REST service as payload. This means that I want to go through all entries from one datastore entity, and send 100 batched entries at once to some outside REST service. My current solution Read input from datastore Create as many keys as there are workers specified in pipeline options (1 worker = 1 key). Group by key, so that we get iterator as output (iterator input in step no.4) Programatically batch users in

`gcloud compute copy-files`: permission denied when copying files

帅比萌擦擦* 提交于 2019-11-27 18:48:43
I'm having a hard time copying files over to my Google Compute Engine. I am using an Ubuntu server on Google Compute Engine. I'm doing this from my OS X terminal and I am already authorized using gcloud . local:$ gcloud compute copy-files /Users/Bryan/Documents/Websites/gce/index.php example-instance:/var/www/html --zone us-central1-a Warning: Permanently added '<IP>' (RSA) to the list of known hosts. scp: /var/www/html/index.php: Permission denied ERROR: (gcloud.compute.copy-files) [/usr/bin/scp] exited with return code [1]. insert root@ before the instance name: local:$ gcloud compute copy