gcloud

google cloud sdk: set environment variable_ python --> linux

心已入冬 提交于 2019-12-04 05:58:25
ERROR: Python 3 is not supported by the Google Cloud SDK. Please use a Python 2.x version that is 2.6 or greater. If you have a compatible Python interpreter installed, you can use it by setting the CLOUDSDK_PYTHON environment variable to point to it. I guess the first question we should be asking is "with all the money google makes off of their customers why can't they hire someone to ensure that their cloud sdk works with python 3?" How exactly to overcome this error on linux? What specific files need to be edited? and where should those files be located? I searched around, a lot, and found

How to authenticate with a Google Service Account in Jenkins pipeline

一笑奈何 提交于 2019-12-04 05:32:26
I want to use gcloud in Jenkins pipeline and therefore I have to authenticate first with the Google Service account. I'm using the https://wiki.jenkins.io/display/JENKINS/Google+OAuth+Plugin which holds my Private Key Credentials. I'm stuck with loading the credentials into the pipeline: withCredentials([[$class: 'MultiBinding', credentialsId: 'my-creds', variable: 'GCSKEY']]) { sh "gcloud auth activate-service-account --key-file=${GCSKEY}" } I also tried it with from file, but without luck. withCredentials([file(credentialsId:'my-creds', variable: 'GCSKEY')]) { The log says: org.jenkinsci

How to authenticate to Google Cloud API without Application Default Credentials or Cloud SDK?

可紊 提交于 2019-12-04 01:15:24
I'm trying to access the Google Cloud API from an AWS Lambda function but I don't know how to authenticate. The auth guide in the Google Cloud documentation ( https://cloud.google.com/docs/authentication ) wants me to download a credentials JSON file and use Application Default Credentials, but as anyone who has used hosted functions already knows, the point is that you don't need to manage a server or runtime environment, so Lambda doesn't give me the ability to store arbitrary files in the environment of the running code. I can use the Cloud SDK locally to get an access token but it expires

gcloud component update fails

孤人 提交于 2019-12-04 01:06:39
I've deployed to VM's running Debian on GCE and have cron scripts that use gcloud commands. I noticed that gcloud components update retuns this error ERROR: (gcloud.components.update) The component manager is disabled for this installation My mac works fine to update gcloud and add new components. The built in gcloud tools that were in the VM image won't update. I have not found out how to enable the component manager. UPDATED Now you can use sudo apt-get install google-cloud-sdk command to install or update Google Cloud SDK. You may need to add Cloud SDK repository in your Linux machine. This

Why is this gcloud compute copy-files producing an error message?

♀尐吖头ヾ 提交于 2019-12-03 23:43:11
When I execute the command gcloud compute copy-files "C:\Users\fName lName\Desktop\testtext.txt" instancename:test.txt --zone europe-west1-a I receive the error: "All sources must be local files when the destination is remote." . Can anyone help me figure out what is wrong? Thanks asking this! This appears to be a bug in gcloud that we're now tracking. The issue is that gcloud compute copy-files is interpreting the colon in C:\Users\fNam... as a part of a remote path. As suggested by George's answer, the work around is to avoid local paths containing the colon character. In order to copy the

not able to access kubernetes dashboard in gcloud

人走茶凉 提交于 2019-12-03 20:06:52
I am following the instructions as given here I used the command to get a running cluster, in gcloud console I typed: curl -sS https://get.k8s.io | bash as described in the link, after that I ran the command kubectl cluster-info from that I got: kubernetes-dashboard is running at https://35.188.109.36/api/v1/proxy/namespaces/kube- system/services/kubernetes-dashboard but when I go to that url from firefox, the message that comes is: User "system:anonymous" cannot proxy services in the namespace "kube-system".: "No policy matched." Expected behaviour: Should ask for admin name and password to

Gcloud - How to automate installation of gcloud on a server?

和自甴很熟 提交于 2019-12-03 16:46:57
I want to write a shell script which basically goes through all the installation steps for gcloud, as outlined at: https://cloud.google.com/sdk/?hl=en However, when you are run install.sh, you will be asked to enter an authorization code, your project-id, whether you want to help improve Google Cloud or not and so on. Basically user-inputs are required. But if I want to automate the installation process on a machine where there will not be any user, how can I do this? There are two separate problems here. First, how do you install without prompts: Download the google cloud sdk tar file. This

Uploading a buffer to google cloud storage

此生再无相见时 提交于 2019-12-03 16:29:32
问题 I'm trying to save a Buffer (of a file uploaded from a form) to Google Cloud storage, but it seems like the Google Node SDK only allows files with a given path to be uploaded (Read / Write streams). This is what I have used for AWS (S3) - is the anything else similar in the Google node SDK?: var fileContents = new Buffer('buffer'); var params = { Bucket: //bucket name Key: //file name ContentType: // Set mimetype Body: fileContents }; s3.putObject(params, function(err, data) { // Do something

Node pool does not reduce his node size to zero although autoscaling is enabled

对着背影说爱祢 提交于 2019-12-03 15:52:21
I have created two node pools. A small one for all the google system jobs and a bigger one for my tasks. The bigger one should reduce its size to 0 after the job is done. The problem is: Even if there are no cron jobs, the node pool do not reduce his size to 0. Creating cluster: gcloud beta container --project "projectXY" clusters create "cluster" --zone "europe-west3-a" --username "admin" --cluster-version "1.9.6-gke.0" --machine-type "n1-standard-1" --image-type "COS" --disk-size "100" --scopes "https://www.googleapis.com/auth/cloud-platform" --num-nodes "1" --network "default" --enable

Is it possible to copy a directory from a Google Compute Engine instance to my local machine?

此生再无相见时 提交于 2019-12-03 15:41:32
问题 With scp I can add the -r flag to download directories to my local machine via ssh. When using: gcloud compute scp -r it sais that '-r' is not an available option. Without -r I get an error saying that my source path is a directory. (Implying I can only download single files.) Is there an equivalent to -r flag for gcloud compute scp command? 回答1: Found it! GCE offers an equivalent and it is --recurse . My final command looks like this: gcloud compute scp --recurse username@instance_name:./*