gcloud

Get service account auth token without gcloud?

泄露秘密 提交于 2019-11-27 15:54:18
问题 Is it possible to get an authorization bearer token for a Google Cloud service account without the use of gcloud ? That is, I would like to make an HTTP request (presumably signed in some way by my JSON key file) that would provide me the equivalent of gcloud auth application-default print-access-token This request would be made on my own server, where I may not wish to install gcloud and where I do not have access to any internal server metadata that might provide this (e.g., as is the case

gcloud command not found - while installing Google Cloud SDK

馋奶兔 提交于 2019-11-27 09:42:40
问题 I am on a mac and am trying to install the Google Cloud SDK (including the gcloud command line utility) using this command in terminal curl https://sdk.cloud.google.com | bash as seen at https://cloud.google.com/sdk/ It got all the way to the end and finished but even after I restarted my shell, the gcloud command still says it's not found. Why isn't this installation working? 回答1: So below is my previous fix for this problem, but it turns out it isn't permanent. It works but every time you

Google App Engine: from six.moves import http_client no module named moves

主宰稳场 提交于 2019-11-27 08:27:30
问题 Okie dokie, I'm trying to get Google's Dialogflow python API working with Google App Engine, and I seem to be running into issues when I run the application. I have pip installed dialogflow to a lib folder and added the lib folder through the app.yaml file. I keep running into an error where it says that it can't find 'six.moves.' Very new to this (app engine in general), so please tell me if I have something setup wrong. I've read a few other threads with no luck. This won't work locally or

How can I use bucket storage to serve static files on google flex/app engine environment?

为君一笑 提交于 2019-11-27 08:02:43
问题 I have a nodejs backend and a reactjs frontend. I am using the gcloud flex environment (app engine) and want to serve all the frontend files using a CDN. I would not want the requests to touch my nodejs server. I am unable to configure my projects app.yaml to do the same. I suspect that my requests are not being served from a CDN because if I comment the below line in my nodejs code, I can no longer access index.html . app.use('/', express.static(path.resolve('./frontend/dist'))); Below is

Save pandas data frame as csv on to gcloud storage bucket

試著忘記壹切 提交于 2019-11-27 07:23:07
问题 from pyspark import SparkContext, SparkConf from pyspark.sql import SparkSession import gc import pandas as pd import datetime import numpy as np import sys APP_NAME = "DataFrameToCSV" spark = SparkSession\ .builder\ .appName(APP_NAME)\ .config("spark.sql.crossJoin.enabled","true")\ .getOrCreate() group_ids = [1,1,1,1,1,1,1,2,2,2,2,2,2,2] dates = ["2016-04-01","2016-04-01","2016-04-01","2016-04-20","2016-04-20","2016-04-28","2016-04-28","2016-04-05","2016-04-05","2016-04-05","2016-04-05",

Upload files to Firebase Storage using Node.js

∥☆過路亽.° 提交于 2019-11-27 03:27:47
I'm trying to understand how to upload files in Firebase Storage, using Node.js. My first try was to use the Firebase library: "use strict"; var firebase = require('firebase'); var config = { apiKey: "AIz...kBY", authDomain: "em....firebaseapp.com", databaseURL: "https://em....firebaseio.com", storageBucket: "em....appspot.com", messagingSenderId: "95...6" }; firebase.initializeApp(config); // Error: firebase.storage is undefined, so not a function var storageRef = firebase.storage().ref(); var uploadTask = storageRef.child('images/octofez.png').put(file); // Register three observers: // 1.

Can datastore input in google dataflow pipeline be processed in a batch of N entries at a time?

会有一股神秘感。 提交于 2019-11-26 21:55:10
问题 I am trying to execute a dataflow pipeline job which would execute one function on N entries at a time from datastore. In my case this function is sending batch of 100 entries to some REST service as payload. This means that I want to go through all entries from one datastore entity, and send 100 batched entries at once to some outside REST service. My current solution Read input from datastore Create as many keys as there are workers specified in pipeline options (1 worker = 1 key). Group by

Appengine remote_api_shell not working with application-default credentials since update

て烟熏妆下的殇ゞ 提交于 2019-11-26 21:52:01
问题 I recently updated my gcloud libraries from 118.0.0 to 132.0.0 and immediately remote_api_shell no longer worked. I went through a number of permutations of re-logging in, to set the application-default credentials through gcloud, and to use a service account and environment variable. All permutations failed with the same error message: Traceback (most recent call last): File "/Users/mbostwick/google-cloud-sdk/bin/remote_api_shell.py", line 133, in <module> run_file(__file__, globals()) File

`gcloud compute copy-files`: permission denied when copying files

可紊 提交于 2019-11-26 19:36:41
问题 I'm having a hard time copying files over to my Google Compute Engine. I am using an Ubuntu server on Google Compute Engine. I'm doing this from my OS X terminal and I am already authorized using gcloud . local:$ gcloud compute copy-files /Users/Bryan/Documents/Websites/gce/index.php example-instance:/var/www/html --zone us-central1-a Warning: Permanently added '<IP>' (RSA) to the list of known hosts. scp: /var/www/html/index.php: Permission denied ERROR: (gcloud.compute.copy-files) [/usr/bin

What is the relationship between Google's App Engine SDK and Cloud SDK?

一曲冷凌霜 提交于 2019-11-26 17:55:59
I'm developing a Google App Engine application and I am encountering references to both an App Engine SDK and a Cloud SDK . How do these two SDKs relate to each other? There is definitely some overlap between the two. There is a dev_appserver.py and appcfg.py is both of them. I can run a development server using dev_appserver.py , and also with gcloud preview app run . Why are there two tools that do the same thing? Is one being deprecated in favor of the other? Is there a roadmap for merging the toolsets, or are they going to be maintained in parallel? Do I need both, or just one? It seems