gcloud-python

Python: How can I create a GoogleCredentials using a specific user instead of get_application_default()

老子叫甜甜 提交于 2021-02-18 18:57:12
问题 I'm updating a script to call OAuth2-protected Google Cloud endpoints. The previous version assumed a single user previously authenticated by gcloud auth login and thus was able to use the default: credentials = GoogleCredentials.get_application_default() http = credentials.authorize(http) However now I must do some calls as user A and some as user B. I can perform these steps in the shell to generate access tokens, but I would prefer to do it in the program directly: gcloud auth login user_A

GCloud Upload httplib2.RedirectMissingLocation: Redirected but the response is missing a Location: header

爷,独闯天下 提交于 2020-07-20 07:35:09
问题 I am attempting to upload a small file to gcloud using a simple python program client = storage.Client(project=GCLOUD_PROJECT) bucket = client.get_bucket(GCLOUD_BUCKET) blob = bucket.blob(GCLOUD_FILE_ON_CLOUD) blob.upload_from_filename(GCLOUD_FILE_LOCAL) It had been working until recently and something changed. Now, whenever I upload a file greater than 5MB I get the below error. Files less than or equal to 5MB goes through. The size isn't large enough to implement resumable upload, is it?

GCloud Upload httplib2.RedirectMissingLocation: Redirected but the response is missing a Location: header

元气小坏坏 提交于 2020-07-20 07:35:08
问题 I am attempting to upload a small file to gcloud using a simple python program client = storage.Client(project=GCLOUD_PROJECT) bucket = client.get_bucket(GCLOUD_BUCKET) blob = bucket.blob(GCLOUD_FILE_ON_CLOUD) blob.upload_from_filename(GCLOUD_FILE_LOCAL) It had been working until recently and something changed. Now, whenever I upload a file greater than 5MB I get the below error. Files less than or equal to 5MB goes through. The size isn't large enough to implement resumable upload, is it?

Set metadata in Google Cloud Storage using gcloud-python

北城余情 提交于 2020-01-12 08:13:29
问题 I am trying to upload a file to Google Cloud Storage using gcloud-python and set some custom metadata properties. To try this I have created a simple script. import os from gcloud import storage client = storage.Client('super secret app id') bucket = client.get_bucket('super secret bucket name') blob = bucket.get_blob('kirby.png') blob.metadata = blob.metadata or {} blob.metadata['Color'] = 'Pink' with open(os.path.expanduser('~/Pictures/kirby.png'), 'rb') as img_data: blob.upload_from_file

Google Cloud Storage + Python : Any way to list obj in certain folder in GCS?

岁酱吖の 提交于 2019-12-30 06:09:40
问题 I'm going to write a Python program to check if a file is in certain folder of my Google Cloud Storage, the basic idea is to get the list of all objects in a folder, a file name list , then check if the file abc.txt is in the file name list . Now the problem is, it looks Google only provide the one way to get obj list , which is uri.get_bucket() , see below code which is from https://developers.google.com/storage/docs/gspythonlibrary#listing-objects uri = boto.storage_uri(DOGS_BUCKET, GOOGLE

Google Cloud Storage + Python : Any way to list obj in certain folder in GCS?

巧了我就是萌 提交于 2019-12-30 06:08:32
问题 I'm going to write a Python program to check if a file is in certain folder of my Google Cloud Storage, the basic idea is to get the list of all objects in a folder, a file name list , then check if the file abc.txt is in the file name list . Now the problem is, it looks Google only provide the one way to get obj list , which is uri.get_bucket() , see below code which is from https://developers.google.com/storage/docs/gspythonlibrary#listing-objects uri = boto.storage_uri(DOGS_BUCKET, GOOGLE

Cannot get gcloud to work with Python and Pycharm

大兔子大兔子 提交于 2019-12-24 07:07:28
问题 I am trying to connect to the Google App Engine Datastore from my local machine. I have spent all day digging in to this without any luck. I have tried the approach here (as well as alot of other suggestions from SO such as Using gcloud-python in GAE and Unable to run dev_appserver.py with gcloud): How to access a remote datastore when running dev_appserver.py? I first installed gcloud based on this description from google: https://cloud.google.com/appengine/docs/python/tools/using-libraries

Using Google Cloud Datastore with NDB API?

老子叫甜甜 提交于 2019-12-01 21:14:07
问题 There is a lot of info on using NDB API with Google App Engine Datastore but I can't find any info on how to use NDB with Google Cloud Datastore. The only module I found is googledatastore which is very primitive library. How is App Engine Datastore different from the Cloud Datastore? Is NDB available for the Cloud Datastore? 回答1: NDB support outside of App Engine (using Google Cloud Datastore) is currently in development. UPDATE: Check out the NDB development discussion on GitHub. 回答2: You

Using Google Cloud Datastore with NDB API?

跟風遠走 提交于 2019-12-01 19:15:51
There is a lot of info on using NDB API with Google App Engine Datastore but I can't find any info on how to use NDB with Google Cloud Datastore. The only module I found is googledatastore which is very primitive library. How is App Engine Datastore different from the Cloud Datastore? Is NDB available for the Cloud Datastore? Alfred Fuller NDB support outside of App Engine (using Google Cloud Datastore) is currently in development. UPDATE: Check out the NDB development discussion on GitHub. You might want to try using gcloud.datastore ( pip install gcloud ). Docs: http://googlecloudplatform