google-cloud-python

How can I create a PubSub subscription for a topic in a different Google Cloud Platform project using the Python API?

徘徊边缘 提交于 2021-02-19 05:48:06
问题 The API seems to allow me to create a subscription for a topic in a different project, but when I inspect the newly created subscription, the it is associated with the project where the topic is located. from google.cloud import pubsub pubsub_client_publisher = pubsub.Client("publisher-project") topic = pubsub_client_publisher.topic("topic1") pubsub_client_receiver = pubsub.Client("receiver-project") subscription = pubsub.subscription.Subscription("subscription1", topic) subscription.create

Python: How can I create a GoogleCredentials using a specific user instead of get_application_default()

老子叫甜甜 提交于 2021-02-18 18:57:12
问题 I'm updating a script to call OAuth2-protected Google Cloud endpoints. The previous version assumed a single user previously authenticated by gcloud auth login and thus was able to use the default: credentials = GoogleCredentials.get_application_default() http = credentials.authorize(http) However now I must do some calls as user A and some as user B. I can perform these steps in the shell to generate access tokens, but I would prefer to do it in the program directly: gcloud auth login user_A

Using Python to Query GCP Stackdriver logs

让人想犯罪 __ 提交于 2020-06-15 05:59:22
问题 I am using Python3 to query Stackdriver for GCP logs. Unfortunately, the log entries that have important data are returned to me as "NoneType" instead of as a "dict" or a "str". The resulting "entry.payload" is type "None" and the "entry.payload_pb" has the data I want, but it is garbled. Is there a way to get Stackdriver to return this data in a clean format, or is there a way I can parse it? If not, is there a way I should query this data that is better than what I am doing and yields clean

How to authenticate with gcloud big query using a json credentials file?

浪子不回头ぞ 提交于 2020-01-22 19:30:53
问题 In the gcloud documentation for google bigquery, it states that authentication can be determined from from_service_account_json. I've tried the following from gcloud import bigquery client = bigquery.Client.from_service_account_json('/Library/gcloud_api_credentials.json') The json file looks like the following (Note: Scrambled credentials so these are now fake). {"type": "service_account", "project_id": "example_project", "private_key_id": "c7e371776ab6e2dsfafdsaff97edf9377178c8", "private

How to authenticate with gcloud big query using a json credentials file?

吃可爱长大的小学妹 提交于 2020-01-22 19:30:49
问题 In the gcloud documentation for google bigquery, it states that authentication can be determined from from_service_account_json. I've tried the following from gcloud import bigquery client = bigquery.Client.from_service_account_json('/Library/gcloud_api_credentials.json') The json file looks like the following (Note: Scrambled credentials so these are now fake). {"type": "service_account", "project_id": "example_project", "private_key_id": "c7e371776ab6e2dsfafdsaff97edf9377178c8", "private

Set metadata in Google Cloud Storage using gcloud-python

北城余情 提交于 2020-01-12 08:13:29
问题 I am trying to upload a file to Google Cloud Storage using gcloud-python and set some custom metadata properties. To try this I have created a simple script. import os from gcloud import storage client = storage.Client('super secret app id') bucket = client.get_bucket('super secret bucket name') blob = bucket.get_blob('kirby.png') blob.metadata = blob.metadata or {} blob.metadata['Color'] = 'Pink' with open(os.path.expanduser('~/Pictures/kirby.png'), 'rb') as img_data: blob.upload_from_file

Google Cloud Storage: How to Delete a folder (recursively) in Python

非 Y 不嫁゛ 提交于 2020-01-02 03:32:06
问题 I am trying to delete a folder in GCS and its all content (including sub-directories) with its Python library. Also I understand GCS doesn't really have folders (but prefix?) but I am wondering how I can do that? I tested this code: from google.cloud import storage def delete_blob(bucket_name, blob_name): """Deletes a blob from the bucket.""" storage_client = storage.Client() bucket = storage_client.get_bucket(bucket_name) blob = bucket.blob(blob_name) blob.delete() delete_blob('mybucket',