google-cloud-platform

How to authenticate to GCP from a containerized Dockerfile

自作多情 提交于 2021-01-29 09:50:43
问题 I am trying to build a new Docker image dynamically using a Cloud Build trigger job, however I fail to see how to safely retrieve my credentials to authenticate against GCP with a service account. Here are the steps: Dockerfile created with steps to build a Docker image. One of the steps includes downloading a file from Google Storage (bucket) that I need to access as a GCP service account. Docker image is built by using a Cloud Build trigger that is triggered after each change in the linked

400 Bad request when uploading image to Google Cloud storage using Python 3 Flask

断了今生、忘了曾经 提交于 2021-01-29 09:48:52
问题 I'm trying to upload a image to cloud storage using blob.upload_from_string but I get a 400 Bad Request Error I'm following this tutorial, but the difference is that I want to send a string instead of a file, and the string contains the image. This is the function : def upload_image_file(self, file): """ Upload the user-uploaded file to Google Cloud Storage and retrieve its publicly-accessible URL. """ if not file: return None testImageString = "Python is interesting." arr = bytes

Does “The OAuth client was deleted” from Google mean the keys are invalid?

人走茶凉 提交于 2021-01-29 09:40:35
问题 I am trying to track down leaked values for GOOGLE_CLIENT_SECRET and GOOGLE_CLIENT_ID . I run this basic flask app here in docker, link it to localhost, edit /etc/hosts to map that to "myserver.local.com" and access the page. When I click Login on the "This app will attempt to authenticate you through Google OAuth 2.0" screen, I get this error Authorization Error Error 401: deleted_client The OAuth client was deleted Excluding the possibility of restoring the project within 30 days of

Command to import multiple files from Cloud Storage into BigQuery

£可爱£侵袭症+ 提交于 2021-01-29 09:36:54
问题 I've figured that this command lists paths to all files: gsutil ls "gs://bucket/foldername/*.csv" This command imports a file to BQ and autodetects schema: bq load --autodetect --source_format=CSV dataset.tableName gs://bucket/foldername/something.csv Now I need to make it work together to import all files to respective tables in BQ. If table exists, then replace it. Could you give me a hand? 回答1: First, create a file with all the list with all the folders you want to load into BigQuery:

google cloud platform cloud build rebuild cloud function not updated the content

孤人 提交于 2021-01-29 09:13:50
问题 I put the file on Github and connected with Google Cloud Repository. Below is the .yaml file, when I update my index.js file, the Cloud Build rebuilds the Cloud Function, but why the content didn't get updated? Manually setup for Cloud Function works steps: - name: 'gcr.io/cloud-builders/yarn' args: ['install'] dir: 'functions/autodeploy' - name: 'gcr.io/cloud-builders/gcloud' args: ['functions', 'deploy', 'function-1', '--trigger-http', '--runtime', 'nodejs10', '--entry-point', 'firstci']

Is GCP pub/sub region specific?

风格不统一 提交于 2021-01-29 09:10:32
问题 Let's say a code running in Region A publishes a message. Can cloud function in region B and C subscribe for such events? 回答1: In standard, YES. Pubsub is a global service. If the publisher and the subscriber are in the same region, there is no reason the message change region. But in cross region, the message is forwarded to the subscriber region and then consumed. You don't see this mechanism, it's automatic and managed by PubSub However, if you have legal constrains, you can limit the

GPU Quota Error even when i have Quota for Nvidia P100 both for region Us-west1 and Europe-west4 from 0 to 1

浪尽此生 提交于 2021-01-29 09:02:27
问题 I just made up a account on Google Cloud Platform and am trying to make a VM instance and have even increased my GPU quota in region Us-west1 and Europe-west4 both to 1 from 0 Yet when i try to create a VM instance using Nvidia P100 Its gives me the error - Quota 'GPUS_ALL_REGIONS' exceeded. Limit: 0.0 globally Any help would be appreciated please and if that GPU is not usable then can you advise on a similar powered GPU please 回答1: As the error says you need to increase the ALL_REGIONS quota

How to I stage a GCP/Apache Beam Dataflow template?

夙愿已清 提交于 2021-01-29 09:00:27
问题 Ok I have to be missing something here. What do i need to stage a pipeline as a template? When I try to stage my template with via these instructions, it runs the module but doesn't stage anything., it appears to function as expected without errors, but I don't see any files actually get added to the bucket location listen in my --template_location. Should my python code be showing up there? I assume so right? I have made sure i have all the beam and google cloud SDKs installed, but maybe I'm

How do I set up a GCP App Engine instance with CORS?

一笑奈何 提交于 2021-01-29 08:31:29
问题 I've set up two different GCP App Engine apps. One is an express server (let's call it foo) with the following app.yaml: runtime: nodejs10 handlers: - url: /tasks static_dir: /tasks http_headers: Access-Control-Allow-Origin: https://bar.appspot.com/ secure: always From my bar app, I'm trying to do a fetch call: const response = await fetch('https://foo.appspot.com/tasks'); Every time I try this, however, Chrome blocks my request with the 'has been blocked by CORS policy: No 'Access-Control

Kubernetes - RBAC issue with ingress controller

纵然是瞬间 提交于 2021-01-29 08:31:10
问题 I'm following a tutorial by Diego Martínez, outlining how to use an ingress controller with SSL on K8s. Everything works fine, with the exception of an RBAC error: It seems the cluster it is running with Authorization enabled (like RBAC) and there is no permissions for the ingress controller. Please check the configuration Does anyone know how I can grant RBAC permissions to this resource? I'm running on Google Cloud, and for reference, below is the ingress deployment spec 回答1: If you are