google-cloud-platform

Answer with an image + button in dialogflow chat with fulfillment

安稳与你 提交于 2021-02-11 17:37:52
问题 I'm busy with learning more and more about dialogflow and the different possibilities that it has to offer but I'm stuck at the moment. What do I have now? Via my dialogflow agent it is possible at the moment to request the present travel advice from the Dutch gouverment to a specific country. So when an user is asking: 'give me the travel advice to Spain' the dialogflow will respond with the current travel advice from the gouverment. The data is being imported from a Google Sheet. In this

Limit access to metadata on GCE instance

主宰稳场 提交于 2021-02-11 16:59:26
问题 Is there some way to limit access to the internal metadata IP? Background is: https://about.gitlab.com/blog/2020/02/12/plundering-gcp-escalating-privileges-in-google-cloud-platform/ When I fetch all the data with curl I can see the email address of my google account among other stuff. I'd like to limit the data itself and access to the data as much as possible. Metadata is required during setup and boot as far as I know. Is there some way around this or at least some way to lock down access

Limit access to metadata on GCE instance

一个人想着一个人 提交于 2021-02-11 16:57:53
问题 Is there some way to limit access to the internal metadata IP? Background is: https://about.gitlab.com/blog/2020/02/12/plundering-gcp-escalating-privileges-in-google-cloud-platform/ When I fetch all the data with curl I can see the email address of my google account among other stuff. I'd like to limit the data itself and access to the data as much as possible. Metadata is required during setup and boot as far as I know. Is there some way around this or at least some way to lock down access

cloudtasks.CreateTask fails: `lacks IAM permission “cloudtasks.tasks.create”` even though my account has that permission

让人想犯罪 __ 提交于 2021-02-11 16:42:23
问题 I'm following the Creating HTTP Target tasks guide. When I run the code posted below I get this error: cloudtasks.CreateTask: rpc error: code = PermissionDenied desc = The principal (user or service account) lacks IAM permission "cloudtasks.tasks.create" for the resource "projects/my_project/locations/europe-west1/queues/my_queue" (or the resource may not exist). I have signed in with gcloud auth login my@email.com . my@email.com has the following permissions set by my custom cloud task role:

Why isn't my colab notebook using the GPU?

与世无争的帅哥 提交于 2021-02-11 16:28:09
问题 When I run code on my colab notebook after having selected the GPU, I get a message saying "You are connected to a GPU runtime, but not utilizing the GPU". Now I understand similar questions have been asked before, but I still don't understand why. I am running PCA on a dataset over hundreds of iterations, for multiple trials. Without a GPU it takes about as long as it does on my laptop, which can be >12 hours, resulting in a time out on colab. Is colab's GPU restricted to machine learning

Google Cloud SDK Python Client: How to list files inside of a Cloud Storage bucket?

我是研究僧i 提交于 2021-02-11 15:49:57
问题 Trying to use Python to get and iterate through all of the files inside of a Cloud Storage bucket I own. I'm using the official library, google-cloud-storage . Using gsutil , I can run commands like gsutil ls gs://my-composer-bucket/dags/composer_utils/ . Does the google-cloud-storage library offer an equivalent method to gsutil ls ? I'd like to use the Python client rather than shell out to gsutil (don't want to install and authenticate the GCloud SDK inside of a Docker image). I've tried a

Can not load model weights saved to GCP with keras.save_weights. Need to transfer to new bucket to load weights

余生颓废 提交于 2021-02-11 15:36:00
问题 I am training on Google Colab with data and model weights loaded from/save to GCP. I am using Keras callbacks to save the weights to GCP. This is what the callback looks like callbacks = [tf.keras.callbacks.ModelCheckpoint(filepath='gs://mybucket/'+ 'savename' + '_loss_{loss:.2f}', monitor='loss', verbose=1, save_weights_only=True, save_freq='epoch')] The training saves the model weights successfully to my GCP bucket, but when I try to load those weights in a new session, the cell just hangs,

GCP kms encrypt env var and passing encrypted key through cloudbuild.yaml to google app engine

我们两清 提交于 2021-02-11 14:46:23
问题 I'm trying to encrypt env vars for database in Cloud SQL in my RoR app deploying to Google App Engine. Following this doc https://cloud.google.com/cloud-build/docs/securing-builds/use-encrypted-secrets-credentials However, I have an error when running both gcloud builds submit and gcloud app deploy . Both are error out with: Failure status: UNKNOWN: Error Response: [4] DEADLINE_EXCEEDED / build step 0 "gcr.io/cloud-builders/gcloud" failed: exit status 1. I then check the gcloud builds

Streaming upload to Google Storage API when the final stream size is not known

一曲冷凌霜 提交于 2021-02-11 14:44:11
问题 So Google Storage has this great API for resumable uploads: https://cloud.google.com/storage/docs/json_api/v1/how-tos/resumable-upload which I'd like to utilize to upload a large object in multiple chunks. However this is done a in stream processing pipeline where the total amount of bytes in the stream is not know in advance. According to the documentation of the API, you're supposed to use Content-Range header to tell the Google Storage API that you're done uploading the file, e.g.: PUT

Streaming upload to Google Storage API when the final stream size is not known

本小妞迷上赌 提交于 2021-02-11 14:42:19
问题 So Google Storage has this great API for resumable uploads: https://cloud.google.com/storage/docs/json_api/v1/how-tos/resumable-upload which I'd like to utilize to upload a large object in multiple chunks. However this is done a in stream processing pipeline where the total amount of bytes in the stream is not know in advance. According to the documentation of the API, you're supposed to use Content-Range header to tell the Google Storage API that you're done uploading the file, e.g.: PUT