google-cloud-storage

MLflow Artifacts Storing artifacts(google cloud storage) but not displaying them in MLFlow UI

混江龙づ霸主 提交于 2021-02-11 17:38:54
问题 I am working on a docker environment(docker-compose) with a jupyter notebook docker image and a postgres docker image for running ML models and using google cloud storage to store the model artifacts. Storing the models on the cloud storage works fine but i can't get to show them within the MLFlow UI. I have seen similar problems but non of the solutions used google cloud storage as the storage location for artifacts. The error message says the following Unable to list artifacts stored under

Google Cloud SDK Python Client: How to list files inside of a Cloud Storage bucket?

我是研究僧i 提交于 2021-02-11 15:49:57
问题 Trying to use Python to get and iterate through all of the files inside of a Cloud Storage bucket I own. I'm using the official library, google-cloud-storage . Using gsutil , I can run commands like gsutil ls gs://my-composer-bucket/dags/composer_utils/ . Does the google-cloud-storage library offer an equivalent method to gsutil ls ? I'd like to use the Python client rather than shell out to gsutil (don't want to install and authenticate the GCloud SDK inside of a Docker image). I've tried a

Can not load model weights saved to GCP with keras.save_weights. Need to transfer to new bucket to load weights

余生颓废 提交于 2021-02-11 15:36:00
问题 I am training on Google Colab with data and model weights loaded from/save to GCP. I am using Keras callbacks to save the weights to GCP. This is what the callback looks like callbacks = [tf.keras.callbacks.ModelCheckpoint(filepath='gs://mybucket/'+ 'savename' + '_loss_{loss:.2f}', monitor='loss', verbose=1, save_weights_only=True, save_freq='epoch')] The training saves the model weights successfully to my GCP bucket, but when I try to load those weights in a new session, the cell just hangs,

Cannot upload object to Google Cloud Storage using signed url and HTTP PUT

末鹿安然 提交于 2021-02-11 15:31:18
问题 I'm trying to upload an object to the Google cloud storage, using a signed url . I'm generating the url , using a JAVA method, as follows (with the help of this link): public String generatePutObjectSignedUrl(String bucketName, String objectName, String contentType) throws GenericAttachmentException { if (!bucketExists(bucketName)) { createBucket(bucketName); } // Define resource BlobInfo blobInfo = BlobInfo.newBuilder(BlobId.of(bucketName, objectName)).build(); // Generate signed URL Map

Streaming upload to Google Storage API when the final stream size is not known

一曲冷凌霜 提交于 2021-02-11 14:44:11
问题 So Google Storage has this great API for resumable uploads: https://cloud.google.com/storage/docs/json_api/v1/how-tos/resumable-upload which I'd like to utilize to upload a large object in multiple chunks. However this is done a in stream processing pipeline where the total amount of bytes in the stream is not know in advance. According to the documentation of the API, you're supposed to use Content-Range header to tell the Google Storage API that you're done uploading the file, e.g.: PUT

Streaming upload to Google Storage API when the final stream size is not known

本小妞迷上赌 提交于 2021-02-11 14:42:19
问题 So Google Storage has this great API for resumable uploads: https://cloud.google.com/storage/docs/json_api/v1/how-tos/resumable-upload which I'd like to utilize to upload a large object in multiple chunks. However this is done a in stream processing pipeline where the total amount of bytes in the stream is not know in advance. According to the documentation of the API, you're supposed to use Content-Range header to tell the Google Storage API that you're done uploading the file, e.g.: PUT

authorization and authentication mechanism in GCP

廉价感情. 提交于 2021-02-11 14:21:54
问题 I want to create a Udemy like video platform where a user can see all videos but can watch videos only that he has purchased. I am making a rest call to get the videos from the storage bucket from an angular application, using Firebase authentication here. In my GET request to storage bucket I am passing the access token that I got from Firebase authn . Does this access token can be used to determine scope of the user to access video in a bucket? Assume if I have given read access for a video

How to resolve class file for com.google.cloud.Service not found

拜拜、爱过 提交于 2021-02-11 14:13:57
问题 I am Trying to upload JSON data to gcs. As I did not use google cloud previously I started with uploading random String to gcs but I got stuck at the beginning itself while creating a Storage service object Maven dependency <dependency> <groupId>com.google.cloud</groupId> <artifactId>google-cloud-storage</artifactId> <version>1.70.0</version> </dependency> import com.google.cloud.storage.*; Storage storage = StorageOptions.getDefaultInstance().getService(); BlobId blobId = BlobId.of("bucket

BigQuery displaying wrong results - Duplicating data from Cloud Function?

对着背影说爱祢 提交于 2021-02-11 12:48:53
问题 I am a junior developer and I was in charge of implementing the Facebook API to an existing project. However, the business team figured out that the Google Analytics results displayed on BigQuery are wrong. They asked me to fix it. This is the architecture: What I have done is: On BigQuery, checking how close/far are the results from Google Analytics. I found there is a pattern, the results I am getting on BigQuery are always either 1, 2 or 3 times the original value of GA. I checked if there

Download json file from firebase storage

懵懂的女人 提交于 2021-02-11 12:29:20
问题 Let me restate my question. I have a vue.js app that uploads a json file to a bucket in my firebase storage. I need to download the file to use in my app but when I try to download a json file from firebase storage I get a "Uncaught (in promise) undefined" error. What I would like to do is download the file back into an object to use in my app Here is my code to download and read the file: const storage = firebase.storage() const storageRef = storage.ref() const objectRef = storageRef.child(