google-cloud-platform

How to add new column with metadata value to csv when loading it to bigquery

不打扰是莪最后的温柔 提交于 2021-01-29 11:59:37
问题 I have a daily csv file coming into my bucket on google storage and I built a function that load this csv and append it into a table in BigQuery when it comes in. However, I want to add a new column to the csv with the function execution id (context["id"]) before I load the data to Big query. is that possible? thanks in advance! def TimeTableToBigQuery(data, context): # Getting metadata about the uploaded file, the storage and datetime of insert excution_id = context['event_id'] bucketname =

updating a deployment - uploaded images gets deleted after redeployment to google cloud

社会主义新天地 提交于 2021-01-29 11:42:06
问题 So I have a node js web app, this web app has a folder to store images uploaded by users from a mobile app. How I upload the image to the folder is by using the image's base64 string, and using fs.writeFile to save the image to the folder, like this: fs.writeFile(__dirname + '/../images/complaintImg/complaintcase_' + data.cID + '.jpg', Buffer.from(data.complaintImage, 'base64'), function (err) { if (err) { console.log(err); } else { console.log("success"); } }); The problem is, whenever the

how to secure connection to google database using public ip (0.0.0.0)?

你。 提交于 2021-01-29 11:26:06
问题 I have created software in which I'm connecting with google database using public IP but as with changing network, the public IP changes for the system. So for ease, I used the IP 0.0.0.0 but it's not secure but I don't know any other option as I'm new to this. So is there any way to secure it. 回答1: There are several ways to set a secure connection to a Cloud SQL instance. Use SSL certs to connect to the instance and enable the "Allow only SSL connections". Use Private IP. I understand that

GCP write only access to bucket (GCS)

試著忘記壹切 提交于 2021-01-29 11:15:34
问题 We are trying to create different bucket for different source system, and give them access only to dump data on particular bucket. They should not have read access, i.e. they shouldnt be able to see hats there inside the bucket. Is it doable , if yes how ? 回答1: You are probably looking for roles/storage.objectCreator role (take a look at IAM roles for Storage) : Allows users to create objects. Does not give permission to view, delete, or overwrite objects. 回答2: You can create a custom role

Google Cloud Storage bucket.get_blob to verified file path returns None

狂风中的少年 提交于 2021-01-29 11:07:49
问题 I am able to verify the existence of finished_json_path in the bucket_name , but I get finished_json_blob value of None , when running this code... Any insights really appreciated! bucket_name = mdata_list[0] org_repo = mdata_list[3] pull_number = mdata_list[4] job_name = mdata_list[5] build_number = mdata_list[6] prlogs_pull_dir = bucket_name + "/pr-logs/pull" prlogs_directory_dir = bucket_name + "/pr_logs/directory" finished_json_path = prlogs_pull_dir + "/" + org_repo + "/" + pull_number +

How to access Laravel Folder in Google Cloud

与世无争的帅哥 提交于 2021-01-29 10:28:37
问题 I received an error on my Laravel App /app/storage/logs/laravel-2020-02-20.log" could not be opened: failed to open stream: Permission denied So I google the error, and I discovered it a permission error. chmod -R 775 storage chmod -R 775 bootstrap/cache Apparently, this fixes the error, but anytime I tried to run the command in the cloud console. It keeps saying No directory found. Can someone advice me on how to access storage directory. 回答1: From the tags, I can tell it is GAE flex on GCP.

Set default meta-data for all future uploaded objects in GCP Bucket

房东的猫 提交于 2021-01-29 10:23:50
问题 I would like to know how could I set default meta-data for all future uploaded objects. Am trying to set "Cache-Control:public,max-age=3600" as a header for each object in my bucket hosting a static website. For all the existing objects, I used the guide command to set meta data, although, can't find a way to set it by default for future uploaded objects. P.S., Developers are using GCP console to upload the objects, and I recently realized that when they upload the updated HTML files (which

Deploying an app on Google App Engine - Error from requirements.txt

三世轮回 提交于 2021-01-29 10:13:32
问题 I can't deploy an app on Google App Engine. The problem seems to come from my requirements.txt file: When I deploy the example building-an-app-1, it works well. When I replace the original requirements.txt file with mine, it doesn't work*. Here is my requirements.txt : Flask==1.1.1 flask-wtf==0.14.2 unidecode numpy openfoodfacts os When I remove the last 2 packages, it works. What is the problem? * gcloud deploy app returns: File upload done. Updating service [default]...failed. ERROR:

How to load my pickeled ML model from GCS to Dataflow/Apache beam

爷,独闯天下 提交于 2021-01-29 10:07:07
问题 I've developed an apache beam pipeline locally where I run predictions on a sample file. Locally on my computer I can load the model like this: with open('gs://newbucket322/my_dumped_classifier.pkl', 'rb') as fid: gnb_loaded = cPickle.load(fid) but when running on google dataflow that doesn't obviously work. I tried changing the path to GS:// but that also obviously does not work. I also tried this code snippet (from here) that was used to load files: class ReadGcsBlobs(beam.DoFn): def

YouTube Data API Error 403 Insufficient Permission

怎甘沉沦 提交于 2021-01-29 09:57:50
问题 I'm working on an IOS app using Swift that requires accessing to the user's YouTube subscriptions. After adding all scopes I need on Google Cloud Platform and implementing GoogleSignIn in Firebase. I sign in (GIDSignInButton), then I get the error shown down below after making this request: https://www.googleapis.com/youtube/v3/subscriptions?part=snippet%2CcontentDetails&mine=true&key=\(apiKey)&access_token=\(Access_Token) . The Access_Token is the one I got after I call: func sign(_ signIn: