google-cloud-platform

Impossible to set the Cloud Firebase daily spending limit

怎甘沉沦 提交于 2021-02-08 05:55:52
问题 I just purchased a Firebase blaze plan (unlimited) but I tried to setup the daily quota as suggested in their documentation: https://firebase.google.com/docs/firestore/quotas#set_spending_limits The problem is that I can't find this option on my own console, there is no "Set budget" section: I really need to setup this daily alert to avoid any bad surprises. Why am I not seeing this option? Thanks 回答1: Well it seems they removed the possibility... The documentation page (https://firebase

How to establish a private connection between Google app engine and compute engine?

左心房为你撑大大i 提交于 2021-02-08 05:28:14
问题 I have a web app/Api which is currently running on a google app engine resource. As the calculations of the API are very computing intensive, i have outsourced the computational part to a managed auto-scaling google compute engine group, with a HTTP load balancer in the front end (to maintain a single IP address and balance load accross the several engines that are dynamically spawning). Currently, i just make an HTTP call to the load balancer IP address from the app engine. As the GAE and

How can I get the billing for a VM Instance in GCP?

好久不见. 提交于 2021-02-08 05:01:58
问题 I have a project in Google Cloud , where I have multiple instances running, and I have a billing account for the organisation. But I want to make a VM instance, say vm-01 for couple of hours then it will be deleted. I want to get the exact cost that vm-01 incurred during it's period using the API calls. Is it possible? 回答1: In your GCP Web Console, go to Billing page go to Billing export section you could either export the detail billing to BigQuery or File you could search the billing item

Is there anyway to share stateful variables in dataflow pipeline?

大兔子大兔子 提交于 2021-02-08 04:37:26
问题 I'm making dataflow pipeline with python. I want to share global variables across pipeline transform and across worker nodes like global variables (across multiple workers). Is there any way to support this? thanx in advance 回答1: Stateful processing may be of use for sharing state between workers of a specific node (would not be able to share between transforms though): https://beam.apache.org/blog/2017/02/13/stateful-processing.html 来源: https://stackoverflow.com/questions/44432556/is-there

Api gateway for Microservices with Google Cloud Functions

亡梦爱人 提交于 2021-02-08 03:42:25
问题 Inputs For example, we have a few services. Account service Product service Payment service Each service is a separate Google Cloud Function. Each service has its own HTTP API. For example, the account service has: https://REGION-FUNCTIONS_PROJECT_ID.cloudfunctions.net/account/sign-up https://REGION-FUNCTIONS_PROJECT_ID.cloudfunctions.net/account/sign-in https://REGION-FUNCTIONS_PROJECT_ID.cloudfunctions.net/account/reset-password etc Each service has its own swagger documentation endpoint

How to upload images to GCS bucket with multer and NodeJS?

不打扰是莪最后的温柔 提交于 2021-02-08 01:57:40
问题 I'm facing issues for uploading local images to my google cloud storage. I've already tried two methods. The first one is uploading with multer var storage = multer.diskStorage({ destination: (req, file, cb) => { cb(null, './uploads/') }, filename: (req, file, cb) => { cb(null, file.fieldname + '-' + Date.now()) } }); var upload = multer({storage: storage}).single('image'); app.post('/upload',function(req,res,next){ upload(req,res,(err) => { if(err){ console.log(err) }else{ console.log(req

How to upload images to GCS bucket with multer and NodeJS?

余生长醉 提交于 2021-02-08 01:57:01
问题 I'm facing issues for uploading local images to my google cloud storage. I've already tried two methods. The first one is uploading with multer var storage = multer.diskStorage({ destination: (req, file, cb) => { cb(null, './uploads/') }, filename: (req, file, cb) => { cb(null, file.fieldname + '-' + Date.now()) } }); var upload = multer({storage: storage}).single('image'); app.post('/upload',function(req,res,next){ upload(req,res,(err) => { if(err){ console.log(err) }else{ console.log(req

Google cloud SDK code to execute via cron

試著忘記壹切 提交于 2021-02-07 21:02:27
问题 I am trying to implement an automated code to shut down and start VM Instances in my Google Cloud account via Crontab. The OS is Ubuntu 12 lts and is installed with Google service account so it can handle read/write on my Google cloud account. My actual code is in this file /home/ubu12lts/cronfiles/resetvm.sh #!/bin/bash echo Y | gcloud compute instances stop my-vm-name --zone us-central1-a sleep 120s gcloud compute instances start my-vm-name --zone us-central1-a echo "completed" When I call

Google cloud SDK code to execute via cron

不问归期 提交于 2021-02-07 20:57:20
问题 I am trying to implement an automated code to shut down and start VM Instances in my Google Cloud account via Crontab. The OS is Ubuntu 12 lts and is installed with Google service account so it can handle read/write on my Google cloud account. My actual code is in this file /home/ubu12lts/cronfiles/resetvm.sh #!/bin/bash echo Y | gcloud compute instances stop my-vm-name --zone us-central1-a sleep 120s gcloud compute instances start my-vm-name --zone us-central1-a echo "completed" When I call

How to schedule a job to execute Python script in cloud to load data into bigquery?

爷,独闯天下 提交于 2021-02-07 20:34:56
问题 I am trying to setup a schedule job/process in cloud to load csv data into Bigquery from google buckets using a python script. I have manage to get hold off the python code to do this but not sure where do I need to save this code so that this task could be completed as an automated process rather than running the gsutil commands manualy. 回答1: Reliable Task Scheduling on Google Compute Engine | Solutions | Google Cloud Platform, the 1st link in Google on "google cloud schedule a cron job",