google-cloud-build

Where is set global limit for gcloud steps timeout for all builds?

瘦欲@ 提交于 2021-02-10 23:05:28
问题 Where can I find global limit for gcloud build step timeout? This is my gcloud build config: steps: - name: 'gcr.io/cloud-builders/yarn' - name: 'gcr.io/cloud-builders/yarn' args: ['build-nginx-config'] - name: 'gcr.io/cloud-builders/yarn' args: ['build'] timeout: 3601s ... timeout: 7200s And this is what I get when I try to run this build: [10:41:45]ERROR: (gcloud.builds.submit) INVALID_ARGUMENT: invalid build: invalid timeout in build step #2: build step timeout "1h0m1s" must be <= build

Issue using in-cluster kubernetes configuration with client-go library on google cloud build

扶醉桌前 提交于 2021-02-10 12:23:08
问题 I'm having a bit of a challenge try to build my app which is using the golang client-go library. What the app does is provide and api which then deploys a pod to a kubernetes cluster. Now the app is able to deploy a pod successfully if I use an out of cluster kubernetes(i.e minikube) config which is found in $HOME/.kube/config. See code below that determines which config to use depending on the config path; package kubernetesinterface import ( "log" "os" core "k8s.io/api/core/v1" v1 "k8s.io

GCP Cloud Function - ERROR fetching storage source during build/deploy

南楼画角 提交于 2021-02-07 05:48:46
问题 Running into problems building deploying functions. When trying to programmatically deploy the function I get the following output in builder logs (ERRORS). 2020-10-20T02:22:12.155866856Z starting build "1fc13f51-28b6-4052-9a79-d5d0bef9ed5c" I 2020-10-20T02:22:12.156015831Z FETCHSOURCE I 2020-10-20T02:22:12.156031384Z Fetching storage object: gs://gcf-sources-629360234120-us-central1/${FUNCTIONNAME}-63f501f1-a8d2-4837-b992-1173ced83036/version-1/function-source.zip#1603160527600655 I 2020-10

How can I grant the account permission to list enabled APIs?

纵饮孤独 提交于 2021-02-06 15:49:35
问题 During a build on Cloud Build, I get the following warning: Step #2: WARNING: Unable to verify that the Appengine Flexible API is enabled for project [xxxxx]. You may not have permission to list enabled services on this project. If it is not enabled, this may cause problems in running your deployment. Please ask the project owner to ensure that the Appengine Flexible API has been enabled and that this account has permission to list enabled APIs. I'd like to get rid of this warning to get a

How to authenticate to GCP from a containerized Dockerfile

自作多情 提交于 2021-01-29 09:50:43
问题 I am trying to build a new Docker image dynamically using a Cloud Build trigger job, however I fail to see how to safely retrieve my credentials to authenticate against GCP with a service account. Here are the steps: Dockerfile created with steps to build a Docker image. One of the steps includes downloading a file from Google Storage (bucket) that I need to access as a GCP service account. Docker image is built by using a Cloud Build trigger that is triggered after each change in the linked

google cloud platform cloud build rebuild cloud function not updated the content

孤人 提交于 2021-01-29 09:13:50
问题 I put the file on Github and connected with Google Cloud Repository. Below is the .yaml file, when I update my index.js file, the Cloud Build rebuilds the Cloud Function, but why the content didn't get updated? Manually setup for Cloud Function works steps: - name: 'gcr.io/cloud-builders/yarn' args: ['install'] dir: 'functions/autodeploy' - name: 'gcr.io/cloud-builders/gcloud' args: ['functions', 'deploy', 'function-1', '--trigger-http', '--runtime', 'nodejs10', '--entry-point', 'firstci']

How do I use Google Secrets Manager to create a docker ARG in Google Cloud Build?

一个人想着一个人 提交于 2021-01-29 07:05:00
问题 I'm doing a build on GCB in which I need to install private dependencies, so am using Google Secrets Manager. My cloudbuild.yaml looks like this: steps: - name: gcr.io/cloud-builders/gcloud entrypoint: 'bash' args: [ '-c', "gcloud secrets versions access latest --secret=PERSONAL_ACCESS_TOKEN_GITHUB --format='get(payload.data)' | tr '_-' '/+' | base64 -d > decrypted-pat.txt" ] - name: 'gcr.io/cloud-builders/docker' args: - build - '--build-arg' - PERSONAL_ACCESS_TOKEN_GITHUB=$(cat decrypted

Cloud build service account permission to build

若如初见. 提交于 2021-01-29 04:32:01
问题 I have my env set as Cloud build app (Github app) to provision terraform through cloud build to Google Cloud Platform. The build is a simple cloud composer with cloud functions, that creates these resources along with the right service accounts and members. However, only the owner permission can execute this successfully, I want to have least privilege for the cloud build service account. I have used a lot of roles and nothing seems to be successful. i.e. create service account, editor,

Problem with data transfer from Cloud Build container to Google Compute Engine instance

一个人想着一个人 提交于 2021-01-28 06:09:01
问题 Currently I'm using Cloud Build to produce some artifacts that I need to deploy to GCE instance. I've tried to use gcloud builder for this purpose with the following args: - name: 'gcr.io/cloud-builders/gcloud' args: ['compute', 'scp', '--zone=<zone_id>', '<local_path>', '<google compute engine instance name>:<instance_path>'] and build fails with the following error: ERROR: (gcloud.compute.scp) Could not SSH into the instance. It is possible that your SSH key has not propagated to the

GCP Cloud build ignores timeout settings

旧时模样 提交于 2021-01-27 14:15:53
问题 I use Cloud Build for copying the configuration file from storage and deploying the app to App Engine flex . The problem is that the build fails every time when it lasts more than 10 minutes. I've specified timeout in my cloudbuild.yaml but it looks like it's ignored. Also, I configured app/cloud_build_timeout and set it to 1000. Could somebody explain to me what is wrong here? My cloudbuild.yaml looks in this way: steps: - name: gcr.io/cloud-builders/gsutil args: ["cp", "gs://myproj-dev