google-cloud-repository

Service account does not have storage.buckets.lists access to project while pushing images to GCR via Gitlab CI

三世轮回 提交于 2020-08-10 05:01:52
问题 I am using Gitlab CI to build docker images and to push them to GCR. My Script goes like this - build: image: google/cloud-sdk services: - docker:dind stage: build cache: script: - echo "$GCP_SERVICE_KEY" > gcloud-service-key.json # Google Cloud service accounts - gcloud auth activate-service-account --key-file gcloud-service-key.json - gcloud auth configure-docker --quiet - gcloud config set project $GCP_PROJECT_ID - echo ${IMAGE_NAME}:${IMAGE_TAG} - PYTHONUNBUFFERED=1 gcloud builds submit

Python cloud function is not updated after rebuild

天大地大妈咪最大 提交于 2020-06-01 07:44:05
问题 Steps to reproduce: Enable these APIs: Cloud Repositories, Cloud Build, and Cloud Functions Create Repository and push content from here there. This is a simple Python Flask app returning simple Html with cloudbuild.yaml file. Create Cloud function using created repository with name la-repo-function-1 (which is referred in cloudbuild.yaml file) and using Python 3.7 with HTTP trigger and function to execute equal greetings_http Create Cloud Build trigger on that repo and point it to use

Git Large File Storage with Google Cloud Storage

丶灬走出姿态 提交于 2019-12-10 22:18:44
问题 I am part of the project, where we use git repository hosted on google cloud source repository. Right now we use google cloud storage to store raw and processed data. Everyone involved in the project downloads the data and places it locally in the ./data folder, which is .gitignore -ed. I would prefer to use git LFS instead, but it is required that if the data has to be stored somewhere externally, it may only be GCS. Is it possible to configure git LFS, Google Cloud Source Repository and