gsutil

gsutil ServiceException: 401 Anonymous caller does not have storage.objects.list access to bucket even though I'm loggedin in gcloud

余生长醉 提交于 2021-02-15 11:16:00
问题 I am trying to create an internal app to upload files to google cloud. I don't want each individual user or this app to log in so I'm using a service account. I login into the service account and everything is ok, but when I try to upload it gives me this error: ServiceException: 401 Anonymous caller does not have storage.objects.list access to bucket As you can see I am logged in with a service account and my account and(neither service or personal) works 回答1: I had similar problem, and as

gsutil ServiceException: 401 Anonymous caller does not have storage.objects.list access to bucket even though I'm loggedin in gcloud

北战南征 提交于 2021-02-15 11:15:53
问题 I am trying to create an internal app to upload files to google cloud. I don't want each individual user or this app to log in so I'm using a service account. I login into the service account and everything is ok, but when I try to upload it gives me this error: ServiceException: 401 Anonymous caller does not have storage.objects.list access to bucket As you can see I am logged in with a service account and my account and(neither service or personal) works 回答1: I had similar problem, and as

Why does gsutil cp require storage.objects.delete on versioned bucket?

喜夏-厌秋 提交于 2021-02-11 06:52:09
问题 I'm using a service account to upload a file to Google Cloud Storage bucket that has versioning. I want to keep the service account privileges minimal, it only ever needs to upload files so I don't want to give it permission to delete files, but the upload fails (only after streaming everything!) saying it requires delete permission. Shouldn't it be creating a new version instead of deleting? Here's the command: cmd-that-streams | gsutil cp -v - gs://my-bucket/${FILE}

Why does gsutil cp require storage.objects.delete on versioned bucket?

孤街浪徒 提交于 2021-02-11 06:52:04
问题 I'm using a service account to upload a file to Google Cloud Storage bucket that has versioning. I want to keep the service account privileges minimal, it only ever needs to upload files so I don't want to give it permission to delete files, but the upload fails (only after streaming everything!) saying it requires delete permission. Shouldn't it be creating a new version instead of deleting? Here's the command: cmd-that-streams | gsutil cp -v - gs://my-bucket/${FILE}

Listing all public links for all objects in a bucket using gsutil

妖精的绣舞 提交于 2021-02-08 15:05:15
问题 Is there a way to list all public links for all the objects stored into a Google Cloud Storage bucket (or a directory in a bucket) using Cloud SDK's gsutil or gcloud ? Something like: $ gsutil ls --public-link gs://my-bucket/a-directory 回答1: Public links for publicly visible objects are predictable. They just match this pattern: https://storage.googleapis.com/BUCKET_NAME/OBJECT_NAME . gsutil doesn't have a command to print URLs for objects in a bucket, but it can just list objects. You could

Listing all public links for all objects in a bucket using gsutil

爱⌒轻易说出口 提交于 2021-02-08 15:03:56
问题 Is there a way to list all public links for all the objects stored into a Google Cloud Storage bucket (or a directory in a bucket) using Cloud SDK's gsutil or gcloud ? Something like: $ gsutil ls --public-link gs://my-bucket/a-directory 回答1: Public links for publicly visible objects are predictable. They just match this pattern: https://storage.googleapis.com/BUCKET_NAME/OBJECT_NAME . gsutil doesn't have a command to print URLs for objects in a bucket, but it can just list objects. You could

How do you successfully invoke gsutil rsync from a python script?

谁都会走 提交于 2021-02-08 08:10:50
问题 I am trying to execute the following line gsutil -m rsync s3://input gs://output in python. When running this line in the shell terminal it works fine. However, I am trying to run this in a python script by using the following line. subprocess.Popen(["gsutil", "-m", "rsync", "s3://input", "gs://output"]) However it just hangs forever. It outputs the following: Building synchronization state... Starting synchronization... The bash command successfully prints: Building synchronization state...

How do you successfully invoke gsutil rsync from a python script?

烈酒焚心 提交于 2021-02-08 08:06:40
问题 I am trying to execute the following line gsutil -m rsync s3://input gs://output in python. When running this line in the shell terminal it works fine. However, I am trying to run this in a python script by using the following line. subprocess.Popen(["gsutil", "-m", "rsync", "s3://input", "gs://output"]) However it just hangs forever. It outputs the following: Building synchronization state... Starting synchronization... The bash command successfully prints: Building synchronization state...

How to schedule a job to execute Python script in cloud to load data into bigquery?

爷,独闯天下 提交于 2021-02-07 20:34:56
问题 I am trying to setup a schedule job/process in cloud to load csv data into Bigquery from google buckets using a python script. I have manage to get hold off the python code to do this but not sure where do I need to save this code so that this task could be completed as an automated process rather than running the gsutil commands manualy. 回答1: Reliable Task Scheduling on Google Compute Engine | Solutions | Google Cloud Platform, the 1st link in Google on "google cloud schedule a cron job",

Command to import multiple files from Cloud Storage into BigQuery

£可爱£侵袭症+ 提交于 2021-01-29 09:36:54
问题 I've figured that this command lists paths to all files: gsutil ls "gs://bucket/foldername/*.csv" This command imports a file to BQ and autodetects schema: bq load --autodetect --source_format=CSV dataset.tableName gs://bucket/foldername/something.csv Now I need to make it work together to import all files to respective tables in BQ. If table exists, then replace it. Could you give me a hand? 回答1: First, create a file with all the list with all the folders you want to load into BigQuery: