gsutil

How can I use gsutil with multiple accounts?

房东的猫 提交于 2019-12-09 07:30:27
问题 I frequently use Google Cloud Storage with at least two accounts: personal@gmail.com and work@corp.com. I used gsutil config to create .boto files for both accounts, which I've renamed to personal.boto and work.boto . It is tiring to have to remember to type cp personal.boto ~/.boto whenever I need to switch between these accounts. Is there a better way? 回答1: The Google Cloud SDK now includes the gcloud tool, which allows you to login and easily switch between accounts. $ gcloud auth list

AccessDeniedException: 403 when trying to copy file to Google Storage from VM using gsutil

穿精又带淫゛_ 提交于 2019-12-08 02:08:09
问题 I’m trying to manually do the steps I need to automate the process to understand how it works and to make sure I get all the commands straight. But when I try to do it using the command: gsutil cp file_name gs://bucket_name/ I get the following error: AccessDeniedException: 403 Insufficient OAuth2 scope to perform this operation. It was supposed to be a very simple thing, but I can't get it right. I'm used to do it in AWS, but I'm not being able to do the same in Google Cloud. Anyone knows

How to remove extension name from multiple files in google cloud storage?

天涯浪子 提交于 2019-12-08 02:01:48
问题 I have files like gs://bucketname/thumbnails/12321.jpg gs://bucketname/thumbnails/44666.jpg gs://bucketname/thumbnails/89774.jpg gs://bucketname/thumbnails/63333.jpg ... in google cloud storage and I want the end result as gs://bucketname/thumbnails/12321 gs://bucketname/thumbnails/44666 gs://bucketname/thumbnails/89774 gs://bucketname/thumbnails/63333 ... I couldn't find an appropriate gsutil command for this? 回答1: gsutil doesn't have built-in support to do what you're trying to do. You

Google Cloud Storage: How to get list of new files in bucket/folder using gsutil

柔情痞子 提交于 2019-12-07 09:47:45
问题 I have a bucket/folder into which a lot for files are coming in every minutes. How can I read only the new files based on file timestamp. eg: list all files with timestamp > my_timestamp 回答1: This is not a feature that gsutil or the GCS API provides, as there is no way to list objects by timestamp. Instead, you could subscribe to new objects using the GCS Cloud Pub/Sub feature. 回答2: You could use some bash-fu: gsutil ls -l gs://your-bucket-name | sort -k2n | tail -n1 | awk 'END {$1=$2=""; sub

Can I use gsutil with my local development server?

馋奶兔 提交于 2019-12-07 09:36:37
问题 I'm developing a google app engine application that uses cloud storage. I want to have a base set of files on the cloud storage that are shared by each user of the application. I know I can use gsutil to copy these files to the production server. But I would like to test my application on my local development server, so I need these files in the dev cloud storage as well. I can't find any way to copy the files. Is there a way to use gsutil to copy files to the development server's cloud

Reading appengine backup_info file gives EOFError

不想你离开。 提交于 2019-12-06 12:26:19
I'm trying to inspect my appengine backup files to work out when a data corruption occured. I used gsutil to locate and download the file: gsutil ls -l gs://my_backup/ > my_backup.txt gsutil cp gs://my_backup/LongAlphaString.Mymodel.backup_info file://1.backup_info I then created a small python program, attempting to read the file and parse it using the appengine libraries. #!/usr/bin/python APPENGINE_PATH='/Applications/GoogleAppEngineLauncher.app/Contents/Resources/GoogleAppEngine-default.bundle/Contents/Resources/google_appengine/' ADDITIONAL_LIBS = [ 'lib/yaml/lib' ] import sys sys.path

Pipe gsutil output to file

放肆的年华 提交于 2019-12-06 11:50:36
Greetings StackOverflow, I'm working on a small project on Windows which needs to read the output of GSUTIL's copy function. Problem is, the output of the copy function doesn't seem to work via the Standard Output. Also, the behavior of GSUTIL is inconsistent: piping output doesn't work with the copy function but using the list function is does work. When I use the following command in my command prompt the output is displayed in the command prompt but not redirected to the text file. This command doesn't work right: C:\gsutil> python gsutil cp "file://C:/test_files/*" gs://gs_teststore/ >

How to remove extension name from multiple files in google cloud storage?

纵饮孤独 提交于 2019-12-06 10:28:52
I have files like gs://bucketname/thumbnails/12321.jpg gs://bucketname/thumbnails/44666.jpg gs://bucketname/thumbnails/89774.jpg gs://bucketname/thumbnails/63333.jpg ... in google cloud storage and I want the end result as gs://bucketname/thumbnails/12321 gs://bucketname/thumbnails/44666 gs://bucketname/thumbnails/89774 gs://bucketname/thumbnails/63333 ... I couldn't find an appropriate gsutil command for this? Mike Schwartz gsutil doesn't have built-in support to do what you're trying to do. You could generate a script to do it using something like: gsutil ls gs://your-bucket/**.jpg | sed 's/

Mass rename objects on Google Cloud Storage

喜夏-厌秋 提交于 2019-12-06 05:00:46
Is it possible to mass rename objects on Google Cloud Storage using gsutil (or some other tool)? I am trying to figure out a way to rename a bunch of images from *.JPG to *.jpg. https://cloud.google.com/storage/docs/gsutil/addlhelp/WildcardNames gsutil supports URI wildcards EDIT gsutil 3.0 release note As part of the bucket sub-directory support we changed the * wildcard to match only up to directory boundaries, and introduced the new ** wildcard... Do you have directories under bucket? if so, maybe you need to go down to each directories or use **. gsutil -m mv gs://my_bucket/**.JPG gs://my

Change storage class of (existing) objects in Google Cloud Storage

◇◆丶佛笑我妖孽 提交于 2019-12-06 01:30:28
Good day! I recently learnt of the new storage tiers and reduced prices announced on the Google Cloud Storage platform/service. So I wanted to change the default storage class for one of my buckets from Durable Reduced Availability to Coldline, as that is what is appropriate for the files that I'm archiving in that bucket. I got this note though: Changing the default storage class only affects objects you add to this bucket going forward. It does not change the storage class of objects that are already in your bucket. Any advice/tips on how I can change class of all existing objects in the