gsutil

Google Cloud Storage: How to get list of new files in bucket/folder using gsutil

怎甘沉沦 提交于 2019-12-05 17:04:26
I have a bucket/folder into which a lot for files are coming in every minutes. How can I read only the new files based on file timestamp. eg: list all files with timestamp > my_timestamp This is not a feature that gsutil or the GCS API provides, as there is no way to list objects by timestamp. Instead, you could subscribe to new objects using the GCS Cloud Pub/Sub feature. You could use some bash-fu: gsutil ls -l gs://your-bucket-name | sort -k2n | tail -n1 | awk 'END {$1=$2=""; sub(/^[ \t]+/, ""); print }' breaking that down: gsutil ls -l gs://your-bucket-name # grab detailed list of objects

Can I use gsutil with my local development server?

大城市里の小女人 提交于 2019-12-05 16:06:52
I'm developing a google app engine application that uses cloud storage. I want to have a base set of files on the cloud storage that are shared by each user of the application. I know I can use gsutil to copy these files to the production server. But I would like to test my application on my local development server, so I need these files in the dev cloud storage as well. I can't find any way to copy the files. Is there a way to use gsutil to copy files to the development server's cloud storage simuation? We don't currently support the full GCS API in the local dev server. Your best bet is to

gsutil cannot copy to s3 due to authentication

你。 提交于 2019-12-04 19:17:55
问题 I need to copy many (1000+) files to s3 from GCS to leverage an AWS lambda function. I have edited ~/.boto.cfg and commented out the 2 aws authentication parameters but a simple gsutil ls s3://mybucket fails from either an GCE or EC2 VM. Error is The authorization mechanism you have provided is not supported. Please use AWS4-HMAC-SHA256. . I use gsutil version: 4.28 and locations of GCS and S3 bucket are respectively US-CENTRAL1 and US East (Ohio) - in case this is relevant. I am clueless as

Google cloud storage - Download file from web

♀尐吖头ヾ 提交于 2019-12-04 06:23:29
I want to use Google cloud storage in my next project. My aim is tracking various web sites and collecting some photos. As, I read the documentation for gsutil; I'm able download the file manually to my server and upload it google cloud storage by using gsutil. Downloading and uploading files generates so much traffic in my server. Are there a way to let google cloud download file direct from http? This is very easy to do from the Google Cloud Shell as long as your download is less than ~ 4.6 GB. Launch the Cloud Shell (first icon on your top right after you login to your project in GCP) and

gsutil cannot copy to s3 due to authentication

徘徊边缘 提交于 2019-12-04 03:01:58
I need to copy many (1000+) files to s3 from GCS to leverage an AWS lambda function. I have edited ~/.boto.cfg and commented out the 2 aws authentication parameters but a simple gsutil ls s3://mybucket fails from either an GCE or EC2 VM. Error is The authorization mechanism you have provided is not supported. Please use AWS4-HMAC-SHA256. . I use gsutil version: 4.28 and locations of GCS and S3 bucket are respectively US-CENTRAL1 and US East (Ohio) - in case this is relevant. I am clueless as the AWS key is valid and I enabled http/https. Downloading from GCS and uploading to S3 using my laptop

gsutil returning “no matches found”

筅森魡賤 提交于 2019-12-04 00:49:23
I'm trying using gsutil to remove the contents of a Cloud Storage bucket (but not the bucket itself). According to the documentation, the command should be: gsutil rm gs://bucket/** However, whenever I run that (with my bucket name substituted of course), I get the following response: zsh: no matches found: gs://my-bucket/** I've checked permissions, and I have owner permissions. Additionally, if I specify a file, which is in the bucket, directly, it is successfully deleted. Other information which may matter: My bucket name has a "-" in it (similar to "my-bucket") It is the bucket that Cloud

How can I use gsutil with multiple accounts?

梦想与她 提交于 2019-12-03 09:24:54
I frequently use Google Cloud Storage with at least two accounts: personal@gmail.com and work@corp.com. I used gsutil config to create .boto files for both accounts, which I've renamed to personal.boto and work.boto . It is tiring to have to remember to type cp personal.boto ~/.boto whenever I need to switch between these accounts. Is there a better way? The Google Cloud SDK now includes the gcloud tool, which allows you to login and easily switch between accounts. $ gcloud auth list Credentialed accounts: - youremail@gmail.com (active) To set the active account, run $ gcloud config set

Fastest way to get Google Storage bucket size?

纵饮孤独 提交于 2019-12-03 06:34:43
问题 I'm currently doing this, but it's VERY slow since I have several terabytes of data in the bucket: gsutil du -sh gs://my-bucket-1/ And the same for a sub-folder: gsutil du -sh gs://my-bucket-1/folder Is it possible to somehow obtain the total size of a complete bucket (or a sub-folder) elsewhere or in some other fashion which is much faster? 回答1: Unfortunately, no. If you need to know what size the bucket is right now, there's no faster way than what you're doing. If you need to check on this

Installing gsutil on Windows

北慕城南 提交于 2019-12-02 01:07:54
问题 I'm relatively new to python so apologies if this is a dumb question. I'm having trouble installing gsutil on Windows I'm following the directions here https://developers.google.com/storage/docs/gsutil_install#specifications I've had to install python 2.7 and unzip gsutil in C:\gsutil\gsutil the directions say to run the following code in the python command prompt python gsutil I'm getting this error File "<interactive input>", line 1 python gsutil ^ SyntaxError: invalid syntax Thanks in

Using google cloud storage and gsutil not able to generate valid signedurl

…衆ロ難τιáo~ 提交于 2019-12-02 00:09:41
I have been trying to create a signedurl using gsutil of google cloud storage, I have been using python from my windows machine Till now what I have done is Created a service account from google's developer console Then from my python I have executed this command to configure service account "gsutil config -e" Then given my credential information as asked by the console Then I tried to create a signed url for one of my object using following command python gsutil signurl -d 10m -p notasecret p12file.p12 gs://{my bucket}/{my object} This gave me this output with error message https://storage