Error “No URLs matched” When copying Google cloud bucket data to my local computer?

浪尽此生 提交于 2019-12-09 17:56:21

问题


I am trying to download a folder which is inside my Google Cloud Bucket, I read from google docs gsutil/commands/cp and executed below the line.

gsutil cp -r appengine.googleapis.com gs://my-bucket

But i am getting the error

CommandException: No URLs matched: appengine.googleapis.com

Edit

By running below command

gsutil cp -r gs://logsnotimelimit .

I am getting Error

IOError: [Errno 22] invalid mode ('ab') or filename: u'.\logsnotimelimit\appengine.googleapis.com\nginx.request\2018\03\14\14:00:00_14:59:59_S0.json_.gstmp'


回答1:


What is the appengine.googleapis.com parameter in your command? Is that a local directory on your filesystem you are trying to copy to the cloud bucket?

The gsutil cp -r appengine.googleapis.com gs://my-bucket command you provided will copy a local directory named appengine.googleapis.com recursively to your cloud bucket named my-bucket. If that's not what you are doing - you need to construct your command differently.

I.e. to download a directory named folder from your cloud bucket named my-bucket into the current location try running gsutil cp -r gs://my-bucket/folder .

-- Update: Since it appears that you're using a Windows machine (the "\" directory separators instead of "/" in the error message) and since the filenames contain the ":" character - the cp command will end up failing when creating those files with the error message you're seeing.




回答2:


Just wanted to help people out if they run into this problem on Windows. As administrator:

  • Open C:\Program Files (x86)\Google\Cloud SDK\google-cloud-sdk\platform\gsutil\gslib\utils
  • Delete copy_helper.pyc
  • Change the permissions for copy_helper.py to allow writing
  • Open copy_helper.py
  • Go to the function _GetDownloadFile
  • On line 2312 (at time of writing), change the following line
download_file_name = _GetDownloadTempFileName(dst_url)

to (for example, objective is to remove the colons):

download_file_name = _GetDownloadTempFileName(dst_url).replace(':', '-')
  • Go to the function _ValidateAndCompleteDownload
  • On line 3184 (at time of writing), change the following line
final_file_name = dst_url.object_name

to (for example, objective is to remove the colons):

final_file_name = dst_url.object_name.replace(':', '-')
  • Save the file, and rerun the gsutil command
  • FYI, I was using the command gsutil -m cp -r gs://my-bucket/* . to download all my logs, which by default contain : which does not bode well for Windows files!

Hope this helps someone, I know it's a somewhat hacky solution, but seeing as you never need (should have) colons in Windows filenames, it's fine to do and forget. Just remember that if you update the Google SDK you'll have to redo this.



来源:https://stackoverflow.com/questions/49491936/error-no-urls-matched-when-copying-google-cloud-bucket-data-to-my-local-comput

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!