Google Cloud SDK Python Client: How to list files inside of a Cloud Storage bucket?

我是研究僧i 提交于 2021-02-11 15:49:57

问题


Trying to use Python to get and iterate through all of the files inside of a Cloud Storage bucket I own. I'm using the official library, google-cloud-storage.

Using gsutil, I can run commands like gsutil ls gs://my-composer-bucket/dags/composer_utils/. Does the google-cloud-storage library offer an equivalent method to gsutil ls? I'd like to use the Python client rather than shell out to gsutil (don't want to install and authenticate the GCloud SDK inside of a Docker image).

I've tried a few different things which have left me confused on how blobs work:

>>> dag_folder_blob = cloud_composer_bucket.blob(bucket, 'dags/')
>>> dag_folder_blob.exists()
True
>>> util_folder_blob = cloud_composer_bucket.blob(bucket, 'dags/composer_utils/')  # directory exists
>>> util_folder_blob.exists()
False
>>> util_file_blob = cloud_composer-bucket.blob(bucket, 'dags/composer_utils/__init__.py')
>>> util_file_blob.exists()
True

回答1:


You will want to use the list_blobs method of a Bucket object. Read more about listing objects in Cloud Storage.




回答2:


# replicating command: gsutil ls gs://<bucketName>/<prefix>

from google.cloud import storage
bucket = storage.Client(<proj>).bucket(<bucketName>)
for key in bucket.list_blobs(prefix=<prefix>):
    print key


来源:https://stackoverflow.com/questions/55636889/google-cloud-sdk-python-client-how-to-list-files-inside-of-a-cloud-storage-buck

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!