问题
Trying to use Python to get and iterate through all of the files inside of a Cloud Storage bucket I own. I'm using the official library, google-cloud-storage.
Using gsutil, I can run commands like gsutil ls gs://my-composer-bucket/dags/composer_utils/. Does the google-cloud-storage library offer an equivalent method to gsutil ls? I'd like to use the Python client rather than shell out to gsutil (don't want to install and authenticate the GCloud SDK inside of a Docker image).
I've tried a few different things which have left me confused on how blobs work:
>>> dag_folder_blob = cloud_composer_bucket.blob(bucket, 'dags/')
>>> dag_folder_blob.exists()
True
>>> util_folder_blob = cloud_composer_bucket.blob(bucket, 'dags/composer_utils/') # directory exists
>>> util_folder_blob.exists()
False
>>> util_file_blob = cloud_composer-bucket.blob(bucket, 'dags/composer_utils/__init__.py')
>>> util_file_blob.exists()
True
回答1:
You will want to use the list_blobs method of a Bucket object. Read more about listing objects in Cloud Storage.
回答2:
# replicating command: gsutil ls gs://<bucketName>/<prefix>
from google.cloud import storage
bucket = storage.Client(<proj>).bucket(<bucketName>)
for key in bucket.list_blobs(prefix=<prefix>):
print key
来源:https://stackoverflow.com/questions/55636889/google-cloud-sdk-python-client-how-to-list-files-inside-of-a-cloud-storage-buck