How to upload folder on Google Cloud Storage using Python API

痞子三分冷 提交于 2021-02-18 11:20:11

问题


I have successfully uploaded single text file on Google Cloud Storage. But when i try to upload whole folder, It gives permission denied error.

filename = "d:/foldername"   #here test1 is the folder.


Error:
Traceback (most recent call last):
  File "test1.py", line 142, in <module>
    upload()
  File "test1.py", line 106, in upload
    media = MediaFileUpload(filename, chunksize=CHUNKSIZE, resumable=True)
  File "D:\jatin\Project\GAE_django\GCS_test\oauth2client\util.py", line 132, in positional_wrapper
    return wrapped(*args, **kwargs)
  File "D:\jatin\Project\GAE_django\GCS_test\apiclient\http.py", line 422, in __init__
    fd = open(self._filename, 'rb')
IOError: [Errno 13] Permission denied: 'd:/foldername'

回答1:


This works for me. Copy all content from a local directory to a specific bucket-name/full-path (recursive) in google cloud storage:

import glob
from google.cloud import storage

def upload_local_directory_to_gcs(local_path, bucket, gcs_path):
    assert os.path.isdir(local_path)
    for local_file in glob.glob(local_path + '/**'):
        if not os.path.isfile(local_file):
           upload_local_directory_to_gcs(local_file, bucket, gcs_path + "/" + os.path.basename(local_file))
        else:
           remote_path = os.path.join(gcs_path, local_file[1 + len(local_path):])
           blob = bucket.blob(remote_path)
           blob.upload_from_filename(local_file)


upload_local_directory_to_gcs(local_path, bucket, BUCKET_FOLDER_DIR)



回答2:


A folder is a cataloging structure containing references to files and directories. The library will not accept a folder as an argument.

As far as I understand, your use case is to make an upload to GCS preserving a local folder structure. To accomplish that you can use the os python module and make a recursive function (e.g process_folder) that will take path as an argument. This logic can be used for the function:

  1. Use os.listdir() method to get a list of objects within the source path (will return both files and folders).
  2. Iterate over a list from step 1 to separate files from folders via os.path.isdir() method.
  3. Iterate over files and upload them with adjusted path (e.g. path+ “/“ + file_name).
  4. Iterate over folders making a recursive call (e.g. process_folder(path+folder_name)).

It’ll be necessary to work with two paths:

  1. Real system path (e.g. “/Users/User/…/upload_folder/folder_name”) used with os module.
  2. Virtual path for GCS file uploads (e.g. “upload”+”/“ + folder_name + ”/“ + file_name).

Don’t forget to implement exponential backoff referenced at [1] to deal with 500 errors. You can use a Drive SDK example at [2] as a reference.

[1] - https://developers.google.com/storage/docs/json_api/v1/how-tos/upload#exp-backoff
[2] - https://developers.google.com/drive/web/handle-errors




回答3:


I assume the sheer filename = "D:\foldername" is not enough info about the source code. Neither am I sure that this is even possible.. via the web interface you can also just upload files or create folders where you then upload the files.

You could save the folders name, then create it (I've never used the google-app-engine, but I guess that should be possible) and then upload the contents to the new folder




回答4:


Refer - https://hackersandslackers.com/manage-files-in-google-cloud-storage-with-python/

from os import listdir
from os.path import isfile, join

...

def upload_files(bucketName):
    """Upload files to GCP bucket."""
    files = [f for f in listdir(localFolder) if isfile(join(localFolder, f))]
    for file in files:
        localFile = localFolder + file
        blob = bucket.blob(bucketFolder + file)
        blob.upload_from_filename(localFile)
    return f'Uploaded {files} to "{bucketName}" bucket.'


来源:https://stackoverflow.com/questions/25599503/how-to-upload-folder-on-google-cloud-storage-using-python-api

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!