How to import a CSV file using Google Sheets API V4

前端 未结 5 1751
独厮守ぢ
独厮守ぢ 2020-12-30 06:14

Background

I\'m developing a Python 2.7 script that analyzes data from an SQL table and at the end, generates a CSV file.

Once the file is gener

5条回答
  •  自闭症患者
    2020-12-30 07:20

    I've spent couple of hours trying to make any of the other answers work. Libraries do not explain the authentication well, and don't work with google-provided way of handling credentials. On the other hand, Sam's answer doesn't elaborate on the details of using the API, which might be confusing at times. So, here is a full recipe of uploading CSVs to gSheets. It uses both Sam's and CapoChino's answers plus some of my own research.

    1. Authenticate/Setup. Generally, refer to the docs
      • Big blue button will get you credentials.json with no extra steps
      • quickstart.py can easily be adapted into authenticate.py
      • scopes should contain https://www.googleapis.com/auth/spreadsheets

    Hopefully by now you have your credentials stored, so let's move to the actual code

    1. Recipe that should work out of the box:
    import pickle
    from googleapiclient.discovery import build
    
    SPREADSHEET_ID = '1BxiMVs0XRA5nFMdKvBdBZjgmUUqptlbs74OgvE2upms' # Get this one from the link in browser
    worksheet_name = 'Sheet2'
    path_to_csv = 'New Folder/much_data.csv'
    path_to_credentials = 'Credentials/token.pickle'
    
    
    # convenience routines
    def find_sheet_id_by_name(sheet_name):
        # ugly, but works
        sheets_with_properties = API \
            .spreadsheets() \
            .get(spreadsheetId=SPREADSHEET_ID, fields='sheets.properties') \
            .execute() \
            .get('sheets')
    
        for sheet in sheets_with_properties:
            if 'title' in sheet['properties'].keys():
                if sheet['properties']['title'] == sheet_name:
                    return sheet['properties']['sheetId']
    
    
    def push_csv_to_gsheet(csv_path, sheet_id):
        with open(csv_path, 'r') as csv_file:
            csvContents = csv_file.read()
        body = {
            'requests': [{
                'pasteData': {
                    "coordinate": {
                        "sheetId": sheet_id,
                        "rowIndex": "0",  # adapt this if you need different positioning
                        "columnIndex": "0", # adapt this if you need different positioning
                    },
                    "data": csvContents,
                    "type": 'PASTE_NORMAL',
                    "delimiter": ',',
                }
            }]
        }
        request = API.spreadsheets().batchUpdate(spreadsheetId=SPREADSHEET_ID, body=body)
        response = request.execute()
        return response
    
    
    # upload
    with open(path_to_credentials, 'rb') as token:
        credentials = pickle.load(token)
    
    API = build('sheets', 'v4', credentials=credentials)
    
    push_csv_to_gsheet(
        csv_path=path_to_csv,
        sheet_id=find_sheet_id_by_name(worksheet_name)
    )
    

    Good thing about directly using batchUpdate is that it uploads thousands of rows in a second. On a low level gspread does the same and should be as performant. Also there is gspread-pandas.

    p.s. the code is tested with python 3.5, but this thread seemed to be most appropriate to submit it to.

提交回复
热议问题