python requests upload large file with additional data

落爺英雄遲暮 提交于 2020-12-01 02:49:31

问题


I've been looking around for ways to upload large file with additional data, but there doesn't seem to be any solution. To upload file, I've been using this code and it's been working fine with small file:

with open("my_file.csv", "rb") as f:
    files = {"documents": ("my_file.csv", f, "application/octet-stream")}
    data = {"composite": "NONE"}
    headers = {"Prefer": "respond-async"}
    resp = session.post("my/url", headers=headers, data=data, files=files)

The problem is that the code loads the whole file up before sending, and I would run into MemoryError when uploading large files. I've looked around, and the way to stream data is to set

resp = session.post("my/url", headers=headers, data=f)

but I need to add {"composite": "NONE"} to the data. If not, the server wouldn't recognize the file.


回答1:


You can use the requests-toolbelt to do this:

import requests
from requests_toolbelt.multipart import encoder

session = requests.Session()
with open('my_file.csv', 'rb') as f:
    form = encoder.MultipartEncoder({
        "documents": ("my_file.csv", f, "application/octet-stream"),
        "composite": "NONE",
    })
    headers = {"Prefer": "respond-async", "Content-Type": form.content_type}
    resp = session.post(url, headers=headers, data=form)
session.close()

This will cause requests to stream the multipart/form-data upload for you.



来源:https://stackoverflow.com/questions/35779879/python-requests-upload-large-file-with-additional-data

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!