Streaming download large file with python-requests interrupting

眉间皱痕 提交于 2021-01-02 06:14:21

问题


i have problem with streaming download large file (about 1.5 GB) in python-requests v. 2.0.1

with open("saved.rar",'wb') as file:
    r = session.get(url,stream=True,timeout=3600)
    for chunk in r.iter_content(chunk_size=1024):
        if chunk:
            file.write(chunk)
            file.flush()

I tested it few times on my vps and sometimes it downloaded 200mb, 500mb or 800mb and saved it without any error. It doesnt reached the timeout, just stopped like finish downloading.

Host where im downloading this file is stable because i dont have any problems to download this file in browser.

There is any way to be download large file in python-requests and be 100% sure its whole file?

@Edit

I've solved it using urllib, problem is only with requests. anyway thanks for help.


回答1:


There might be several issues that will cause download to be interrupted. Network issues, etc. But we know the file size before we start the download to check if you have downloaded the whole file, you can do this using urllib:

site = urllib.urlopen("http://python.org")
meta = site.info()
print meta.getheaders("Content-Length")

Using requests:

r = requests.get("http://python.org")
r.headers["Content-Length"]


来源:https://stackoverflow.com/questions/19726607/streaming-download-large-file-with-python-requests-interrupting

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!