Streaming large file from Heroku fails after 30 seconds timeout

元气小坏坏 提交于 2019-12-01 18:58:34

问题


I have a python web worker that streams a large file upon client request. After 30 seconds the connection is terminated by Heroku. I'm using web.py and yielding new output. According to Heroku docs:

Cedar supports HTTP 1.1 features such as long-polling and streaming responses. An application has an initial 30 second window to respond with a single byte back to the client. However, each byte transmitted thereafter (either received from the client or sent by your application) resets a rolling 55 second window. If no data is sent during the 55 second window, the connection will be terminated.

I send much more than 1 byte every 55 seconds but the connection is still terminated.

These are the headers I'm using

web.header('Content-type' , 'application/zip')
web.header('Content-Disposition', 'attachment; filename="images.zip"')

I even tried adding:

web.header('Transfer-Encoding','chunked')

Am I doing something wrong?


回答1:


It appears the problem was a result of bad gunicorn settings. Extending gunicron timeout in Procfile did the trick:

--timeout 300


来源:https://stackoverflow.com/questions/17609342/streaming-large-file-from-heroku-fails-after-30-seconds-timeout

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!