Prevent timeout when opening large files from URL

时光毁灭记忆、已成空白 提交于 2019-12-24 08:05:42

问题


I am writing a Ruby 1.8.7 script which has to request really large XML files(1 - 5MB) from server which is quite slow(1min30sec for 1MB). The requested file is written to disk.

I set the timeout in my script to some ridiculous amount of seconds, since I really want to get the file, not just move on if it takes too long. Still with the high amount of seconds I keep getting timeouts.

Is there a best practice for this?

right now I use

  open(DIR + "" + number + "" + ".xml", 'wb') do |file|
  begin
    status = Timeout::timeout(6000000) do
      file << open(url).read
      end
    rescue Timeout::Error => e
      Rails.logger.info "Timeout for:" + number.to_s
    end
  end

now tought timeout was set in seconds which would make 6000000 way more then 1min30sec, but somehow it isn't using my timeout in seconds. Note again that i'm restricted to using Ruby 1.8.7


回答1:


Unfortunately, this is problematic. In Ruby 1.9.x, open-uri-extended open can take read_timeout parameter, which it passes over to http library. But in Ruby 1.8.x, which you're using, this parameter is not available.

So, you need to use net/http directly, call start/get there and set read_timeout to your liking. If you just use the open-uri wrapper, the read_timeout remains 60 seconds, which is shorter than what you want.



来源:https://stackoverflow.com/questions/7642364/prevent-timeout-when-opening-large-files-from-url

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!