问题
Having discussed some failure handling in "Does Ruby's 'open_uri' reliably close sockets after read or on fail?", I wanted to dig into this a little deeper.
I'd like to attempt to pull data from an FTP server, then if that fails, attempt a pull from an http server. If these both fail, I'd like to cycle around and attempt a retry several times with a short pause in between the attempts (perhaps 1 second)
I read about the "retryable" method in "Retrying code blocks in Ruby (on exceptions, whatever)", however retryable-rb may be more robust.
Would appreciate seeing an example from an old hat at this scenario as I need a reliable mechanism in place for culling data from a couple of semi-unreliable sources I have. As noted in the other thread, it seems that Typhoeus could offer a robust component to this solution.
回答1:
Using one of those gems might be a good idea but it''s pretty straightforward without them:
data = nil
until data
# or 5.times do
data = open(ftp_url){|f| f.read} rescue nil
data ||= open(http_url){|f| f.read} rescue nil
break if data
sleep 1
end
来源:https://stackoverflow.com/questions/8706367/retryable-ftp-http-uri-reading-with-typhoeus