On my Ruby on Rails application I need to execute 50 background jobs in parallel. Each job creates a TCP connection to a different server, fecths some data and updates an ac
Some thoughts...
Just because you need to read 50 sites and naturally want some parallel work does not mean that you need 50 processes or threads. You need to balance the slowdown and overhead. How about having 10 or 20 processes each read a few sites?
Depending on which Ruby you are using, be careful about the green threads, you may not get the parallel result you want
You might want to structure it like a reverse, client-side inetd, and use connect_nonblock and IO.select to get the parallel connections you want by making all the servers respond in parallel. You don't really need parallel processing of the results, you just need to get in line at all the servers in parallel, because that is where the latency really is.
So, something like this from the socket library...extend it for multiple outstanding connections...
require 'socket'
include Socket::Constants
socket = Socket.new(AF_INET, SOCK_STREAM, 0)
sockaddr = Socket.sockaddr_in(80, 'www.google.com')
begin
socket.connect_nonblock(sockaddr)
rescue Errno::EINPROGRESS
IO.select(nil, [socket])
begin
socket.connect_nonblock(sockaddr)
rescue Errno::EISCONN
end
end
socket.write("GET / HTTP/1.0\r\n\r\n")
# here perhaps insert IO.select. You may not need multiple threads OR multiple
# processes with this technique, but if you do insert them here
results = socket.read