there are several questions on stackoverflow regarding tornado I still haven't found out an answer to my question I have a big text file that I wish to iterate on and send each line as a POST http request. I wish to do it async ( I need it to be fast) and then check the responses of the requests.
I have something like that
http_client = httpclient.AsyncHTTPClient() with open(filename) as log_file: for line in log_file: request = httpclient.HTTPRequest(self.destination,method="POST",headers=self.headers,body=json.dumps(line)) response = http_client.fetch(request, callback=self.handle_request)
looking at tcpdump this does not do anything all I get is a serious of "Futures" object I also tried placing the fetch command in "yield" and then iterating it while using the @gen.coroutine decorator on the method. that did not help. can anyone please tell me what am I doing wrong?
thanks!
Here's how you'd use "fetch" in a coroutine:
from tornado import gen, httpclient, ioloop filename = 'filename.txt' destination = 'http://localhost:5000' http_client = httpclient.AsyncHTTPClient() @gen.coroutine def post(): with open(filename) as log_file: for line in log_file: request = httpclient.HTTPRequest(destination, body=line, method="POST") response = yield http_client.fetch(request) print response ioloop.IOLoop.current().run_sync(post)
You can test this with a little server that receives the lines and prints them:
from tornado import ioloop, web class MyHandler(web.RequestHandler): def post(self): print self.request.body.rstrip() app = web.Application([ web.URLSpec('/', MyHandler) ]) app.listen(port=5000) ioloop.IOLoop.current().start()
First run the server code, and then the client.
If you want to post up to 10 log lines at a time in parallel, install Toro and do:
from tornado import gen, ioloop from tornado.httpclient import AsyncHTTPClient, HTTPRequest from toro import JoinableQueue filename = 'tox.ini' destination = 'http://localhost:5000' AsyncHTTPClient.configure("tornado.simple_httpclient.SimpleAsyncHTTPClient", max_clients=10) http_client = AsyncHTTPClient() q = JoinableQueue(maxsize=10) @gen.coroutine def read(): with open(filename) as log_file: for line in log_file: yield q.put(line) @gen.coroutine def post(): while True: line = yield q.get() request = HTTPRequest(destination, body=line, method="POST") # Don't yield, just keep going as long as there's work in the queue. future = http_client.fetch(request) def done_callback(future): q.task_done() try: print future.result() except Exception as exc: print exc future.add_done_callback(done_callback) # Start coroutines. read() post() # Arrange to stop loop when queue is finished. loop = ioloop.IOLoop.current() join_future = q.join() def done(future): loop.stop() join_future.add_done_callback(done) loop.start()