问题
I am running a simple Flask app with Tornado, but the view only handles one request at a time. How can I make it handle multiple concurrent requests?
The fix I'm using is to fork and use the multiple processes to handle requests, but I don't like that solution.
from flask import Flask
app = Flask(__name__)
@app.route('/flask')
def hello_world():
return 'This comes from Flask ^_^'
from tornado.wsgi import WSGIContainer
from tornado.ioloop import IOLoop
from tornado.web import FallbackHandler, RequestHandler, Application
from flasky import app
class MainHandler(RequestHandler):
def get(self):
self.write("This message comes from Tornado ^_^")
tr = WSGIContainer(app)
application = Application([
(r"/tornado", MainHandler),
(r".*", FallbackHandler, dict(fallback=tr)),
])
if __name__ == "__main__":
application.listen(8000)
IOLoop.instance().start()
回答1:
The immediate answer is that you should use a dedicated WSGI server, such as uWSGI or Gunicorn, and configure it to use multiple workers. Do not Tornado as a WSGI server.
Your fix of spawning processes is correct in as much as using WSGI with Tornado is "correct". WSGI is a synchronous protocol: one worker handles one request at a time. Flask doesn't know about Tornado, so it can't play nice with it by using coroutines: handling the request happens synchronously.
Tornado has a big warning in their docs about this exact thing.
WSGI is a synchronous interface, while Tornado’s concurrency model is based on single-threaded asynchronous execution. This means that running a WSGI app with Tornado’s
WSGIContaineris less scalable than running the same app in a multi-threaded WSGI server like gunicorn or uwsgi. UseWSGIContaineronly when there are benefits to combining Tornado and WSGI in the same process that outweigh the reduced scalability.
In other words: to handle more concurrent requests with a WSGI application, spawn more workers. The type of worker also matters: threads vs. processes vs. eventlets all have tradeoffs. You're spawning workers by creating processes yourself, but it's more common to use a WSGI server such as uWSGI or Gunicorn.
来源:https://stackoverflow.com/questions/39644247/flask-and-tornado-applciation-does-not-handle-multiple-concurrent-requests