问题
We are running a website built with Django and Piston and I want to implement celery to offload tasks to an external server. I don't really want to run Django on the secondary server and would like to simply run a pure Python celery worker. Is it possible for me to write simple function stubs on the Django server and write the actual function logic on the secondary server?
i.e.
Django Side
from celery import task
@task
send_message(fromUser=None, toUser=None, msgType=None, msg=None):
pass
Server Side
from celery import Celery
celery = Celery('hello', broker='amqp://guest@localhost//')
@celery.task
send_message(fromUser=None, toUser=None, msgType=None, msg=None):
# Do send_message logic here
回答1:
That is easily possible.
If you have a pure-python Celery worker, you can send tasks by name as long as you use the same broker URL:
from celery import Celery
celery = Celery(broker='amqp://guest@localhost//')
then in some view:
celery.send_task('send_message', kwargs={
'fromUser': ...,
})
回答2:
You can also schedule your tasks in configuration with CELERYBEAT_SCHEDULE:
CELERYBEAT_SCHEDULE = {
'scheduled_task':{
'task':'name in your task decorator',
'schedule': timedelta(...),
'args': (..., ),
}
}
But your worker has to be run with celery beat (-B
):
celery -A app.tasks worker -B -l info
来源:https://stackoverflow.com/questions/12920974/mixing-django-celery-and-standalone-celery