问题
celery.py
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'project.settings')
app = Celery('project', broker='amqp://foo:bar@remoteserver:5672', backend='amqp')
# app = Celery('project')
# Using a string here means the worker don't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
# should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')
# Load task modules from all registered Django app configs.
app.autodiscover_tasks()
@app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
tasks.py (in the app folder)
from __future__ import absolute_import, unicode_literals
from celery import shared_task
@shared_task
def addnum(x, y):
return (x + y)
When I call this task :
addnum.delay(3, 5)
It returns:
<AsyncResult: 82cb362a-5439-4c1c-9c64-b158a9a48786>
but celery worker just sits there waiting for tasks but doesn't receive any:
[2017-03-17 13:48:36,869: INFO/MainProcess] celery@gauravrajput ready.
The problem is that the tasks are not being queued to the remote rabbitmq server.
When I initialize Celery as:
app = Celery('project')
and then start Celery worker, it started to receive and complete tasks.
[2017-03-17 14:02:13,558: INFO/MainProcess] celery@gauravrajput ready.
[2017-03-17 14:02:13,560: INFO/MainProcess] Received task: app.tasks.addnum[82cb362a-5439-4c1c-9c64-b158a9a48786]
回答1:
I found out that rabbitmq-server was running on my localhost. Idk why but the tasks were being queued to the localhost instead of remote RabbitMQ server even after explicitly declaring the remote RabbitMQ server as my broker. However, simply stopping the rabbit-mq server on my localhost fixed the issue.
sudo -u rabbitmq rabbitmqctl stop
来源:https://stackoverflow.com/questions/42860356/celery-not-queuing-tasks-to-broker-on-remote-server-adds-tasks-to-localhost-ins