Why can't Celery daemon see tasks?

南笙酒味 提交于 2019-12-22 18:25:06

问题


I have a Django 1.62 application running on Debian 7.8 with Nginx 1.2.1 as my proxy server and Gunicorn 19.1.1 as my application server. I've installed Celery 3.1.7 and RabbitMQ 2.8.4 to handle asynchronous tasks. I'm able to start a Celery worker as a daemon but whenever I try to run the test "add" task as shown in the Celery docs, I get the following error:

Received unregistred task of type u'apps.photos.tasks.add'.
The message has been ignored and discarded.

Traceback (most recent call last):
File "/home/swing/venv/swing/local/lib/python2.7/site-packages/celery/worker/consumer.py", line 455, in on_task_received
strategies[name](message, body,
KeyError: u'apps.photos.tasks.add'

All of my configuration files are kept in a "conf" directory that sits just below my "myproj" project directory. The "add" task is in apps/photos/tasks.py.

myproj
│
├── apps
    ├── photos
    │   ├── __init__.py
    │   ├── tasks.py
    conf
    ├── celeryconfig.py
    ├── celeryconfig.pyc
    ├── celery.py
    ├── __init__.py
    ├── middleware.py
    ├── settings
    │   ├── base.py
    │   ├── dev.py
    │   ├── __init__.py
    │   ├── prod.py
    ├── urls.py
    ├── wsgi.py

Here is the tasks file:

# apps/photos/tasks.py
from __future__ import absolute_import
from conf.celery import app

@app.task
def add(x, y):
    return x + y

Here are my Celery application and configuration files:

# conf/celery.py
from __future__ import absolute_import
import os
from celery import Celery
from django.conf import settings
from conf import celeryconfig

os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'conf.settings')
app = Celery('conf')
app.config_from_object(celeryconfig)
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)

@app.task(bind=True)
def debug_task(self):
    print('Request: {0!r}'.format(self.request))

# conf/celeryconfig.py
BROKER_URL = 'amqp://guest@localhost:5672//'
CELERY_RESULT_BACKEND = 'amqp'
CELERY_ACCEPT_CONTENT = ['json', ]
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'

This is my Celery daemon config file. I commented out CELERY_APP because I've found that the Celery daemon won't even start if I uncomment it. I also found that I need to add the "--config" argument to CELERYD_OPTS in order for the daemon to start. I created a non-privileged "celery" user who can write to the log and pid files.

# /etc/default/celeryd
CELERYD_NODES="worker1"
CELERYD_LOG_LEVEL="DEBUG"
CELERY_BIN="/home/myproj/venv/myproj/bin/celery"
#CELERY_APP="conf"
CELERYD_CHDIR="/www/myproj/"
CELERYD_OPTS="--time-limit=300 --concurrency=8 --config=celeryconfig"
CELERYD_LOG_FILE="/var/log/celery/%N.log"
CELERYD_PID_FILE="/var/run/celery/%N.pid"
CELERYD_USER="celery"
CELERYD_GROUP="celery"
CELERY_CREATE_DIRS=1

I can see from the log file that when I run the command, "sudo service celeryd start", Celery starts without any errors. However, if I open the Python shell and run the following commands, I'll see the error I described at the beginning.

$ python shell
In [] from apps.photos.tasks import add
In [] result = add.delay(2, 2)

What's interesting is that if I examine Celery's registered tasks object, the task is listed:

In [] import celery
In [] celery.registry.tasks

Out [] {'celery.chain': ..., 'apps.photos.tasks.add': <@task: apps.photos.tasks.add of conf:0x16454d0> ...}

Other similar questions here have discussed having a PYTHONPATH environment variable and I don't have such a variable. I've never understood how to set PYTHONPATH and this project has been running just fine for over a year without it.

I should also add that my production settings file is conf/settings/prod.py. It imports all of my base (tier-independent) settings from base.py and adds some extra production-dependent settings.

Can anyone tell me what I'm doing wrong? I've been struggling with this problem for three days now.

Thanks!


回答1:


Looks like it is happening due to relative import error.

>>> from project.myapp.tasks import mytask
>>> mytask.name
'project.myapp.tasks.mytask'

>>> from myapp.tasks import mytask
>>> mytask.name
'myapp.tasks.mytask'

If you’re using relative imports you should set the name explicitly.

@task(name='proj.tasks.add')
def add(x, y):
   return x + y

Checkout: http://celery.readthedocs.org/en/latest/userguide/tasks.html#automatic-naming-and-relative-imports




回答2:


I'm using celery 4.0.2 and django, and I created a celery user and group for use with celeryd and had this same problem. The command-line version worked fine, but celeryd was not registering the tasks. It was NOT a relative naming problem.

The solution was to add the celery user to the group that can access the django project. In my case, this group is www-data with read, execute, and no write.



来源:https://stackoverflow.com/questions/29561655/why-cant-celery-daemon-see-tasks

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!