celery

Celery tries to connect to the wrong broker

拟墨画扇 提交于 2019-12-05 09:46:42
I have in my celery configuration BROKER_URL = 'redis://127.0.0.1:6379' CELERY_RESULT_BACKEND = 'redis://127.0.0.1:6379' Yet whenever I run the celeryd, I get this error consumer: Cannot connect to amqp://guest@127.0.0.1:5672//: [Errno 111] Connection refused. Trying again in 2.00 seconds... Why is it not connecting to the redis broker I set it up with, which is running btw? import your celery and add your broker like that : celery = Celery('task', broker='redis://127.0.0.1:6379') celery.config_from_object(celeryconfig) If you followed First Steps with Celery tutorial, specifically: app.config

celery daemon - permission denied on log file

核能气质少年 提交于 2019-12-05 09:40:21
I have been working on setting up my celery task as a daemon in order to process data on a schedule. I have been following the docs in order to set up my daemon, but have been running into a log file permission error that has me stumped. Below is the configuration i have set up on an ubuntu box on Digital Ocean /etc/default/celeryd # here we have a single node CELERYD_NODES="w1" CELERY_BIN = "/mix_daemon/venv/bin/celery" CELERYD_CHDIR="/mix_daemon/" CELERYD_OPTS="-A tasks worker --loglevel=info --beat" # %n will be replaced with the nodename. CELERYD_LOG_FILE="/var/log/celery/%n.log" CELERYD

Route celery task to specific queue

我只是一个虾纸丫 提交于 2019-12-05 09:39:07
I have two separate celeryd processes running on my server, managed by supervisor . They are set to listen on separate queues as such: [program:celeryd1] command=/path/to/celeryd --pool=solo --queues=queue1 ... [program:celeryd2] command=/path/to/celeryd --pool=solo --queues=queue2 ... And my celeryconfig looks something like this: from celery.schedules import crontab BROKER_URL = "amqp://guest:guest@localhost:5672//" CELERY_DISABLE_RATE_LIMITS = True CELERYD_CONCURRENCY = 1 CELERY_IGNORE_RESULT = True CELERY_DEFAULT_QUEUE = 'default' CELERY_QUEUES = { 'default': { "exchange": "default",

Running Celery worker inside an app context still raises “working outside of app context” error in task

拈花ヽ惹草 提交于 2019-12-05 08:57:53
I am using Miguel Grinberg's article to set up Celery with the app factory pattern in order to send email with Flask-Mail. I've been calling various scripts that use Celery without any issues. However I keep getting Runtime Error: working outside of application context with the following task even though I am running the worker inside an app context. Why am I getting this error? How do I get Flask-Mail to work in Celery? email.py : from flask import current_app, render_template from flask.ext.mail import Message from . import celery, mail @celery.task def send_async_email(msg): mail.send(msg)

Celery Closes Unexpectedly After Longer Inactivity

北城余情 提交于 2019-12-05 08:52:34
So I am using a RabbitMQ + Celery to create a simple RPC architecture. I have one RabbitMQ message broker and one remote worker which runs Celery deamon. There is a third server which exposes a thin RESTful API. When it receives HTTP request, it sends a task to the remote worker, waits for response and returns a response. This works great most of the time. However I have notices that after a longer inactivity (say 5 minutes of no incoming requests), the Celery worker behaves strangely. First 3 tasks received after a longer inactivity return this error: exchange.declare: connection closed

Cannot start Celery Worker (Kombu.asynchronous.timer)

断了今生、忘了曾经 提交于 2019-12-05 07:21:45
I followed the first steps with Celery (Django) and trying to run a heavy process in the background. I have RabbitMQ server installed. However, when I try, celery -A my_app worker -l info it throws the following error File "<frozen importlib._bootstrap>", line 994, in _gcd_import File "<frozen importlib._bootstrap>", line 971, in _find_and_load File "<frozen importlib._bootstrap>", line 955, in _find_and_load_unlocked File "<frozen importlib._bootstrap>", line 665, in _load_unlocked File "<frozen importlib._bootstrap_external>", line 678, in exec_module File "<frozen importlib._bootstrap>",

python-django-celery的安装和配置_20191122

≯℡__Kan透↙ 提交于 2019-12-05 07:00:43
celery的介绍 celery有三个核心的概念: 任务的发出者(需要发邮件的一方),我们项目的代码就相当于发出者, 中间是一个任务队列(中间人broker),这里我们使用Redis来承担任务队列的作用 任务的处理者(就是帮助发邮件的这就是worker), 逻辑: 首先任务发出者,发出任务之后,不能直接告诉任务的处理者,要先到任务队列, 任务的处理者会监听任务队列,有的话就执行, celery本身是没有提供任务队列的功能的,需要借助一个rabbitMQ数据库,或者Redis,都是可以作为中间人的,这里我们使用Redis来承担任务队列的作用, 这样的设计就不会阻塞了, 安装celery, pip install celery 安装redis:pip install redis 验证redis是否安装成功,在cmd窗口输入python命令之后再输入import redis,如果没有出现模块不存在则表示安装成功。 怎么使用celery? 在项目目录下新建一个包,celery_tasks,新建一个文件,tasks.py,, 我的处理者是在虚拟机的Linux上面, 在这个地方启动任务,还需要把项目代码也放到上面去, 并且需要在虚拟机上也要安装celery,然后才能启动, 1,workon lq_py3(虚拟环境名称)这就是进入虚拟环境工作了, 2,pip freeze

Check status of Celery worker

谁都会走 提交于 2019-12-05 06:55:37
问题 I have a project that uses Celery. I am periodically running into a scenario where my requests are making it to Celery but the tasks aren't being handed off to the workers, but rather the server is just returning a 500 error. When I restart Celery it starts working again. I am only guessing that the worker is hanging which makes it so there aren't anymore workers available. If I startup another batch of workers the requests start working again (which supports my theory). Questions: I

Django Celery beat crashes on start

﹥>﹥吖頭↗ 提交于 2019-12-05 06:45:11
I have recently configured a new server with RabbitMQ and Celery. When I try to start Celerybeat on the machine it starts for a few seconds and stops. I have given right permissions to the log files and changed the owners to the Application user. I have also checked the celerybeat.log file and NO errors are registered. I tried to start it this way in the project folder: ./manage.py celerybeat And I got this error: [2010-12-01 09:59:46,127: INFO/MainProcess] process shutting down Could someone please point me in the right direction here. First thing to do in my opinion and launch it for show

django celery - how to send request.FILES['photo'] to task

痴心易碎 提交于 2019-12-05 06:20:23
i'm trying to send request.FILES['photo'], an uploaded file from my site, to tCelery via: tasks.upload_photos.delay(img=request.FILES['photo']) I get a pickle error because it cannot serialize it. What is the way to send a file to task? error: "can't pickle StringO objects" thanks. Read the file contents into a string, then pack it with the content type in a dict and send that. If you plan on saving the file, you can save the file to a model, then pass the id/pk to a celery task. 来源: https://stackoverflow.com/questions/4330719/django-celery-how-to-send-request-filesphoto-to-task