celery

using class methods as celery tasks

匿名 (未验证) 提交于 2019-12-03 01:45:01
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试): 问题: I'm trying to use the methods of class as the django-celery tasks, marking it up using @task decorator. The same situation is discribed here , asked by Anand Jeyahar. It's something like this class A: @task def foo(self, bar): ... def main(): a = A() ... # what i need a.foo.delay(bar) # executes as celery task a.foo(bar) # executes locally The problem is even if i use class instance like this a.foo.delay(bar) it says, that foo needs at least two arguments, which meens that self pointer misses. More information: I can't convert class to

How to stop celery worker process

杀马特。学长 韩版系。学妹 提交于 2019-12-03 01:43:08
问题 I have a Django project on an Ubuntu EC2 node, which I have been using to set up an asynchronous using Celery . I am following this along with the docs. I've been able to get a basic task working at the command line, using: (env1)ubuntu@ip-172-31-22-65:~/projects/tp$ celery --app=myproject.celery:app worker --loglevel=INFO To start a worker. I have since made some changes to the Python, but realized that I need to restart a worker. From the command line, I've tried: ps auxww | grep 'celery

Celery worker's log contains question marks (???) instead of correct unicode characters

匿名 (未验证) 提交于 2019-12-03 01:36:02
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试): 由 翻译 强力驱动 问题: I'm using Celery 3.1.18 with Python 2.7.8 on CentOS 6.5. In a Celery task module, I have the following code: # someapp/tasks.py from celery import shared_task from celery . utils . log import get_task_logger logger = get_task_logger ( __name__ ) @shared_task () def foo (): logger . info ( 'Test output: %s' , u "测试中" ) I use the initd script here to run a Celery worker. Also I put the following settings in /etc/default/celeryd : CELERYD_NODES = "bar" # %N will be replaced with the first part of the nodename. CELERYD_LOG_FILE = "/var

Celery beat - different time zone per task

匿名 (未验证) 提交于 2019-12-03 01:34:02
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试): 由 翻译 强力驱动 问题: I am using celery beat to schedule some tasks. I'm able to use the CELERY_TIMEZONE setting to schedule the tasks using the crontab schedule and it runs at the scheduled time in the mentioned time zone. But I want to be able to setup multiple such tasks for different timezones in the same application (single django settings.py). I know which task needs to run in what timezone when the task is being scheduled. Is it possible to specify a different timezone for each of the tasks? I'm using django (1.4) with celery (3.0.11) and django

How do I run celery status/flower without the -A option?

匿名 (未验证) 提交于 2019-12-03 01:33:01
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试): 问题: Consider this bash session: $ export DJANGO_SETTINGS_MODULE=web.settings $ celery status -b redis://redis.businessoptics.dev:6379/1 -t 10 Error: No nodes replied within time constraint. $ celery status -b redis://redis.businessoptics.dev:6379/1 -t 10 -A scaffold.tasks.celery_app celery@worker.9e2c39a1c42c: OK Why do I need the -A option? As far as I can tell celery should be able to detect the necessary metadata on redis. Similarly if I run celery flower -b <redis url> it shows that it successfully connects to redis but doesn't show any real

celeryev Queue in RabbitMQ Becomes Very Large

匿名 (未验证) 提交于 2019-12-03 01:33:01
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试): 问题: I am using celery on rabbitmq. I have been sending thousands of messages to the queue and they are being processed successfully and everything is working just fine. However, the number of messages in several rabbitmq queues are growing quite large (hundreds of thousands of items in the queue). The queues are named celeryev.[...] (see screenshot below). Is this appropriate behavior? What is the purpose of these queues and shouldn't they be regularly purged? Is there a way to purge them more regularly, I think they are taking up quite a bit of

Airflow + Cluster + Celery + SQS - Airflow Worker: &#039;Hub&#039; object has no attribute &#039;_current_http_client&#039;

匿名 (未验证) 提交于 2019-12-03 01:05:01
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试): 问题: I'm trying to cluster my Airflow setup and I'm using this article to do so. I just configured my airflow.cfg file to use the CeleryExecutor , I pointed my sql_alchemy_conn to my postgresql database that's running on the same master node, I've set the broker_url to use AWS SQS (I didn't set the access_key_id or secret_key since it's running on an EC2-Instance it doesn't need those), and I've set the celery_result_backend to my postgresql server too. I saved my new airflow.cfg changes, I ran airflow initdb , and then I ran airflow scheduler

Why use Celery instead of RabbitMQ?

微笑、不失礼 提交于 2019-12-03 01:02:23
问题 From my understanding, Celery is a distributed task queue, which means the only thing that it should do is dispatching tasks/jobs to others servers and get the result back. RabbitMQ is a message queue, and nothing more. However, a worker could just listen to the MQ and execute the task when a message is received. This achieves exactly what Celery offers, so why need Celery at all? 回答1: You are right, you don't need Celery at all. When you are designing a distributed system there are a lot of

ImportError: No module named celery for Celery 3.1 and Python 2.7

匿名 (未验证) 提交于 2019-12-03 01:00:01
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试): 问题: Using Python 2.7 and Celery 3.1.25 on Windows, when we run the Celery worker using celery -A proj worker -l info we get the error ImportError: No module named celery Problem: The worker stops working when we changed the name of the file celeryApp.py from celery.py changed the import statement in tasks.py from from .celery import app to from celeryApp import app . Why is this happening? How can we fix the problem? Directory structure /proj/__init__.py /proj/celeryApp.py /proj/tasks.py /proj/celeryApp.py from __future__ import absolute_import,

Callback for celery apply_async

匿名 (未验证) 提交于 2019-12-03 00:59:01
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试): 问题: I use celery in my application to run periodic tasks. Let's see simple example below from myqueue import Queue @perodic_task(run_every=timedelta(minutes=1)) def process_queue(): queue = Queue() uid, questions = queue.pop() if uid is None: return job = group(do_stuff(q) for q in questions) job.apply_async() def do_stuff(question): try: ... except: ... raise As you can see in the example above, i use celery to run async task, but (since it's a queue) i need to do queue.fail(uid) in case of exception in do_stuff or queue.ack(uid) otherwise. In