celery

Kombu/Celery messaging

这一生的挚爱 提交于 2019-12-12 03:37:14
问题 I have a simple application that sends & receives messages, kombu, and uses Celery to task the message. Kombu alon, I can receive the message properly. when I send "Hello", kombu receives "Hello". But when I added the task, what kombu receives is the task ID of the celery. My purpose for this project is so that I can schedule when to send and receive messages, hence Celery. What I would like to know is why is kombu receiving the task id instead of the sent message? I have searched and

Sharing an Oracle database connection between simultaneous Celery tasks

痴心易碎 提交于 2019-12-12 02:49:21
问题 I'm working with Python2.7, Celery and cx_Oracle to access the Oracle database. I create a lot of tasks. Each task runs a query through cx_Oracle. Many of this tasks will run simultaneously. All tasks should share the same database connection. If I only launch one task, the query gets run correctly. However, if I launch several queries, I start getting this error message: [2016-04-04 17:12:43,846: ERROR/MainProcess] Task tasks.run_query[574a6e7f-f58e-4b74-bc84-af4555af97d6] raised unexpected:

python celery multi unregistred task

◇◆丶佛笑我妖孽 提交于 2019-12-12 02:47:33
问题 I have two queues in Celery and one tasks.py with tasks. When I run celery with celery worker -A myapp -l info -Q messages1 celery worker -A myapp -l info -Q messages2 in two different terminals, it works fine and run all my tasks. But if I run in with celery multi start 2 -Q:1 messages1 -Q:2 messages2 --loglevel=DEBUG I get [2014-05-08 15:30:33,020: ERROR/MainProcess] Received unregistered task of type . What I`m doing wrong? UPDATE: I've found, that celery worker -A myapp -l info -Q

Why I am Getting KeyError in Scrapy?

↘锁芯ラ 提交于 2019-12-12 01:24:47
问题 I am using Scrapy spiders inside Celery and I am getting this kind of errors randomly Unhandled Error Traceback (most recent call last): File "/usr/lib/python2.7/site-packages/twisted/internet/base.py", line 428, in fireEvent DeferredList(beforeResults).addCallback(self._continueFiring) File "/usr/lib/python2.7/site-packages/twisted/internet/defer.py", line 321, in addCallback callbackKeywords=kw) File "/usr/lib/python2.7/site-packages/twisted/internet/defer.py", line 310, in addCallbacks

Permission problems prevent celery from running as daemon?

点点圈 提交于 2019-12-12 00:32:49
问题 I'm currently having some trouble running celery as daemon. I use apache to serve my Django application, so I set uid and gid in celery setting all as "www-data". There are 2 places I know so far that need access permission: /var/log/celery/*.log , /var/run/celery/*.pid , and I already set them owned by "www-data". However, celery couldn't get started when I run sudo service celeryd start . If I get rid of the --uid and --gid option for the command, celery could get started by user "root".

Celery import and SQS connection issue

淺唱寂寞╮ 提交于 2019-12-12 00:28:55
问题 I'm trying to follow the documentation to get started with celery, but running into hard to debug problems with the sample code. I can't tell if I'm hitting two sides of the same problem, or two unique problems. I can make a connection to the SQS queue through the shell, but not with django. I don't know what the relation is of that behavior to the problems importing Celery vs importing task. The "Getting Started" guide here: http://celery.github.com/celery/getting-started/first-steps-with

Starting worker with dynamic routing_key?

走远了吗. 提交于 2019-12-11 20:24:15
问题 I have one queue with several task types and I need to run worker for specific task. Something like: 'celery worker --routing_key task.type1 --app=app' Queue configuration: CELERY_QUEUES = ( Queue('myqueue', routing_key='task.#'), ) CELERY_DEFAULT_EXCHANGE_TYPE = 'topic' Using pika task is easy to solve: http://www.rabbitmq.com/tutorials/tutorial-five-python.html but how to do it with celery? 回答1: Np, you can't bind a worker to a routing_key. Workers consume queues not routing_key. Producers

Trouble killing celery process

99封情书 提交于 2019-12-11 19:27:38
问题 I have a Django project on an Ubuntu EC2 node, which I have been using to set up an asynchronous using Celery . I am following http://michal.karzynski.pl/blog/2014/05/18/setting-up-an-asynchronous-task-queue-for-django-using-celery-redis/ along with the docs. I've been able to get a basic task working at the command line, using: (env1)ubuntu@ip-172-31-22-65:~/projects/tp$ celery --app=myproject.celery:app worker --loglevel=INFO To start a worker. I have since made some changes to the Python,

Long celery task causes MySQL timeout in Django - options?

為{幸葍}努か 提交于 2019-12-11 19:08:16
问题 I have a celery task which takes about 6 hours. At the end of it, Django (or possibly Celery) raises an exception "MySQL server has gone away". After doing some reading, it appears that this is a known issue with long tasks. I don't (think I have) control over pinging or otherwise mid-task; but the exception is raised after the call which takes time has finished (but still within the task function). Is there a call I can make within the function to re-establish the connection? (I have run

Celery task for file uploading in django wizard

懵懂的女人 提交于 2019-12-11 18:18:47
问题 I have a WizardView covering two forms, the second one has a FileField. Is it possible to create a Celery task for uploading a file from that FileField? Should I create another FILE_UPLOAD_HANDLER? All the information concerning handling files using wizard I found at https://docs.djangoproject.com is about having to add a file_storage to the WizardView subclass. 回答1: Actually uploading a file is a request, therefore you need to handle it with a view and then do whatever you want to do,