kombu

Celery Error 'No such transport: amqp'

杀马特。学长 韩版系。学妹 提交于 2019-12-10 21:34:22
问题 Celery was working fine, one day the command-line worker failed to start up with the following trace: Traceback (most recent call last): File "/home/buildslave/venv/bin/celery", line 9, in <module> load_entry_point('celery==3.0.7', 'console_scripts', 'celery')() File "/home/buildslave/venv/local/lib/python2.7/site-packages/celery/__main__.py", line 14, in main main() File "/home/buildslave/venv/local/lib/python2.7/site-packages/celery/bin/celery.py", line 942, in main cmd.execute_from

Python: Kombu+RabbitMQ Deadlock - queues are either blocked or blocking

徘徊边缘 提交于 2019-12-10 13:21:49
问题 The problem I have a RabbitMQ Server that serves as a queue hub for one of my systems. In the last week or so, its producers come to a complete halt every few hours. What have I tried Brute force Stopping the consumers releases the lock for a few minutes, but then blocking returns. Restarting RabbitMQ solved the problem for a few hours. I have some automatic script that does the ugly restarts, but it's obviously far from a proper solution. Allocating more memory Following cantSleepNow's

Celery beat not starting EOFError('Ran out of input')

孤者浪人 提交于 2019-12-10 11:02:03
问题 Everything worked perfectly fine until: celery beat v3.1.18 (Cipater) is starting. __ - ... __ - _ Configuration -> . broker -> amqp://user:**@staging-api.user-app.com:5672// . loader -> celery.loaders.app.AppLoader . scheduler -> celery.beat.PersistentScheduler . db -> /tmp/beat.db . logfile -> [stderr]@%INFO . maxinterval -> now (0s) [2015-09-25 17:29:24,453: INFO/MainProcess] beat: Starting... [2015-09-25 17:29:24,457: CRITICAL/MainProcess] beat raised exception <class 'EOFError'>:

Consuming a rabbitmq message queue with multiple threads (Python Kombu)

半腔热情 提交于 2019-12-09 18:08:12
问题 I have a single RabbitMQ exchange with a single queue. I wish to create a daemon that runs multiple threads and works through this queue as quickly as possible. The "work" involves communicating with external services, so there will be a fair amount of blocking going on within each consumer. As such, I want to have multiple threads all dealing with messages from the same queue. I can achieve this by consuming the queue on my primary thread, and then farming the incoming work off to a pool of

Celery beat not starting EOFError('Ran out of input')

不羁的心 提交于 2019-12-06 04:28:33
Everything worked perfectly fine until: celery beat v3.1.18 (Cipater) is starting. __ - ... __ - _ Configuration -> . broker -> amqp://user:**@staging-api.user-app.com:5672// . loader -> celery.loaders.app.AppLoader . scheduler -> celery.beat.PersistentScheduler . db -> /tmp/beat.db . logfile -> [stderr]@%INFO . maxinterval -> now (0s) [2015-09-25 17:29:24,453: INFO/MainProcess] beat: Starting... [2015-09-25 17:29:24,457: CRITICAL/MainProcess] beat raised exception <class 'EOFError'>: EOFError('Ran out of input',) Traceback (most recent call last): File "/home/user/staging/venv/lib/python3.4

What are the django-celery (djcelery) tables for?

萝らか妹 提交于 2019-12-05 12:24:48
问题 When I run syncdb, I notice a lot of tables created like: djcelery_crontabschedule ... djcelery_taskstate django-kombu is providing the transport, so it can't be related to the actual queue. Even when I run tasks, I still see nothing populated in these tables. What are these tables used for? Monitoring purposes only -- if I enable it? If so, is it also true that if I do a lookup of AsyncResult(), I'm guessing that is actually looking up the task result via the django-kombu tables instead of

Differentiate celery, kombu, PyAMQP and RabbitMQ/ironMQ

半城伤御伤魂 提交于 2019-12-04 11:42:32
问题 I want to upload images to S3 server, but before uploading I want to generate thumbnails of 3 different sizes, and I want it to be done out of request/response cycle hence I am using celery. I have read the docs, here is what I have understood. Please correct me if I am wrong. Celery helps you manage your task queues outside the request response cycle. Then there is something called carrot/kombu - its a django middleware that packages tasks that get created via celery. Then the third layer

Consuming a rabbitmq message queue with multiple threads (Python Kombu)

[亡魂溺海] 提交于 2019-12-04 04:59:04
I have a single RabbitMQ exchange with a single queue. I wish to create a daemon that runs multiple threads and works through this queue as quickly as possible. The "work" involves communicating with external services, so there will be a fair amount of blocking going on within each consumer. As such, I want to have multiple threads all dealing with messages from the same queue. I can achieve this by consuming the queue on my primary thread, and then farming the incoming work off to a pool of other threads, but is there a way to launch multiple consumers, each within their own threaded context?

What are the django-celery (djcelery) tables for?

隐身守侯 提交于 2019-12-04 01:35:54
When I run syncdb, I notice a lot of tables created like: djcelery_crontabschedule ... djcelery_taskstate django-kombu is providing the transport, so it can't be related to the actual queue. Even when I run tasks, I still see nothing populated in these tables. What are these tables used for? Monitoring purposes only -- if I enable it? If so, is it also true that if I do a lookup of AsyncResult(), I'm guessing that is actually looking up the task result via the django-kombu tables instead of djcelery? Thanks. Mauro Rocco The celery task_state table, populated by the daemon celerycam, is just

Differentiate celery, kombu, PyAMQP and RabbitMQ/ironMQ

久未见 提交于 2019-12-03 07:28:37
I want to upload images to S3 server, but before uploading I want to generate thumbnails of 3 different sizes, and I want it to be done out of request/response cycle hence I am using celery. I have read the docs, here is what I have understood. Please correct me if I am wrong. Celery helps you manage your task queues outside the request response cycle. Then there is something called carrot/kombu - its a django middleware that packages tasks that get created via celery. Then the third layer PyAMQP that facilitates the communication of carrot to a broker. eg. RabbitMQ, AmazonSQS, ironMQ etc.