celery

Python redis and celery too many clients, different errors on each execution | Tasks connect to MySQL using pymsql

折月煮酒 提交于 2020-01-05 07:52:06
问题 I am currently working on an app, which has to process several long running tasks. I am using python 3 , flask , celery , redis . I have a working solution on localhost, but on heroku there are many errors and every execution of the app triggers everytime a different set of errors. I know it cant be random so I am trying to figure out where to start looking. I have a feeling something must be wrong with redis and I am trying to understand what clients are and where they come from, but I am

Getting latest tasks from Celery and display them using Django

余生颓废 提交于 2020-01-05 07:00:43
问题 I have a Django 1.5.1 webapp using Celery 3.0.23 with RabbitMQ 3.1.5. and sqlite3. I can submit jobs using a simple result = status.tasks.mymethod.delay(parameter) , all tasks executes correctly: [2013-09-30 17:04:11,369: INFO/MainProcess] Got task from broker: status.tasks.prova[a22bf0b9-0d5b-4ce5-967a-750f679f40be] [2013-09-30 17:04:11,566: INFO/MainProcess] Task status.tasks.mymethod[a22bf0b9-0d5b-4ce5-967a-750f679f40be] succeeded in 0.194540023804s: u'Done' I want to display in a page the

django-celery works in development, fails in wsgi production: How to debug?

帅比萌擦擦* 提交于 2020-01-04 14:09:13
问题 I'm using the django celery task queue, and it works fine in development, but not at all in wsgi production. Even more frustrating, it used to work in production, but I somehow broke it. "sudo rabbitmqctl status" tells me that the rabbitmq server is working. Everything also seems peachy in django: objects are created, and routed to the task manager without problems. But then their status just stays as "queued" indefinitely. The way I've written my code, they should switch to "error" or "ready

Updating many objects as result of Celery task: Django post_save or 2nd Celery task?

你离开我真会死。 提交于 2020-01-04 05:59:11
问题 I have a Celery task that runs periodically and may update a few objects on each run (order of magnitude ~10 would be the most per run). I have 2 other steps that need to occur if models are updated by that Celery task, and had planned to use the Django Model post_save signal to accomplish them. However, I realized that handing update of a large number of objects (potentially tens or hundreds of thousands) in a post_save handler might not be ideal. Here's the flow of what I'm trying to

Celery immediately exceeds memory on Heroku

南楼画角 提交于 2020-01-04 01:33:09
问题 I'm deploying a Celery process to Heroku and every time it starts, it immediately starts to rack up memory usage and crash after it exceeds the maximum. I only have one task called "test_task" that prints once per minute. This is Django app using Celery with a Redis backend hosted on Heroku. Proc file: web: daphne chatbot.asgi:channel_layer --port $PORT --bind 0.0.0.0 --verbosity 1 chatworker: python manage.py runworker --verbosity 1 celeryworker: celery -A chatbot worker -l info Heroku logs:

Celery

一个人想着一个人 提交于 2020-01-04 01:12:15
Celery 1.Celery是什么 Celery 一个懂得 异步任务 , 定时任务 , 周期任务 的芹菜 Celery 是基于Python实现的模块, 用于执行异步定时周期任务的 其结构的组成是由 1.用户任务 app 2.管道 broker 用于存储任务 官方推荐 redis rabbitMQ / backend 用于存储任务执行结果的 3.员工 worker 多任务异步任务: app---task---调度器(broker)---worker ---调度器(backend)---task---app 定时任务:task---多少时间执行该任务>调度器(broker)---多少时间执行该任务>worker等待 ---调度器(backend)---task 周期任务: 2.Celery的简单实例 1 from celery import Celery 2 import time 3 4 #创建一个Celery实例,这就是我们用户的应用app 5 my_task = Celery("tasks", broker="redis://127.0.0.1:6379", backend="redis://127.0.0.1:6379") 6 7 # 为应用创建任务,func1 8 @my_task.task 9 def func1(x, y): 10 time.sleep(15) 11

Dynamically add/remove threads to the worker pool in celery

拥有回忆 提交于 2020-01-03 09:04:48
问题 How do I add more threads (and remove threads) to the current multiprocessing pool, from within a task (i.e. celeryd was run with CELERYD_CONCURRENCY = 10 but I want to change it on-the-fly to CELERYD_CONCURRENCY = 15)? There is a function called celery.concurrency.processes.TaskPool.Pool.grow but I have no idea how to call that from a running task or whether it is the correct function to do that. 回答1: Read the source: https://github.com/ask/celery/blob/master/celery/concurrency/processes/_

How to connect celery to rabbitMQ using SSL

末鹿安然 提交于 2020-01-03 05:14:06
问题 I'm trying to connect celery with a rabbitMQ broker using SSL certificates. This is the code: from celery import Celery import ssl broker_uri = 'amqp://user:pwd@server:5672/vhost' certs_conf = { "ca_certs": "/certs/serverca/cacert.pem", "certfile": "/certs/client/rabbit-cert.pem", "keyfile": "/certs/client/rabbit-key.pem", "cert_reqs": ssl.CERT_REQUIRED } app = Celery('tasks', broker=broker_uri) app.conf.update(BROKER_USE_SSL=certs_conf) app.send_task('task.name', [{'a': 1}]) When I try to

How to make two tasks mutually exclusive in Celery?

♀尐吖头ヾ 提交于 2020-01-03 05:07:12
问题 Is there a way to disallow two different tasks to run simultaneously in Celery? I was thinking about defining a new queue with concurrency level=1, and send those tasks to that queue, but I couldn't find an example. Is that possible? Thanks! 回答1: Yes, if you don't need to worry about overall throughput it is possible to create a separate queue and have a dedicated worker with concurrency set to 1. You can create as many queues as you want and configure which of those queues each worker

Delay sending an email using Mandrill send_at or Celery countdown/eta

混江龙づ霸主 提交于 2020-01-02 08:11:12
问题 I commonly send transactional emails in response to certain actions on my website, some of which I delay sending by a couple of hours. The function that actually queues the email is a Celery task function called with .delay() that eventually makes an API call to Mandrill using djrill. I discovered that Mandrill offers a send_at parameter when sending an email that will have Mandrill delay sending the email until the specified time. Celery also offers eta or countdown parameters when calling