celery

Using multiprocessing pool from celery task raises exception

妖精的绣舞 提交于 2019-12-03 12:32:13
FOR THOSE READING THIS: I have decided to use RQ instead which doesn't fail when running code that uses the multiprocessing module. I suggest you use that. I am trying to use a multiprocessing pool from within a celery task using Python 3 and redis as the broker (running it on a Mac). However, I don't seem to be able to even create a multiprocessing Pool object from within the Celery task! Instead, I get a strange exception that I really don't know what to do with. Can anyone tell me how to accomplish this? The task: from celery import Celery from multiprocessing.pool import Pool app = Celery(

Assign different tasks to different celery workers

别来无恙 提交于 2019-12-03 12:30:08
问题 I am running my server using this command: celery worker -Q q1,q2 -c 2 which shows that my server will handle all the tasks on queues q1 and q2 , and I have 2 workers running. My server should support 2 different tasks: @celery.task(name='test1') def test1(): print "test1" time.sleep(3) @celery.task(name='test2') def test2(): print "test2" If I send my test1 tasks to queue q1 and test2 to q2 , both workers will run both tasks. So the result will be: test1 test2 test1 test2 ... Now what I need

Celery框架

|▌冷眼眸甩不掉的悲伤 提交于 2019-12-03 12:15:51
一 Celery初识 1.什么是celery:   指的是分布式任务队列,是一个异步任务调度工具,celery框架自带socket,所以自身是一个独立运行的服务。 2.文档 Celery 官网: http://www.celeryproject.org/ Celery 官方文档英文版: http://docs.celeryproject.org/en/latest/index.html Celery 官方文档中文版: http://docs.jinkan.org/docs/celery/ 3.celery组成 celery框架由三部分组成:存放要执行的任务broker,执行任务的对象worker,存放任务结果的backend 4.使用场景 异步任务:将耗时操作任务提交给Celery去异步执行,比如发送短信/邮件、消息推送、音视频处理等等 定时任务:定时执行某件事情,比如每天数据统计 二 Celery基本使用(异步任务) 1.安装celery模块 pip install celery 2.建立文件夹和py文件 project # 一般为项目名如:大路飞 ├── celery_task # celery包 │ ├── __init__.py # 包文件 │ ├── celery.py # celery连接和配置相关文件,且名字必须交celery.py │ └── tasks.py #

Celery框架

主宰稳场 提交于 2019-12-03 12:08:06
  一.Celery框架 独立运行的框架     1.1celery框架自带一个socket 底层通信接口 相当于起了一24 小时不间断的服务运行的项目(服务端) 好比一个死循环 不依赖Djagno 框架(wgiref 实现的并发 帮我们起的Djamgo 项目),Mysql 也是自带一个socket相当于启动了一个进程 接外界所有我们外界请求的客户端任务   1.2 目的:启动celery框架是执行服务中的任务的 服务中带一个执行任务的对象,会执行准备就绪的任务,将执行任务的结果保存起来.   1.3 Celery框架的三大组成部分:   (1)存放要执行的任务 >>> broker(中间件)   (2)执行任务的对象worker   (3)存放执行任务结果的banckend   1.4 安装的celery主体模块, 默认只提供一个woker对象 , 要结合其他技术提供broker(任务)和backend 存放执行的结果 (结果是需要我们去拿)(Rabbit, Rdis) 工作流程图   1.5 使用场景:   1.将大量耗时的任务交给celery 去做 异步(socket)   2 .爬虫 将每天定时的任务交给celery 每天 定时 或者多长时间或延缓多长时间再执行任务(定时任务)   3. 安装celery   执行指令:pip install celery   4.

Celery and transaction.atomic

冷暖自知 提交于 2019-12-03 11:48:40
问题 In some Django views, I used a pattern like this to save changes to a model, and then to do some asynchronous updating (such as generating images, further altering the model) based on the new model data. mytask is a celery task: with transaction.atomic(): mymodel.save() mytask.delay(mymodel.id).get() The problem is that the task never returns. Looking at celery's logs, the task gets queued (I see "Received task" in the log), but it never completes. If I move the mytask.delay...get call out of

Running Django-Celery in Production

有些话、适合烂在心里 提交于 2019-12-03 11:18:39
I've built a Django web application and some Django-Piston services. Using a web interface a user submits some data which is POSTed to a web service and that web service in turn uses Django-celery to start a background task. Everything works fine in the development environment using manage.py. Now I'm trying to move this to production on a proper apache server. The web application and web services work fine in production but I'm having serious issues starting celeryd as a daemon. Based on these instructions: http://docs.celeryproject.org/en/latest/tutorials/daemonizing.html#running-the-worker

Using celeryd as a daemon with multiple django apps?

泄露秘密 提交于 2019-12-03 11:06:54
问题 I'm just starting using django-celery and I'd like to set celeryd running as a daemon. The instructions, however, appear to suggest that it can be configured for only one site/project at a time. Can the celeryd handle more than one project, or can it handle only one? And, if this is the case, is there a clean way to set up celeryd to be automatically started for each configuration, which requiring me to create a separate init script for each one? 回答1: Like all interesting questions, the

supervisor - how to run multiple commands

我只是一个虾纸丫 提交于 2019-12-03 11:06:22
问题 I'm managing a Celery worker that processes queue via Supervisor. Here's my /etc/supervisor/celery.conf: [program:celery] command = /var/worker/venv/bin/celery worker -A a_report_tasks -Q a_report_process --loglevel=INFO directory=/var/worker user=nobody numprocs=1 autostart=true autorestart=true startsecs=10 stopwaitsecs = 60 stdout_logfile=/var/log/celery/worker.log stderr_logfile=/var/log/celery/worker.log killasgroup=true priority=998 How do I add this second command to run? /var/worker

How do I add authentication and endpoint to Django Celery Flower Monitoring?

白昼怎懂夜的黑 提交于 2019-12-03 10:38:23
I've been using flower locally and it seems easy enough to setup and run, but I can't see how I would set it up in a production environment. In particular, how can I add authentication and how would I define a url to access it? For custom address, use the --address flag. For auth, use the --basic_auth flag. See below: # celery flower --help Usage: /usr/local/bin/celery [OPTIONS] Options: --address run on the given address --auth regexp of emails to grant access --basic_auth colon separated user-password to enable basic auth --broker_api inspect broker e.g. http://guest:guest@localhost:15672

Python based asynchronous workflow modules : What is difference between celery workflow and luigi workflow?

让人想犯罪 __ 提交于 2019-12-03 10:34:32
问题 I am using django as a web framework. I need a workflow engine that can do synchronous as well as asynchronous(batch tasks) chain of tasks. I found celery and luigi as batch processing workflow. My first question is what is the difference between these two modules. Luigi allows us to rerun failed chain of task and only failed sub-tasks get re-executed. What about celery: if we rerun the chain (after fixing failed sub-task code), will it rerun the already succeed sub-tasks? Suppose I have two