celery

OSError: dlopen(libSystem.dylib, 6): image not found

若如初见. 提交于 2020-01-22 17:25:08
问题 Just updated my Mac to El Capitan 10.11. I am trying to run Django 1.6 with Celery 3.1 and I'm getting this error now: Unhandled exception in thread started by <function wrapper at 0x10f861050> Traceback (most recent call last): File "/Library/Python/2.7/site-packages/django/utils/autoreload.py", line 93, in wrapper fn(*args, **kwargs) File "/Library/Python/2.7/site-packages/django/core/management/commands/runserver.py", line 101, in inner_run self.validate(display_num_errors=True) File "

Celery auto reload on ANY changes

六眼飞鱼酱① 提交于 2020-01-22 13:21:10
问题 I could make celery reload itself automatically when there is changes on modules in CELERY_IMPORTS in settings.py . I tried to give mother modules to detect changes even on child modules but it did not detect changes in child modules. That make me understand that detecting is not done recursively by celery. I searched it in the documentation but I did not meet any response for my problem. It is really bothering me to add everything related celery part of my project to CELERY_IMPORTS to detect

Celery auto reload on ANY changes

好久不见. 提交于 2020-01-22 13:19:05
问题 I could make celery reload itself automatically when there is changes on modules in CELERY_IMPORTS in settings.py . I tried to give mother modules to detect changes even on child modules but it did not detect changes in child modules. That make me understand that detecting is not done recursively by celery. I searched it in the documentation but I did not meet any response for my problem. It is really bothering me to add everything related celery part of my project to CELERY_IMPORTS to detect

Interact with celery ongoing task

ⅰ亾dé卋堺 提交于 2020-01-22 10:24:06
问题 We have a distributed architecture based on rabbitMQ and Celery . We can launch in parallel multiple tasks without any issue. The scalability is good. Now we need to control the task remotely: PAUSE, RESUME, CANCEL. The only solution we found is to make in the Celery task a RPC call to another task that replies the command after a DB request. The Celery task and RPC task are not on the same machine and only the RPC task has access to the DB. Do you have any advice how to improve it and easily

Celery: list all tasks, scheduled, active *and* finished

匆匆过客 提交于 2020-01-22 05:40:26
问题 Update for the bounty I'd like a solution that does not involve a monitoring thread, if possible. I know I can view scheduled and active tasks using the Inspect class of my apps Control . i = myapp.control.inspect() currently_running = i.active() scheduled = i.scheduled() But I could not find any function to show already finished tasks. I know that this information mus be at least temporarily accessible, because I can look up a finished task by its task_id : >>> r = my task.AsyncResult(task

Celery with multiple django sites

三世轮回 提交于 2020-01-16 20:02:20
问题 I have a one django backend for few customer's sites: my_proj |- my_proj |- __init__.py |- settings.py |- settings_development.py |- settings_production_1.py |- settings_production_2.py |- settings_production_3.py |- my_app_1 |- my_app_2 ... settings_production_1.py: from settings import * DEBUG = False DATABASES = { 'default': { 'ENGINE': 'django.db.backends.mysql', 'NAME': 'customer_1_db', 'USER': 'some_user', 'PASSWORD': 'some_passw', 'HOST': '127.0.0.1', 'PORT': '', } } MEDIA_ROOT =

Celery with multiple django sites

馋奶兔 提交于 2020-01-16 20:02:05
问题 I have a one django backend for few customer's sites: my_proj |- my_proj |- __init__.py |- settings.py |- settings_development.py |- settings_production_1.py |- settings_production_2.py |- settings_production_3.py |- my_app_1 |- my_app_2 ... settings_production_1.py: from settings import * DEBUG = False DATABASES = { 'default': { 'ENGINE': 'django.db.backends.mysql', 'NAME': 'customer_1_db', 'USER': 'some_user', 'PASSWORD': 'some_passw', 'HOST': '127.0.0.1', 'PORT': '', } } MEDIA_ROOT =

Django Celery Redis 异步执行任务demo实例

六眼飞鱼酱① 提交于 2020-01-16 09:50:00
一、windows中安装redis 安装过程见 《 在windows x64上部署使用Redis 》 二、环境准备 requirements.txt Django==1.10.5 celery==3.1.23 redis==2.10.5 注意,celery 4.x 以上不支持windows pip install -r requirements.txt 三、创建Django项目celery_proj,创建APP:celery_demo >>django-admin startproject celery_proj >>cd celery_proj >>django-admin startapp celery_demo 四、添加celery相关配置信息 1.在celery_proj/celery_proj目录下,添加如下 celery.py 文件 #!/usr/bin/env python # -*- coding: utf-8 -*- from __future__ import absolute_import import os from celery import Celery from django.conf import settings # set the default Django settings module for the 'celery_proj'

celery --分布式任务队列

一曲冷凌霜 提交于 2020-01-16 08:51:17
celery --分布式任务队列 一、介绍 celery是一个基于python开发的分布式异步消息任务队列,用于处理大量消息,同时为操作提供维护此类系统所需的工具。 它是一个任务队列,专注于实时处理,同时还支持任务调度。如果你的业务场景中需要用到异步任务,就可以考虑使用celery 二、实例场景 1、你想对100台机器执行一条批量命令,可能会花很长时间 ,但你不想让你的程序等着结果返回,而是给你返回 一个任务ID,你过一段时间只需要拿着这个任务id就可以拿到任务执行结果, 在任务执行ing进行时,你可以继续做其它的事情。 2、你想做一个定时任务,比如每天检测一下你们所有客户的资料,如果发现今天 是客户的生日,就给他发个短信祝福 三、优点 1、简单:一但熟悉了celery的工作流程后,配置和使用还是比较简单的 2、高可用:当任务执行失败或执行过程中发生连接中断,celery 会自动尝试重新执行任务 3、快速:一个单进程的celery每分钟可处理上百万个任务 4、灵活:几乎celery的各个组件都可以被扩展及自定制 四、入门 celery 需要一个解决方案来发送和接受消息,通常,这是以称为消息代理的单独服务的形式出现的 有以下几种解决方案,包括: 一:RabbitMQ(消息队列,一种程序之间的通信方式) rabbitmq 功能齐全,稳定,耐用且易于安装。它是生产环境的绝佳选择。

Is there a way to receive a notification as soon as a certain task with a certain task id is successful or fails using Celery for Python?

点点圈 提交于 2020-01-15 20:00:55
问题 I want to know if there is a way to monitor whether or not a task completes or fails as soon as it does using python celery. I have an event I want to fire up based on the results of a certain task. 回答1: You can run your task as a celery @shared_task with a try except block inside: @shared_task def my_task(input1, input2, ...): Setting up... try: Do stuff fire_success_event() <- Your success event except Exception: The above stuff failed fire_fail_event() <- your fail event return 1 <- fail