celery

Debugging djcelery's celeryd via pdb

ε祈祈猫儿з 提交于 2019-12-03 22:24:16
Have anybody tried debugging celeryd worker using pdb? Whenever a breakpoint is encountered (either by running celeryd via pdb, or by pdb.set_trace() ), I hit the following error: Error while handling action event. Traceback (most recent call last): File "/home/jeeyo/workspace3/uwcr/subscriptions/tasks.py", line 79, in process_action_event func(action_event) File "/home/jeeyo/workspace3/uwcr/subscriptions/tasks.py", line 36, in new_user_email send_registration_email(username, new_user.get_profile().plaintext_password) File "/home/jeeyo/workspace3/uwcr/looers/email.py", line 18, in send

celery + django - how to write task state to database

折月煮酒 提交于 2019-12-03 22:15:12
I'm running Celery with Django and RabbitMQ and want to see the task states in the database table. Unfortunately no entries are written into the table djcelery_taskstate and I can't figure out why. My settings: CELERY_ENABLE_UTC = True BROKER_URL = "amqp://guest:guest@localhost:5672/" CELERY_RESULT_BACKEND = "database" CELERYBEAT_SCHEDULER = 'djcelery.schedulers.DatabaseScheduler' CELERY_TRACK_STARTED = True CELERY_SEND_EVENTS = True CELERY_IMPORTS = ("project_management.tasks", "accounting.tasks", "time_tracking.tasks", ) CELERY_ALWAYS_EAGER = False import djcelery djcelery.setup_loader() My

celery在django项目中的运用

ε祈祈猫儿з 提交于 2019-12-03 21:19:26
#CeleryTest/celery.py from __future__ import absolute_import,unicode_literals import os from celery import Celery #set the default Django settings module for the 'celery' program. #'CeleryTest'项目名 os.environ.setdefault('DJANGO_SETTINGS_MODULE','CeleryTest.setting') app = Celery('CeleryTest') #Using a string here means the worker don't to serialize #the configuration object to child processes #Using a string here means the worker don't to serialize app.config_from_object('django.conf:settings',namespace='CELERY') #load task modules from all registered Django app configs app.autodiscover_tasks()

celery总结

女生的网名这么多〃 提交于 2019-12-03 20:36:07
Celery配置基础配置class Config: """ 配置类 """ CACHE_TYPE = 'redis' CACHE_REDIS_HOST = os.environ.get('CACHE_REDIS_HOST') or '127.0.0.1' CACHE_REDIS_PORT = os.environ.get('CACHE_REDIS_PORT') or 6379 CACHE_REDIS_DB = os.environ.get('CACHE_REDIS_DB') or '1' CACHE_REDIS_PASSWORD = os.environ.get('CACHE_REDIS_PASSWORD') or 'greenvalley' SQLALCHEMY_TRACK_MODIFICATIONS = False # 'redis://auth:password@redishost:6379/0' CELERY_BROKER_URL = 'redis://auth:%s@%s:%s/10' % (CACHE_REDIS_PASSWORD, CACHE_REDIS_HOST, CACHE_REDIS_PORT, ) CELERY_RESULT_BACKEND = 'redis://auth:%s@%s:%s/11' % (CACHE_REDIS_PASSWORD, CACHE

Celery 3.1.9 Django integration, specifying settings file, without using djcelery

ε祈祈猫儿з 提交于 2019-12-03 20:16:31
问题 I began using celery 3.1.9 today with Django. This newer version has a tighter integration with django that removes the need to using django-celery. I use multiple settings files and I was wondering if there was an easy way to specify which settings file to use when initializing the celery worker ? With djcelery it is quite simply since it uses the manage.py commands. I naively tried to check if settings.DEBUG was true in the celery.py file, but of course that failed because the settings were

django & celery - 关于并发处理能力和内存使用的小结

前提是你 提交于 2019-12-03 20:12:46
背景 众所周知,celery 是python世界里处理分布式任务的好助手,它的出现结合赋予了我们强大的处理异步请求,分布式任务,周期任务等复杂场景的能力。 然鹅,今天我们所要讨论的则是如何更好的在使用celery, 主要讨论的点针是对内存的使用方面。 django & celery & django-celery 楼主的项目中使用的是 celery 和 django 的相结合的方式,版本分别为: python == 2.7 celery==3.1.25 Django==1.11.7 django-celery==3.2.2 celery 处理并发 项目中使用celery beat 来触发定时任务;并且根据业务需求,分别使用了2个 celery worker 来处理异步请求。 在开发环境下,操作系统有4个processors, 内存为8GB。 在默认情况下,启动celerycelery beat 和 两个 worker 后,并发情况如下: 可以看到, 默认情况下, celery 会根据processor的数量(4个)来启动相应数量的worker 。 celery 允许我们通过配置 ‘CELERYD_CONCURRENCY ’ 来 控制 celery worker 并发数 。 当修改celery worker 为 tasksWorker 的 worker 的配置为: CELERYD

Django - Executing a task through celery from a model

梦想的初衷 提交于 2019-12-03 17:50:09
问题 In my models.py: from django.db import models from core import tasks class Image(models.Model): image = models.ImageField(upload_to='images/orig') thumbnail = models.ImageField(upload_to='images/thumbnails', editable=False) def save(self, *args, **kwargs): super(Image, self).save(*args, **kwargs) tasks.create_thumbnail.delay(self.id) In my tasks.py: from celery.decorators import task from core.models import Image @task() def create_thumbnail(image_id): ImageObj = Image.objects.get(id=image_id

How to execute tasks in Celery using datetime from MySQL?

给你一囗甜甜゛ 提交于 2019-12-03 17:31:26
News are stored in database MySQL with datetime of publication. Any time user can add a new in the table with delay date publishing. How to use Celery for listeting database table and check if it is time to publish data? For publishing data is responsible another process(script Python). So Celery should call this script for each rows in MySQL table according datetime . How to do that using Celery? Another way I think to create queue with date and id of publication then directly add data from user form in this queue. Therefore sholud be process(Celery) that observes a queue for further

celery 使用详解

て烟熏妆下的殇ゞ 提交于 2019-12-03 17:29:29
celery 是啥? 由python 编写 的异步生产者消费者设计模式下 的实例 举个例子: 现有两个进程 生产者进程A 消费者进程B 现在的情况是 逻辑推导: A 产出栗子 B 要吃栗子 那么这两个进程必然是 B依赖于A 耦合度很高且是一个耗时操作 B -----> (发送请求给A)------->(等待A 产出栗子也许会很久)------->(A响应栗子给B)------->(B得到栗子) B 可能是个很多服务的集成后台之类很忙大忙人不想一直等等等 那么 celery 的任务就是 替B 去等 逻辑推导: A 产出栗子 B 要吃栗子 C celery B (替我去取栗子)-----> C(发送请求给A)------->(等待A 产出栗子也许会很久)------->(A响应栗子给C)------->(B得到栗子) (C 可以去把栗子存在一个地方B直接去取就好了) 那么celery 的本质知道了:一个中间人角色 ,类似快递小哥,跑腿的 作用: 1 防止线程阻塞提高性能 2低耦合解耦 高内聚 高复用 好处 ::每分钟可以实现数以百万的任务 特点: 使用消息队列(broker )在客户端和消费者之间协调 主体在消息队列 还有 协调 两端可以有多个 好的,现在说celery 的结构 Celery的架构由三部分组成,消息中间件(message broker),任务执行单元(worker

Celery in daemon mode

為{幸葍}努か 提交于 2019-12-03 16:54:07
I use GNU screen for running Celery in console mode, but it's a hack I don't want to use on prodution server. I want to know how to daemonize Celery. I have virtualenv with celery set up. I want to run %venv%/bin/celeryd in daemon mode. I tried ./celeryd start and got: Unrecognized command line arguments: start What else I should try to run it in daemon mode? Try this /etc/init.d/celeryd script. #!/bin/sh -e ### BEGIN INIT INFO # Provides: celeryd # Required-Start: $network $local_fs $remote_fs # Required-Stop: $network $local_fs $remote_fs # Default-Start: 2 3 4 5 # Default-Stop: 0 1 6 #