celery

Periodic tasks in Django/Celery - How to notify the user on screen?

。_饼干妹妹 提交于 2020-08-22 05:15:28
问题 I have now succesfully setup Django-celery to check after my existing tasks to remind the user by email when the task is due: @periodic_task(run_every=datetime.timedelta(minutes=1)) def check_for_tasks(): tasks = mdls.Task.objects.all() now = datetime.datetime.utcnow().replace(tzinfo=utc,second=00, microsecond=00) for task in tasks: if task.reminder_date_time == now: sendmail(...) So far so good, however what if I wanted to also display a popup to the user as a reminder? Twitter bootstrap

How to route a chain of tasks to a specific queue in celery?

最后都变了- 提交于 2020-08-22 04:34:27
问题 When I route a task to a particular queue it works: task.apply_async(queue='beetroot') But if I create a chain: chain = task | task And then I write chain.apply_async(queue='beetroot') It seems to ignore the queue keyword and assigns to the default 'celery' queue. It would be nice if celery supported routing in chains - all tasks executed sequentially in the same queue. 回答1: I do it like this: subtask = task.s(*myargs, **mykwargs).set(queue=myqueue) mychain = celery.chain(subtask, subtask2, .

Django/Celery multiple queues on localhost - routing not working

大兔子大兔子 提交于 2020-08-21 10:28:19
问题 I followed celery docs to define 2 queues on my dev machine. My celery settings: CELERY_ALWAYS_EAGER = True CELERY_TASK_RESULT_EXPIRES = 60 # 1 mins CELERYD_CONCURRENCY = 2 CELERYD_MAX_TASKS_PER_CHILD = 4 CELERYD_PREFETCH_MULTIPLIER = 1 CELERY_CREATE_MISSING_QUEUES = True CELERY_QUEUES = ( Queue('default', Exchange('default'), routing_key='default'), Queue('feeds', Exchange('feeds'), routing_key='arena.social.tasks.#'), ) CELERY_ROUTES = { 'arena.social.tasks.Update': { 'queue': 'fs_feeds', }

Django/Celery multiple queues on localhost - routing not working

烂漫一生 提交于 2020-08-21 10:26:35
问题 I followed celery docs to define 2 queues on my dev machine. My celery settings: CELERY_ALWAYS_EAGER = True CELERY_TASK_RESULT_EXPIRES = 60 # 1 mins CELERYD_CONCURRENCY = 2 CELERYD_MAX_TASKS_PER_CHILD = 4 CELERYD_PREFETCH_MULTIPLIER = 1 CELERY_CREATE_MISSING_QUEUES = True CELERY_QUEUES = ( Queue('default', Exchange('default'), routing_key='default'), Queue('feeds', Exchange('feeds'), routing_key='arena.social.tasks.#'), ) CELERY_ROUTES = { 'arena.social.tasks.Update': { 'queue': 'fs_feeds', }

celery shutdown worker after particular task

邮差的信 提交于 2020-08-21 07:52:52
问题 I'm using celery (solo pool with concurrency=1) and I want to be able to shut down the worker after a particular task has run. A caveat is that I want to avoid any possibility of the worker picking up any further tasks after that one. Here's my attempt in the outline: from __future__ import absolute_import, unicode_literals from celery import Celery from celery.exceptions import WorkerShutdown from celery.signals import task_postrun app = Celery() app.config_from_object('celeryconfig') @app

celery shutdown worker after particular task

北城余情 提交于 2020-08-21 07:52:12
问题 I'm using celery (solo pool with concurrency=1) and I want to be able to shut down the worker after a particular task has run. A caveat is that I want to avoid any possibility of the worker picking up any further tasks after that one. Here's my attempt in the outline: from __future__ import absolute_import, unicode_literals from celery import Celery from celery.exceptions import WorkerShutdown from celery.signals import task_postrun app = Celery() app.config_from_object('celeryconfig') @app

一次崩溃的长时间调试经验,celery多人配合,一定要做好环境隔离!!!

天涯浪子 提交于 2020-08-15 07:16:54
现象: 项目在本地测一点问题都没有,部署到生产环境后,celery定时任务各种神奇的问题 celery里面感觉代码都在跳着运行 打印的log完全没在log文件里体现 两个人调试到了半夜都毫无头绪。感觉见鬼。 后来猛然醒悟。原来本机的celery worker,beat还起着,而且因为要migrate数据到生产环境,本地暂时也连的生产数据库。 立马关了,就正常了。 教训: celery 一套环境(数据库,broker)只能同时跑一个celery(worker,beat) 多人开发celery任务,多人环境一定要隔离。 个人还是别连生产了,连接生产一定要建立起一套严格的制度。连接生产的时候,最好有人监督执行。或者专人专机连接生产。 来源: oschina 链接: https://my.oschina.net/u/2396236/blog/4297495

django-celery-beat部署后beat不派发任务解决记录

筅森魡賤 提交于 2020-08-13 08:27:32
django-celery-beat部署到服务器上 启动celery beat -A name -l info后 发现在数据库中配置的定时任务根本就没执行 查看beat日志,发现启动后,就没派发任何任务 测试环境都是好的 生产环境有问题 排查后发现,生产环境的django_celery_beat_periodictask表是从测试环境直接拷贝数据过去的 表里last_run_at里,有测试环境上次运行的时间数据 查看了下django-celery-beat的源代码,beat每次会取出表里的last_run_at的时间加上间隔时间和服务器的当前时间进行判断,若时间到了,就派发任务。 由于测试环境虚拟机时间和正式环境服务器的时间有点差异。测试机时间早于正常的时间。导致正式环境一直认为还没到派发时间,所以没有派发 把表里的时间改一下,就正常了。 来源: oschina 链接: https://my.oschina.net/u/2396236/blog/4297473