celery

How to call a Celery shared_task?

|▌冷眼眸甩不掉的悲伤 提交于 2020-02-08 10:09:31
问题 I'm trying to use stream_framework in my application ( NOT Django ) but I'm having a problem calling the stream_framework shared tasks. Celery seems to find the tasks: -------------- celery@M3800 v3.1.25 (Cipater) ---- **** ----- --- * *** * -- Linux-4.15.0-34-generic-x86_64-with-Ubuntu-18.04-bionic -- * - **** --- - ** ---------- [config] - ** ---------- .> app: task:0x7f8d22176dd8 - ** ---------- .> transport: redis://localhost:6379/0 - ** ---------- .> results: redis://localhost:6379/0 - *

python3+celery+redis实现异步任务

别说谁变了你拦得住时间么 提交于 2020-02-08 00:26:23
一、原理 Celery是基于Python开发的一个分布式任务队列框架,支持使用任务队列的方式在分布的机器/进程/线程上执行任务调度。它是Python写的库,但是它实现的通讯协议也可以使用ruby,php,javascript等调用。异步任务除了消息队列的后台执行的方式,还是一种则是定时计划任务。 Celery 是一个强大的分布式任务队列,它可以让任务的执行完全脱离主程序,甚至可以被分配到其他主机上运行。我们通常使用它来实现异步任务(async task)和定时任务(crontab)。它的架构组成如下图 组件: 1、任务(tasks)--用户定义的函数,用于实现用户的功能,比如执行一个耗时很长的任务 2、中间介(Broker)--用于存放tasks的地方,但是这个中间介需要解决一个问题,就是可能需要存放非常非常多的tasks,而且要保证Worker能够从这里拿取 3、执行者(Worker)--用于执行tasks,也就是真正调用我们在tasks中定义的函数 4、存储(Backend)--把执行tasks返回的结果进行存储,以供用户查看或调用 二、实现过程 1.环境安装(RabbitMQ/Redis、Celery、django-celery、flower) 2.创建工程 红圈为本工程所需: web_order下面需要修改的文件:celery.py、__init__.py

使用supervisor后台运行celery

◇◆丶佛笑我妖孽 提交于 2020-02-06 16:12:07
一、先安装supervisor 1、安装命令: $ pip install supervisor 如果在沙盒环境下安装不上的话使用: $ apt-get install supervisor 二、安装celery $ pip install celery 三、对supervisor进行配置 1、生成默认配置文件 $ echo_supervisord_conf > /etc/supervisord.conf 在这里你也可以自定义默认配置文件生成的位置,前提是/etc目录下先去创建你的要放置配置文件的目录 假如我们创建了一个叫做supervisor的文件夹 然后将默认配置文件放到这个文件夹里面 $ echo_supervisord_conf > /etc/supervisor/supervisord.conf 2、修改配置文件 $ vim /etc/supercisor/supervisord.conf 在最后一行添加 files = /etc/supervisor/supervisord.conf.d/*.conf [include] ;files = /etc/supervisor/conf.d/*.conf files = /etc/supervisor/supervisord.conf.d/*.conf 然后进入supervisord.conf.d文件夹,创建celeryd

Django Celery Periodic Task at specific time

爷,独闯天下 提交于 2020-02-06 10:20:19
问题 I am using celery==4.1.1 in my project. In my settings.py , I have the following: from celery.schedules import crontab CELERY_BROKER_URL = "redis://127.0.0.1:6379/1" CELERY_TIMEZONE = 'Asia/Kolkata' CELERY_ACCEPT_CONTENT = ['application/json'] CELERY_TASK_SERIALIZER = 'json' CELERY_RESULT_SERIALIZER = 'json' CELERY_RESULT_BACKEND = "redis://127.0.0.1:6379/1" CELERY_BEAT_SCHEDULE = { 'task-number-one': { 'task': 'mathematica.core.tasks.another_test', 'schedule': crontab(minute=45, hour=00) },

Is there any way to change the Celery config programmatically, after app init?

好久不见. 提交于 2020-02-04 06:23:42
问题 I have set up a testing environment where I have Celery workers actually running in other processes, so that the full functionality of my system with Celery can be tested. This way, tasks actually run in a worker process and communicate back to the test runner, and so I don't need CELERY_ALWAYS_EAGER to test this functionality. That being said, in some situations I have tasks which trigger off other tasks without caring when they finish, and I'd like to create tests which do - that is, to

Is there any way to change the Celery config programmatically, after app init?

梦想与她 提交于 2020-02-04 06:22:49
问题 I have set up a testing environment where I have Celery workers actually running in other processes, so that the full functionality of my system with Celery can be tested. This way, tasks actually run in a worker process and communicate back to the test runner, and so I don't need CELERY_ALWAYS_EAGER to test this functionality. That being said, in some situations I have tasks which trigger off other tasks without caring when they finish, and I'd like to create tests which do - that is, to

How do I override the backend for celery tasks

☆樱花仙子☆ 提交于 2020-02-03 17:46:07
问题 we're using Redis as our result backend. However for one task we'd like to override this to use RabbitMQ instead. The documentation for Task.backend says: The result store backend to use for this task. Defaults to the CELERY_RESULT_BACKEND setting So I had assumed that we could set Task.backend to a string of the same format accepted by CELERY_RESULT_BACKEND . So I try this: celeryconfig.py CELERY_RESULT_BACKEND = "redis://redis-host:7777" tasks.py @app.task(backend='amqp://guest@localhost

celery连接redis报错

梦想与她 提交于 2020-02-02 04:54:53
1、错误: django.core.exceptions.ImproperlyConfigured: mysqlclient 1.3.13 or newer is required; you have 0.9.3 2、方法 1、注释"D:\project\Django\venv\lib\site-packages\django\db\backends\mysql\base.py"中的一下两行代码: if version < (1, 3, 13): raise ImproperlyConfigured('mysqlclient 1.3.13 or newer is required; you have %s.' % Database.__version__) 2、将"D:\project\Django\venv\lib\site-packages\django\db\backends\mysql\operations.py" 中的 decode改为encode 来源: CSDN 作者: 明宇李 链接: https://blog.csdn.net/mingyuli/article/details/104132930

celery beat之pidfile already exists问题

拜拜、爱过 提交于 2020-02-01 11:51:34
背景 在进行 celery 定时任务测试时,发现到点任务并未执行,检查了 log 发现在启动 celery beat 的时候有这样一个报错,所以 celery beat 并未启动成功。 1234 (hzinfo) E:PythonWorkSpacehzinfo>celery beat v3.1.0 (Cipater) is starting.ERROR: Pidfile (celerybeat.pid) already exists.Seems we're already running? (pid: 22220) * Restarting with stat 排查 celery beat 在运行时,会自动创建两个文件: pidfile :默认为 celerybeat.pid ,保存在项目根目录。 scheduler :默认为 celerybeat-schedule ,保存在项目根目录。 这里的报错说明 pidfile 已存在。 看下官网的说明 –pidfile File used to store the process pid. Defaults to celerybeat.pid. The program won’t start if this file already exists and the pid is still alive. 上次运行的时候,已经自动创建了

mac下安装并启动RabbitMQ

浪尽此生 提交于 2020-01-31 13:41:55
前言   RabbitMQ是实现了高级消息队列协议(AMQP)的开源消息代理软件(亦称面向消息的中间件)。RabbitMQ服务器是用Erlang语言编写的,而群集和故障转移是构建在开放电信平台框架上的。所有主要的编程语言均有与代理接口通讯的客户端库。   RabbitMQ是一套开源(MPL)的消息队列服务软件,是由 LShift 提供的一个 Advanced Message Queuing Protocol (AMQP) 的开源实现,由以高性能、健壮以及可伸缩性出名的 Erlang 写成。 正文   在django需要使用异步操作的情况下,Celery是一个常用的库。在实际的项目中,Celery又需要依赖RabbitMQ。所以,安装和使用RabbitMq是一切的关键。在macos中一般使用:brew来安装,下面给出安装步骤: 1 安装更新:brew update 2 安装rabbitmq:brew install rabbitmq rabbitmq的脚本都安装在目录:/usr/local/Cellar/rabbitmq 的sbin下面,也等同于目录:/usr/local/opt/rabbitmq/sbin下 接下来就需要在环境变量中添加:export PATH=$PATH:/usr/local/opt/rabbitmq/sbin 即可: