celery

Djando Celery: Celery task does not create record in DB

安稳与你 提交于 2019-12-06 15:27:05
I want to create database records with celery task. But for some reason object.save() method is not working with task.apply_async() (Apply tasks asynchronousy). Same record (Ticker) is saved in the database with a celery task while running it locally: get_all_tickers.apply() But is not saved with asynchronous mode: get_all_tickers.apply_async() In both cases INSERT statement is visible in the server log. models.py class Ticker(TimeStampedModel): ask = models.DecimalField(max_digits=18, decimal_places=8) bid = models.DecimalField(max_digits=18, decimal_places=8) pair = models.ForeignKey(Pair)

一百四十七:CMS系统之celery实现邮件和短信异步发送

我与影子孤独终老i 提交于 2019-12-06 15:16:58
celery工作原理 celery官方文档: https://docs.celeryproject.org/en/latest/ 安装:pip install celery windows下还需安装eventlet来做任务调度:pip install eventlet 启动redis,已将redis密码设为为123456 简单示例 import timefrom celery import Celerycelery = Celery('tasks', broker='redis://:123456@192.168.223.128:6379/0', backend='redis://:123456@192.168.223.128:6379/0')@celery.taskdef send_email(): print('邮件开始发送...') time.sleep(2) print('邮件发送结束...') 执行命令监控命令:celery -A tasks.celery --pool=eventlet worker --loglevel=info 运行发邮件操作 实现异步发送邮件和短信验证码 flask推荐的使用celery的方法: https://flask.palletsprojects.com/en/1.0.x/patterns/celery/ config中配置信息

Celery 异步队列

浪子不回头ぞ 提交于 2019-12-06 15:07:17
Celery Celery是一个功能完备即插即用的异步任务队列系统。它适用于异步处理问题,当发送邮件、或者文件上传, 图像处理等等一些比较耗时的操作,我们可将其异步执行,这样用户不需要等待很久,提高用户体验。 文档:http://docs.jinkan.org/docs/celery/getting-started/index.html Celery的特点是: 简单,易于使用和维护,有丰富的文档。 高效,单个celery进程每分钟可以处理数百万个任务。 灵活,celery中几乎每个部分都可以自定义扩展。 任务队列是一种跨线程、跨机器工作的一种机制. 任务队列中包含称作任务的工作单元。有专门的工作进程持续不断的监视任务队列,并从中获得新的任务并处理. celery通过消息进行通信,通常使用一个叫Broker(中间人)来协client(任务的发出者)和worker(任务的处理者). clients发出消息到队列中,broker将队列中的信息派发给worker来处理。 Celery 的架构 Celery的架构由三部分组成,消息队列(message broker),任务执行单元(worker)和任务执行结果存储(task result store)组成。 一个celery系统可以包含很多的worker和broker Celery本身不提供消息队列功能

Celery Why Does Task Stay In Queue

那年仲夏 提交于 2019-12-06 14:00:51
问题 So I am using Celery with RabbitMQ. I have a RESTful API that registers a user. I am using remote Celery worker to send a registration email asynchronously so my API can return fast response. from .tasks import send_registration_email def register_user(user_data): # save user to the database etc send_registration_email.delay(user.id) return {'status': 'success'} This works fine. Email is being sent in a non blocking asynchronous way (and can be retried if fails which is cool). The problem is

Celery

倾然丶 夕夏残阳落幕 提交于 2019-12-06 13:46:53
Celery 官方 Celery 官网:http://www.celeryproject.org/ Celery 官方文档英文版:http://docs.celeryproject.org/en/latest/index.html Celery 官方文档中文版:http://docs.jinkan.org/docs/celery/ Celery架构 Celery的架构由三部分组成,消息中间件(message broker)、任务执行单元(worker)和 任务执行结果存储(task result store)组成。 消息中间件 Celery本身不提供消息服务,但是可以方便的和第三方提供的消息中间件集成。包括,RabbitMQ, Redis等等 任务执行单元 Worker是Celery提供的任务执行的单元,worker并发的运行在分布式的系统节点中。 任务结果存储 Task result store用来存储Worker执行的任务的结果,Celery支持以不同方式存储任务的结果,包括AMQP, redis等 使用场景 异步任务:将耗时操作任务提交给Celery去异步执行,比如发送短信/邮件、消息推送、音视频处理等等 定时任务:定时执行某件事情,比如每天数据统计 基本使用 celery.py # 1)创建app + 任务 # 2)启动celery(app)服务: # 非windows #

Django - How to use asynchronous task queue with celery and redis

房东的猫 提交于 2019-12-06 13:27:28
问题 #In my views.py file pi1 = None pis1 = None def my_func(): #Essentially this function sets a random integer to pi1 and pis1 global pi1, pis1 pi1 = randint(0,9) pis1 = randint(0,9) return def index(request): my_func() context = { "pi1" : pi1, "pis1" : pis1, } return render(request, "index.html", context) #In the index.html file <h1>{{ pi1 }}</h1> <h1>{{ pis1 }}</h1> I've removed a lot of my code for simplicity, but this is the gist of it. Despite the code that I've posted for my_func, it is a

Celery框架实现异步执行任务

北战南征 提交于 2019-12-06 12:49:17
Celery 官方 Celery 官网:http://www.celeryproject.org/ Celery 官方文档英文版:http://docs.celeryproject.org/en/latest/index.html Celery 官方文档中文版:http://docs.jinkan.org/docs/celery/ Celery架构 Celery的架构由三部分组成,消息中间件(message broker)、任务执行单元(worker)和 任务执行结果存储(task result store)组成。 消息中间件 Celery本身不提供消息服务,但是可以方便的和第三方提供的消息中间件集成。包括,RabbitMQ, Redis等等 任务执行单元 Worker是Celery提供的任务执行的单元,worker并发的运行在分布式的系统节点中。 任务结果存储 Task result store用来存储Worker执行的任务的结果,Celery支持以不同方式存储任务的结果,包括AMQP, redis等 使用场景 异步任务:将耗时操作任务提交给Celery去异步执行,比如发送短信/邮件、消息推送、音视频处理等等 定时任务:定时执行某件事情,比如每天数据统计 Celery的安装配置 pip install celery 消息中间件:RabbitMQ/Redis app=Celery(

Database is not updated in Celery task with Flask and SQLAlchemy

青春壹個敷衍的年華 提交于 2019-12-06 12:28:32
I'm writing web application with Flask and SQLAlchemy. My program needs to process some stuff in the background and then mark this stuff as processed in the database. Using standard Flask/Celery example , I have something like this: from flask import Flask from celery import Celery def make_celery(app): celery = Celery(app.import_name, broker=app.config['CELERY_BROKER_URL']) celery.conf.update(app.config) TaskBase = celery.Task class ContextTask(TaskBase): abstract = True def __call__(self, *args, **kwargs): with app.app_context(): return TaskBase.__call__(self, *args, **kwargs) celery.Task =

Celery Consumer SQS Messages

那年仲夏 提交于 2019-12-06 12:19:10
问题 I am new to Celery and SQS , and would like to use it to periodically check messages stored in SQS and then fire a consumer. The consumer and Celery both live on EC2 , while the messages are sent from GAE using boto library. Currently, I am confused about: In the message body of creating_msg_gae.py , what task information I should put here? I assume this information would be the name of my celery task ? In the message body of creating_msg_gae.py , is url considered as the argument to be

I'm getting an “ERROR (spawn error)” when I try to start my celery/supervisor instance

限于喜欢 提交于 2019-12-06 12:14:53
问题 I've gone through how to use celery on my django production server using supervisor. However when I try to start supervisor with sudo supervisorctl start app-celery - it returns: app-celery: ERROR (spawn error) Here is my config /etc/supervisor/conf.d/app-celery.conf : [program:app-celery] command=/home/zorgan/app/env/bin/celery worker -A draft1 --loglevel=INFO directory=/home/zorgan/app/draft1 numprocs=1 stdout_logfile=/var/log/supervisor/celery.log stderr_logfile=/var/log/supervisor/celery