celery

使用Celery实现定时任务功能

☆樱花仙子☆ 提交于 2020-01-01 01:21:33
使用Celery实现定时任务功能 前言 Python实现定时任务的方式有很多,如使用Celery、schedule模块、Threading模块中的Timer、sched模块、定时框架APScheduler,都各有特色。在多次对比之后,我决定使用Celery,下面我们介绍一下如何使用Celery。 结构图 下发命令 转发命令 存储执行结果 beat-任务调度器 broker-MQ worker-执行任务单元 backend-MQ或数据库 1.准备环境 首先安装Celery pip install django pip install celery pip install django_celery_beat pip install mysqlclient 安装rabbitmq(可以使用redis代替) brew install rabbitmq 安装完成之后启动服务 brew services start rabbitmq 2.创建项目 初始化项目 django-admin startproject pystudy 修改django配置 # 集成django_celery_beat INSTALLED_APPS = [ . . . , 'django_celery_beat' ] # 修改默认时区以及语言 LANGUAGE_CODE = 'zh-hans' TIME_ZONE =

Celery AttributeError: async error

一曲冷凌霜 提交于 2019-12-31 08:54:24
问题 I have RabbitMQ and Celery running locally on my Mac (OS/X 10.13.4), the following code works locally when I run add.delay(x,y): #!/usr/bin/env python from celery import Celery from celery.utils.log import get_task_logger logger = get_task_logger(__name__) app = Celery('tasks', \ broker='pyamqp://appuser:xx@c2/appvhost', \ backend='db+mysql://appuser:xx@c2/pigpen') @app.task(bind=True) def dump_context(self, x, y): print('Executing task id {0.id}, args: {0.args!r} kwargs {0.kwargs!r}'.format

Celery AttributeError: async error

扶醉桌前 提交于 2019-12-31 08:53:48
问题 I have RabbitMQ and Celery running locally on my Mac (OS/X 10.13.4), the following code works locally when I run add.delay(x,y): #!/usr/bin/env python from celery import Celery from celery.utils.log import get_task_logger logger = get_task_logger(__name__) app = Celery('tasks', \ broker='pyamqp://appuser:xx@c2/appvhost', \ backend='db+mysql://appuser:xx@c2/pigpen') @app.task(bind=True) def dump_context(self, x, y): print('Executing task id {0.id}, args: {0.args!r} kwargs {0.kwargs!r}'.format

Twisted or Celery? Which is right for my application with lots of SOAP calls?

陌路散爱 提交于 2019-12-31 08:44:06
问题 I'm writing a Python application that needs both concurrency and asynchronicity. I've had a few recommendations each for Twisted and Celery, but I'm having trouble determining which is the better choice for this application (I have no experience with either). The application (which is not a web app) primarily centers around making SOAP calls out to various third party APIs. To process a given piece of data, I'll need to call several APIs sequentially. And I'd like to be able to have a pool of

How to keep a request context in a celery task, in Python Flask?

白昼怎懂夜的黑 提交于 2019-12-30 19:26:16
问题 Is there a way to copy the request to a celery task in Flask in such a manner that the task executes inside the request context which initiated the task? I need to access the flask security current user in a celery task, but since the task is outside the request context, I can not do that. I need additional information from the request, so just forwarding the current user to the task would not do the trick. My task does inserts on the database. It needs the current user to save the id of the

How to test celery with django on a windows machine

≯℡__Kan透↙ 提交于 2019-12-30 12:22:56
问题 I'm looking for a resource, documentation or advise on how to test django celery on my windows machine before deploying on a Linux based server. Any useful Answer would be appreciated and accepted. 回答1: Celery (since version 4 as pointed out by another answer) does not support Windows (source: http://docs.celeryproject.org/en/latest/faq.html#does-celery-support-windows). Even so, you have some options: 1) Use task_always_eager=True . This will run your tasks synchronously – with this, you can

Running celeryd_multi with supervisor

我是研究僧i 提交于 2019-12-30 08:30:14
问题 I'm working with djcelery and supervisor. I was running a celery with supervisor and everything worked fine, once I realized that I needed to change it to celery multi everything broke up. If I run celeryd_multi in a terminal it works but always run in background, like supervisor need that the command run in foreground there where the problem is. This is my celery.ini : [program:celery_{{ division }}] command = {{ virtualenv_bin_dir }}/python manage.py celeryd_multi start default mailchimp -c

Adding extra celery configs to Airflow

那年仲夏 提交于 2019-12-30 06:57:11
问题 Anyone know where I can add extra celery configs to airflow celery executor? For instance I want http://docs.celeryproject.org/en/latest/userguide/configuration.html#worker-pool-restarts this property but how do I allow extra celery properties.. 回答1: Use the just-released Airflow 1.9.0 and this is now configurable. In airflow.cfg there is this line: # Import path for celery configuration options celery_config_options = airflow.config_templates.default_celery.DEFAULT_CELERY_CONFIG which points

Python Flask with celery out of application context

…衆ロ難τιáo~ 提交于 2019-12-30 05:53:09
问题 I am building a website using python Flask. Everything is going good and now I am trying to implement celery. That was going good as well until I tried to send an email using flask-mail from celery. Now I am getting an "working outside of application context" error. full traceback is Traceback (most recent call last): File "/usr/lib/python2.7/site-packages/celery/task/trace.py", line 228, in trace_task R = retval = fun(*args, **kwargs) File "/usr/lib/python2.7/site-packages/celery/task/trace

Running background Celery task in Flask

≯℡__Kan透↙ 提交于 2019-12-30 05:27:11
问题 Problem has been updated to include progress made I have the following code and my celery tasks kick off fine, I just don't know where I should store the async result so that I can look at it again later #!/usr/bin/env python """Page views.""" from flask import render_template, request from flask import Flask from celerytest import add from time import sleep app = Flask(__name__) async_res = [] @app.route('/', methods=['GET', 'POST']) def run(): if request.method == 'GET': return render