celery

How can I run a celery periodic task from the shell manually?

匿名 (未验证) 提交于 2019-12-03 02:45:02
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试): 问题: I'm using celery and django-celery. I have defined a periodic task that I'd like to test. Is it possible to run the periodic task from the shell manually so that I view the console output? 回答1: Have you tried just running the task from the Django shell? You can use the .apply method of a task to ensure that it is run eagerly and locally. Assuming the task is called my_task in Django app myapp in a tasks submodule: $ python manage.py shell >>> from myapp.tasks import my_task >>> eager_result = my_task.apply() The result instance has the same

课堂笔记

谁说我不能喝 提交于 2019-12-03 02:44:00
目录 复习 今日 celery框架 复习 """ 1、redis redis-server & redis-cli -h localhost -p 6379 -n 10 setex key time value pip install redis: 直接使用 | 连接池使用 pip install django-redis: 将缓存配置实在为redis,在使用django的cache功能 cache.set(key, value, exp) # value可以为drf序列化类的序列化结果 cache.get(key) 2、手机验证与验证码接口 前台提交手机,后台校验手机是否已注册 前台提交手机,后台校验手机并生成验证码(缓存)并将手机提供给第三方发送短信 """ 今日 """ 登录注册 接口缓存 celery框架 课程主页与详情页 """ celery框架 """ 1、celery框架自带socket,所以自身是一个独立运行的服务 2、启动celery服务,是来执行服务中的任务的,服务中带一个执行任务的对象,会执行准备就绪的任务,将执行任务的结果保存起来 3、celery框架由三部分组成:存放要执行的任务broker,执行任务的对象worker,存放任务结果的backend 4、安装的celery主体模块,默认只提供worker,要结合其他技术提供broker和backend

Updating a Haystack search index with Django + Celery

怎甘沉沦 提交于 2019-12-03 02:41:07
问题 In my Django project I am using Celery. I switched over a command from crontab to be a periodic task and it works well but it is just calling a method on a model. Is it possible to update my Haystack index from a periodic task as well? Has anyone done this? /manage.py update_index That's the command to update the index from the Haystack documentation but I'm not sure how to call that from a task. 回答1: the easiest way to do this would probably be to run the management command directly from

Issues with Celery configuration on AWS Elastic Beanstalk - “No config updates to processes”

匿名 (未验证) 提交于 2019-12-03 02:38:01
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试): 问题: I've a Django 2 application deployed on AWS Elastic Beanstalk and I'm trying to configure Celery in order to exec async tasks on the same machine. My files: 02_packages.config files: "/usr/local/share/pycurl-7.43.0.tar.gz" : mode: "000644" owner: root group: root source: https://pypi.python.org/packages/source/p/pycurl/pycurl-7.43.0.tar.gz packages: yum: python34-devel: [] libcurl-devel: [] commands: 01_download_pip3: # run this before PIP installs requirements as it needs to be compiled with OpenSSL command: 'curl -O https://bootstrap.pypa

Celery dies with DBPageNotFoundError

匿名 (未验证) 提交于 2019-12-03 02:31:01
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试): 问题: I have 3 machines with celery workers and rabbitmq as a broker, one worker is running with beat flag, all of this is managed by supervisor, and sometimes celery dies with such error. This error appears only on beat worker, but when it appears, workers on all machines dies. (celery==3.1.12, kombu==3.0.20) [2014-07-05 08:37:04,297: INFO/MainProcess] Connected to amqp://user:**@192.168.15.106:5672// [2014-07-05 08:37:04,311: ERROR/Beat] Process Beat Traceback (most recent call last): File "/var/projects/env/local/lib/python2.7/site-packages

Unit testing an AsyncResult in celery

人盡茶涼 提交于 2019-12-03 02:29:15
I am trying to test some celery functionality in Django's unit testing framework, but whenever I try to check an AsyncResult the tests act like it was never started. I know this code works in a real environment with RabbitMQ, so I was just wondering why it didn't work when using the testing framework. Here is an example: @override_settings(CELERY_EAGER_PROPAGATES_EXCEPTIONS = True, CELERY_ALWAYS_EAGER = True, BROKER_BACKEND = 'memory',) def test_celery_do_work(self): result = myapp.tasks.celery_do_work.AsyncResult('blat') applied_task = myapp.tasks.celery_do_work.apply_async((), task_id='blat'

How to structure celery tasks

房东的猫 提交于 2019-12-03 02:26:29
I have 2 types of task: async tasks and schedule tasks. So, here is my dir structure: proj | -- tasks | -- __init__.py | -- celeryapp.py => celery instance defined in this file. | -- celeryconfig.py | -- async | | | -- __init__.py | | | -- task1.py => from proj.tasks.celeryapp import celery | | | -- task2.py => from proj.tasks.celeryapp import celery | -- schedule | -- __init__.py | -- task1.py => from proj.tasks.celeryapp import celery | -- task2.py => from proj.tasks.celeryapp import celery But when I run celery worker like below, it does not work. It can not accept the task from celery beat

Celery workers unable to connect to redis on docker instances

匿名 (未验证) 提交于 2019-12-03 02:20:02
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试): 问题: I have a dockerized setup running a Django app within which I use Celery tasks. Celery uses Redis as the broker. Versioning: Docker version 17.09.0-ce, build afdb6d4 docker-compose version 1.15.0, build e12f3b9 Django==1.9.6 django-celery-beat==1.0.1 celery==4.1.0 celery[redis] redis==2.10.5 Problem: My celery workers appear to be unable to connect to the redis container located at localhost:6379. I am able to telnet into the redis server on the specified port. I am able to verify redis-server is running on the container. When I manually

Celery &Rabbitmq:WARNING/MainProcess] Received and deleted unknown message. Wrong destination?!?- a experiment on the GIT

匿名 (未验证) 提交于 2019-12-03 02:20:02
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试): 问题: Recently , I am doing an experiment on a GIT project to understanding the big data processing framework. 1、GIT project: https://github.com/esperdyne/celery-message-processing we have the following components: 1、AMPQ broker( RabbitMQ ): it works as a message buffer, which works as a mail-box to exchange messages for different user! 2、worker: it works as the service-server to provide service for various service client. 3、Queue( "celery" :it works as a multi-processing container which is used to handle the various worker instances at the same

invoke celery task from tornado [duplicate]

匿名 (未验证) 提交于 2019-12-03 02:20:02
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试): 问题: This question already has an answer here: Tornado celery integration hacks 4 answers How can someone invoke a celery task from tornado, and get the result via a callback? This post claims that someone must simply put a message via RabbitMQ and then the task shall be executed. This makes sense, but can someone give an example in python (even better in tornado, with a callback)? Personally, I use mongodb as my message broker, but I can switch to Redis or RabbitMQ as well.. EDIT: To clarify things, I want an example with a callback. For example