celery

Celery异步发送邮件

无人久伴 提交于 2019-12-23 02:59:58
一、settings.py 1 # 发送邮件配置 2 EMAIL_BACKEND = 'django.core.mail.backends.smtp.EmailBackend' 3 # smpt服务地址 4 EMAIL_HOST = 'smtp.163.com' 5 EMAIL_PORT = 25 6 # 发送邮件的邮箱 7 EMAIL_HOST_USER = 'smartli_it@163.com' 8 # 在邮箱中设置的客户端授权密码 9 EMAIL_HOST_PASSWORD = 'smartli123' 10 # 收件人看到的发件人 11 EMAIL_FROM = '天天生鲜<smartli_it@163.com>' 二、编写 tasks.py 在项目根目录下创建包和py文件 发送邮件代码如下 1 from django.core.mail import send_mail 2 from django.conf import settings 3 from celery import Celery 4 5 6 # 在任务处理者一端加这几句 7 import os 8 import django 9 os.environ.setdefault("DJANGO_SETTINGS_MODULE", "dailyfresh.settings") 10 django.setup()

15 Django之Celery发送邮件

╄→尐↘猪︶ㄣ 提交于 2019-12-23 02:59:49
异步任务--celery发送邮件 安装两个python包: pip install celery==3.1.25 pip install django-celery==3.2.1    pip install celery-with-redis==3.0 在你的应用下面创建名为task.py,用于封装耗时任务 #settings中配置邮箱 1 EMAIL_BACKEND = 'django.core.mail.backends.smtp.EmailBackend' 2 EMAIL_USE_TLS = False 3 EMAIL_HOST = 'smtp.163.com' 4 EMAIL_PORT = 25 5 EMAIL_HOST_USER = 'xxxxx@163.com' 6 EMAIL_HOST_PASSWORD = 'xxxxx' 7 EMAIL_FROM = 'xxxxx@163.com' 8 9 10 11 import djcelery 12 13 djcelery.setup_loader() 1 # 配置redis :密码@ip:端口/指定数据库 2 BROKER_URL = 'redis://:pzl123456@47.106.37.80:6379/0' 3 CELERT_IMPORTS = ('users.task') python manage.py

使用django + celery + redis 异步发送邮件

我只是一个虾纸丫 提交于 2019-12-23 02:59:16
参考: http://blog.csdn.net/Ricky110/article/details/77205291 环境: centos7 + python3.6.1 + django2.0.1 + celery4.1.0 + redis3.2.10 yum install -y redis pip3 install redis,celery,django 开始: 创建django工程my_report 创建app celery_test, 如下所示 : INSTALLED_APPS中注册app_celery setting中celery配置 # Celery settings CELERY_BROKER_URL = 'redis://localhost:6379' #: Only add pickle to this list if your broker is secured CELERY_ACCEPT_CONTENT = ['json'] CELERY_RESULT_BACKEND = 'redis://localhost:6379' CELERY_TASK_SERIALIZER = 'json' CELERY_ENABLE_UTC = True CELERY_TIMEZONE = 'Asia/Shanghai'    setting中mail配置 EMAIL

Celery + gevent using only one CPU core

拈花ヽ惹草 提交于 2019-12-23 01:01:14
问题 I've issues with performance load running Celery with gevent, everything is running on the same core on my VPS. Here's a screenshot of 4 Celery instance with 20 gevent concurrency each How to fix this ? What am I doing wrong ? Here's my first task : def update_sender(): items = models.Item.objects.filter(active=True).all() count = items.count() items = [i.id for i in items] step = count / settings.WORKERS for job in list(chunks(items, step)): update_item.apply_async(args=[job]) Calling the

Celery 4 not auto-discovering tasks

落爺英雄遲暮 提交于 2019-12-22 18:30:38
问题 I have a Django 1.11 and Celery 4.1 project, and I've configured it according to the setup docs. My celery_init.py looks like from __future__ import absolute_import import os from celery import Celery # set the default Django settings module for the 'celery' program. os.environ['DJANGO_SETTINGS_MODULE'] = 'myproject.settings.settings' app = Celery('myproject') app.config_from_object('django.conf:settings', namespace='CELERY') #app.autodiscover_tasks(lambda: settings.INSTALLED_APPS) # does

Celery 4 not auto-discovering tasks

混江龙づ霸主 提交于 2019-12-22 18:29:12
问题 I have a Django 1.11 and Celery 4.1 project, and I've configured it according to the setup docs. My celery_init.py looks like from __future__ import absolute_import import os from celery import Celery # set the default Django settings module for the 'celery' program. os.environ['DJANGO_SETTINGS_MODULE'] = 'myproject.settings.settings' app = Celery('myproject') app.config_from_object('django.conf:settings', namespace='CELERY') #app.autodiscover_tasks(lambda: settings.INSTALLED_APPS) # does

Why can't Celery daemon see tasks?

南笙酒味 提交于 2019-12-22 18:25:06
问题 I have a Django 1.62 application running on Debian 7.8 with Nginx 1.2.1 as my proxy server and Gunicorn 19.1.1 as my application server. I've installed Celery 3.1.7 and RabbitMQ 2.8.4 to handle asynchronous tasks. I'm able to start a Celery worker as a daemon but whenever I try to run the test "add" task as shown in the Celery docs, I get the following error: Received unregistred task of type u'apps.photos.tasks.add'. The message has been ignored and discarded. Traceback (most recent call

Flask-Mail breaks Celery

那年仲夏 提交于 2019-12-22 13:48:26
问题 I've got a Flask app where celery works fine and Flask-Mail on its own works fine as well. from celery import Celery from flask_mail import Mail, Message app = Flask(__name__) mail = Mail(app) celery = Celery('main_app', broker='mongodb://localhost', backend='mongodb://localhost') @celery.task def cel_test(): return 'cel_test' @app.route('/works_maybe') def works_maybe(): return cel_test.delay() SO FAR, SO GOOD cel_test works fine with the celery worker; everything shows up in mongo. But here

Django Celery and multiple databases (Celery, Django and RabbitMQ)

北城以北 提交于 2019-12-22 11:31:41
问题 Is it possible to set a different database to be used with Django Celery? I have a project with multiple databases in configuration and don't want Django Celery to use the default one. I will be nice if I can still use django celery admin pages and read results stored in this different database :) 回答1: It should be possible to set up a separate database for the django-celery models using Django database routers: https://docs.djangoproject.com/en/1.4/topics/db/multi-db/#automatic-database

Run Unittest On Main Django Database

不羁岁月 提交于 2019-12-22 10:56:53
问题 I'm looking for a way to run a full celery setup during django tests, asked in this other SO question After thinking about it, I think I could settle for running a unittest (it's more of an integration test) in which I run the test script against the main Django (development) database. Is there a way to write unittests, run them with Nose and do so against the main database? I imagine it would be a matter of telling Nose (or whatever other framework) about the django settings. I've looked at