celery

Celery Flower Security in Production

≯℡__Kan透↙ 提交于 2019-11-29 20:26:39
I am looking to use Flower ( https://github.com/mher/flower ) to monitor my Celery tasks in place of the django-admin as reccomended in their docs ( http://docs.celeryproject.org/en/latest/userguide/monitoring.html#flower-real-time-celery-web-monitor ). However, because I am new to this I am a little confused about the way Flower's page is only based on HTTP, and not HTTPS. How can I enable security for my Celery tasks such that any old user can't just visit the no-login-needed website http://flowerserver.com:5555 and change something? I have considered Celery's own documentation on this, but

Celery and Django simple example

Deadly 提交于 2019-11-29 19:26:53
Let's take a simple Django example. app/models.py from django.db import models from django.contrib.auth.models import User class UserProfile(models.Model): user = models.OneToOneField(User) token = models.CharField(max_length=32) app/views.py from django.http import HttpResponse from django.views.decorators.csrf import csrf_exempt from forms import RegisterForm from utils.utilities import create_user @csrf_exempt def register_view(request): if request.method == 'POST': form = RegisterForm(request.POST) if form.is_valid(): create_user(form.cleaned_data) return HttpResponse('success') utils

Celery parallel distributed task with multiprocessing

和自甴很熟 提交于 2019-11-29 19:19:25
I have a CPU intensive Celery task. I would like to use all the processing power (cores) across lots of EC2 instances to get this job done faster (a celery parallel distributed task with multiprocessing - I think ) . The terms, threading , multiprocessing , distributed computing , distributed parallel processing are all terms I'm trying to understand better. Example task: @app.task for item in list_of_millions_of_ids: id = item # do some long complicated equation here very CPU heavy!!!!!!! database.objects(newid=id).save() Using the code above (with an example if possible) how one would ago

Python—在Django中使用Celery

橙三吉。 提交于 2019-11-29 19:13:54
一.Django中的请求   Django Web中从一个http请求发起,到获得响应返回html页面的流程大致如下:     http请求发起     经过中间件    http handling(request解析)     url mapping(url匹配找到对应的View)     在View中进行逻辑(包括调用Model类进行数据库的增删改查)     经过中间件     返回对应的template/response。      同步请求:所有逻辑处理、数据计算任务在View中处理完毕后返回response。在View处理任务时用户处于等待状态,直到页面返回结果。   异步请求:View中先返回response,再在后台处理任务。用户无需等待,可以继续浏览网站。当任务处理完成时,我们可以再告知用户。 二.Django中使用Celery 安装 pip3 install django-celery 配置   首先创建一个django项目,结构如下:                    之后再settings.py的同级目录添加celeryconfig.py配置文件,更多配置信息可以参考官方文档。 import djcelery from datetime import timedelta djcelery.setup_loader() # 导入任务 CELERY

how to setup sqlalchemy session in celery tasks with no global variable

佐手、 提交于 2019-11-29 18:55:45
问题 Summary: I want to use a sqlalchemy session in celery tasks without having a global variable containing that session. I am using SQLAlchemy in a project with celery tasks, and I'm having Currently, I have a global variable 'session' defined along with my celery app setup (celery.py), with a worker signal to set it up. session = scoped_session(sessionmaker()) @celeryd_init.connect def configure_workers(sender=None, conf=None, **kwargs): # load the application configuration # db_uri = conf['db

Setting up periodic tasks in Celery (celerybeat) dynamically using add_periodic_task

人盡茶涼 提交于 2019-11-29 18:50:25
问题 I'm using Celery 4.0.1 with Django 1.10 and I have troubles scheduling tasks (running a task works fine). Here is the celery configuration: os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'myapp.settings') app = Celery('myapp') app.autodiscover_tasks(lambda: settings.INSTALLED_APPS) app.conf.BROKER_URL = 'amqp://{}:{}@{}'.format(settings.AMQP_USER, settings.AMQP_PASSWORD, settings.AMQP_HOST) app.conf.CELERY_DEFAULT_EXCHANGE = 'myapp.celery' app.conf.CELERY_DEFAULT_QUEUE = 'myapp.celery

Django Celery tutorial not returning results

拥有回忆 提交于 2019-11-29 18:27:16
问题 UDATE3: found the issue. See the answer below. UPDATE2: It seems I might have been dealing with an automatic naming and relative imports problem by running the djcelery tutorial through the manage.py shell, see below. It is still not working for me, but now I get new log error messages. See below. UPDATE: I added the log at the bottom of the post. It seems the example task is not registered? Original Post: I am trying to get django-celery up and running. I was not able to get through the

RabbitMQ on EC2 Consuming Tons of CPU

∥☆過路亽.° 提交于 2019-11-29 18:07:33
问题 I am trying to get RabbitMQ with Celery and Django going on an EC2 instance to do some pretty basic background processing. I'm running rabbitmq-server 2.5.0 on a large EC2 instance. I downloaded and installed the test client per the instructions here (at the very bottom of the page). I have been just letting the test script go and am getting the expected output: recving rate: 2350 msg/s, min/avg/max latency: 588078478/588352905/588588968 microseconds recving rate: 1844 msg/s, min/avg/max

Python Celery - How to call celery tasks inside other task

混江龙づ霸主 提交于 2019-11-29 14:42:37
问题 I'm calling a task within a tasks in Django-Celery Here are my tasks. @shared_task def post_notification(data,url): url = "http://posttestserver.com/data/?dir=praful" # when in production, remove this line. headers = {'content-type': 'application/json'} requests.post(url, data=json.dumps(data), headers=headers) @shared_task def shipment_server(data,notification_type): notification_obj = Notification.objects.get(name = notification_type) server_list = ServerNotificationMapping.objects.filter

django celery: how to set task to run at specific interval programmatically

半腔热情 提交于 2019-11-29 14:42:30
问题 I found that I can set the task to run at specific interval at specific times from here, but that was only done during task declaration. How do I set a task to run periodically dynamically? 回答1: The schedule is derived from a setting, and thus seems to be immutable at runtime. You can probably accomplish what you're looking for using Task ETAs. This guarantees that your task won't run before the desired time, but doesn't promise to run the task at the designated time—if the workers are