celery

How to configure celery-redis in django project on microsoft azure?

流过昼夜 提交于 2019-12-20 04:09:51
问题 I have this django locator project deployed in azure. My redis cache host name(DNS) is mycompany.azure.microsoft.net. I created it in azure, but not sure where i can find the password for the redis server. I have got this as my configuration in my settings.py. I am using redis as a broker for my celery setup in project. BROKER_URL = 'redis://:passwordAzureAccessKey=@mycompany.redis.cache.windows.net:6380/0' I could not connect. Is there anyplace different, I need to put password or username

How to run celery workers by superuser?

老子叫甜甜 提交于 2019-12-20 03:30:39
问题 When I run celery workers with sudo, i get the following error: Running a worker with superuser privileges when the worker accepts messages serialized with pickle is a very bad idea! If you really want to continue then you have to set the C_FORCE_ROOT environment variable (but please think about this before you do). User information: uid=0 euid=0 gid=0 egid=0 also my C_FORCE_ROOT environment variable is true : echo $C_FORCE_ROOT true More information: Python --> 2.7.6 celery --> 3.1.23

Django - Should external API requests always be made through a task handler (e.g. Celery)?

老子叫甜甜 提交于 2019-12-20 03:09:22
问题 I have a Django app where I have created a custom middleware. It works as follows: The middleware intercepts a token (which identifies the users) within each request, and makes a request to an external API with that token. The external API returns what permissions the user making the original request has. The middleware completes, and the user gets data returned based on its permissions This is my question: Because my app has to wait for the API request to return before it can process the

Celery and Django - No module named 'django'

别说谁变了你拦得住时间么 提交于 2019-12-20 02:34:52
问题 I use instructions described here. Python 2.7 and Celery 3.1.17. In celery.py I have (in beginning): from __future__ import absolute_import import os from celery import Celery from django.conf import settings When I run: celery -A proj worker -l info I have an error: from django.conf import settings ImportError: No module named 'django' But I have Django and my project works. How to fix that? Thanks! 回答1: For me the issue was that I had kombu.transport.django in INSTALLED_APPS . I was

Can I use java send task to celery through rabbitmq?

China☆狼群 提交于 2019-12-20 02:30:40
问题 I just touch celery and java for 2 days. :( Right now, I have a task that java client send task through rabbitmq. Celery will be the worker to handle task. I know it's easy for Python->rabbitmq->celery. But can I do this by java->rabbitmq->celery ? The draft idea is that serialization the java function by JSON and then send by rabbitmq, and then handle by celery. It's better to have example code and could be run directly thanks 回答1: You can certainly send messages through RabbitMQ from Java.

Airflow 1.10 - Scheduler Startup Fails

旧街凉风 提交于 2019-12-20 02:19:07
问题 I've just painfully installed Airflow 1.10 thanks to my previous post here. We have a single ec2-instance running, our queue is AWS Elastic Cache Redis, and our meta database is AWS RDS for PostgreSQL. Airflow works with this setup just fine when we are on Airflow version 1.9. But we are encountering an issue on Airflow version 1.10 when we go to start up the scheduler. [2018-08-15 16:29:14,015] {jobs.py:385} INFO - Started process (PID=15778) to work on /home/ec2-user/airflow/dags/myDag.py

Initializing Different Celery Workers with Different Values

霸气de小男生 提交于 2019-12-19 17:40:50
问题 I am using celery to run long running tasks on Hadoop. Each task executes a Pig script on Hadoop which runs for about 30 mins - 2 hours. My current Hadoop setup has 4 queues a,b,c, and default. All tasks are currently being executed by a single worker which submits the job to a single queue. I want to add 3 more workers which submit jobs to other queues, one worker per queue. The problem is the queue is currently hard-coded and I wish to make this variable per worker. I searched a lot but I

Celery详解(1)

妖精的绣舞 提交于 2019-12-19 12:51:20
在学习Celery之前,我先简单的去了解了一下什么是生产者消费者模式。 生产者消费者模式 在实际的软件开发过程中,经常会碰到如下场景:某个模块负责产生数据,这些数据由另一个模块来负责处理(此处的模块是广义的,可以是类、函数、线程、进程等)。产生数据的模块,就形象地称为生产者;而处理数据的模块,就称为消费者。 单单抽象出生产者和消费者,还够不上是生产者消费者模式。该模式还需要有一个缓冲区处于生产者和消费者之间,作为一个中介。生产者把数据放入缓冲区,而消费者从缓冲区取出数据,如下图所示: 生产者消费者模式是通过一个容器来解决生产者和消费者的强耦合问题。生产者和消费者彼此之间不直接通讯,而通过消息队列(缓冲区)来进行通讯,所以生产者生产完数据之后不用等待消费者处理,直接扔给消息队列,消费者不找生产者要数据,而是直接从消息队列里取,消息队列就相当于一个缓冲区,平衡了生产者和消费者的处理能力。这个消息队列就是用来给生产者和消费者解耦的。------------->这里又有一个问题,什么叫做解耦? 解耦 :假设生产者和消费者分别是两个类。如果让生产者直接调用消费者的某个方法,那么生产者对于消费者就会产生依赖(也就是耦合)。将来如果消费者的代码发生变化,可能会影响到生产者。而如果两者都依赖于某个缓冲区,两者之间不直接依赖,耦合也就相应降低了。生产者直接调用消费者的某个方法,还有另一个弊端

Celery revoke task before execute using django database

霸气de小男生 提交于 2019-12-19 11:35:26
问题 I'm using Django database instead of RabbitMQ for concurrency reasons. But I can't solve the problem of revoking a task before it execute. I found some answers about this matter but they don't seem complete or I can't get enough help. first answer second answer How can I extend celery task table using a model, add a boolean field (revoked) to set when I don't want the task to execute? Thanks. 回答1: Since Celery tracks tasks by an ID, all you really need is to be able to tell which IDs have

airflow的安装

穿精又带淫゛_ 提交于 2019-12-19 09:56:15
1.环境准备 1.1 安装环境 1.2 创建用户 2.安装airflow 2.1 安装python 2.2 安装pip 2.3 安装数据库 2.4 安装airflow 2.4.1 安装主模块 2.4.2 安装数据库模块、密码模块 2.5 配置airflown 2.5.1 设置环境变量 2.5.2 修改配置文件 3. 启动airflow 3.1 初始化数据库 3.2 创建用户 3.3 启动airflow 4.执行任务 5.安装celery 5.1 安装celery模块 5.2 安装celery broker 5.2.1 使用RabbitMQ作为broker 5.2.2 使用Redis做为broker 5.3 修改airflow配置文件启用celery 5.4 测试celery 5.5 部署多个worker 6. 问题 官方文档文档: http://airflow.incubator.apache.org/project.html 1.环境准备 1.1 安装环境 centos 6.7 (docker) python 2.7.13 docker run --name airflow -h airflow -dti --net hadoopnet --ip=172.18.0.20 -p 10131:22 -v /dfs/centos/airflow/home:/home -v /dfs