celery

Celery: stuck in infinitly repeating timeouts (Timed out waiting for UP message)

匿名 (未验证) 提交于 2019-12-03 07:50:05
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试): 问题: I defined some tasks with a time limit of 1200: @celery.task(time_limit=1200) def create_ne_list(text): c = Client() return c.create_ne_list(text) I'm also using the worker_process_init signal to do some initialization, each time a new process starts: @worker_process_init.connect def init(sender=None, conf=None, **kwargs): init_system(celery.conf) init_pdf(celery.conf) This initialization function takes several seconds to execute. Besides that, I'm using the following configuration: CELERY_RESULT_SERIALIZER = 'json' CELERY_TASK_SERIALIZER =

Getting Keras (with Theano) to work with Celery

老子叫甜甜 提交于 2019-12-03 07:47:11
I have some keras code which works synchronously to predict a given input, I have even made amendments so it can work with standard multi-threading (using locks in a seperate class from this) however when running via asynchronous celery (even with one worker and one task) I get an error on calling predict on the keras model. @app.task def predict_task(param): """Run task.""" json_file = open('keras_model.json', 'r') loaded_model_json = json_file.read() json_file.close() model = model_from_json(loaded_model_json) model.load_weights('keras_weights.h5') tokenizer_file = open('tokenizer.pickle',

Django Asynchronous Processing

时间秒杀一切 提交于 2019-12-03 07:44:17
问题 I have a bunch of Django requests which executes some mathematical computations ( written in C and executed via a Cython module ) which may take an indeterminate amount ( on the order of 1 second ) of time to execute. Also the requests don't need to access the database and are all independent of each other and Django. Right now everything is synchronous ( using Gunicorn with sync worker types ) but I'd like to make this asynchronous and nonblocking. In short I'd like to do something: Receive

Using Celery with existing RabbitMQ messages

你离开我真会死。 提交于 2019-12-03 07:42:19
问题 I have an existing RabbitMQ deployment that that a few Java applications are using the send out log messages as string JSON objects on various channels. I would like to use Celery to consume these messages and write them to various places (e.g. DB, Hadoop, etc.). I can see that Celery is design to be both the producer and consumer of RabbitMQ messages, since it tries to hide the mechanism by which those messages are delivered. Is there anyway to get Celery to consume messages created by

django/celery: Best practices to run tasks on 150k Django objects?

亡梦爱人 提交于 2019-12-03 07:36:50
问题 I have to run tasks on approximately 150k Django objects. What is the best way to do this? I am using the Django ORM as the Broker. The database backend is MySQL and chokes and dies during the task.delay() of all the tasks. Related, I was also wanting to kick this off from the submission of a form, but the resulting request produced a very long response time that timed out. 回答1: I would also consider using something other than using the database as the "broker". It really isn't suitable for

Differentiate celery, kombu, PyAMQP and RabbitMQ/ironMQ

久未见 提交于 2019-12-03 07:28:37
I want to upload images to S3 server, but before uploading I want to generate thumbnails of 3 different sizes, and I want it to be done out of request/response cycle hence I am using celery. I have read the docs, here is what I have understood. Please correct me if I am wrong. Celery helps you manage your task queues outside the request response cycle. Then there is something called carrot/kombu - its a django middleware that packages tasks that get created via celery. Then the third layer PyAMQP that facilitates the communication of carrot to a broker. eg. RabbitMQ, AmazonSQS, ironMQ etc.

Why is RabbitMQ not persisting messages on a durable queue?

假如想象 提交于 2019-12-03 07:28:26
问题 I am using RabbitMQ with Django through Celery. I am using the most basic setup: # RabbitMQ connection settings BROKER_HOST = 'localhost' BROKER_PORT = '5672' BROKER_USER = 'guest' BROKER_PASSWORD = 'guest' BROKER_VHOST = '/' I imported a Celery task and queued it to run one year later. From the iPython shell: In [1]: from apps.test_app.tasks import add In [2]: dt=datetime.datetime(2012, 2, 18, 10, 00) In [3]: add.apply_async((10, 6), eta=dt) DEBUG:amqplib:Start from server, version: 8.0,

Django - Executing a task through celery from a model

好久不见. 提交于 2019-12-03 07:26:10
In my models.py: from django.db import models from core import tasks class Image(models.Model): image = models.ImageField(upload_to='images/orig') thumbnail = models.ImageField(upload_to='images/thumbnails', editable=False) def save(self, *args, **kwargs): super(Image, self).save(*args, **kwargs) tasks.create_thumbnail.delay(self.id) In my tasks.py: from celery.decorators import task from core.models import Image @task() def create_thumbnail(image_id): ImageObj = Image.objects.get(id=image_id) # other stuff here This is returning the following: Exception Type: ImportError Exception Value:

celery异步框架

不问归期 提交于 2019-12-03 07:23:23
一、什么是celery Celery是一个简单、灵活且可靠的,处理大量消息的分布式系统 专注于实时处理的异步任务队列 同时也支持任务调度 二、Celery架构 Celery的架构由三部分组成,消息中间件(message broker)、任务执行单元(worker)和 任务执行结果存储(task result store)组成。 2.1 消息中间件 Celery本身不提供消息服务,但是可以方便的和第三方提供的消息中间件集成。包括,RabbitMQ, Redis等等 2.2 任务执行单元 Worker是Celery提供的任务执行的单元,worker并发的运行在分布式的系统节点中。 2.3 任务结果存储 Task result store用来存储Worker执行的任务的结果,Celery支持以不同方式存储任务的结果,包括AMQP,Redis等 2.4 版本支持情况 Celery version 4.0 runs on     Python ❨2.7, 3.4, 3.5❩     PyPy ❨5.4, 5.5❩ This is the last version to support Python 2.7, and from the next version (Celery 5.x) Python 3.5 or newer is required. If you’re running an

celery的简单了解与简单使用

我们两清 提交于 2019-12-03 07:22:17
Celery框架 1、celery其实就是一个简单灵活的分布式系统,它的主要作用是被用来处理异步任务。 2、celery框架自带socket,所以自身是一个独立运行的服务 3、启动celery服务,是来执行服务中的任务的,服务中带一个执行任务的对象,会执行准备就绪的任务,将执行任务的结果保存起来 4、celery框架由三部分组成:存放要执行的任务broker,执行任务的对象worker,存放任务结果的backend 5、安装的celery主体模块,默认只提供worker,要结合其他技术提供broker和backend(两个存储的单位) Celery架构 Celery的架构由三部分组成,消息中间件(message broker)、任务执行单元(worker)和 任务执行结果存储(backend - task result store)组成。 消息中间件 Celery本身不提供消息服务,但是可以方便的和第三方提供的消息中间件集成。包括,RabbitMQ, Redis等等 任务执行单元 Worker是Celery提供的任务执行的单元,worker并发的运行在分布式的系统节点中。 任务结果存储 Task result store用来存储Worker执行的任务的结果,Celery支持以不同方式存储任务的结果,包括AMQP, redis等 使用场景 异步任务