celery

How to execute two tasks at same time

Deadly 提交于 2019-12-13 03:29:22
问题 I'm working with celery that I encountered a problem. I have two functions: 1) This function will be activated when the program is activated, and it will work infinitely: from celery.signals import worker_ready @worker_ready.connect() def message_poll_start(sender=None, headers=None, body=None, **kwargs): while True: time.sleep(2) print("hello") 2) This function will be activated every ten seconds and write a date in a txt file: @periodic_task(run_every=timedelta(seconds=10)) def last_record

Celery worker getting crashed on Heroku

放肆的年华 提交于 2019-12-13 03:25:20
问题 I am working on a Django project which I have pushed on Heroku , for background tasking I have used Celery . Although Celery works fine locally, but on the Heroku server I have observed that celery worker is getting crashed. I have set CLOUDAMQP_URL properly in settings.py and configured worker configuration in Procfile , but still worker is getting crashed. Procfile web: gunicorn my_django_app.wsgi --log-file - worker: python manage.py celery worker --loglevel=info Settings.py ... # Celery

Django Celery - Missing something but I have no idea what? Have results but can't get them

一世执手 提交于 2019-12-13 03:18:56
问题 My task goes into celery and gets results. I know this because I can do this. >>> ts = TaskState.objects.all()[0] >>> ts Out[31]: <TaskState: SUCCESS apps.checklist.tasks.bulk_checklist_process(ec01461b-3431-478d-adfc-6d6cf162e9ad) ts:2012-07-20 14:35:41> >>> ts.state Out[32]: u'SUCCESS' >>> ts.result Out[33]: u'{\'info\': ["Great",]}' But when I attempt to use the documented way to get the result - all hell breaks loose.. >>> from celery.result import BaseAsyncResult >>> result =

Logstash sync database with elasticsearch

谁说胖子不能爱 提交于 2019-12-13 03:02:54
问题 I'm trying to implement something like https://www.elastic.co/blog/how-to-keep-elasticsearch-synchronized-with-a-relational-database-using-logstash where it uses logstash jdbc input and elasticsearch output I can make it work with simple queries. But it's getting harder to properly prepare(serialize) data (with multiple joins and etc) for output. (When you have data from multiple tables and need to properly format data for ES, it's harder to do it in sql query) I'm wondering if I could

celeryd with RabbitMQ hangs on “mingle: searching for neighbors”, but plain celery works

痞子三分冷 提交于 2019-12-13 02:37:04
问题 I'm banging my head to the wall with celeryd and RabbitMQ. This example from tutorial is working just fine: from celery import Celery app = Celery('tasks', backend='amqp', broker='amqp://') @app.task def add(x, y): return x + y I run: celery -A tasks worker --loglevel=info And I get the output: [2014-11-18 19:47:58,874: INFO/MainProcess] Connected to amqp://guest:**@127.0.0.1:5672// [2014-11-18 19:47:58,881: INFO/MainProcess] mingle: searching for neighbors [2014-11-18 19:47:59,889: INFO

Received unregistered task for celery

梦想的初衷 提交于 2019-12-13 01:13:54
问题 I am getting unregistered error when i run the worker to take jobs from a queue. This is how i am doing celery -A Tasks beat The above command will schedule a job at specific time. After that, the task will be added to default queue.Now i run celery worker in other terminal as below celery worker -Q default But i am getting the following error [2014-08-19 19:34:02,466: ERROR/MainProcess] Received unregistered task of type 'TasksReg.vodafone_v2'. The message has been ignored and discarded. Did

django rest framework update method in serializer, instance is not saved immediately

牧云@^-^@ 提交于 2019-12-12 21:32:50
问题 The instance to be updated has instance.email=abc@mail.com . email to be updated or changed to xyz@mail.com UserUpdateSerializer 's update method. def update(self, instance, validated_data): email_updated=False email = self.validated_data["email"] print(instance.email) #abc@email.com if email!=instance.email: if User.objects.filter(email=email).exists(): raise serializers.ValidationError("email is not available") else: email_updated=True instance.__dict__.update(**validated_data) instance

Server Push with SocketIO from Celery Task

假如想象 提交于 2019-12-12 18:24:51
问题 I have a flask application within which I have many long running asynchronous tasks (~hours). It's important that the state of these tasks is communicated with the client. I use celery to manage the background task queue, and I'm currently trying to broadcast updates to the client from each background thread via socketIO . Is this possible? Is there a better suited strategy to achieving what I would like? 回答1: You did not say, but I assume you plan on using Flask-SocketIO to handle the server

Celery - importing models in tasks.py

a 夏天 提交于 2019-12-12 16:13:50
问题 I'm having an issue getting access to models in my tasks.py My goal is to send an email at various parts of the application (user registration, reset password etc..). To do this I pass the user id(s) to a celery task called 'send_email'. @shared_task() def send_email(sender_id=None, receiver_id=None, type=None, message=None): sender = User.objects.get(id=sender_id) receiver = User.objects.get(id=receiver_id) logger.info("Starting send email") Email.send_email(sender, receiver, type, message)

celery定时器

佐手、 提交于 2019-12-12 15:33:34
Celery 1.什么是Clelery Celery是一个简单、灵活且可靠的,处理大量消息的分布式系统 专注于实时处理的异步任务队列 同时也支持任务调度 Celery架构 Celery的架构由三部分组成,消息中间件(message broker),任务执行单元(worker)和任务执行结果存储(task result store)组成。 消息中间件 Celery本身不提供消息服务,但是可以方便的和第三方提供的消息中间件集成。包括,RabbitMQ, Redis等等 任务执行单元 Worker是Celery提供的任务执行的单元,worker并发的运行在分布式的系统节点中。 任务结果存储 Task result store用来存储Worker执行的任务的结果,Celery支持以不同方式存储任务的结果,包括AMQP, redis等 版本支持情况 Celery version 4.0 runs on Python ❨2.7, 3.4, 3.5❩ PyPy ❨5.4, 5.5❩ This is the last version to support Python 2.7, and from the next version (Celery 5.x) Python 3.5 or newer is required.​ If you’re running an older version of