celery

AsyncResult(task_id) returns “PENDING” state even after the task started

十年热恋 提交于 2019-12-05 06:14:08
In the project, I try to poll task.state of a long running task and update its running status. It worked in the development, but it won't work when I move the project on production server. I kept getting 'PENDING' even I can see the task started on flower. However, I can still get the results updated when the task finished, which when task.state == 'SUCCESS'. I use python 2.6, Django 1.6 and Celery 3.1 in the production, result backend AMQP. @csrf_exempt def poll_state(request): data = 'Fail' if request.is_ajax(): if 'task_id' in request.POST.keys() and request.POST['task_id']: task_id =

celery + django - how to write task state to database

℡╲_俬逩灬. 提交于 2019-12-05 06:04:03
问题 I'm running Celery with Django and RabbitMQ and want to see the task states in the database table. Unfortunately no entries are written into the table djcelery_taskstate and I can't figure out why. My settings: CELERY_ENABLE_UTC = True BROKER_URL = "amqp://guest:guest@localhost:5672/" CELERY_RESULT_BACKEND = "database" CELERYBEAT_SCHEDULER = 'djcelery.schedulers.DatabaseScheduler' CELERY_TRACK_STARTED = True CELERY_SEND_EVENTS = True CELERY_IMPORTS = ("project_management.tasks", "accounting

can't import django model into celery task

眉间皱痕 提交于 2019-12-05 06:00:55
i have the following task: from __future__ import absolute_import from myproject.celery import app from myapp.models import Entity @app.task def add(entity_id): entity = Entity.objects.get(pk=entity_id) return entity.name I get the following error: django.core.exceptions.ImproperlyConfigured: Requested setting DEFAULT_INDEX_TABLESPACE, but settings are not configured. You must either define the environment variable DJANGO_SETTINGS_MODULE or call settings.configure() before accessing settings. If I take out the entity import every thing is fine and no error occurs. When add back : from myapp

DRF 获取DefaultRouter 对应的url

穿精又带淫゛_ 提交于 2019-12-05 04:50:46
命令 python manage.py show_urls urls.py from user.router import core_router urlpatterns = [ path('user/login/', views.LoginView.as_view(), name='login'), path('user/logout/', views.LogoutView.as_view(), name='logout'), path('user/register/', views.UserRegisterView.as_view(), name='register'), path('user/forget/', views.UserForgetView.as_view(), name='forget'), # path('code/image/', views.LoginView.as_view()), ] urlpatterns += core_router.urls router.py from rest_framework.routers import DefaultRouter from user import views core_router = DefaultRouter() core_router.register('user', views

Shared XMPP connection between Celery workers

旧巷老猫 提交于 2019-12-05 04:22:47
My web app needs to be able to send XMPP messages (Facebook Chat), and I thought Celery might be a good solution for this. A task would consist of querying the database and sending the XMPP message to a number of users. However, with that approach I would have to connect to the XMPP server every time I run a task, which is not a great idea. From the Facebook Chat API docs : Best Practices Your Facebook Chat integration should only be used for sessions that are expected to be long-lived. Clients should not rapidly churn on and off. Is there a way to share an XMPP connection between workers so I

How to execute tasks in Celery using datetime from MySQL?

最后都变了- 提交于 2019-12-05 03:15:14
问题 News are stored in database MySQL with datetime of publication. Any time user can add a new in the table with delay date publishing. How to use Celery for listeting database table and check if it is time to publish data? For publishing data is responsible another process(script Python). So Celery should call this script for each rows in MySQL table according datetime . How to do that using Celery? Another way I think to create queue with date and id of publication then directly add data from

Maximum clients reached on Heroku and Redistogo Nano

强颜欢笑 提交于 2019-12-05 03:03:47
I am using celerybeat on Heroku with RedisToGo Nano addon There is one web dyno and one worker dyno The celerybeat worker is set to perform a task every minute. The problem is: Whenever I deploy a new commit, dynos restart, and I get this error 2014-02-27T13:19:31.552352+00:00 app[worker.1]: Traceback (most recent call last): 2014-02-27T13:19:31.552352+00:00 app[worker.1]: File "/app/.heroku/python/lib/python2.7/site-packages/celery/worker/consumer.py", line 389, in start 2014-02-27T13:19:31.552352+00:00 app[worker.1]: self.reset_connection() 2014-02-27T13:19:31.552352+00:00 app[worker.1]:

Get progress from async python celery chain by chain id

北城余情 提交于 2019-12-05 02:43:38
I'm trying to get the progress of a task chain by querying each task status. But when retrieving the chain by it's id, I get some object that behaves differently. In tasks.py from celery import Celery celery = Celery('tasks') celery.config_from_object('celeryconfig') def unpack_chain(nodes): while nodes.parent: yield nodes.parent nodes = nodes.parent yield nodes @celery.task def add(num, num2): return num + num2 When quering from ipython... In [43]: from celery import chain In [44]: from tasks import celery, add, unpack_chain In [45]: c = chain(add.s(3,3), add.s(10).set(countdown=100)) In [46]

delete Task / PeriodicTask in celery

心不动则不痛 提交于 2019-12-05 02:19:04
How can I delete a regular Task or PeriodicTask in celery? rlotun You revoke the task: See documentation : Control.revoke(task_id, destination=None, terminate=False, signal='SIGTERM', **kwargs) Tell all (or specific) workers to revoke a task by id. If a task is revoked, the workers will ignore the task and not execute it after all. Parameters: task_id – Id of the task to revoke. terminate – Also terminate the process currently working on the task (if any). signal – Name of signal to send to process if terminate. Default is TERM. 来源: https://stackoverflow.com/questions/2557424/delete-task

Celery简单说明以及在Django中的配置

一笑奈何 提交于 2019-12-05 02:08:39
Celery 1.什么是Clelery Celery是一个简单、灵活且可靠的,处理大量消息的分布式系统 专注于实时处理的异步任务队列 同时也支持任务调度 Celery架构 Celery的架构由三部分组成,消息中间件(message broker),任务执行单元(worker)和任务执行结果存储(task result store)组成。 消息中间件 Celery本身不提供消息服务,但是可以方便的和第三方提供的消息中间件集成。包括,RabbitMQ, Redis等等 任务执行单元 Worker是Celery提供的任务执行的单元,worker并发的运行在分布式的系统节点中。 任务结果存储 Task result store用来存储Worker执行的任务的结果,Celery支持以不同方式存储任务的结果,包括AMQP, redis等 版本支持情况 Celery version 4.0 runs on Python ❨2.7, 3.4, 3.5❩ PyPy ❨5.4, 5.5❩ This is the last version to support Python 2.7, and from the next version (Celery 5.x) Python 3.5 or newer is required. If you’re running an older version of