celery

RuntimeError: Never call result.get() within a task Celery

半世苍凉 提交于 2019-12-08 17:39:55
问题 I am using celery to send a task to remote server and trying to get the result back. The state of task is constantly updated using update_state method on remote server. I am sending task using app.send_task('task_name') getting results of celery task is a blocking call and i don't want my django app to wait for result and timeout. So i tried running another celery task for getting results. @app.task(ignore_result=True) def catpure_res(task_id): task_obj = AsyncResult(task_id) task_obj.get(on

Starting celery in flask: AttributeError: 'Flask' object has no attribute 'user_options'

天涯浪子 提交于 2019-12-08 16:49:09
问题 I try to start a Celery worker server from a command line: celery -A server application worker --loglevel=info The code and folder path: server.py application/controllers/routes.py server.py app = Flask(__name__) from application.controllers import routes app.run(host='127.0.0.1',port=5051,debug=True) route.py from flask import Flask, from celery import Celery from server import app app.config['CELERY_BROKER_URL'] = 'redis://localhost:6379/0' app.config['CELERY_RESULT_BACKEND'] = 'redis:/

Django celery - asyncio - daemonic process are not allowed to have children

泄露秘密 提交于 2019-12-08 16:29:23
问题 I can see similar questions have been asked before but those are running multi processors and not executors. Therefore I am unsure how to fix this. the GitHub issue also say its resolved in 4.1 https://github.com/celery/celery/issues/1709 I am using celery==4.1.1 django-celery==3.2.1 django-celery-beat==1.0.1 django-celery-results==1.0.1 My script as as follows, ive tried to cut it down to show relevant code only. @asyncio.coroutine def snmp_get(ip, oid, snmp_user, snmp_auth, snmp_priv):

Celery (Django) Rate limiting

扶醉桌前 提交于 2019-12-08 16:07:21
问题 I'm using Celery to process multiple data-mining tasks. One of these tasks connects to a remote service which allows a maximum of 10 simultaneous connections per user (or in other words, it CAN exceed 10 connections globally but it CANNOT exceed 10 connections per individual job). I THINK Token Bucket (rate limiting) is what I'm looking for, but I can't seem to find any implementation of it. 回答1: Celery features rate limiting, and contains a generic token bucket implementation. Set rate

Retrieve queue length with Celery (RabbitMQ, Django)

为君一笑 提交于 2019-12-08 15:55:42
问题 I'm using Celery in a django project, my broker is RabbitMQ, and I want to retrieve the length of the queues. I went through the code of Celery but did not find the tool to do that. I found this issue on stackoverflow (Check RabbitMQ queue size from client), but I don't find it satisfying. Everything is setup in celery, so there should be some kind of magic method to retrieve what I want, without specifying a channel / connection. Does anyone have any idea about this question ? Thanks ! 回答1:

django/celery - celery status: Error: No nodes replied within time constraint

回眸只為那壹抹淺笑 提交于 2019-12-08 15:41:33
问题 I'm trying to deploy a simple example of celery in my production server, I've followed the tutorial in the celery website about running celery as daemon http://docs.celeryproject.org/en/latest/tutorials/daemonizing.html#daemonizing, and I got the config file in /etc/default/celeryd 1 # Name of nodes to start 2 # here we have a single node 3 CELERYD_NODES="w1" 4 # or we could have three nodes: 5 #CELERYD_NODES="w1 w2 w3" 6 7 # Where to chdir at start. 8 CELERYD_CHDIR="/home/audiwime/cidec_sw"

psycopg2 error: DatabaseError: error with no message from the libpq

扶醉桌前 提交于 2019-12-08 15:20:36
问题 I have an application that parses and loads data from csv files into a Postgres 9.3 database. In serial execution insert statements/cursor executions work without an issue. I added celery in the mix to add parallel parsing and inserting of the data files. Parsing works fine. However, I go to run insert statements and I get: [2015-05-13 11:30:16,464: ERROR/Worker-1] ingest_task.work_it: Exception Traceback (most recent call last): File "ingest_tasks.py", line 86, in work_it rowcount = ingest

Print statement in Celery scheduled task doesn't appear in terminal

空扰寡人 提交于 2019-12-08 14:56:09
问题 When I run celery -A tasks2.celery worker -B I want to see "celery task" printed every second. Currently nothing is printed. Why isn't this working? from app import app from celery import Celery from datetime import timedelta celery = Celery(app.name, broker='amqp://guest:@localhost/', backend='amqp://guest:@localhost/') celery.conf.update(CELERY_TASK_RESULT_EXPIRES=3600,) @celery.task def add(x, y): print "celery task" return x + y CELERYBEAT_SCHEDULE = { 'add-every-30-seconds': { 'task':

celery for different timezones

雨燕双飞 提交于 2019-12-08 13:34:36
问题 Now I using django-celery to send scheduled emails to user. it works fine if all users in same timezone. But if a user in different timezone, he will get the not in right time. For example, I scheduled an email send to user a, user b at 8am every day with CrontabSchedule, server is GMT time, user a is GMT, user b is GMT+1, user a will get that email at 8am but user b will get it at 9am. How can I schedule tasks for different timezones with celery? 回答1: When user B has his timezone set to

Script needs to be run as a Celery task. What consequences does this have?

别等时光非礼了梦想. 提交于 2019-12-08 09:45:49
问题 My task is it to write a script using opencv which will later run as a Celery task. What consequences does this have? What do I have to pay attention to? Is it enough in the end to include two lines of code or could it be, that I have to rewrite my whole script? I read, that Celery is a "asynchronous task queue/job queuing system based on distributed message passing", but I wont pretend to know completely what that all entails. I try to update the question, as soon as I get more details. 回答1: