celery

Celery scheduled list returns None

限于喜欢 提交于 2019-11-29 14:17:38
I'm fairly new to Celery and I've been attempting setup a simple script to schedule and unschedule tasks. However I feel like I'm running into a weird issue. I have the following setup from celery import Celery app = Celery('celery_test', broker='amqp://', backend='amqp') @app.task def add(x, y): return x + y I start up my celery server just fine and can add tasks. Now when I want to get a list of active tasks things seem to get weird. When I goto use inspect to get a list of scheduled tasks it works exactly once then returns None every time afterwards. >>> i = app.control.inspect() >>> print

Django Celery implementation - OSError : [Errno 38] Function not implemented

*爱你&永不变心* 提交于 2019-11-29 12:50:02
问题 I installed django-celery and I tried to start up the worker server but I get an OSError that a function isn't implemented. I'm running CentOS release 5.4 (Final) on a VPS: . broker -> amqp://guest@localhost:5672/ . queues -> . celery -> exchange:celery (direct) binding:celery . concurrency -> 4 . loader -> djcelery.loaders.DjangoLoader . logfile -> [stderr]@WARNING . events -> OFF . beat -> OFF [2010-07-22 17:10:01,364: WARNING/MainProcess] Traceback (most recent call last): [2010-07-22 17

How do I tell celery worker to stop accepting tasks? How can I check if any celery worker tasks are running?

帅比萌擦擦* 提交于 2019-11-29 12:47:09
The scenario: System running on a server consisting of a Python/Flask web application and background tasks using Celery Both web application and celery workers are run as upstart jobs (Web app behind Nginx) Deployment to production is done with a script that: Stop the upstart jobs Push code to server Run any db migrations Start the upstart jobs How can I enhance the deployment script so it does the following?: Tell the celery worker to stop accepting tasks Wait until any currently running celery tasks are finished Stop the upstart jobs Push code to server Run any db migrations Start the

Celery does not release memory

这一生的挚爱 提交于 2019-11-29 12:30:41
问题 It looks like celery does not release memory after task finished. Every time a task finishes, there would be 5m-10m memory leak. So with thousands of tasks, soon it will use up all memory. BROKER_URL = 'amqp://user@localhost:5672/vhost' # CELERY_RESULT_BACKEND = 'amqp://user@localhost:5672/vhost' CELERY_IMPORTS = ( 'tasks.tasks', ) CELERY_IGNORE_RESULT = True CELERY_DISABLE_RATE_LIMITS = True # CELERY_ACKS_LATE = True CELERY_TASK_RESULT_EXPIRES = 3600 # maximum time for a task to execute

Celery is rerunning long running completed tasks over and over

谁说我不能喝 提交于 2019-11-29 11:55:25
I've a python celery-redis queue processing uploads and downloads worth gigs and gigs of data at a time. Few of the uploads takes upto few hours. However once such a task finishes, I'm witnessing this bizarre celery behaviour that the celery scheduler is rerunning the just concluded task again by sending it again to the worker (I'm running a single worker) And it just happened 2times on the same task! Can someone help me know why is this happening and how can I prevent it? The tasks are definitely finishing cleanly with no errors reported just that these are extremely long running tasks. I

passing django request object to celery task

佐手、 提交于 2019-11-29 10:53:42
I have a task in tasks.py like so: @app.task def location(request): .... I am trying to pass the request object directly from a few to task like so: def tag_location(request): tasks.location.delay(request) return JsonResponse({'response': 1}) I am getting an error that it can't be serialized i guess? How do I fix this? trouble is I have file upload objects as well .. its not all simple data types. Because the request object contains references to things which aren't practical to serialize — like uploaded files, or the socket associated with the request — there's no general purpose way to

How to list the queued items in celery?

浪尽此生 提交于 2019-11-29 10:35:18
I have a Django project on an Ubuntu EC2 node, which I have been using to set up an asynchronous using Celery . I am following http://michal.karzynski.pl/blog/2014/05/18/setting-up-an-asynchronous-task-queue-for-django-using-celery-redis/ along with the docs. I've been able to get a basic task working at the command line, using: (env1)ubuntu@ip-172-31-22-65:~/projects/tp$ celery --app=myproject.celery:app worker --loglevel=INFO I just realized, that I have a bunch of tasks in my queue, that had not executed: [2015-03-28 16:49:05,916: WARNING/MainProcess] Restoring 4 unacknowledged message(s).

Cannot import name _uuid_generate_random in heroku django

帅比萌擦擦* 提交于 2019-11-29 10:30:38
问题 I am working on a project which scans user gmail inbox and provides a report. I have deployed it in heroku with following specs: Language: Python 2.7 Framework: Django 1.8 Task scheduler: Celery ( Rabbitmq-bigwig for broker url) Now when heroku execute it the celery is not giving me the output. On Heroku push its showing Collectstatic configuration error . I have tried using whitenoise package Also tried executing: heroku run python manage.py collectstatic --dry-run --noinput Still getting

How do I schedule a task with Celery that runs on 1st of every month?

橙三吉。 提交于 2019-11-29 08:06:17
问题 How do I schedule a task with celery that runs on 1st of every month? 回答1: Since Celery 3.0 the crontab schedule now supports day_of_month and month_of_year arguments: http://docs.celeryproject.org/en/latest/userguide/periodic-tasks.html#crontab-schedules 回答2: You can do this using Crontab schedules and you cand define this either: in your django settings.py : from celery.schedules import crontab CELERYBEAT_SCHEDULE = { 'my_periodic_task': { 'task': 'my_app.tasks.my_periodic_task', 'schedule'

Celery Task Chain and Accessing **kwargs

孤街醉人 提交于 2019-11-29 07:49:29
问题 I have a situation similar to the one outlined here, except that instead of chaining tasks with multiple arguments, I want to chain tasks that return a dictionary with multiple entries. This is -- very loosely and abstractly --- what I'm trying to do: tasks.py @task() def task1(item1=None, item2=None): item3 = #do some stuff with item1 and item2 to yield item3 return_object = dict(item1=item1, item2=item2, item3=item3) return return_object def task2(item1=None, item2=None, item3=None): item4