Running Celery tasks periodically (without Django)

流过昼夜 提交于 2019-12-24 02:25:28

问题


I am trying to run a few functions (tasks) periodically, say every 3 seconds, with Celery.

The closest I'm getting is to just run the tasks once.

This is my Celery configuration file:

# celeryconfig.py
from datetime import timedelta

BROKER_URL = 'amqp://guest@localhost//'

CELERY_RESULT_BACKEND = 'rpc://'

CELERYBEAT_SCHEDULE = {
    'f1-every-3-seconds': {
        'task': 'tasks.f1',
        'schedule': timedelta(seconds=3),
        'args': (1, 2)
    },
    'f2-every-3-seconds': {
        'task': 'tasks.f2',
        'schedule': timedelta(seconds=3),
        'args': (3, 4)
    },
}

This is where I declare the tasks:

# tasks.py:
import celeryconfig
from celery import Celery
from celery import task

dbwapp = Celery('tasks')
dbwapp.config_from_object(celeryconfig)

@dbwapp.task()
def f1(a, b):
    print "F1: {0}, {1}".format(a, b)

@dbwapp.task()
def f2(a, b):
    print "F2: {0}, {1}".format(a, b)

And this is where my main program would run:

#tasks_runner.py:
from tasks import f1, f2, dbwapp


f1.delay(5, 6)
f2.delay(7, 8)

I run my code with: python tasks_runner.pybut don't manage to make those two functions run periodically. This is the output that I get:

[2016-03-31 23:36:16,108: WARNING/Worker-9] F1: 5, 6
[2016-03-31 23:36:16,109: WARNING/Worker-6] F2: 7, 8

What am I doing wrong? How do I make f1 and f2 to run periodically?


回答1:


Using your code, I was able to start celery incl. scheduled tasks this way:

$ celery beat                                                                         (env: celery) 
celery beat v3.1.23 (Cipater) is starting.
__    -    ... __   -        _
Configuration ->
    . broker -> redis://localhost:6379/0
    . loader -> celery.loaders.default.Loader
    . scheduler -> celery.beat.PersistentScheduler
    . db -> celerybeat-schedule
    . logfile -> [stderr]@%INFO
    . maxinterval -> now (0s)
[2016-04-01 00:15:05,377: INFO/MainProcess] beat: Starting...
[2016-04-01 00:15:08,402: INFO/MainProcess] Scheduler: Sending due task f2-every-3-seconds (tasks.f2)
[2016-04-01 00:15:08,410: INFO/MainProcess] Scheduler: Sending due task f1-every-3-seconds (tasks.f1)
[2016-04-01 00:15:11,403: INFO/MainProcess] Scheduler: Sending due task f2-every-3-seconds (tasks.f2)
[2016-04-01 00:15:11,411: INFO/MainProcess] Scheduler: Sending due task f1-every-3-seconds (tasks.f1)
[2016-04-01 00:15:14,404: INFO/MainProcess] Scheduler: Sending due task f2-every-3-seconds (tasks.f2)
[2016-04-01 00:15:14,412: INFO/MainProcess] Scheduler: Sending due task f1-every-3-seconds (tasks.f1)
[2016-04-01 00:15:17,404: INFO/MainProcess] Scheduler: Sending due task f2-every-3-seconds (tasks.f2)
[2016-04-01 00:15:17,412: INFO/MainProcess] Scheduler: Sending due task f1-every-3-seconds (tasks.f1)
[2016-04-01 00:15:20,405: INFO/MainProcess] Scheduler: Sending due task f2-every-3-seconds (tasks.f2)
[2016-04-01 00:15:20,413: INFO/MainProcess] Scheduler: Sending due task f1-every-3-seconds (tasks.f1)
[2016-04-01 00:15:23,406: INFO/MainProcess] Scheduler: Sending due task f2-every-3-seconds (tasks.f2)
[2016-04-01 00:15:23,413: INFO/MainProcess] Scheduler: Sending due task f1-every-3-seconds (tasks.f1)
[2016-04-01 00:15:26,407: INFO/MainProcess] Scheduler: Sending due task f2-every-3-seconds (tasks.f2)
[2016-04-01 00:15:26,414: INFO/MainProcess] Scheduler: Sending due task f1-every-3-seconds (tasks.f1)

It apparently loads the default configuration for celery and starting the beat service start firing scheduled tasks according to current configuration.

Anyway, this only sends requests to perform the tasks but miss the actual worker. The worker may be started in another console:

$ celery worker -A tasks
[2016-04-01 00:31:46,950: WARNING/MainProcess] celery@zen ready.
[2016-04-01 00:31:47,029: WARNING/Worker-4] F2: 3, 4
[2016-04-01 00:31:47,029: WARNING/Worker-2] F1: 1, 2
[2016-04-01 00:31:47,036: WARNING/Worker-3] F2: 3, 4
[2016-04-01 00:31:47,036: WARNING/Worker-1] F1: 1, 2
[2016-04-01 00:31:48,829: WARNING/Worker-4] F2: 3, 4
[2016-04-01 00:31:48,829: WARNING/Worker-2] F1: 1, 2

If you want to use only one worker, you may start it at once with the beat service:

$ celery worker -A tasks -B



回答2:


Instead of running task_runner.py, you need to start the celery beat worker as a separate process:

celery -A proj beat

As described here: http://docs.celeryproject.org/en/latest/userguide/periodic-tasks.html



来源:https://stackoverflow.com/questions/36344523/running-celery-tasks-periodically-without-django

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!