celery

Running Celery as root

浪子不回头ぞ 提交于 2019-12-02 18:19:27
I need to run my Django along with Celery as root for access reasons. It says I need to set C_FORCE_ROOT environment variable. How/where do I set the environment variable? You can set it to true like this: # export C_FORCE_ROOT="true" Then make sure it is set as an env. variable # echo $C_FORCE_ROOT true But make sure to make it permanent, as this will vanish with the next restart Have fun :) !! 1st solution - Manually type command at terminal $ export C_FORCE_ROOT='true' 2nd solution - Edit shell configuration $ vi ~/.bashrc # add following line export C_FORCE_ROOT='true' $ source ~/.bashrc

Why do CELERY_ROUTES have both a “queue” and a “routing_key”?

三世轮回 提交于 2019-12-02 17:56:37
My understanding of AMQP is that messages only have the following components: The message body The routing key The exchange Queues are attached to exchanges. Messages can't have any knowledge of queues. They just post to an exchange, and then based on the exchange type and routing key, the messages are routed to one or more queues. In Celery, the recommended way of routing tasks is through the CELERY_ROUTES setting. From the docs, CELERY_ROUTES is... A list of routers, or a single router used to route tasks to queues. http://celery.readthedocs.org/en/latest/configuration.html#message-routing

Django, ImportError: cannot import name Celery, possible circular import?

半世苍凉 提交于 2019-12-02 17:53:34
I went through this example here: http://docs.celeryproject.org/en/latest/django/first-steps-with-django.html All my tasks are in files called tasks.py. After updating celery and adding the file from the example django is throwing the following error, no matter what I try: ImportError: cannot import name Celery Is the problem possibly caused by the following? app.autodiscover_tasks(settings.INSTALLED_APPS, related_name='tasks') Because it goes through all tasks.py files which all have the following import. from cloud.celery import app cloud/celery.py : from __future__ import absolute_import

Django & Celery — Routing problems

非 Y 不嫁゛ 提交于 2019-12-02 17:35:12
I'm using Django and Celery and I'm trying to setup routing to multiple queues. When I specify a task's routing_key and exchange (either in the task decorator or using apply_async() ), the task isn't added to the broker (which is Kombu connecting to my MySQL database). If I specify the queue name in the task decorator (which will mean the routing key is ignored), the task works fine. It appears to be a problem with the routing/exchange setup. Any idea what the problem could be? Here's the setup: settings.py INSTALLED_APPS = ( ... 'kombu.transport.django', 'djcelery', ) BROKER_BACKEND = 'django

Examples of Django and Celery: Periodic Tasks

僤鯓⒐⒋嵵緔 提交于 2019-12-02 17:34:07
I have been fighting the Django/Celery documentation for a while now and need some help. I would like to be able to run Periodic Tasks using django-celery. I have seen around the internet (and the documentation) several different formats and schemas for how one should go about achieving this using Celery... Can someone help with a basic, functioning example of the creation, registration and execution of a django-celery periodic task? In particular, I want to know whether I should write a task that extends the PeriodicTask class and register that, or whether I should use the @periodic_task

How to set up celery workers on separate machines?

心已入冬 提交于 2019-12-02 16:33:52
I am new to celery.I know how to install and run one server but I need to distribute the task to multiple machines. My project uses celery to assign user requests passing to a web framework to different machines and then returns the result. I read the documentation but there it doesn't mention how to set up multiple machines. What am I missing? My understanding is that your app will push requests into a queueing system (e.g. rabbitMQ) and then you can start any number of workers on different machines (with access to the same code as the app which submitted the task). They will pick out tasks

How can I run a celery periodic task from the shell manually?

荒凉一梦 提交于 2019-12-02 16:28:28
I'm using celery and django-celery. I have defined a periodic task that I'd like to test. Is it possible to run the periodic task from the shell manually so that I view the console output? Have you tried just running the task from the Django shell? You can use the .apply method of a task to ensure that it is run eagerly and locally. Assuming the task is called my_task in Django app myapp in a tasks submodule: $ python manage.py shell >>> from myapp.tasks import my_task >>> eager_result = my_task.apply() The result instance has the same API as the usual AsyncResult type, except that the result

Updating a Haystack search index with Django + Celery

我的梦境 提交于 2019-12-02 16:16:22
In my Django project I am using Celery. I switched over a command from crontab to be a periodic task and it works well but it is just calling a method on a model. Is it possible to update my Haystack index from a periodic task as well? Has anyone done this? /manage.py update_index That's the command to update the index from the Haystack documentation but I'm not sure how to call that from a task. the easiest way to do this would probably be to run the management command directly from python and run it in your task from haystack.management.commands import update_index update_index.Command()

python服务不能在docker容器里运行的问题

最后都变了- 提交于 2019-12-02 15:38:10
在开发过程中,我们将mysql、redis、celery等服务在docker容器里跑,项目在本地运行,便于debug调试 docker-compose -f docker-compose-dev.yml up db redis celery 当我将服务在docker里运行时,才发现了问题 报错指在manage.py,这可是一头雾水,经过一番查询之后,在manage.py文件第一行加入了路径 #! /usr/bin/env python3.6 然而并没有什么作用,报错路径问题。 最终解决方案是通过在dockerfile里去除执行权限 RUN chmod -x /app/manage.py 来源: https://www.cnblogs.com/lutt/p/11751953.html

Flask with create_app, SQLAlchemy and Celery

落花浮王杯 提交于 2019-12-02 15:17:39
I'm really struggling to the get the proper setup for Flask, SQLAlchemy and Celery. I have searched extensively and tried different approaches, nothing really seems to work. Either I missed the application context or can't run the workers or there are some other problems. The structure is very general so that I can build a bigger application. I'm using: Flask 0.10.1, SQLAlchemy 1.0, Celery 3.1.13, my current setup is the following: app/__init__.py #Empty app/config.py import os basedir = os.path.abspath(os.path.dirname(__file__)) class Config: @staticmethod def init_app(app): pass class