celery

Failure to start celeryd - Error : conflicting option string(s): --no-color

匿名 (未验证) 提交于 2019-12-03 00:56:02
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试): 由 翻译 强力驱动 问题: I'm using django v1.7.0b4 and celery v3.1.1. Followed steps according to the django installation guide . But I'm stuck with the below error. $ ./ manage . py celeryd -- help Starting server in DEVELOPMENT Mode Traceback ( most recent call last ): File "./manage.py" , line 10 , in <module> execute_from_command_line ( sys . argv ) File "/Library/Python/2.7/site-packages/django/core/management/__init__.py" , line 427 , in execute_from_command_line utility . execute () File "/Library/Python/2.7/site-packages/django/core/management/_

Airflow: Tasks queued but not running

匿名 (未验证) 提交于 2019-12-03 00:56:02
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试): 由 翻译 强力驱动 问题: I am new to airflow and trying to setup airflow to run ETL pipelines. I was able to install airflow postgres celery rabbitmq I am able to test run the turtorial dag. When i try to schedule the jobs, scheduler is able to pick it up and queue the jobs which i could see on the UI but tasks are not running. Could somebody help me fix ths issue? I believe i am missing most basic airflow concept here. below is the airflow.cfg Here is my config file: [ core ] airflow_home = /root/ airflow dags_folder = /root/ airflow / dags base_log

Celery task that runs more tasks

守給你的承諾、 提交于 2019-12-03 00:55:19
I am using celerybeat to kick off a primary task that kicks of a number of secondary tasks. I have both tasks written already. Is there a way to easily do this? Does Celery allow for tasks to be run from within tasks? My example: @task def compute(users=None): if users is None: users = User.objects.all() tasks = [] for user in users: tasks.append(compute_for_user.subtask((user.id,))) job = TaskSet(tasks) job.apply_async() # raises a IOError: Socket closed @task def compute_for_user(user_id): #do some stuff compute gets called from celerybeat, but causes an IOError when it tries to run apply

ImportError: No module named dateutil

匿名 (未验证) 提交于 2019-12-03 00:53:01
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试): 问题: I am trying to follow the example in the "First Steps with Celery" document. I have installed Celery using pip. I created a file called tasks.py in ~/python/celery, and it contains the following: from celery import Celery celery = Celery('tasks', broker='amqp://guest@localhost//') @celery.task def add(x, y): return x + y I started a worker using celery -A tasks worker --loglevel=info while in the ~/python/celery directory, and it seems to be running. In a separate Terminal window, I launched Python and ran the following: from tasks import

Airflow worker is not listening to default rabbitmq queue

匿名 (未验证) 提交于 2019-12-03 00:48:01
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试): 问题: I have configured Airflow with rabbitmq broker, the services: airflow worker airflow scheduler airflow webserver are running without any errors. The scheduler is pushing the tasks to execute on default rabbitmq queue: Even I tried airflow worker -q=default - worker still not receiving tasks to run. My airflow.cfg settings file: [core] # The home folder for airflow, default is ~/airflow airflow_home = /home/my_projects/ksaprice_project/airflow # The folder where your airflow pipelines live, most likely a # subfolder in a code repository #

django/celery - celery status: Error: No nodes replied within time constraint

匿名 (未验证) 提交于 2019-12-03 00:46:02
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试): 问题: I'm trying to deploy a simple example of celery in my production server, I've followed the tutorial in the celery website about running celery as daemon http://docs.celeryproject.org/en/latest/tutorials/daemonizing.html#daemonizing , and I got the config file in /etc/default/celeryd 1 # Name of nodes to start 2 # here we have a single node 3 CELERYD_NODES="w1" 4 # or we could have three nodes: 5 #CELERYD_NODES="w1 w2 w3" 6 7 # Where to chdir at start. 8 CELERYD_CHDIR="/home/audiwime/cidec_sw" 9 10 # Python interpreter from environment. 11

Correct setup of django redis celery and celery beats

匿名 (未验证) 提交于 2019-12-03 00:46:02
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试): 问题: I have being trying to setup django + celery + redis + celery_beats but it is giving me trouble. The documentation is quite straightforward, but when I run the django server, redis, celery and celery beats, nothing gets printed or logged (all my test task does its log something). This is my folder structure: - aenima - aenima - __init__.py - celery.py - criptoball - tasks.py celery.py looks like this: from __future__ import absolute_import, unicode_literals import os from django.conf import settings from celery import Celery # set the

Django &amp; Celery ― Routing problems

匿名 (未验证) 提交于 2019-12-03 00:44:02
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试): 问题: I'm using Django and Celery and I'm trying to setup routing to multiple queues. When I specify a task's routing_key and exchange (either in the task decorator or using apply_async() ), the task isn't added to the broker (which is Kombu connecting to my MySQL database). If I specify the queue name in the task decorator (which will mean the routing key is ignored), the task works fine. It appears to be a problem with the routing/exchange setup. Any idea what the problem could be? Here's the setup: settings.py INSTALLED_APPS = ( ... 'kombu

浅谈 Celery 分布式队列

匿名 (未验证) 提交于 2019-12-03 00:39:02
Q1: Django开发Web项目时遇到一个问题,如何解决大量用户在同一时间注册,短信发送延迟的问题?   A1:   ① 封装一个发送短信的函数       ② 创建进程、线程、协程调用发送短信的函数 Q2: 创建的进程、线程、协程和Django网站服务器在同一个电脑上,并且调用顺序也是不确定的 所以 A1 OUT A2: Celery(异步任务队列): ① celery中的任务发出者,中间人和任务执行着可以在不同的电脑上 ② celery 中的任务会进行排序,先添加的任务先被worker执行 1. Celery的介绍   Celery是Python开发的分布式任务调度模块,通过它我们可以轻松地实现任务的异步处理,Celery主要有以下几个优点:     它可以让任务的执行同主程序完全脱离,甚至不在同一台主机内。     3. 它可以用来处理复杂系统性能问题,却又相当灵活易用。    还是举用户注册的例子,比如同一时间有100个用户要注册,此时网络很差,请求到达短信系统的时间将会很长,如果短信系统迟迟无法回应,会导致后续的代码无法执行,造成用户长时间地等待,影响用户的体验;使用了Celery异步消息队列,只要发布者将发送短信的任务送至中间件,后续无需做任何事情,worker会监听任务队列并执行。    在我的理解中 Celery主要有三大模块组成: 任务发出者: ② 任务执行者

Celery异步任务

匿名 (未验证) 提交于 2019-12-03 00:39:02
  在实际开发过程中,会遇到很多耗时操作,这时如果不采取措施,程序会进入到阻塞状态,直到耗时任务完成,为了保证整个项目的流畅性,通常会对这些耗时任务进行异步操作,具体步骤如下:   1.创建celery_tasks用于保存celery异步任务   2.在celery_tasks目录下创建config.py文件,用于保存celery的配置信息      broker_url = "redis://127.0.0.1/14"   3.在celery_tasks目录下创建main.py文件,用于作为celery的启动文件      from celery import Celery # 为celery使用django配置文件进行设置 import os if not os.getenv( ‘ DJANGO_SETTINGS_MODULE ‘ ): os.environ[ ‘ DJANGO_SETTINGS_MODULE ‘ ] = ‘ xxx.settings.dev ‘ # 创建celery应用 app = Celery( ‘ xxx ‘ ) # 导入celery配置 app.config_from_object( ‘ celery_tasks.config ‘ ) # 自动注册celery任务 app.autodiscover_tasks([ ‘ celery_tasks.xxx ‘