celery

Celery: how to limit number of tasks in queue and stop feeding when full?

房东的猫 提交于 2019-11-27 07:50:58
问题 I am very new to Celery and here is the question I have: Suppose I have a script that is constantly supposed to fetch new data from DB and send it to workers using Celery. tasks.py # Celery Task from celery import Celery app = Celery('tasks', broker='amqp://guest@localhost//') @app.task def process_data(x): # Do something with x pass fetch_db.py # Fetch new data from DB and dispatch to workers. from tasks import process_data while True: # Run DB query here to fetch new data from DB fetched

Celery and SQLAlchemy - This result object does not return rows. It has been closed automatically

空扰寡人 提交于 2019-11-27 07:39:39
问题 I have a celery project connected to a MySQL databases. One of the tables is defined like this: class MyQueues(Base): __tablename__ = 'accepted_queues' id = sa.Column(sa.Integer, primary_key=True) customer = sa.Column(sa.String(length=50), nullable=False) accepted = sa.Column(sa.Boolean, default=True, nullable=False) denied = sa.Column(sa.Boolean, default=True, nullable=False) Also, in the settings I have THREADS = 4 And I am stuck in a function in code.py : def load_accepted_queues(session,

How can I schedule a Task to execute at a specific time using celery?

余生长醉 提交于 2019-11-27 06:55:50
I've looked into PeriodicTask , but the examples only cover making it recur. I'm looking for something more like cron 's ability to say "execute this task every Monday at 1 a.m." The recently released version 1.0.3 supports this now, thanks to Patrick Altman! Example: from celery.task.schedules import crontab from celery.decorators import periodic_task @periodic_task(run_every=crontab(hour=7, minute=30, day_of_week="mon")) def every_monday_morning(): print("This runs every Monday morning at 7:30a.m.") See the changelog for more information: http://celeryproject.org/docs/changelog.html Use

Celery Received unregistered task of type (run example)

夙愿已清 提交于 2019-11-27 06:42:59
I'm trying to run example from Celery documentation. I run: celeryd --loglevel=INFO /usr/local/lib/python2.7/dist-packages/celery/loaders/default.py:64: NotConfigured: No 'celeryconfig' module found! Please make sure it exists and is available to Python. "is available to Python." % (configname, ))) [2012-03-19 04:26:34,899: WARNING/MainProcess] -------------- celery@ubuntu v2.5.1 ---- **** ----- --- * *** * -- [Configuration] -- * - **** --- . broker: amqp://guest@localhost:5672// - ** ---------- . loader: celery.loaders.default.Loader - ** ---------- . logfile: [stderr]@INFO - ** ---------- .

Celery with Amazon SQS

▼魔方 西西 提交于 2019-11-27 06:20:38
I want to use Amazon SQS as broker backed of Celery . There’s the SQS transport implementation for Kombu , which Celery depends on. However there is not enough documentation for using it, so I cannot find how to configure SQS on Celery. Is there somebody that had succeeded to configure SQS on Celery? I ran into this question several times but still wasn't entirely sure how to setup Celery to work with SQS. It turns out that it is quite easy with the latest versions of Kombu and Celery. As an alternative to the BROKER_URL syntax mentioned in another answer, you can simply set the transport,

celery doesn't work with global variable

China☆狼群 提交于 2019-11-27 06:07:34
问题 from celery import Celery app = Celery('tasks', backend='amqp://guest@localhost//', broker='amqp://guest@localhost//') a_num = 0 @app.task def addone(): global a_num a_num = a_num + 1 return a_num this is the code I used to test celery. I hope every time I use addone() the return value should increase. But it's always 1 why??? Results python >> from tasks import addone >> r = addone.delay() >> r.get() 1 >> r = addone.delay() >> r.get() 1 >> r = addone.delay() >> r.get() 1 回答1: By default when

go解析markdown转成html

情到浓时终转凉″ 提交于 2019-11-27 05:39:33
一、代码 package main import ( "fmt" "github.com/microcosm-cc/bluemonday" "github.com/russross/blackfriday" "io/ioutil" "os" ) func ReadAll(filePth string) ([]byte, error) { f, err := os.Open(filePth) if err != nil { return nil, err } return ioutil.ReadAll(f) } func MarkdownToHTML(md string) string { myHTMLFlags := 0 | blackfriday.HTML_USE_XHTML | blackfriday.HTML_USE_SMARTYPANTS | blackfriday.HTML_SMARTYPANTS_FRACTIONS | blackfriday.HTML_SMARTYPANTS_DASHES | blackfriday.HTML_SMARTYPANTS_LATEX_DASHES myExtensions := 0 | blackfriday.EXTENSION_NO_INTRA_EMPHASIS | blackfriday.EXTENSION_TABLES |

Celery Worker Database Connection Pooling

て烟熏妆下的殇ゞ 提交于 2019-11-27 05:17:34
问题 I am using Celery standalone (not within Django). I am planning to have one worker task type running on multiple physical machines. The task does the following Accept an XML document. Transform it. Make multiple database reads and writes. I'm using PostgreSQL, but this would apply equally to other store types that use connections. In the past, I've used a database connection pool to avoid creating a new database connection on every request or avoid keeping the connection open too long.

Celery ValueError: not enough values to unpack (expected 3, got 0)的解决方案

隐身守侯 提交于 2019-11-27 05:06:38
最近因项目需要,在使用任务队列Celery的时候,出现如题错误,最终在github上里找到解决办法,记录一下。 运行环境环境:win10 + python3 + redis 2.10.6 + celery 4.2.1 win10上运行celery4.x会出现这个问题,开启任务队列一切正常(显示ready提示),一旦接受任务,就报ValueError: not enough values to unpack (expected 3, got 0)错误,解决办法如下: 1、先安装一个扩展 eventlet pip install eventlet 2、然后启动worker的时候加一个参数-P eventlet,如下: celery -A <mymodule> worker -l info -P eventlet 以上两步,完美解决遇到的问题,原理待深究。 来源: https://www.cnblogs.com/tjp40922/p/11345529.html

Django Celery Logging Best Practice

半腔热情 提交于 2019-11-27 04:12:54
问题 I'm trying to get Celery logging working with Django . I have logging set-up in settings.py to go to console (that works fine as I'm hosting on Heroku ). At the top of each module, I have: import logging logger = logging.getLogger(__name__) And in my tasks.py, I have: from celery.utils.log import get_task_logger logger = get_task_logger(__name__) That works fine for logging calls from a task and I get output like this: 2012-11-13T18:05:38+00:00 app[worker.1]: [2012-11-13 18:05:38,527: INFO