celery

Delay sending an email using Mandrill send_at or Celery countdown/eta

我怕爱的太早我们不能终老 提交于 2020-01-02 08:11:06
问题 I commonly send transactional emails in response to certain actions on my website, some of which I delay sending by a couple of hours. The function that actually queues the email is a Celery task function called with .delay() that eventually makes an API call to Mandrill using djrill. I discovered that Mandrill offers a send_at parameter when sending an email that will have Mandrill delay sending the email until the specified time. Celery also offers eta or countdown parameters when calling

Can I use Python requests with celery?

折月煮酒 提交于 2020-01-02 07:59:06
问题 I have the following defined in a celery module named tasks.py with the requests library imported: @celery.task def geturl(url): res = requests.get(url) return res.content Whenever I call the task (either from tasks.py or REPL) with: res = geturl.delay('http://www.google.com') print res.get() Here are the log entries on the celery server: [2012-12-19 18:49:58,400: INFO/MainProcess] Starting new HTTP connection (1): www.google.com [2012-12-19 18:49:58,594: INFO/MainProcess] Starting new HTTP

Pickle is refusing to serialize content with celery reporting ContentDisallowed: Refusing to deserialize untrusted content of type pickle

丶灬走出姿态 提交于 2020-01-02 00:46:06
问题 I am trying to put some python object mostly json serializable except datetime.datetime in rabbitmq queue and so using pickle to serialize. celery_config file: CELERY_TASK_SERIALIZER = 'pickle' CELERY_RESULT_SERIALIZER = 'pickle' It is throwing an exception saying: File "/usr/local/lib/python2.7/dist-packages/kombu/serialization.py", line 174, in loads raise self._for_untrusted_content(content_type, 'untrusted') ContentDisallowed: Refusing to deserialize untrusted content of type pickle

Celery: stuck in infinitly repeating timeouts (Timed out waiting for UP message)

≡放荡痞女 提交于 2020-01-01 16:09:12
问题 I defined some tasks with a time limit of 1200: @celery.task(time_limit=1200) def create_ne_list(text): c = Client() return c.create_ne_list(text) I'm also using the worker_process_init signal to do some initialization, each time a new process starts: @worker_process_init.connect def init(sender=None, conf=None, **kwargs): init_system(celery.conf) init_pdf(celery.conf) This initialization function takes several seconds to execute. Besides that, I'm using the following configuration: CELERY

Celery-------周期任务

不想你离开。 提交于 2020-01-01 14:24:32
在项目目录例子的基础上进行修改一下celery文件 from celery import Celery from celery.schedules import crontab celery_task = Celery("task", broker="redis://127.0.0.1:6379", backend="redis://127.0.0.1:6379", include=["Celery_task.task_1","Celery_task.task_2"]) #我要要对beat任务生产做一个配置,这个配置的意思就是每10秒执行一次Celery_task.task_1任务参数是(10,10) PS: 这里我们并没真的用到参数 只是举个例子 celery_task.conf.beat_schedule={ "each10s_task":{ "task":"Celery_task.task_1.func1", "schedule":10, # 每10秒钟执行一次 "args":(10,10) }, "each1m_task": { "task": "Celery_task.task_1.func1", "schedule": crontab(minute=1), # 每一分钟执行一次 "args": (10, 10) }, "each24hours_task": {

AttributeError: 'DisabledBackend' object has no attribute '_get_task_meta_for'

我的梦境 提交于 2020-01-01 10:52:08
问题 I am trying to read meta info from celery task in case of timeout (if task is not finished in given time). I have 3 celery workers. When I execute tasks on 3 workers serially my timeout logic (getting meta info from redis backend) works fine. But, when I execute tasks in parallel using threads, I get error 'AttributeError: 'DisabledBackend' object has no attribute '_get_task_meta_for''. main script. from threading import Thread from util.tasks import app from celery.exceptions import

AttributeError: 'DisabledBackend' object has no attribute '_get_task_meta_for'

冷暖自知 提交于 2020-01-01 10:51:49
问题 I am trying to read meta info from celery task in case of timeout (if task is not finished in given time). I have 3 celery workers. When I execute tasks on 3 workers serially my timeout logic (getting meta info from redis backend) works fine. But, when I execute tasks in parallel using threads, I get error 'AttributeError: 'DisabledBackend' object has no attribute '_get_task_meta_for''. main script. from threading import Thread from util.tasks import app from celery.exceptions import

Retrieve result from 'task_id' in Celery from unknown task

我与影子孤独终老i 提交于 2020-01-01 09:08:39
问题 How do I pull the result of a task if I do not know previously which task was performed? Here's the setup: Given the following source('tasks.py'): from celery import Celery app = Celery('tasks', backend="db+mysql://u:p@localhost/db", broker = 'amqp://guest:guest@localhost:5672//') @app.task def add(x,y): return x + y @app.task def mul(x,y): return x * y with RabbitMQ 3.3.2 running locally: marcs-mbp:sbin marcstreeter$ ./rabbitmq-server RabbitMQ 3.3.2. Copyright (C) 2007-2014 GoPivotal, Inc. #

celery: daemonic processes are not allowed to have children

て烟熏妆下的殇ゞ 提交于 2020-01-01 08:17:52
问题 In Python (2.7) I try to create processes (with multiprocessing) in a celery task (celery 3.1.17) but it gives the error: daemonic processes are not allowed to have children Googling it, I found that most recent versions of billiard fix the "bug" but I have the most recent version (3.3.0.20) and the error is still happening. I also tried to implement this workaround in my celery task but it gives the same error. Does anybody know how to do it? Any help is appreciated, Patrick EDIT: snippets

Running Django-Celery in Production

假装没事ソ 提交于 2020-01-01 04:23:44
问题 I've built a Django web application and some Django-Piston services. Using a web interface a user submits some data which is POSTed to a web service and that web service in turn uses Django-celery to start a background task. Everything works fine in the development environment using manage.py. Now I'm trying to move this to production on a proper apache server. The web application and web services work fine in production but I'm having serious issues starting celeryd as a daemon. Based on