celery

How to properly configure djcelery results backend to database

a 夏天 提交于 2019-12-03 07:04:40
问题 I'm trying to setup djangocelery to store task results in the databse. I set: CELERY_RESULT_BACKEND = 'djcelery.backends.database.DatabaseBackend' then I synced and migrated the db (no errors). Celery is working and tasks get processed (I can get the results), but admin shows there is no tasks. In the database are two tables celery_taskmeta and djcelery_taskmeta . First one is holding the results and second one is displayed in admin. Anyone has insight how to configure it properly? 回答1: Check

Simulating the passing of time in unittesting

南楼画角 提交于 2019-12-03 06:57:54
问题 I've built a paywalled CMS + invoicing system for a client and I need to get more stringent with my testing. I keep all my data in a Django ORM and have a bunch of Celery tasks that run at different intervals that makes sure that new invoices and invoice reminders get sent and cuts of access when users don't pay their invoices. For example I'd like to be a able to run a test that: Creates a new user and generates an invoice for X days of access to the site Simulates the passing of X + 1 days,

Is Celery as efficient on a local system as python multiprocessing is?

耗尽温柔 提交于 2019-12-03 06:55:34
问题 I'm having a bit of trouble deciding whatever to use python multiprocessing or celery or pp for my application. My app is very CPU heavy but currently uses only one cpu so, I need to spread it across all available cpus(which caused me to look at python's multiprocessing library) but I read that this library doesn't scale to other machines if required. Right now I'm not sure if I'll need more than one server to run my code but I'm thinking of running celery locally and then scaling would only

celery-django can't find settings

╄→尐↘猪︶ㄣ 提交于 2019-12-03 06:22:31
I have a Django project that uses Celery for running asynchronous tasks. I'm doing my development on a Windows XP machine. Starting my Django server ( python manage.py runserver 80 ) works fine, but attempting to start the Celery Daemon ( python manage.py celeryd start ) fails with the following error: ImportError: Could not import settings 'src.settings' (Is it on sys.path? Does it have syntax errors?): No module named src.settings sys.path includes 'C:\development\SpaceCorps\src', so I'm not sure why it can't find this module. Here's the full output from starting the daemon: C:\development

Celery raises ValueError: not enough values to unpack

谁说胖子不能爱 提交于 2019-12-03 06:20:53
问题 Trying to run simple example with Celery and receiving an exception. RabbitMQ started in a Docker, also tried to start it locally. Celery works on a local Windows host from celery import Celery app = Celery('tasks', broker='amqp://192.168.99.100:32774') @app.task() def hello(): print('hello') if __name__ == '__main__': hello.delay() Excerpt of my error text: [2017-08-18 00:01:08,628: INFO/MainProcess] Received task: tasks.hello[8d33dbea-c5d9-4938-ab1d-0646eb1a3858] [2017-08-18 00:01:08,632:

Python Celery versus Threading Library for running async requests [closed]

醉酒当歌 提交于 2019-12-03 06:10:32
Closed . This question is opinion-based. It is not currently accepting answers. Learn more . Want to improve this question? Update the question so it can be answered with facts and citations by editing this post . I am running a python method that parses a lot of data. Since it is time intensive, I would like to run it asynchronously on a separate thread so the user can still access the website/UI. Do threads using the "from threading import thread" module terminate if a user exits the site or do they continue to run on the server? What would be the advantages of using Celery versus simply

Recover from task failed beyond max_retries

江枫思渺然 提交于 2019-12-03 05:46:30
问题 I am attempting to asynchronously consume a web service because it takes up to 45 seconds to return. Unfortunately, this web service is also somewhat unreliable and can throw errors. I have set up django-celery and have my tasks executing, which works fine until the task fails beyond max_retries . Here is what I have so far: @task(default_retry_delay=5, max_retries=10) def request(xml): try: server = Client('https://www.whatever.net/RealTimeService.asmx?wsdl') xml = server.service

How to disallow pickle serialization in celery

倾然丶 夕夏残阳落幕 提交于 2019-12-03 05:26:45
问题 Celery defaults to using pickle as its serialization method for tasks. As noted in the FAQ, this represents a security hole. Celery allows you to configure how tasks get serialized using the CELERY_TASK_SERIALIZER configuration parameter. But this doesn't solve the security problem. Even if tasks are serialized with JSON or similar, the workers will still execute tasks inserted into the queue with pickle serialization -- they just respond to the content-type parameter in the message. So

Examples of Django and Celery: Periodic Tasks

我的梦境 提交于 2019-12-03 05:08:36
问题 I have been fighting the Django/Celery documentation for a while now and need some help. I would like to be able to run Periodic Tasks using django-celery. I have seen around the internet (and the documentation) several different formats and schemas for how one should go about achieving this using Celery... Can someone help with a basic, functioning example of the creation, registration and execution of a django-celery periodic task? In particular, I want to know whether I should write a task

Starting Celery: AttributeError: 'module' object has no attribute 'celery'

女生的网名这么多〃 提交于 2019-12-03 05:08:06
I try to start a Celery worker server from a command line: celery -A tasks worker --loglevel=info The code in tasks.py: import os os.environ[ 'DJANGO_SETTINGS_MODULE' ] = "proj.settings" from celery import task @task() def add_photos_task( lad_id ): ... I get the next error: Traceback (most recent call last): File "/usr/local/bin/celery", line 8, in <module> load_entry_point('celery==3.0.12', 'console_scripts', 'celery')() File "/usr/local/lib/python2.7/site-packages/celery-3.0.12-py2.7.egg/celery/__main__.py", line 14, in main main() File "/usr/local/lib/python2.7/site-packages/celery-3.0.12