django-celery

How to debug “could not receive data from client: Connection reset by peer”

非 Y 不嫁゛ 提交于 2021-02-18 05:12:25
问题 I'm running a django-celery application on Ubuntu-12.04. When I run a celery task from my web interface, I get the following error, taken form postgresql-9.3 logfile (maximum level of log): 2013-11-12 13:57:01 GMT tss_usr 8113 LOG: could not receive data from client: Connection reset by peer tss_usr is the postgresql user of the django application database and (in this example) 8113 is the pid of the process who killed the connection, I guess. Have you got any idea on why this happens or at

How to debug “could not receive data from client: Connection reset by peer”

你。 提交于 2021-02-18 05:11:59
问题 I'm running a django-celery application on Ubuntu-12.04. When I run a celery task from my web interface, I get the following error, taken form postgresql-9.3 logfile (maximum level of log): 2013-11-12 13:57:01 GMT tss_usr 8113 LOG: could not receive data from client: Connection reset by peer tss_usr is the postgresql user of the django application database and (in this example) 8113 is the pid of the process who killed the connection, I guess. Have you got any idea on why this happens or at

Django-Celery in production?

风格不统一 提交于 2021-02-16 13:38:28
问题 So I've been trying to figure out how to make scheduled tasks, I've found Celery and been able to to make simple scheduled tasks. To do this I need to open up a command line and run celery -A proj beat for the tasks to happen. This works fine in a development environment, but when putting this into production that will be an issue. So how can I get celery to work without the command line use? When my production server is online, how can I make sure my scheduler goes up with it? Can Celery do

Django-Celery in production?

蹲街弑〆低调 提交于 2021-02-16 13:38:12
问题 So I've been trying to figure out how to make scheduled tasks, I've found Celery and been able to to make simple scheduled tasks. To do this I need to open up a command line and run celery -A proj beat for the tasks to happen. This works fine in a development environment, but when putting this into production that will be an issue. So how can I get celery to work without the command line use? When my production server is online, how can I make sure my scheduler goes up with it? Can Celery do

Running celery as daemon does not create PID file (no permission issue)

可紊 提交于 2021-02-11 16:34:44
问题 I am trying to run celery (worker) as a daemon / service in Ubuntu server. I've follow their documentation (https://docs.celeryproject.org/en/stable/userguide/daemonizing.html) However, when I start the daemon it says: celery multi v5.0.4 (singularity) > Starting nodes... > worker1@ubuntuserver: OK But when I check the status it says: celery init v10.1. Using config script: /etc/default/celeryd celeryd down: no pidfiles found I've seen some info on the internet about permissions. But not sure

celery consume send_task response

流过昼夜 提交于 2021-02-11 15:19:10
问题 In django application I need to call an external rabbitmq, running on a windows server and using some application there, where the django app runs on a linux server. I'm currently able to add a task to the queue by using the celery send_task : app.send_task('tasks', kwargs=self.get_input(), queue=Queue('queue_async', durable=False)) My settings looks like: CELERY_BROKER_URL = CELERY_CONFIG['broker_url'] BROKER_TRANSPORT_OPTIONS = {"max_retries": 3, "interval_start": 0, "interval_step": 0.2,

Celery doesn't process the task request in single hit?

◇◆丶佛笑我妖孽 提交于 2021-02-11 14:17:58
问题 I have set up a Django project with Celery and Redis. I am trying to send an OTP to a mobile number. The problem is whenever I try to execute the task by running sms_queue_processor.delay(phone, message) the celery worker doesn't receive the task. I tried the same executing from the shell but it doesn't receive at all. I tried to execute the statement from the shell at a rapid speed twice then the celery worker receives and I can able to receive the SMS. This is something weird and can't be

How to run django's “python manage.py runserver” , celery's “celery -A app_name worker -l info” and redis-server in one command

老子叫甜甜 提交于 2021-02-11 13:42:10
问题 I have recently started with django. And I started doing a small project. I've been using celery with redis worker. And every to use celery and redis I have to run the celery and redis server and then django server. Which is a bit lengthy process. I have two questions. 1. Am I doing the right thing by running the servers everytime or are there any other right method to this process? 2. If I'm in the right direction, is there any method to do this? I tried circus.ini , but it did not work. 回答1

Celery tasks profiling

我只是一个虾纸丫 提交于 2021-02-07 06:26:25
问题 As I can see in top utility celery procecess consume a lot of CPU time. So I want to profile it. I can do it manually on developer machine like so: python -m cProfile -o test-`date +%Y-%m-%d-%T`.prof ./manage.py celeryd -B But to have accurate timings I need to profile it on production machine. On that machine (Fedora 14) celery is launched by init scripts. E.g. service celeryd start I have figured out these scripts eventually call manage.py celeryd_multi eventually. So my question is how can

Celery tasks profiling

依然范特西╮ 提交于 2021-02-07 06:24:09
问题 As I can see in top utility celery procecess consume a lot of CPU time. So I want to profile it. I can do it manually on developer machine like so: python -m cProfile -o test-`date +%Y-%m-%d-%T`.prof ./manage.py celeryd -B But to have accurate timings I need to profile it on production machine. On that machine (Fedora 14) celery is launched by init scripts. E.g. service celeryd start I have figured out these scripts eventually call manage.py celeryd_multi eventually. So my question is how can