upstart

Upstart node.js working directory

て烟熏妆下的殇ゞ 提交于 2019-11-29 17:20:53
问题 Starting Node.js with Upstart, when trying to access files within Node.js it cannot access them without using the full path. I need it to use the working directory. start on startup stop on shutdown script echo $$ > /var/run/mynodeapp.pid exec sudo -u mynodeapp node server.js >> /var/log/mynodeapp.sys.log 2>&1 end script pre-start script echo "Starting" >> /var/log/mynodeapp.sys.log end script pre-stop script rm /var/run/mynodeapp.pid echo "Stopping" >> /var/log/mynodeapp.sys.log end script

running node.js server using upstart causes 'terminated with status 127' on 'ubuntu 10.04'

时光怂恿深爱的人放手 提交于 2019-11-29 14:04:37
i have written an upstart script for ubuntu to launch my node.js server manually or on startup. But it always terminates with status 127 and i can't find more information about what is going wrong. If i execute it manually then it works and i also tested it on ubuntu 12.10 where it also works ... it only fails to work on ubuntu 10.04 which is the production server i'm using. The script: description "" author "" start on started mountall stop on shutdown respawn respawn limit 20 5 # Max open files are @ 1024 by default. Bit few. limit nofile 32768 32768 env HOME=/home/projects/<project_name>

How do I tell celery worker to stop accepting tasks? How can I check if any celery worker tasks are running?

帅比萌擦擦* 提交于 2019-11-29 12:47:09
The scenario: System running on a server consisting of a Python/Flask web application and background tasks using Celery Both web application and celery workers are run as upstart jobs (Web app behind Nginx) Deployment to production is done with a script that: Stop the upstart jobs Push code to server Run any db migrations Start the upstart jobs How can I enhance the deployment script so it does the following?: Tell the celery worker to stop accepting tasks Wait until any currently running celery tasks are finished Stop the upstart jobs Push code to server Run any db migrations Start the

How to write an Ubuntu Upstart job for Celery (django-celery) in a virtualenv

℡╲_俬逩灬. 提交于 2019-11-28 18:52:25
I really enjoy using upstart. I currently have upstart jobs to run different gunicorn instances in a number of virtualenvs. However, the 2-3 examples I found for Celery upstart scripts on the interwebs don't work for me. So, with the following variables, how would I write an Upstart job to run django-celery in a virtualenv. Path to Django Project: /srv/projects/django_project Path to this project's virtualenv: /srv/environments/django_project Path to celery settings is the Django project settings file (django-celery): /srv/projects/django_project/settings.py Path to the log file for this

How to set a global nofile limit to avoid “many open files” error?

随声附和 提交于 2019-11-28 18:40:30
I have a websocket service. it's strage that have error:"too many open files", but i have set the system configure: /etc/security/limits.conf * soft nofile 65000 * hard nofile 65000 /etc/sysctl.conf net.ipv4.ip_local_port_range = 1024 65000 ulimit -n //output 6500 So i think my system configure it's right. My service is manage by supervisor, it's possible supervisor limits? check process start by supervisor: cat /proc/815/limits Max open files 1024 4096 files check process manual start: cat /proc/900/limits Max open files 65000 65000 files The reason is used supervisor manage serivce. if i

Run python script as daemon at boot time (Ubuntu)

梦想与她 提交于 2019-11-28 17:20:21
I've created small web server using werkzeug and I'm able to run it in usual python way with python my_server.py . Pages load, everything works fine. Now I want to start it when my pc boots. What's the easiest way to do that? I've been struggling with upstart but it doesn't seem to "live in a background" cuz after I execute start my_server I immediately receive kernel: [ 8799.793942] init: my_server main process (7274) terminated with status 1 my_server.py: ... if __name__ == '__main__': from werkzeug.serving import run_simple app = create_app() run_simple('0.0.0.0', 4000, app) upstart

Daemon vs Upstart for python script

生来就可爱ヽ(ⅴ<●) 提交于 2019-11-28 15:25:45
I have written a module in Python and want it to run continuously once started and need to stop it when I need to update other modules. I will likely be using monit to restart it, if module has crashed or is otherwise not running. I was going through different techniques like Daemon , Upstart and many others. Which is the best way to go so that I use that approach through out my all new modules to keep running them forever? From your mention of Upstart I will assume that this question is for a service being run on an Ubuntu server. On an Ubuntu server an upstart job is really the simplest and

Ubuntu, upstart, and creating a pid for monitoring

依然范特西╮ 提交于 2019-11-28 14:45:05
问题 Below is a upstart script for redis. How to I create a pid so I use monit for monitoring? #!upstart description "Redis Server" env USER=redis start on startup stop on shutdown respawn exec sudo -u $USER sh -c "/usr/local/bin/redis-server /etc/redis/redis.conf 2>&1 >> /var/log/redis/redis.log" 回答1: If start-stop-daemon is available on your machine, I would highly recommend using it to launch your process. start-stop-daemon will handle launching the process as an unprivileged user without

running node.js server using upstart causes 'terminated with status 127' on 'ubuntu 10.04'

寵の児 提交于 2019-11-28 07:41:57
问题 i have written an upstart script for ubuntu to launch my node.js server manually or on startup. But it always terminates with status 127 and i can't find more information about what is going wrong. If i execute it manually then it works and i also tested it on ubuntu 12.10 where it also works ... it only fails to work on ubuntu 10.04 which is the production server i'm using. The script: description "" author "" start on started mountall stop on shutdown respawn respawn limit 20 5 # Max open

How to write an Ubuntu Upstart job for Celery (django-celery) in a virtualenv

笑着哭i 提交于 2019-11-27 11:43:17
问题 I really enjoy using upstart. I currently have upstart jobs to run different gunicorn instances in a number of virtualenvs. However, the 2-3 examples I found for Celery upstart scripts on the interwebs don't work for me. So, with the following variables, how would I write an Upstart job to run django-celery in a virtualenv. Path to Django Project: /srv/projects/django_project Path to this project's virtualenv: /srv/environments/django_project Path to celery settings is the Django project