celery

How do you deal with an exception raised by celery (not your code)?

£可爱£侵袭症+ 提交于 2019-12-24 06:58:26
问题 So in my flask app right now I am using Celery to deploy servers on remote machines. Right now, I have an enum, status, which indicates the lifecycle of my deployment process: @celery.task(bind=True) def deploy_server(self, server_id): server = Server.query.get(server_id) if not server.can_launch(): return try: server.status = RemoteStatus.LAUNCHING db.session.commit() verify_DNS(server) host = server.server.ssh_user + '@' + server.server.ip execute(fabric_deploy_server, self, server, hosts

Is it possible to use Django models module only in my project?

匆匆过客 提交于 2019-12-24 06:36:37
问题 I am developing a small independent python application which uses Celery. I have built this using django framework but my application is back end only. This means that the users do not need to visit my site and my application is built only for the purpose of receiving tasks queue from celery and performing operations on the database. In order to perform operations on the database, I need to use Django modules. What I am trying to do is eliminate the rest of my django application and use ONLY

Is it possible to control the datefmt of the Celery task logs?

一个人想着一个人 提交于 2019-12-24 05:32:26
问题 With normal python loggers you might configure the format string and date format like so: { 'format': '%(asctime)s.%(msecs)d [%(process)d] %(levelname)s [%(name)s] %(message)s', 'datefmt': %d/%m/%Y-%H:%M:%S' } But I am completely unable to find any way to pass a datefmt to the Celery task logger. It seems like such a basic bit of functionality that I'm surprised if it is impossible. The closest I got was by following the technique layed out in this article. The gist of it is to use the after

Django celery running only two tasks at once?

℡╲_俬逩灬. 提交于 2019-12-24 05:32:20
问题 I have a celery task like this: @celery.task def file_transfer(password, source12, destination): result = subprocess.Popen(['sshpass', '-p', password, 'rsync', '-avz', source12, destination], stderr=subprocess.PIPE, stdout=subprocess.PIPE).communicate()[0] return result I have called in a Djagno view. User can select more than one file to copy to the destination. For example if the user selects, 4 files at once, celery accept only 2 tasks. What's wrong? 回答1: Have you checked the concurrency

Django celery running only two tasks at once?

本小妞迷上赌 提交于 2019-12-24 05:32:13
问题 I have a celery task like this: @celery.task def file_transfer(password, source12, destination): result = subprocess.Popen(['sshpass', '-p', password, 'rsync', '-avz', source12, destination], stderr=subprocess.PIPE, stdout=subprocess.PIPE).communicate()[0] return result I have called in a Djagno view. User can select more than one file to copy to the destination. For example if the user selects, 4 files at once, celery accept only 2 tasks. What's wrong? 回答1: Have you checked the concurrency

How to run celery schedule instantly?

我怕爱的太早我们不能终老 提交于 2019-12-24 03:23:42
问题 I have a celery schedule which is configured like this: CELERYBEAT_SCHEDULE = { "runs-every-30-seconds": { "task": "tasks.refresh", "schedule": timedelta(hours=1) }, } After testing I find that this schedule is started after 1 hour, but I want to run this schedule instantly and again after 1 hour. 回答1: If you mean at startup, do it in AppConfig.ready() (new in django 1.7): # my_app/__init__.py: class MyAppConfig(AppConfig): def ready(self): tasks.refresh.delay() Also see: https://docs

Unable to connect to celery task from a celery signal?

被刻印的时光 ゝ 提交于 2019-12-24 02:33:25
问题 I am trying to connect task2 from task_success signal from celery.signals import task_success from celery import Celery app = Celery() @app.task def task1(): return 't1' @app.task def task2(): return 't2' task_success.connect(task2, sender=task1) When I run this code, its throwing TypeError: cannot create weak reference to 'PromiseProxy' object If remove app.task decorator for task2, it works perfectly. But why is it unable to connect to celery task? 回答1: The technical details is that the

Running Celery tasks periodically (without Django)

流过昼夜 提交于 2019-12-24 02:25:28
问题 I am trying to run a few functions (tasks) periodically, say every 3 seconds, with Celery. The closest I'm getting is to just run the tasks once. This is my Celery configuration file: # celeryconfig.py from datetime import timedelta BROKER_URL = 'amqp://guest@localhost//' CELERY_RESULT_BACKEND = 'rpc://' CELERYBEAT_SCHEDULE = { 'f1-every-3-seconds': { 'task': 'tasks.f1', 'schedule': timedelta(seconds=3), 'args': (1, 2) }, 'f2-every-3-seconds': { 'task': 'tasks.f2', 'schedule': timedelta

Celery + Redis - .get() hangs indefinitely after running smoothly for ~70 hours

依然范特西╮ 提交于 2019-12-24 00:54:46
问题 Everything runs fine for multiple days, but then I get an indefinite hang on .get(). The time it takes for the indefinite hang to occur varies, but it's between 24 and 72 hours of running. My suspicion is that this has something to do with the Redis broker. The output of CLIENT LIST in redis-cli shows a large number of connections with a very high idle number (see below). But I don't know if this is an issue or why this would cause Celery's .get() to hang indefinitely. I have confirmed that

Mac 安装

强颜欢笑 提交于 2019-12-23 23:48:28
【推荐】2019 Java 开发者跳槽指南.pdf(吐血整理) >>> sql建立数据库 create database opsmanage DEFAULT CHARACTER SET utf8 COLLATE utf8_general_ci; grant all privileges on opsmanage.* to root@'%' identified by 'password'; 安装依赖 brew install libmagic 配置文件 conf/opsmanage.ini [db] engine = mysql host = 127.0.0.1 port = 3307 user = root password = 123456 database = opsmanage [redis] host = 127.0.0.1 port = 6379 password = ansible_db = 3 celery_db = 4 default_db = 0 [deploy] path = /Users/lijingjing/OpsManage/workspaces 初始化数据 python3 manage.py makemigrations wiki python manage.py createsuperuser # 建立后台管理员帐号 https://github