celery

Django-Celery in production?

风格不统一 提交于 2021-02-16 13:38:28
问题 So I've been trying to figure out how to make scheduled tasks, I've found Celery and been able to to make simple scheduled tasks. To do this I need to open up a command line and run celery -A proj beat for the tasks to happen. This works fine in a development environment, but when putting this into production that will be an issue. So how can I get celery to work without the command line use? When my production server is online, how can I make sure my scheduler goes up with it? Can Celery do

Django-Celery in production?

蹲街弑〆低调 提交于 2021-02-16 13:38:12
问题 So I've been trying to figure out how to make scheduled tasks, I've found Celery and been able to to make simple scheduled tasks. To do this I need to open up a command line and run celery -A proj beat for the tasks to happen. This works fine in a development environment, but when putting this into production that will be an issue. So how can I get celery to work without the command line use? When my production server is online, how can I make sure my scheduler goes up with it? Can Celery do

Celery: Rate limit on tasks with the same parameters

微笑、不失礼 提交于 2021-02-15 05:52:37
问题 I am looking for a way to restrict when a function is called, but only when the input parameters are different, that is: @app.task(rate_limit="60/s") def api_call(user): do_the_api_call() for i in range(0,100): api_call("antoine") api_call("oscar") So I would like api_call("antoine") to be called 60 times per second and api_call("oscar") 60 times per second as well. Any help on how can I do that? --EDIT 27/04/2015 I have tried calling a subtask with rate_limit within a task, but it does not

Celery: Rate limit on tasks with the same parameters

半世苍凉 提交于 2021-02-15 05:50:48
问题 I am looking for a way to restrict when a function is called, but only when the input parameters are different, that is: @app.task(rate_limit="60/s") def api_call(user): do_the_api_call() for i in range(0,100): api_call("antoine") api_call("oscar") So I would like api_call("antoine") to be called 60 times per second and api_call("oscar") 60 times per second as well. Any help on how can I do that? --EDIT 27/04/2015 I have tried calling a subtask with rate_limit within a task, but it does not

如何将炫酷的报表直接截图发送邮件——在Superset 0.37使用Schedule Email功能

吃可爱长大的小学妹 提交于 2021-02-12 03:49:43
Superset的图表是非常炫酷的,但是原来的版本只能在web端查看,而最新的0.37版本,可以将图表截图直接发送成邮件,非常的方便。 本文将详细介绍Superset 0.37 定时邮件功能。安装过程遇到的任何问题请关注 “ 实时流式计算 ” 随时与我联系。所有的离线安装包已经整理好,请 后台 回复 “ superset0928 ” 下载。 开启邮件功能 superset 0.37的电子邮件功能 默认是关闭的 电子邮件功能允许用户对以下两种电子邮件进行报告: 图表和仪表板(附件或嵌在邮件之中) 图表数据(CSV附件) vi config.py 开启邮件功能 ENABLE_SCHEDULED_EMAIL_REPORTS = True 要发送电子邮件 还需要配置一下SMTP EMAIL_NOTIFICATIONS = True SMTP_HOST = "email-smtp.eu-west-1.amazonaws.com" SMTP_STARTTLS = True SMTP_SSL = False SMTP_USER = "smtp_username" SMTP_PORT = 25 SMTP_PASSWORD = os.environ.get( "SMTP_PASSWORD" ) SMTP_MAIL_FROM = "insights@komoot.com" 启动前记得执行

Running celery as daemon does not create PID file (no permission issue)

可紊 提交于 2021-02-11 16:34:44
问题 I am trying to run celery (worker) as a daemon / service in Ubuntu server. I've follow their documentation (https://docs.celeryproject.org/en/stable/userguide/daemonizing.html) However, when I start the daemon it says: celery multi v5.0.4 (singularity) > Starting nodes... > worker1@ubuntuserver: OK But when I check the status it says: celery init v10.1. Using config script: /etc/default/celeryd celeryd down: no pidfiles found I've seen some info on the internet about permissions. But not sure

celery consume send_task response

流过昼夜 提交于 2021-02-11 15:19:10
问题 In django application I need to call an external rabbitmq, running on a windows server and using some application there, where the django app runs on a linux server. I'm currently able to add a task to the queue by using the celery send_task : app.send_task('tasks', kwargs=self.get_input(), queue=Queue('queue_async', durable=False)) My settings looks like: CELERY_BROKER_URL = CELERY_CONFIG['broker_url'] BROKER_TRANSPORT_OPTIONS = {"max_retries": 3, "interval_start": 0, "interval_step": 0.2,

Celery Task Custom tracking method

混江龙づ霸主 提交于 2021-02-11 14:39:44
问题 My main problem relies on the fact that i need to know if a task is still queued, started or revoked. I cant do this with celery and redis because 24hs after the results are in redis they are deleted. I had some ideas but i think the most solid one is to have a database tracking and manually adding the critical information that i need of the task a user is running. There are methods for that that can run before a task start and i can also manually work with the database when i create task or

celery monitoring with sqs broker

一个人想着一个人 提交于 2021-02-10 20:32:41
问题 We are using Airflow(1.10.3) with celery executor(4.1.1 (latentcall)) and broker SQS. While debugging an issue we tried our hands on celery CLI and found out that SQS broker is not supported for any of the inspect commands or monitoring tool eg. Flower. Is there any way we can monitor the tasks or events on celery workers? We have tried curses monitor as below: celery events -b sqs:// But it shows no worker discovered,no tasks selected. Inspect Commands directly shows: Availability: RabbitMQ

How can Celery distribute users' tasks in a fair way?

血红的双手。 提交于 2021-02-10 20:16:30
问题 The task I'm implementing is related to scrape some basic info about a URL, such as title, description and OGP metadata. If User A requests 200 URLs to scrape, and after User B requests for 10 URLs, User B may wait much more than s/he expect. What I'm trying to achieve is to rate limit a specific task on a per user basis or, at least, to be fair between users. The Celery implementation for rate limiting is too broad, since it uses the task name only Do you have any suggestion to achieve this