logging

logging.handlers.SMTPHandler raises smtplib.SMTPAuthenticationError

这一生的挚爱 提交于 2020-04-11 01:42:06
问题 I tried this with Verizon and Gmail. Both servers denied authentication. Gmail emailed me that it denied a login attempt because the connection was not using "modern security". I would like to know how I can use modern security with this logging handler. logging.handlers.SMTPHandler(mailhost=('', 25), fromaddr='', toaddrs='', subject='', credentials=('username','password'), secure=()) 回答1: Gmail problem: Not addressed here. See other answers about Gmail's app authentication. Verizon problem:

How to log from .NET Core Web API into Elasticsearch on own index

▼魔方 西西 提交于 2020-04-10 08:19:13
问题 I have a .NET Web API written with C# and a Elasticsearch . On the Elasticsearch I have a index " logging " where I want to push my logs from the API into. I can not figure out how to get my logs from the C# API into the Elastic "logging". I read documentations like Logging with ElasticSearch..., but I have no logstash available at my Elasticsearch. So I'm searching for a Package which helps my logging in a easy way. I think need to hand over the Index "logging" ones, so it knows where to log

Celery logger configuration

六月ゝ 毕业季﹏ 提交于 2020-04-08 09:23:45
问题 I'm using Django 1.10, python 3.5 and celery 4.1.0 I'm trying to log celery tasks info into a file. So I tried as suggested in celery documentation - from celery.utils.log import get_task_logger logger = get_task_logger(__name__) and tried to log a message inside the task - logger.info(message) I expected it to log to my default logger. But it didn't. So I added to settings a dedicated logger named 'celery.task' (as I understand from documentation): LOGGING = { 'version': 1, 'disable_existing

Celery logger configuration

早过忘川 提交于 2020-04-08 09:22:33
问题 I'm using Django 1.10, python 3.5 and celery 4.1.0 I'm trying to log celery tasks info into a file. So I tried as suggested in celery documentation - from celery.utils.log import get_task_logger logger = get_task_logger(__name__) and tried to log a message inside the task - logger.info(message) I expected it to log to my default logger. But it didn't. So I added to settings a dedicated logger named 'celery.task' (as I understand from documentation): LOGGING = { 'version': 1, 'disable_existing

Log4j Implicit String Formatting

吃可爱长大的小学妹 提交于 2020-04-08 08:48:13
问题 I am using log4j v1.2.14 for logging in my project and I am also using Java 7 String.format() to put variables in my output. Currently I am writing LOGGER.info(String.format("Your var is [%s] and you are [%s]", myVar, myVar1)); Is this really the best way to output strings? I feel that log4j should have this implemented implicitly as below: LOGGER.info("Your var is [%s] and you are [%s]", myVar, myVar1); Have I missed something? Further, are there any Java logging frameworks that support this

How to insert newline in python logging?

我怕爱的太早我们不能终老 提交于 2020-04-07 13:50:32
问题 import logging logging.basicConfig(level=logging.DEBUG, format='%(asctime)s %(levelname)s %(message)s', datefmt='%H:%M:%S') logging.info('hello') logging.warning('\n new hello') 11:15:01 INFO hello 11:16:49 WARNING new hello Because the log is crowded, I want to explicitly insert a newline before asctime and levelname . Is this possible without modifying format ? I looked into logging module and googled a bit and could not find a viable way. 回答1: I have two solutions, the first is very easy,

/var/log/daemon.log taking more space how to reduce it?

自古美人都是妖i 提交于 2020-04-07 08:36:09
问题 below are the files -rw-r----- 1 root adm 4.4G Mar 6 09:04 daemon.log -rw-r----- 1 root adm 6.2G Mar 1 06:26 daemon.log.1 -rw-r----- 1 root adm 50M Feb 23 06:26 daemon.log.2.gz -rw-r----- 1 root adm 41M Feb 17 06:25 daemon.log.3.gz -rw-r----- 1 root adm 72K Feb 9 06:25 daemon.log.4.gz how can I remove it? will it affect if I directly delete it? Thanks in advance. 来源: https://stackoverflow.com/questions/60560729/var-log-daemon-log-taking-more-space-how-to-reduce-it

Redirect STDOUT and STDERR to python logger and also to jupyter notebook

吃可爱长大的小学妹 提交于 2020-03-25 16:06:45
问题 Important to know: I am working on jupyter notebook. I want to create a logger to which I will redirect the STDOUT and STDERR but I also want to see those outputs on the jupyter notebook output console. So far what I have implemented is: import logging import sys class StreamToLogger(object): """ Fake file-like stream object that redirects writes to a logger instance. """ def __init__(self, logger, log_level=logging.INFO): self.logger = logger self.log_level = log_level self.linebuf = '' def

Subset of tag to a second output

ぃ、小莉子 提交于 2020-03-23 08:54:53
问题 I've got a service I'm trying to record logs from and send them to certain locations depending on what they refer to. Fluentd makes sense. Currently, I just have everything being chucked off to s3, and that works fine. However I now want a small subset (any line beginning with the phrase "ACTION:") to also be sent to a mongo database. I still want everything sent to s3. I have this config file here, which of course isn't going to work <source> @type forward @id input @label mainstream port

Does python logging.FileHandler use block buffering by default?

爷,独闯天下 提交于 2020-03-22 09:24:52
问题 The logging handler classes have a flush() method. And looking at the code, logging.FileHandler does not pass a specific buffering mode when calling open() . Therefore when you write to a log file, it will be buffered using a default block size. Is that correct? It surprises me, because when I manage my own system, I am used to watching log files as a live (or near-live) view on the system. For this use case, line-buffering is desired. Also, traditional syslog() to a logging daemon does not