logstash-forwarder

JBoss access logs with log rotation

流过昼夜 提交于 2021-02-19 03:59:19
问题 I'm trying to tell my jboss to write an access log with all information I need and use a daily log rotation. So far this is not an issue. The ultimate goal is to send all access log entries to an elk stack using logstash forwarder. Also not that big a deal. The problem I'm having now is to define the access logs names. JBoss offers log rotation out of the box but adds a timestamp to each logfile so todays file also has a timestamp suffix. What I want to achive is the same behavior as tomcat

Read log file from a remote machine with file input plugin using logstash

我的梦境 提交于 2020-08-02 20:54:01
问题 Presently I have my logs and logstash running on the same machine, so I read my logs placed on my local machine with this config(using pull model) input { file { path => "/home/Desktop/Logstash-Input/**/*_log" start_position => "beginning" } } Now, we have logstash running on a different machine and want to read the logs remote mechine. Is there a way to set the ip in file input of config file? EDIT: I manage to do this with logstash-forwarder which is a push model(log shipper/logstash

Read log file from a remote machine with file input plugin using logstash

跟風遠走 提交于 2020-08-02 20:50:38
问题 Presently I have my logs and logstash running on the same machine, so I read my logs placed on my local machine with this config(using pull model) input { file { path => "/home/Desktop/Logstash-Input/**/*_log" start_position => "beginning" } } Now, we have logstash running on a different machine and want to read the logs remote mechine. Is there a way to set the ip in file input of config file? EDIT: I manage to do this with logstash-forwarder which is a push model(log shipper/logstash

Read log file from a remote machine with file input plugin using logstash

本小妞迷上赌 提交于 2020-08-02 20:49:09
问题 Presently I have my logs and logstash running on the same machine, so I read my logs placed on my local machine with this config(using pull model) input { file { path => "/home/Desktop/Logstash-Input/**/*_log" start_position => "beginning" } } Now, we have logstash running on a different machine and want to read the logs remote mechine. Is there a way to set the ip in file input of config file? EDIT: I manage to do this with logstash-forwarder which is a push model(log shipper/logstash

Filter specific Message with logstash before sending to ElasticSearch

北城余情 提交于 2019-12-13 05:45:48
问题 I had like to know if it is possible to send only specific log messages to elasticsearch via logstash? E.G let's say I have these messages in my log file: 2015-08-14 12:21:03 [31946] PASS 10.249.10.70 http://google.com 2015-08-14 12:25:00 [2492] domainlist \"/etc/ufdbguard/blacklists\ 2015-08-14 12:21:03 [31946] PASS 10.249.10.41 http://yahoo.com I had like to skip the second line when logstash/log forwarder process this log, is it possible to instruct it to skip any log message with the

Logstash security

久未见 提交于 2019-12-13 05:38:15
问题 I am wondering if it is possible to implement something like mutual handshake authorization between logstash and logstash-forwarder? At the moment, I know that logstash provides ssl certificates for security, but I am not sure if this is the best way to protect my logs flow. The certificates are not safe enough in my case. If they will get stolen then you are in danger.. Looking for something else that may help. Thanks! 回答1: The Logstash forwarder project has been deprecated in favor of the

Docker apps logging with Filebeat and Logstash

不问归期 提交于 2019-12-04 07:37:02
问题 I have a set of dockerized applications scattered across multiple servers and trying to setup production-level centralized logging with ELK. I'm ok with the ELK part itself, but I'm a little confused about how to forward the logs to my logstashes. I'm trying to use Filebeat, because of its loadbalance feature. I'd also like to avoid packing Filebeat (or anything else) into all my dockers, and keep it separated, dockerized or not. How can I proceed? I've been trying the following. My Dockers

JSON parser in logstash ignoring data?

拟墨画扇 提交于 2019-12-02 18:07:34
问题 I've been at this a while now, and I feel like the JSON filter in logstash is removing data for me. I originally followed the tutorial from https://www.digitalocean.com/community/tutorials/how-to-install-elasticsearch-logstash-and-kibana-elk-stack-on-ubuntu-14-04 I've made some changes, but it's mostly the same. My grok filter looks like this: uuid #uuid and fingerprint to avoid duplicates { target => "@uuid" overwrite => true } fingerprint { key => "78787878" concatenate_sources => true }

Docker apps logging with Filebeat and Logstash

試著忘記壹切 提交于 2019-12-02 14:42:58
I have a set of dockerized applications scattered across multiple servers and trying to setup production-level centralized logging with ELK. I'm ok with the ELK part itself, but I'm a little confused about how to forward the logs to my logstashes. I'm trying to use Filebeat, because of its loadbalance feature. I'd also like to avoid packing Filebeat (or anything else) into all my dockers, and keep it separated, dockerized or not. How can I proceed? I've been trying the following. My Dockers log on stdout so with a non-dockerized Filebeat configured to read from stdin I do: docker logs -f

JSON parser in logstash ignoring data?

寵の児 提交于 2019-12-02 10:07:37
I've been at this a while now, and I feel like the JSON filter in logstash is removing data for me. I originally followed the tutorial from https://www.digitalocean.com/community/tutorials/how-to-install-elasticsearch-logstash-and-kibana-elk-stack-on-ubuntu-14-04 I've made some changes, but it's mostly the same. My grok filter looks like this: uuid #uuid and fingerprint to avoid duplicates { target => "@uuid" overwrite => true } fingerprint { key => "78787878" concatenate_sources => true } grok #Get device name from the name of the log { match => { "source" => "%{GREEDYDATA}%{IPV4:DEVICENAME}%