filebeat

Windows docker: permission denied /var/run/docker.sock

你离开我真会死。 提交于 2021-02-18 11:30:10
问题 When I try to run filebeat with autodiscover I get the following error: Exiting: error in autodiscover provider settings: Got permission denied while trying to connect to the Docker daemon socket at unix:///var/run/docker.sock: Get http://%2Fvar%2Frun%2Fdocker.sock/v1.22/containers/json?limit=0: dial unix /var/run/docker.sock: connect: permission denied I exposed the daemon on tcp://localhost:2375 from docker settings. I checked that my user is member of "docker-users" group. docker-compose

Filebeat “error loading config file: yaml: did not find expected key”

拜拜、爱过 提交于 2021-02-11 07:08:42
问题 Im stacked at this issue. I have an Elasticsearch server with x-pack security enabled. A client with Filebeat that is sending outputs to that server. All is working fin without enabling x-pck security, but whe doing it, in the lient I have this error message. ./filebeat test config -v Exiting: error loading config file: yaml: line 157: did not find expected key The line that causes the error is the "username" and "password". When commented, the config test is OK, but when username and

Filebeat “error loading config file: yaml: did not find expected key”

和自甴很熟 提交于 2021-02-11 07:08:16
问题 Im stacked at this issue. I have an Elasticsearch server with x-pack security enabled. A client with Filebeat that is sending outputs to that server. All is working fin without enabling x-pck security, but whe doing it, in the lient I have this error message. ./filebeat test config -v Exiting: error loading config file: yaml: line 157: did not find expected key The line that causes the error is the "username" and "password". When commented, the config test is OK, but when username and

Logstash pipeline not working with csvfile

家住魔仙堡 提交于 2021-02-08 11:33:50
问题 set it up like below wget https://artifacts.elastic.co/downloads/logstash/logstash-6.6.2.deb sudo dpkg -i logstash-6.6.2.deb sudo systemctl enable logstash.service sudo systemctl start logstash.service and i added a pipeline script like below input { file { path => "/root/dev/Intuseer-PaaS/backend/airound_sv_logs.log" start_position => "beginning" } } output { stdout {} file { path => "/root/dev/output/output-%{+YYYY-MM-dd}.log" } } the log file likes below timestamp, server_cpu, server

Logstash pipeline not working with csvfile

六眼飞鱼酱① 提交于 2021-02-08 11:33:14
问题 set it up like below wget https://artifacts.elastic.co/downloads/logstash/logstash-6.6.2.deb sudo dpkg -i logstash-6.6.2.deb sudo systemctl enable logstash.service sudo systemctl start logstash.service and i added a pipeline script like below input { file { path => "/root/dev/Intuseer-PaaS/backend/airound_sv_logs.log" start_position => "beginning" } } output { stdout {} file { path => "/root/dev/output/output-%{+YYYY-MM-dd}.log" } } the log file likes below timestamp, server_cpu, server

Resend old logs from filebeat to logstash

与世无争的帅哥 提交于 2021-02-08 08:13:20
问题 Thanks in advance for your help. I would like to reload some logs to customize additional fields. I have noticed that registry file in filebeat configuration keeps track of the files already picked. However, if I remove the content in that file, I am not getting the old logs back. I have tried also to change the timestamp of the source in registry file with no sucsess. What changes are needed to sent old logs from filebeat to logstash? How can I get the logs back? Update: This is the last log

Suricata to Filebeat to Kafka, routing to topics by event-type

梦想的初衷 提交于 2021-01-29 10:11:57
问题 I discovered Filebeat a couple days ago. I have it sending data to Kafka directly if I hard code the topic name in filebeat.yml. But I can't seem to figure out how to dynamically compute the topic name based on suricata event type. I've enabled the filebeat suricata module, and tried a number of things in the filebeat.yml topic value, like: topic: 'suricata-%{[fields.suricata.eve.event_type]}' But I always get this error in the log: 2020-01-14T23:44:49.550Z INFO kafka/log.go:53 kafka message:

Generating filebeat custom fields

女生的网名这么多〃 提交于 2020-12-05 08:17:49
问题 I have an elasticsearch cluster (ELK) and some nodes sending logs to the logstash using filebeat. All the servers in my environment are CentOS 6.5. The filebeat.yml file in each server is enforced by a Puppet module (both my production and test servers got the same configuration). I want to have a field in each document which tells if it came from a production/test server. I wanted to generate a dynamic custom field in every document which indicates the environment (production/test) using