I have the following scenario:
FileBeat ----> Kafka -----> Logstash -----> Elastic ----> Kibana
In Filebeat I have 2 prospectors the in YML file,,. and I add some fields to identify the log data. But, the issue is: in Logstash I haven't be able to validate this fields.
The configuration files are:
1. filebeat.yml
filebeat.prospectors: - input_type: log paths: - /opt/jboss/server.log* tags: ["log_server"] fields: environment: integracion log_type: log_server document_type: log_server fields_under_root: true - input_type: log paths: - /var/todo1_apps/ebanTX.log* tags: ["log_eban"] fields: environment: integracion log_type: log_ebanking document_type: log_ebanking fields_under_root: true output.kafka: enabled: true hosts: ["192.168.105.68:9092"] topic: "sve_logs" timeout: 30s
2. logstash.conf
input { kafka { bootstrap_servers => "192.xxx.xxx.xxx:9092" group_id => "sve_banistmo" topics => ["sve_logs"] decorate_events => true codec => "plain" } } filter { if [type] == "log_ebanking" { grok { patterns_dir => ["patterns/patterns"] match => { "message" => "%{TIMESTAMP_ISO8601:logdate}%{SPACE}%{LOGLEVEL:level}%{SPACE}\[%{DATA:thread}]%{SPACE}-%{SPACE}%{GREEDYDATA:message_log}" } } } } output { if [type] == "log_ebanking" { elasticsearch { hosts => ["192.168.105.67:9200"] index => "sve-banistmo-ebanking-%{+YYYY.MM.dd}" } stdout { codec => json} } }
The problem is in the conditional filter and output section. I've tried with
@[metadata][type] @metadata][type] @metadata.type metadata.type [type]
With both the type and log_type variable. Nothing works !! :S If I don't put conditionals, the data flow without problem. I mean, is not a conection issue.
Please help me. I've reviewed all the information related, but in my case the conditional doesn't work.
Thanks in advance
Dario R