Logstash pipeline not working with csvfile

家住魔仙堡 提交于 2021-02-08 11:33:50

问题


set it up like below

wget https://artifacts.elastic.co/downloads/logstash/logstash-6.6.2.deb
sudo dpkg -i logstash-6.6.2.deb
sudo systemctl enable logstash.service
sudo systemctl start logstash.service

and i added a pipeline script like below

input {
        file {
                path => "/root/dev/Intuseer-PaaS/backend/airound_sv_logs.log"
                start_position => "beginning"
        }
}
output {
        stdout {}
        file {
                path => "/root/dev/output/output-%{+YYYY-MM-dd}.log"
        }
}

the log file likes below

timestamp, server_cpu, server_memory_used, server_memory_free, process_cpu, process_memory
1582787287, 1, 1176, 2759, 0, 9.05
1582787288, 1, 1176, 2759, 1, 8.97
1582787289, 2, 1176, 2759, 0, 9.04
1582787290, 1, 1177, 2758, 0, 8.98
1582787291, 0, 1176, 2759, 1, 9.04
1582787292, 1, 1176, 2759, 0, 8.96
1582787293, 1, 1177, 2758, 0, 9.03
1582787294, 1, 1176, 2759, 1, 9.08
1582787295, 0, 1177, 2758, 0, 9.02
1582787296, 1, 1176, 2759, 1, 9.05

I've tried so many time to get this log on local directory. I checked the status of logstash. but It doesn't have change after below. aslo the output-%.log file wasn't made.

The result of $ systemctl status logstash.service

Please help me. thank you. Im trying with logstash 6.6.2. I


回答1:


So, in summary, everything was working, but since you didn't see anything in the log you assumed it wasn't working. Adding --debug confirmed that everything was working properly.

A few notes, though:

  • Don't forget to add sincedb_path otherwise you run the risk of not being able to reprocess your file repeatedly
  • Try not to do stuff in /root as the use under which Logstash runs might not always have the right to R/W that folder
  • Leverage the csv filter in order to parse your rows.


来源:https://stackoverflow.com/questions/60428021/logstash-pipeline-not-working-with-csvfile

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!