Creating Index based on pattern matching in logstash

℡╲_俬逩灬. 提交于 2019-12-13 19:51:21

问题


I'm trying to build a centralised logging system for a group windows & linux servers using elasticsearch logstash and kibana. My input would be syslogs from both the system(single input stream). I'm trying to understand if there is way to use grok and match the pattern and then based on that put the logs in different indices(one for windows logs and one for linux logs)

Any help in direction would be appreciated.

Thanks,


回答1:


You can assign 'type' based on from which system the logs are coming and then use that type in output.

Below is the configuration:

input{
   file{
     path =>"path/to/system1_log_file"
     type =>"sys1logs"
   }

  file{
    path =>"path/to/system2_log_files"
    type =>"sys2logs"
  }
}

output{
  if[type]=="sys1logs"{
  #output to sys1 index
   elasticsearch{host => localhost
               index => "sys1"
               }
   }

 if[type]=="sys2logs"{
 #output to sys2 index
  elasticsearch{host => localhost
               index =>"sys2"
              }
  }
}



回答2:


You can grok{} them and add_field when the grok matches:

grok {
    matches => { "message", "foo %{WORD}" }
    add_field => { "send_to", "fooindex" }
}

or from a field in the pattern:

grok {
    matches => { "message", "foo %{WORD:send_to} bar" }
}

And then use %{send_to} in your output:

elasticsearch {
    index => "%{send_to}-YYYY.MM.dd"
}

Note that if your indexes aren't named "logstash-*", you won't get the default mapping that logstash provides.



来源:https://stackoverflow.com/questions/31277641/creating-index-based-on-pattern-matching-in-logstash

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!