Logstash: Merge two logs into one output document

前端 未结 1 1160
天命终不由人
天命终不由人 2021-01-12 17:09

I have set syslog to send logs to logstash, with the following filters:

output {
  elasticsearch 
  { hosts => [\"localhost:9200\"]
  document_id => \"         


        
相关标签:
1条回答
  • 2021-01-12 17:48

    You can use the aggregate filter in order to do this. The aggregate filter provides support for aggregating several log lines into one single event based on a common field value. In your case, the common field would be the job_id field.

    Then we need another field to detect the first event vs the second event that should be aggregated. In your case, this would be the state field.

    So you simply need to add another filter to your existing Logstash configuration, like this:

    filter {
        ...your other filters
    
        if [state] == "processing" {
            aggregate {
                task_id => "%{job_id}"
            }
        } else if [state] == "failed" {
            aggregate {
                task_id => "%{job_id}"
                end_of_task => true
                timeout => 120
            }
        }
    }
    

    You are free to adjust the timeout (in seconds) depending on how long your jobs are running.

    0 讨论(0)
提交回复
热议问题