Logstash – Multiple Kafka Config In A Single File

Kafka is great tool to collect logs from various environments to build central logging. Sometimes you need to add more kafka Input and Output to send them to ELK stack for sure.

Here is basic concept of log flow to manage logs:

Logstash parses and makes sense logs to analyz and store them. Now we’re dealing 3 section to send logs to ELK stack: 

Input
Filter
Output

For multiple Inputs, we can use “tags” to separate where logs come from:

input {

kafka {
codec => “json”
bootstrap_servers => “172.16.1.15:9092”
topics => [“APP1_logs”]
tags => [“app1logs”]
}

kafka {
codec => “json”
bootstrap_servers => “172.16.1.25:9094”
topics => [“APP2_logs”]
tags => [“app2logs”]
}

}

And filter them as your requirements. I also used “mutate” filter to remove quotes from the log:

#FILTER1

filter {

if “app1logs” in [tags] {

dissect {
mapping => {
“message” => “%{field1} %{field2} %{field3}”
              }

  remove_field => [“message”]
         }

          mutate {
          gsub => [‘message’,'”‘, “”]
           }
     }
}

 

#FILTER2

filter {

if “app2logs” in [tags] {

dissect {
mapping => {
“message” => “%{field1} %{field2} %{field3}”
              }

  remove_field => [“message”]
         }

          mutate {
          gsub => [‘message’,'”‘, “”]
           }
     }
}

 

In last section here is how multiple Outputs to send logs to Kibana:

 

output {

   if “app1logs” in [tags] {
   elasticsearch {
   hosts => [“localhost:9200”]
   user => “elastic”
   password => “xxx”
   index => “app1logs”
      }
   stdout {codec => rubydebug}
   }

   if “app2logs” in [tags] {
   elasticsearch {
   hosts => [“localhost:9200”]
   user => “elastic”
   password => “xxx”
   index => “app2logs”
    }
   stdout {codec => rubydebug}
 }

}

 

Hope this help who need multiple config in a single Logstash file.