Home > Software engineering >  Monitor Kong API Logs Using ELK
Monitor Kong API Logs Using ELK

Time:11-14

We are using ELK (Elasticsearsh, Logstash, Kibana) version 8.x to collect logs from Kong API Gateway version 2.8 using tcp-logs plugin.

We have configured tcp-logs plugin to use Logstash as an endpoint to send the Logs to Logstash then Logstash will send the logs to Elasticsearch.

Kong TCP-Logs Plugin -> Logstash -> Elasticsearch

I do appreciate your support to clarify the following, please:

How to display Kong API Gateway Logs using Kibana? From where shall I start? Is there Index for Kong logs will be created by default in Elasticsearch? What is the Elasticsearch Index Pattern do I need to use to get Kong API Logs?

Note: I am not using filebeat agent on the Kong API nodes. I am using tcp-logs plugin to send Kong logs to Logstash.

The content of /etc/logstash/conf.d/beats.conf

    input {
  beats {
    port => 5044
  }
}
filter {
  if [type] == "syslog" {
     grok {
        match => { "message" => "%{SYSLOGLINE}" }
  }
     date {
        match => [ "timestamp", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss" ]
     }
  }
}
output {
  elasticsearch {
    hosts => ["Elstic_IP_Address:9200"]
    index => "%{[@metadata][beat]}-%{ YYYY.MM.dd}"
  }
}

Thanks so much for your support!

CodePudding user response:

To fix this issue, we have to use index => "transaction" in the content of /etc/logstash/conf.d/beats.conf configuration file.

Then using transaction index to display the logs on Kibana.

input {
  beats {
    port => 5044
  }
}
filter {
  if [type] == "syslog" {
     grok {
        match => { "message" => "%{SYSLOGLINE}" }
  }
     date {
        match => [ "timestamp", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss" ]
     }
  }
}
output {
  elasticsearch {
    hosts => ["Elstic_IP_Address:9200"]
    index => "transaction"
  }
}
  • Related