Home > Enterprise >  Timeout reached in KV filter with value entry too large
Timeout reached in KV filter with value entry too large

Time:11-01

I'm trying to build a new ELK project. I'm a newbie here so not sure what I'm missing. I'm trying to move very huge logs to ELK and while doing so, its timing out in KV filter with the error "Timeout reached in KV filter with value entry too large".

My logstash is in the below format:

   grok {
    match => [ "message", "(?<timestamp>%{MONTHDAY:monthday} %{MONTH:month} %{YEAR:year} % {TIME:time} \[%{LOGLEVEL:loglevel}\] %{DATA:requestId} \(%{DATA:thread}\) %{JAVAFILE:className}: %{GREEDYDATA:logMessage}" ]
    }
    kv {
    source => logMessage" 
    }

Is there a way, i can skip execution to go through kv filter when the logs are huge? If so, can someone guide me on how that can be done.

Thank you

I have tried multiple things but nothing seemed to work.

CodePudding user response:

I solved this by using dissect.

The query was something along the lines of:

dissect{
mapping => { "message" => "%{[@metadata][timestamp] %{[@metadata][timestamp] %{[@metadata][timestamp] %{[@metadata][timestamp] %{loglevel} %{requestId} %{thread} %{classname} %{logMessage}"
}
  • Related