Home > Blockchain >  how to use nested Json field as elasticsearch doc in logstash
how to use nested Json field as elasticsearch doc in logstash

Time:03-07

say the event is like this:

{
  "name": "xxx", 
  "data": {
    "a": xxx
  }
}

with logstash, how to just use inner data field as document source send to elasticsearch, like:

{
  "a": xxx
}

any response would be appreciated!


tried to use json filter

filter {
  json {
    source => "data"
  }
}

but seems like the event is already parsed as a json, the terminal just print this error message:

Error parsing json {:source=>"data", :raw=>{"a"=>xxx}, :exception=>java.lang.ClassCastException: org.jruby.RubyHash cannot be cast to org.jruby.RubyIO}

CodePudding user response:

FYI, found an answer works https://discuss.elastic.co/t/move-subarrays-to-document-root/143876

just use ruby code to move nested fields to document root , and remove all other fields

  ruby {
    code => 'event.get("data").each { | k, v| event.set(k, v) }'
  }

  mutate {
    remove_field => [ "name", "data" ]
  }
  • Related