Home > Blockchain >  Mutate data in logstash with nested JSON
Mutate data in logstash with nested JSON

Time:03-09

Below is a sample log entry(JSON) that I am working on:

{
    "Order": {
        "content": {
            "seqnum": "107",
            "type": "DAIRY",
            "section": "A1",
            "time": "2022-03-02T14:21:45",
            "version": "24",
            "src": "EAST",
            "status": "3"
        },
        "crc": {
            "crcvalue": "45BD2E93"
        }
    }
},

Below is my logstash filter.

filter {
        json {
                source => "message"
        }

        mutate { add_field => { "Order_Version" => "%{Order.content.version}" } }
        mutate { add_field => { "Order_State" => "%{Order.content.status}" } }
        mutate { convert => { "Order_Version" => "integer" } }
        mutate { convert => { "Order_State" => "integer" } }

        if [Order_State] == 3
        {
            mutate {
            add_field => { "Order_State" => "Processing" }
            }
        }
        if [Order_State] == 1
        {
            mutate {
            add_field => { "Order_State" => "Shipped" }
            }
        }
        if [Order_State] == 4
        {
            mutate {
            add_field => { "Order_State" => "Pending" }
            }
        }
        mutate { remove_field => ["message", "@timestamp", "path", "host", "@version", "Order.content.version", "Order.content.status", "Order.content.seqnum", "crcvalue"] }
}

When I apply above filter, I am able to parse all the data properly into fields, but when I try to mutate the data, I am not able to add new fields using the "if" condition. I am not sure, but the data in the "version" and "status" fields are not added to new fields. New fields are being created though, but has just "0" in both of them, may be because I am converting them to integer. The "remove_field" works only with the fields generated by elastic but not the json message fields. Below is the data from elastic

#_Data in Elastic
{
  "_index" : "order-sample-1",
  "_type" : "_doc",
  "_id" : "uXh-a38ByXB23tWqi2r3",
  "_score" : 1.0,
  "_source" : {
    "Order" : {
      "content" : {
        "section" : "A1",
        "src" : "EAST",
        "status" : "3",
        "seqnum" : "107",
        "time" : "2022-03-07T14:21:45",
        "version" : "24"
      },
      "crc" : {
        "crcvalue" : "45BD2E93"
      }
    },
    "Order_State" : 0,
    "Order_Version" : 0
  }
},

#_fields on the kibana

Order.content.seqnum: 107
Order.content.type: DAIRY
Order.content.section: A1
Order.content.time: 2022-03-02T14:21:45
Order.content.version: 24
Order.content.src: EAST
Order.content.status: 3
Order.crc.crcvalue: 45BD2E93

How can I copy/use the contents of json fields and put it in a new field? or should I just use gsub and substitute the contents of the fields?

All I am trying to achieve is like below and remove rest of the fields.

section: A1
time: 2022-03-02T14:21:45
src: EAST
Order_Version: 24
Order_State: Processing

Any help is appreciated. Thank you.

CodePudding user response:

For an object { "Order" : { "content" : { "section" : "A1" } } } elasticsearch and kibana would reference it using Order.content.section.

logstash uses the syntax [Order][content][section]. The advantage of this is that if you have another object { "Order" : { "content.section" : "A1" } } then logstash can unambigously distinguish between [Order][content][section] and [Order][content.section].

So in the mutate use "%{[Order][content][version}" etc.

  • Related