Home > Software design >  Elasticsearch: add JSON in historic indices
Elasticsearch: add JSON in historic indices

Time:01-06

We currently are importing data using Logstash. One of the fields ("request") is a JSON stored as a string. We now require a section of this stored JSON as fields in the searchable index. I have updated Logstash filter using

filter {
    json {
        source => "request"
        target => "[@metadata][request_json]"
    }
    if [@metadata][request_json][merchant] {
        # in the Request, pull out the Merchant-ID
        mutate {
            add_field => {
                "merchant_id" => "%{[@metadata][request_json][merchant][id]}"
                "merchant_name" => "%{[@metadata][request_json][merchant][name]}"
            }
        }
    }
}

Which works great for new data.

How can I update the indices for the historic data? I'm using Elasticsearch, Logstash and Kibana 8.5.3

CodePudding user response:

@Steven I would suggest to create an ingest pipeline performing same operation of extracting the fields and use processors to set those values to the event.

You can then run update_by_query on your old indexes using this pipeline. See: https://www.elastic.co/guide/en/elasticsearch/reference/8.5/docs-update-by-query.html#docs-update-by-query-api-ingest-pipeline

  • Related