Home > Software engineering >  failed to parse field [duration] of type [long] in document
failed to parse field [duration] of type [long] in document

Time:11-14

In a kuberenetes cluster, when pushing the data of jaeger-tracing to elasticsearch i am getting the below error in the jaeger-tracing-collector, I do not have any filter like logstash/filebeat in between. The jaeger data is direclty pushed into elasticsearch. I am very new to elasticsearch and related things, any help would be greatly appreciated..

Thanks in advance...

Error :

{
  "level": "error",
  "ts": 1656982524.1294773,
  "caller": "config/config.go:137",
  "msg": "Elasticsearch part of bulk request failed",
  "map-key": "index",
  "response": {
    "_index": "jaeger-span-2022-07-05",
    "_type": "_doc",
    "_id": "9EHay4EBv4T2qdA80Ei7",
    "status": 400,
    "error": {
      "type": "mapper_parsing_exception",
      "reason": "failed to parse field [duration] of type [long] in document with id '9EHay4EBv4T2qdA80Ei7'. Preview of field's value: '18446744073709550616'",
      "caused_by": {
        "reason": "Numeric value (18446744073709550616) out of range of long (-9223372036854775808 - 9223372036854775807)n at [Source: (ByteArrayInputStream); line: 1, column: 199]",
        "type": "i_o_exception"
      }
    }
  },
  "stacktrace": "github.com/jaegertracing/jaeger/pkg/es/config.(*Configuration).NewClient.func2\\n\\tgithub.com/jaegertracing/jaeger/pkg/es/config/config.go:137\\ngithub.com/olivere/elastic.(*bulkWorker).commit\\n\\tgithub.com/olivere/[email protected] incompatible/bulk_processor.go:588\\ngithub.com/olivere/elastic.(*bulkWorker).work\\ntgithub.com/olivere/[email protected] incompatible/bulk_processor.go:501"
}

Jaeger version : 1.21.0 Elasticsearch version : 7.17.5

CodePudding user response:

Tldr;

It seems you are out of bound as per the error message

"reason": "Numeric value (18446744073709550616) out of range of long (-9223372036854775808 - 9223372036854775807)n at [Source: (ByteArrayInputStream); line: 1, column: 199]"

Solution

As @Val mentioned in the comment, having your mapping would help.

First of all to store the value 18446744073709550616 you need to choose eitherfrom this list:

  • float
  • double
  • unsigned_long

Then when creating your index, you set the type for the field:

PUT 74388060/
{
  "mappings": {
    "properties": {
      "big_number": {
        "type": "float"
      }
    }
  }
}

and then

POST /74388060/_doc/
{
  "big_number": "18446744073709550616"
}

And you should be out of the woods.

  • Related