Home > Software design >  When trying to index in elasticsearch 7.8.1, an error occurs saying "field" is too large,
When trying to index in elasticsearch 7.8.1, an error occurs saying "field" is too large,

Time:05-25

When trying to index in elasticsearch 7.8.1, an error occurs saying "testField" is too large, must be <= 32766 Is there a solution?

Field Info

"testField":{ "type": "keyword", "index": false }

CodePudding user response:

It is a known issue and it is not clear yet on what is best to solve it. Lucene enforces a maximum term length of 32766, beyond which the document is rejected.

Until this gets solved, there are two immediate options you can choose from:

A. Use a script ingest processor to truncate the value to at most 32766 bytes.

PUT _ingest/pipeline/truncate-pipeline
{
  "description": "truncate",
  "processors": [
    {
      "script": {
        "source": """
          ctx.testField = ctx.testField.substring(0, 32766);
        """
      }
    }
  ]
}

PUT my-index/_doc/123?pipeline=truncate-pipeline
{ "testField": "hgvuvhv....sjdhbcsdc" }

B. Use a text field with an appropriate analyzer that would truncate the value, but you'd lose the ability to aggregate and sort on that field.

If you want to keep your field as a keyword, I'd go with option A

  • Related