I was write the mapper for insert data into elastic index but I got following error.
elasticsearch.BadRequestError: BadRequestError(400, 'mapper_parsing_exception', 'not_x_content_exception: Compressor detection can only be called on some xcontent bytes or compressed xcontent bytes')
mapper = {"mappings":
{
"event_info": {"type": "nested",
"properties": {
"type_info": {"type": "text"},
"op_type": {"type": "text"},
"file_name": {"type": "text"},
"file_ext": {"type": "text"},
"process_id": {"type": "text"},
"time_stamp": {"type": "text"}
}
}
}
}
data = [{'event_info': [{'type_info': 'INFO', 'op_type': 'WRITE', 'file_name': '0.txt', 'file_ext': '.txt', 'process_id': '1234', 'time_stamp': '2022-10-17 05:23:06.8620427 0000 UTC'}]}]
Need to create correct mapper for inserting data. Any help would be appreciated.
CodePudding user response:
With Elasticsearch 8.5.0, this works for me.
Mapping:
{
"mappings": {
"properties": {
"event_info": {
"type": "nested",
"properties": {
"type_info": {"type": "text"},
"op_type": {"type": "text"},
"file_name": {"type": "text"},
"file_ext": {"type": "text"},
"process_id": {"type": "text"},
"time_stamp": {"type": "text"}
}
}
}
}
}
Document format to insert:
{
"event_info" : {
"type_info": "INFO",
"op_type": "WRITE",
"file_name": "0.txt",
"file_ext": ".txt",
"process_id": "1234",
"time_stamp": "2022-10-17 05:23:06.8620427 0000 UTC"
}
}
If using bulk you can put the document with:
{"index": {"_index": "your_index"}}
{"event_info": {"type_info": "INFO", "op_type": "WRITE", "file_name": "0.txt","file_ext": ".txt", "process_id": "1234", "time_stamp": "2022-10-17 05:23:06.8620427 0000 UTC"}}