Home > Back-end >  How to provide Field type for HashMap of temporal accessors in Spring Data Elasticsearch
How to provide Field type for HashMap of temporal accessors in Spring Data Elasticsearch

Time:11-04

Given

private Map<String, ZonedDateTime> timePoints = new HashMap<>();

How to hint spring-data the type of the field ?

When put directly on the dictionary, the converter tries to parse the key and the value together if they were a date string.

@Field(FieldType.Date)
private Map<String, ZonedDateTime> timePoints = new HashMap<>();

When no field-type provided, the following error appears:

Type .. of property .. is a TemporalAccessor class but has neither a @Field annotation defining the date type nor a registered converter for writing! It will be mapped to a complex object in Elasticsearch!

CodePudding user response:

Putting the annotation on the property like

@Field(FieldType.Date)
private Map<String, ZonedDateTime> timePoints = new HashMap<>();

cannot work, because a Map is not a temporal type and thus cannot be converted as such.

If you leave the annotation away, the Map<String, ZonedDateTime> will be interpreted as an object. If you for example have

Map<String, ZonedDateTime> map = new Map();
map.put("utc", ZonedDateTime.of(LocalDateTime.now(), ZoneId.of("UTC")));
map.put("paris", ZonedDateTime.of(LocalDateTime.now(), ZoneId.of("Europe/Paris")));

and then on storing this object with Spring Data Elasticsearch this will try to create an object to be sent to Elasticsearch (JSON representation) that looks like this:

{
  "utc": {
    
  },
  "paris": {
    
  }
}

The inner objects that should represent the temporals are stored as nested objects and not as some converted value as it is not possible to add a field type to the values of a map - you see the warnings about the in the logs.

But using a Map as a property in Elasticsearch is problematic anyway. The keys are interpreted as properties of a sub-object. It is not possible to define a mapping of the types in the index before, because it is not known what possible names these properties can have. In my example it was "utc" and "paris", but it could be any String. Each of these values will be added by Elasticsearch as a dynamically mapped field to the index. This might lead to something called mapping explosion, therefore Elasticsearch limits the number of fields in an index to a default value of 1000. You could rethink the way you store the data in Elasticsearch.

If you want to stick to a Map, you will need to write a custom converter that is able to convert your Map<String, ZonedDateTime> to a Map<String, String> and back.

CodePudding user response:

Don't know if I have 100% understand your question but you should try implementing Persistable and adding annotated properties, these values will be automatically maintained for entities stored in Elasticsearch, like this example:

@Document(indexName = "person")
public class Person implements Persistable<Long> {

    @Nullable @Id
    private Long id;

    @CreatedDate
    @Nullable @Field(type = FieldType.Date, format = DateFormat.basic_date_time)
    private Instant created;

    @Nullable @CreatedBy
    private String createdBy;

    @LastModifiedDate
    @Nullable @Field(type = FieldType.Date, format = DateFormat.basic_date_time)
    private Instant lastModified;

    @Nullable @LastModifiedBy
    private String lastModifiedBy;

    @Nullable
    public Long getId() { return id; }

    @Override
    public boolean isNew() { return id == null || (createdBy == null && created == null); }

    // other properties, getter, setter...
}

CodePudding user response:

From the documentation, it looks like you might have to create custom converters and add them to a custom ElasticsearchCustomConversions bean. The below converters seem to work:

    @Bean
    public ElasticsearchCustomConversions elasticsearchCustomConversions() {
        return new ElasticsearchCustomConversions(
                Arrays.asList(new ZonedDateTimeWriteConverter(), new ZonedDateTimeReadConverter()));
    }
    
    @WritingConverter
    static class ZonedDateTimeWriteConverter implements Converter<ZonedDateTime, String> {

        @Override
        public String convert(ZonedDateTime source) {
            return DateTimeFormatter.ISO_ZONED_DATE_TIME.format(source);
        }
    }

    @ReadingConverter
    static class ZonedDateTimeReadConverter implements Converter<String, ZonedDateTime> {

        @Override
        public ZonedDateTime convert(String source) {
            return ZonedDateTime.from(DateTimeFormatter.ISO_ZONED_DATE_TIME.parse(source));
        }
    }
  • Related