4

I'm currently migrating from Spring Data Elasticsearch 3.2.x to 4.0.0.

I'm removing a JacksonEntityMapper, that defined a custom ZonedDateTimeDeserializer, to use the ElasticsearchEntityMapper

I have a ZonedDateTime field defined as follows:

    @Field(type = Date, format = DateFormat.date_time)
    private final ZonedDateTime loggedIn;

However, the deserialization of this loses the zone information, so that a comparison between the field before and after being stored fails:

before

loggedIn=2020-06-01T09:50:27.389589+01:00[Europe/London]

after

loggedIn=2020-06-01T09:50:27.389+01:00

I expect the zone information to be lost as only the timezone offset is being stored. With the Jackson ZonedDateTimeDeserializer I was able to apply the Zone during the ZonedDateTime construction.

Ideally, I'd like to define a custom date format and converter classes to handle my scenario.

I've tried the following field configuration:

    @Field(type = Date, format = DateFormat.custom, pattern = "yyyy-MM-dd'T'HH:mm:ss.SSSSSSZ")
    private final ZonedDateTime loggedIn;

With Reading/WritingConverters

@WritingConverter
public class ZonedDateTimeToStringConverter implements Converter<ZonedDateTime, String>  {

    @Override
    public String convert(ZonedDateTime source) {
        return source.format(DateTimeFormatter.ISO_OFFSET_DATE_TIME);
    }
}

@ReadingConverter
public class StringToZonedDateTimeConverter implements Converter<String, ZonedDateTime>  {

    @Override
    public ZonedDateTime convert(String source) {
        return ZonedDateTime.parse(source, DateTimeFormatter.ISO_OFFSET_DATE_TIME.withZone(ZoneId.systemDefault()));
    }
}

and configuration

public class ElasticConfiguration extends AbstractElasticsearchConfiguration {

    @Bean
    @Override
    public ElasticsearchCustomConversions elasticsearchCustomConversions() {
        return new ElasticsearchCustomConversions(List.of(new ZonedDateTimeToStringConverter(),
                                                          new StringToZonedDateTimeConverter()));
    }
}

However, the reading of the field fails with an exception

Caused by: java.time.DateTimeException: Unable to obtain LocalDate from TemporalAccessor: {YearOfEra=2020, MonthOfYear=8, DayOfMonth=20, OffsetSeconds=3600},ISO resolved to 11:11:11.123 of type java.time.format.Parsed
    at java.base/java.time.LocalDate.from(LocalDate.java:396)
    at java.base/java.time.ZonedDateTime.from(ZonedDateTime.java:560)
    at org.springframework.data.elasticsearch.core.convert.ElasticsearchDateConverter.parse(ElasticsearchDateConverter.java:109)
    at org.springframework.data.elasticsearch.core.convert.ElasticsearchDateConverter.parse(ElasticsearchDateConverter.java:114)
    ...

Looking at the exception, when comparing the parsing against the successful DateFormat.date_time read, I may have an error in the pattern. The TemporalAccessor for the DateFormat.date_time is {OffsetSeconds=3600, InstantSeconds=1597918271},ISO resolved to 2020-08-20T11:11:11.123, whereas my custom pattern parses to {YearOfEra=2020, MonthOfYear=8, DayOfMonth=20, OffsetSeconds=3600},ISO resolved to 11:11:11.123

But it also seems that the custom converters I specified aren't being picked up. Note. I have other customer converters specified that are being picked up so don't believe it's a configuration issue.

Any help would be appreciated, I'm not sure why the custom pattern fails, but think I could avoid it if the custom converters were picked up. I can workaround the issue for now, but ideally I'd like everything to be consistent before and after the upgrade.

Matt Garner
  • 125
  • 1
  • 4

2 Answers2

8

Don't use yyyy in a date pattern, change it to (see the Elasticsearch docs)

pattern = "uuuu-MM-dd'T'HH:mm:ss.SSSSSSZ")

By defining the property as FieldType.Dateinternally a converter is created for this property and used; the custom converters aren't needed

P.J.Meisch
  • 18,013
  • 6
  • 50
  • 66
  • Ah. Thanks. I missed that in their documentation. One follow up question, is if it isn't possible to specify a customer converter, is it possible to specify a locale to default to? So that the deserialized object has [Europe/London] set. – Matt Garner Jun 03 '20 at 08:24
  • 1
    To explain the handling of the custom converter in Date fields: For these, an internal converter is created according to the annotation parameter that is attached to the property. When the entity is written or read, this property-converter is preferred to a globally registered one. This allows for having different properties in an entity with different time formats. This would not be possible if the global converter had a higher priority. As for the time zone-id, did you try adding/using `V` (see https://docs.oracle.com/javase/8/docs/api/java/time/format/DateTimeFormatter.html) – P.J.Meisch Jun 03 '20 at 10:07
0

ElasticsearchDateConverter is a final class and causes error on custom date patterns.

ElasticsearchCustomConversions work only on "non-mapped" date types.

This is a limitation for the newest versions of spring-data-elasticsearch.

The fields on elastic can accept many date formats but on spring this is blocked.

solution: use only rest client and jackson with custom date formats:

private ObjectMapper getJacksonObjectMapper() {
        if (jacksonMapper == null) {
            jacksonMapper = new ObjectMapper();
            jacksonMapper.configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, false);
            jacksonMapper.configure(DeserializationFeature.ACCEPT_SINGLE_VALUE_AS_ARRAY, true);
            jacksonMapper.configure(DeserializationFeature.ACCEPT_EMPTY_STRING_AS_NULL_OBJECT, true);
            jacksonMapper.disable(SerializationFeature.WRITE_DATES_AS_TIMESTAMPS);
            // strictMapper.disable(DeserializationFeature.READ_DATE_TIMESTAMPS_AS_NANOSECONDS);
            SimpleModule module = new SimpleModule();
            module.addDeserializer(LocalDateTime.class, new CustomLocalDeteTimeDeserializer());
            module.addDeserializer(ZonedDateTime.class, new CustomZonedDeteTimeDeserializer());
            module.addDeserializer(Date.class, new CustomDeteDeserializer());
            jacksonMapper.registerModule(module);

        }
        return jacksonMapper;
    }


public class CustomLocalDeteTimeDeserializer extends JsonDeserializer<LocalDateTime> {

    @Override
    public LocalDateTime deserialize(JsonParser jsonparser, DeserializationContext context)
            throws IOException, JsonProcessingException {
        String dateAsString = jsonparser.getText();
        try {
            return LocalDateTime.parse(dateAsString, DateTimeFormatter.ofPattern("yyyy-MM-dd'T'HH:mm:ss.SSS'Z'"));
        } catch (Exception e) {
             try {
                 return LocalDateTime.parse(dateAsString, DateTimeFormatter.ofPattern("yyyyMMddHHmmss"));
             } catch (Exception e1) {
                 try {
                     return LocalDateTime.parse(dateAsString, DateTimeFormatter.ofPattern("yyyyMMdd"));
                 } catch (Exception e2) {
                     throw new RuntimeException(e2);
                 }
             }
        }
    }
}


@Bean(name="customConverter")
    public ElasticsearchConverter elasticsearchConverter(SimpleElasticsearchMappingContext mappingContext,
        ElasticsearchCustomConversions elasticsearchCustomConversions) {
        DefaultConversionService cs=new DefaultConversionService();
        MappingElasticsearchConverter converter = new MappingElasticsearchConverter(mappingContext,cs) {
            @Override
            public <R> R read(Class<R> type, org.springframework.data.elasticsearch.core.document.Document source) {
                return getJacksonObjectMapper().convertValue(source, type);
            }
        };
        converter.setConversions(elasticsearchCustomConversions);
        return converter;
    }

public ElasticsearchRestTemplate elasticSearchTemplate(@Qualifier("customConverter")ElasticsearchConverter elasticsearchConverter) {
        return new ElasticsearchRestTemplate(client(), elasticsearchConverter);
    }

Glaucio
  • 61
  • 2