Generally: i'm reading serialized object (as JSONs) from Kafka Stream and trying to save it to Redis using Spring Data repository.
After a two calls (objects has not been saved to Redis) to repository.save() i get StackOverFlowError:
Exception in thread "processOffers-applicationId-1c24ef63-baae-47b9-beb7-5e6517736bc4-StreamThread-1" java.lang.StackOverflowError
at org.springframework.data.util.Lazy.get(Lazy.java:94)
at org.springframework.data.mapping.model.AnnotationBasedPersistentProperty.usePropertyAccess(AnnotationBasedPersistentProperty.java:277)
at org.springframework.data.mapping.model.BeanWrapper.getProperty(BeanWrapper.java:134)
at org.springframework.data.mapping.model.BeanWrapper.getProperty(BeanWrapper.java:115)
at org.springframework.data.redis.core.convert.MappingRedisConverter.lambda$writeInternal$2(MappingRedisConverter.java:601)
at org.springframework.data.mapping.model.BasicPersistentEntity.doWithProperties(BasicPersistentEntity.java:353)
at org.springframework.data.redis.core.convert.MappingRedisConverter.writeInternal(MappingRedisConverter.java:597)
at org.springframework.data.redis.core.convert.MappingRedisConverter.lambda$writeInternal$2(MappingRedisConverter.java:639)
Serialized POJO look like this:
@Data
@With
@NoArgsConstructor
@AllArgsConstructor
@RedisHash("students")
public class Student {
@Id
@JsonProperty("student_id")
private long id;
@JsonProperty("entities")
private Map<String, Object> entities = new HashMap<>();
}
Map entities contains 100+ Entries, with nested maps (objects).
Interesting part: if i make map empty everything works fine and data instantly saved to Redis.
Corresponding repository for POJO:
@Repository
public interface StudentRepository extends CrudRepository<Student, Long> {
}
Also, i've defined RedisCustomConversion for Long id field:
@Component
@ReadingConverter
public class BytesToLongConverter implements Converter<byte[], Long> {
@Override
public Long convert(final byte[] source) {
ByteBuffer buffer = ByteBuffer.allocate(Long.BYTES);
buffer.put(source);
buffer.flip();
return buffer.getLong();
}
}
@Component
@WritingConverter
public class LongToBytesConverter implements Converter<Long, byte[]> {
@Override
public byte[] convert(final Long source) {
ByteBuffer buffer = ByteBuffer.allocate(Long.BYTES);
buffer.putLong(source);
return buffer.array();
}
}
Redis configuration class looks like this:
@Configuration
@EnableRedisRepositories
public class RedisConfiguration {
@Bean
@Primary
public RedisProperties redisProperties() {
return new RedisProperties();
}
@Bean
public RedisConnectionFactory redisConnectionFactory() {
var config = new RedisStandaloneConfiguration();
var props = redisProperties();
config.setHostName(props.getHost());
config.setPort(props.getPort());
return new JedisConnectionFactory(config);
}
@Bean
public RedisTemplate<String, Object> redisTemplate() {
var template = new RedisTemplate<String, Object>();
template.setConnectionFactory(redisConnectionFactory());
template.setDefaultSerializer(new GenericJackson2JsonRedisSerializer());
return template;
}
@Bean
public RedisCustomConversions redisCustomConversions(LongToBytesConverter longToBytes,
BytesToLongConverter bytesToLong) {
return new RedisCustomConversions(Arrays.asList(longToBytes, bytesToLong));
}
}
UPD: I've found this issue on Spring Data Redis Jira, but the resolution set as "Fixed", so it's seems strange to me.