Reading log message from kafka and storing into elastic search from my reactive spring application, getting the following 2 errors:
1. reactor.retry.RetryExhaustedException:
org.springframework.web.client.HttpServerErrorException:
503 POST request to /hotels-transactional-logs-local-2020.11.17/_doc
returned error code 503.
2. reactor.retry.RetryExhaustedException:
io.netty.handler.timeout.ReadTimeoutException
My index settings:
{
"logs-dev-2020.11.17": {
"settings": {
"index": {
"highlight": {
"max_analyzed_offset": "5000000"
},
"number_of_shards": "3",
"provided_name": "logs-dev-2020.11.17",
"creation_date": "1604558592095",
"number_of_replicas": "2",
"uuid": "wjIOSfZOSLyBFTt1cT-whQ",
"version": {
"created": "7020199"
}
}
}
}
}
Save method in spring webclient:
public Mono<Log> saveDataIntoIndex(Log esLog, String index) {
return reactiveElasticsearchOperations.save(esLog, index, "_doc")
.publishOn(Schedulers.newElastic("es", 5))
.retryWhen(
Retry.anyOf(Exception.class)
.randomBackoff(Duration.ofSeconds(5), Duration.ofMinutes(1))
.retryMax(2)
);
}
ReactiveElasticsearchOperations creation code:
@Bean
public ReactiveElasticsearchClient reactiveElasticsearchClient() {
ClientConfiguration clientConfiguration = ClientConfiguration.builder()
.connectedTo(elasticSearchEndpoints.split(","))
.withWebClientConfigurer(webClient -> {
ExchangeStrategies exchangeStrategies = ExchangeStrategies.builder()
.codecs(configurer -> configurer.defaultCodecs().maxInMemorySize(-1))
.build();
return webClient.mutate().exchangeStrategies(exchangeStrategies).build();
})
.build();
return ReactiveRestClients.create(clientConfiguration);
}
@Bean
public ElasticsearchConverter elasticsearchConverter() {
return new MappingElasticsearchConverter(elasticsearchMappingContext());
}
@Bean
public SimpleElasticsearchMappingContext elasticsearchMappingContext() {
return new SimpleElasticsearchMappingContext();
}
@Bean
public ReactiveElasticsearchOperations reactiveElasticsearchOperations() {
return new ReactiveElasticsearchTemplate(reactiveElasticsearchClient(), elasticsearchConverter());
}
Other info:
- Elastic Search version 7.2.1
- Cluster health is good and they are 3 nodes in cluster
- Index will be created on daily basis, there are 3 shards per index
Its occuring while doing load test.
How do we control this?
Somewhere I need to slow down the read and write? If so, how can I do?
I have tried increasing the read timeout seconds, but the issue still persists
withSocketTimeout(Duration.ofSeconds(100))
@Bean
public ReactiveElasticsearchClient reactiveElasticsearchClient() {
ClientConfiguration clientConfiguration = ClientConfiguration.builder()
.connectedTo(elasticSearchEndpoints.split(","))
.withSocketTimeout(Duration.ofSeconds(100)) //the read timeout, max amount of time the client should wait while receiving no data from the server and the response is incomplete
.withWebClientConfigurer(webClient -> {
ExchangeStrategies exchangeStrategies = ExchangeStrategies.builder()
.codecs(configurer -> configurer.defaultCodecs().maxInMemorySize(-1))
.build();
return webClient.mutate().exchangeStrategies(exchangeStrategies).build();
})
.build();
return ReactiveRestClients.create(clientConfiguration);
}
I gone through many sites. No solution. Kindly help.