Personal knowlegedment: I read from javacodegeeks: "... SimpleAsyncTaskExecutor is ok for toy projects but for anything larger than that it’s a bit risky since it does not limit concurrent threads and does not reuse threads. So to be safe, we will also add a task executor bean... " and from baeldung a very simple example how to add our own Task Executor. But I can find any guidance explaining what are the consequences and some worth cases to apply it.
Personal desire: I am working hard to provide a corporative architecture for our microservices logs be publish on Kafka topics. It seems reasonble the statement " risky caused by not limit concurrent threads and not reuse it" mainly for my case that is based on logs.
I am running the bellow code succesfully in local desktop but I am wondering if I am providing a custom Task Executor properly.
My question: does this configuration bellow taking in account I am already using kafkatempla (i.e. syncronized, singleton and thread safe by default at least for producing/sending messsage as far as understand it) really going in right direction to reuse threads and avoid spread accidentally threads creation while using SimpleAsyncTaskExecutor?
Producer config
@EnableAsync
@Configuration
public class KafkaProducerConfig {
private static final Logger LOGGER = LoggerFactory.getLogger(KafkaProducerConfig.class);
@Value("${kafka.brokers}")
private String servers;
@Bean
public Executor taskExecutor() {
ThreadPoolTaskExecutor executor = new ThreadPoolTaskExecutor();
executor.setCorePoolSize(2);
executor.setMaxPoolSize(2);
executor.setQueueCapacity(500);
executor.setThreadNamePrefix("KafkaMsgExecutor-");
executor.initialize();
return executor;
}
@Bean
public Map<String, Object> producerConfigs() {
Map<String, Object> props = new HashMap<>();
props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, servers);
props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, JsonDeserializer.class);
return props;
}
}
Producer
@Service
public class Producer {
private static final Logger LOGGER = LoggerFactory.getLogger(Producer.class);
@Autowired
private KafkaTemplate<String, String> kafkaTemplate;
@Async
public void send(String topic, String message) {
ListenableFuture<SendResult<String, String>> future = kafkaTemplate.send(topic, message);
future.addCallback(new ListenableFutureCallback<SendResult<String, String>>() {
@Override
public void onSuccess(final SendResult<String, String> message) {
LOGGER.info("sent message= " + message + " with offset= " + message.getRecordMetadata().offset());
}
@Override
public void onFailure(final Throwable throwable) {
LOGGER.error("unable to send message= " + message, throwable);
}
});
}
}
for demo purposes:
@SpringBootApplication
public class KafkaDemoApplication implements CommandLineRunner {
public static void main(String[] args) {
SpringApplication.run(KafkaDemoApplication.class, args);
}
@Autowired
private Producer p;
@Override
public void run(String... strings) throws Exception {
p.send("test", " qualquer messagem demonstrativa");
}
}