1

I have Kafka Produce which sends the message to kafka .And i log the message in database in the both onsucess and onFailure with the help stored procedure . As shown in the code i am using asynchronous

  1. should i mark my callStoredProcedure method in the repository as synchronised to avoid deadlocks? i believe synchronised is not needed as callback will be executed sequentially in a single thread.

  2. from the below link

    https://kafka.apache.org/10/javadoc/org/apache/kafka/clients/producer/KafkaProducer.html

Note that callbacks will generally execute in the I/O thread of the producer and so should be reasonably fast or they will delay the sending of messages from other threads. If you want to execute blocking or computationally expensive callbacks it is recommended to use your own Executor in the callback body to parallelize processing.

Should i execute callbacks in other thread ? And can u share the code snippet how to excute callback in other thread. like parallelise callback in 3 threads

My code snippet

@Autowired
private Myrepository  myrepository;

public void sendMessageToKafka(List<String> message) {
    
            for (String s : message) {
    
                future = kafkaTemplate.send(topicName, message);
    
                future.addCallback(new ListenableFutureCallback<SendResult<String, String>>() {
    
                    @Override
                    public void onSuccess(SendResult<String, String> result) {
    
                        System.out.println("Message Sent  " + result.getRecordMetadata().timestamp());
                        
                        myrepository.callStoredProcedure(result,"SUCCESS");
    
                    }
    
                    @Override
                    public void onFailure(Throwable ex) {
    
                        System.out.println(" sending failed  ");
                        myrepository.callStoredProcedure(result,"FAILED");
    
                    }
                });
    
            }
Pale Blue Dot
  • 511
  • 2
  • 13
  • 33

1 Answers1

3

private final ExecutorService exec = Executors.newSingleThreadExecutor();


...

this.exec.submit(() -> myrepository.callStoredProcedure(result,"SUCCESS"));

The tasks will still be run on a single thread (but not the Kafka IO thread).

If it can't keep up with your publishing rate, you might need to use a different executor such as a cached thread pool executor or Spring's ThreadPoolTaskExecutor.

Gary Russell
  • 166,535
  • 14
  • 146
  • 179
  • Thanks for the reply.I need to publish around 20,000 messages.i would read 20k messages lets say from a rest end point ,publish the message to Kafka and call stored procedure ( which has simple inserts) for each message.So would you recommend using ExecutorService or ThreadPoolTaskExecutor ? – Pale Blue Dot Aug 28 '20 at 21:32
  • 1
    You'll have to experiment; it depends on your DB performance. You might need to batch up the results and submit them to the DB in batches to improve performance (at the risk of losing data in the event of a server crash). – Gary Russell Aug 28 '20 at 21:36