1

I'm reading a ton of questions and answers about this topic, but I can't solve my problem.

I initialized a Springboot project with Kafka and spring-data-jdbc. What I'm trying to do is

  1. Configure a Kafka JDBC Connector in order to push record changes from a PostgreSQL DB into a Kafka topic
  2. Setup a Kafka Consumer in order to consume records pushed into the topic by inserting them into another PostgresSQL DB.

For point 1 is everything ok. For point 2 I'm having some problem.

This is how is organized the project

com.migration
 - MigrationApplication.java
com.migration.config
 - KafkaConsumerConfig.java
com.migration.db
 - JDBCConfig.java
 - RecordRepository.java
com.migration.listener
 - MessageListener.java
com.migration.model
 - Record.java
 - AbstractRecord.java
 - PostgresRecord.java

This is the MessageListener class

@EnableJdbcRepositories("com.migration.db")
@Transactional
@Configuration
public class MessageListener {
    @Autowired
    private RecordRepository repository;

    @KafkaListener(topics={"author"}, groupId = "migrator", containerFactory = "migratorKafkaListenerContainerFactory")
    public void listenGroupMigrator(Record record) {
        repository.insert(message);
        throw new RuntimeException();
    }

I think is pretty clear, it setup a Kafka Consumer in order to listen on "author" topic and consume the record by inserting it into DB.

As you can see, inside listenGroupMigrator() method is performed the insert into DB of the record and then is thrown RuntimeException because I'm checking if @Transactional works and if rollback is performed.

But not, rollback is not performed, even if the class is annotated with @Transactional.

For completeness these are other classes

RecordRepository class

@Repository
public class RecordRepository {
    public RecordRepository() {}

    public void insert(Record record) {
        JDBCConfig jdbcConfig = new JDBCConfig();
        SimpleJdbcInsert messageInsert = new SimpleJdbcInsert(jdbcConfig.postgresDataSource());

        messageInsert.withTableName(record.tableName()).execute(record.content());
    }
}

JDBCConfig class

@Configuration
public class JDBCConfig {

    @Bean
    public DataSource postgresDataSource() {
        DriverManagerDataSource dataSource = new DriverManagerDataSource();
        dataSource.setDriverClassName("org.postgresql.Driver");
        dataSource.setUrl("jdbc:postgresql://localhost:5432/db");
        dataSource.setUsername("postgres");
        dataSource.setPassword("root");

        return dataSource;
    }
}

KafkaConsumerConfig class:

@EnableKafka
@Configuration
public class KafkaConsumerConfig {
    @Value(value = "${kafka.bootstrap-server}")
    private String bootstrapServer;

    private <T extends Record> ConsumerFactory<String, T> consumerFactory(String groupId, Class<T> clazz) {
        Map<String, Object> props = new HashMap<>();
        props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapServer);
        props.put(ConsumerConfig.GROUP_ID_CONFIG, groupId);
        props.put(JsonSerializer.ADD_TYPE_INFO_HEADERS, false);
        props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
        props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, JsonDeserializer.class);
        return new DefaultKafkaConsumerFactory<>(props, new StringDeserializer(), new JsonDeserializer<>(clazz));

    }

    private <T extends Record> ConcurrentKafkaListenerContainerFactory<String, T> kafkaListenerContainerFactory(String groupId, Class<T> clazz) {
        ConcurrentKafkaListenerContainerFactory<String, T> factory =
                new ConcurrentKafkaListenerContainerFactory<>();
        factory.setConsumerFactory(consumerFactory(groupId, clazz));
        return factory;
    }

    @Bean
    public ConcurrentKafkaListenerContainerFactory<String, PostgresRecord> migratorKafkaListenerContainerFactory() {
        return kafkaListenerContainerFactory("migrator", PostgresRecord.class);
    }
}

MigrationApplication class

@SpringBootApplication
public class MigrationApplication {
    public static void main(String[] args) {
        ConfigurableApplicationContext context = SpringApplication.run(MigrationApplication.class, args);
        MessageListener listener = context.getBean(MessageListener.class);
    }
}

How can I make the listenGroupMigrator method transactional?

OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
Vin
  • 701
  • 1
  • 9
  • 30
  • Note: This is not "Kafka Connect" – OneCricketeer Oct 02 '20 at 15:47
  • This question really has nothing to do with Kafka per se, it is just about the JDBC transaction. Have you set `@EnableTransactionManagement` on some `@Configuration` class? You need that for the framework to apply `@Transactional`. You can run it in a debugger, put a breakpoint in the listener, and you should see a transaction interceptor in the stack (it is that interceptor that commits or rolls back the transaction). – Gary Russell Oct 02 '20 at 16:25
  • I set `@EnableTransactionManagement` on `MessageListener` class or on `JDBCConfig` class but no result. I went in debug mode into `listenGroupMigrator()`, first time the record is inserted, then it's invoked other 10 times with the error that I'm trying to insert a record with the same pk. I see some debug variable referring to interceptors but nothing about a transaction interceptor. What I'm doing wrong? – Vin Oct 05 '20 at 08:05

0 Answers0