0

I want to implement Kafka producer which sends and receives Java Serialized Objects. I tried this:

Producer:

@Configuration
public class KafkaProducerConfig {

    @Value(value = "${kafka.bootstrapAddress}")
    private String bootstrapAddress;

    @Bean
    public ProducerFactory<String, String> producerFactory() {
        Map<String, Object> configProps = new HashMap<>();
        configProps.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapAddress);
        configProps.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
        configProps.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, SaleRequestFactory.class);
        return new DefaultKafkaProducerFactory<>(configProps);
    }

    @Bean
    public KafkaTemplate<String, String> kafkaTemplate() {
        return new KafkaTemplate<>(producerFactory());
    }
}

Send object:

@Autowired
private KafkaTemplate<String, SaleRequestFactory> kafkaTemplate;

private static String topic = "tp-sale";

private void perform(){

    Transaction transaction = new Transaction();
    transaction.setStatus(PaymentTransactionStatus.IN_PROGRESS.getText());

    Transaction insertedTransaction = transactionService.save(transaction);

    SaleRequestFactory obj = new SaleRequestFactory();
    obj.setId(100);

    ListenableFuture<SendResult<String, SaleRequestFactory>> send = kafkaTemplate.send(topic, obj);
}

Consumer:

@EnableKafka
@Configuration
public class KafkaConsumerConfig {

    @Value(value = "${kafka.bootstrapAddress}")
    private String bootstrapAddress;

    private String groupId = "test";

    @Bean
    public ConsumerFactory<String, String> consumerFactory() {
        Map<String, Object> props = new HashMap<>();
        props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapAddress);
        props.put(ConsumerConfig.GROUP_ID_CONFIG, groupId);
        props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
        props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, SaleRequestFactory.class);
        return new DefaultKafkaConsumerFactory<>(props);
    }

    @Bean
    public ConcurrentKafkaListenerContainerFactory<String, String>
    kafkaListenerContainerFactory() {

        ConcurrentKafkaListenerContainerFactory<String, String> factory =
                new ConcurrentKafkaListenerContainerFactory<>();
        factory.setConsumerFactory(consumerFactory());
        return factory;
    }
}

// Receive Object

    private static String topic = "tp-sale";

    @KafkaListener(topics = "tp-sale")
    public SaleResponseFactory transactionElavonAuthorizeProcess(@Payload SaleRequestFactory tf, @Headers MessageHeaders headers) throws Exception {

        System.out.println(tf.getId());

        SaleResponseFactory resObj = new SaleResponseFactory();
        resObj.setUnique_id("123123");

        return resObj;
    }

Custom Object: import org.apache.kafka.common.serialization.Serializer;

@Getter
@Setter
@NoArgsConstructor
@AllArgsConstructor
@Builder(toBuilder = true)
public class SaleRequestFactory implements Serializable, Serializer {

    private static final long serialVersionUID = 1744050117179344127L;
    
    private int id;

    @Override
    public byte[] serialize(String s, Object o) {
        return new byte[0];
    }
}


import org.apache.kafka.common.serialization.Deserializer;

@Getter
@Setter
@NoArgsConstructor
@AllArgsConstructor
@Builder(toBuilder = true)
public class SaleResponseFactory implements Serializable, Deserializer {

    private static final long serialVersionUID = 1744050117179344127L;

    private String unique_id;

    @Override
    public Object deserialize(String s, byte[] bytes) {
        return null;
    }
}

When I deploy the Producer I get error during deployment:

Caused by: org.apache.kafka.common.KafkaException: class org.engine.plugin.transactions.factory.SaleResponseFactory is not an instance of org.apache.kafka.common.serialization.Deserializer

How I need to implement properly serialize method and deserialize methods into Custom Objects?

Do you know how I can fix this issue?

Peter Penzov
  • 1,126
  • 134
  • 430
  • 808
  • 1) Show your import statements 2) What's wrong with using JSON/Avro/Protobuf? 3) You shouldn't rely on JVM serialization if you plan on sharing that topic data with clients in other languages, or even other JVM versions – OneCricketeer Nov 08 '20 at 16:46
  • And you can edit or reply to previous posts. Doesn't look like your error has changed – OneCricketeer Nov 08 '20 at 16:51
  • 1) I added the import statements for the custom Objects. 2) I prefer to use Object, not JSON due to performance reasons. 3) I will use Java into every module with one JVM version. – Peter Penzov Nov 08 '20 at 16:54
  • Please edit your previous question with that information. Also, I'm not entirely certain Java serialization is actually faster over the network for speed or size – OneCricketeer Nov 08 '20 at 16:58
  • Will do. Can you guide what I'm missing into the code or mistake that I have? – Peter Penzov Nov 08 '20 at 16:59

0 Answers0