1

I'm trying to consume records from a MySQL table which contains 3 columns (Axis, Price, lastname) with their datatypes (int, decimal(14,4), varchar(50)) respectively.

I inserted one record which has the following data (1, 5.0000, John).

The following Java code (which consumes the AVRO records from a topic created by a MySQL Connector in Confluent platform) reads the decimal column: Price, as java.nio.HeapByteBuffer type so i can't reach the value of the column when i receive it.

Is there a way to extract or convert the received data to a Java decimal or double data type?

Here is the MySQL Connector properties file:-

{
  "name": "mysql-source",
  "config": {
  "connector.class": "io.confluent.connect.jdbc.JdbcSourceConnector",
   "key.converter": "io.confluent.connect.avro.AvroConverter",
   "key.converter.schema.registry.url": "http://localhost:8081",
   "value.converter": "io.confluent.connect.avro.AvroConverter",
   "value.converter.schema.registry.url": "http://localhost:8081",
   "incrementing.column.name": "Axis",
   "tasks.max": "1",
   "table.whitelist": "ticket",
   "mode": "incrementing",
   "topic.prefix": "mysql-",
   "name": "mysql-source",
   "validate.non.null": "false",
   "connection.url": "jdbc:mysql://localhost:3306/ticket? 
   user=user&password=password"
   }
}

Here is the code:-

    public static void main(String[] args) throws InterruptedException, 
     IOException {

        Properties props = new Properties();

        props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
props.put(ConsumerConfig.GROUP_ID_CONFIG, "group1");


        props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, 
        "org.apache.kafka.common.serialization.StringDeserializer");
        props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, 
        "io.confluent.kafka.serializers.KafkaAvroDeserializer");
        props.put("schema.registry.url", "http://localhost:8081");

        props.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");

        String topic = "sql-ticket";
final Consumer<String, GenericRecord> consumer = new KafkaConsumer<String, GenericRecord>(props);
consumer.subscribe(Arrays.asList(topic));

try {
  while (true) {
    ConsumerRecords<String, GenericRecord> records = consumer.poll(100);
    for (ConsumerRecord<String, GenericRecord> record : records) {
      System.out.printf("value = %s \n", record.value().get("Price"));
    }
  }
} finally {
  consumer.close();
}

}

enter image description here

OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
Mahmoud Elbably
  • 139
  • 2
  • 10
  • Your consumer code is using the old, deprecated Consumer API... Where did you get that code from? – OneCricketeer Sep 13 '18 at 22:04
  • @cricket_007 i'm using confluent 4.0.0 and the java example in this version https://docs.confluent.io/4.0.0/schema-registry/docs/serializer-formatter.html if i use the latest version, will it solve the problem? – Mahmoud Elbably Sep 14 '18 at 05:53
  • Well, that section of the documentation is outdated... I can file an issue and work on updating it, but please see this. Specifically, Zookeeper is no longer necessary for a client to interact with Kafka https://www.confluent.io/blog/tutorial-getting-started-with-the-new-apache-kafka-0-9-consumer-client/ – OneCricketeer Sep 14 '18 at 06:46
  • Otherwise, try `avroRecord.get("Price").getFloat())` – OneCricketeer Sep 14 '18 at 06:50
  • @cricket_007 alright now i'm using confluent 5.0.0 and i updated the code from this document https://docs.confluent.io/current/schema-registry/docs/serializer-formatter.html but i'm getting the same results, also it says there is no method `getFloat()` – Mahmoud Elbably Sep 14 '18 at 09:27
  • I got that method from the heapbytebuffer javadoc – OneCricketeer Sep 14 '18 at 18:34
  • This might help https://stackoverflow.com/questions/26676733/how-to-convert-java-nio-heapbytebuffer-to-string – OneCricketeer Sep 14 '18 at 18:41
  • @cricket_007 tried that but the output is useless, some random string. is there any other way around it? this problem is very annoying and only that data type (decimal) is stopping the whole process! – Mahmoud Elbably Sep 15 '18 at 00:28
  • 1
    1) I haven't used the JDBC source connector so I unfortunately don't know of a solution for you. There has to be some method of `HeapByteBuffer` that takes bytes into the type you want. 2) I typically use SpecificRecord avro types built with the Avro Maven plugin rather than plain GenericRecords. – OneCricketeer Sep 15 '18 at 06:36

1 Answers1

1

Alright so i finally found the solution.

The Heapbytebuffer needs to be converted to a byte[] array, then i used BigInteger which constructs the value from the created byte array, then i created a BigDecimal variable that takes the value of the BigInteger and i set the decimal point with movePointLeft(4) which is the Scale (in my case : 4) and everything worked as expected.

    ByteBuffer buf = (ByteBuffer) record.value().get(("Price"));
    byte[] arr = new byte[buf.remaining()];
    buf.get(arr);
    BigInteger bi =new BigInteger(1,arr);
    BigDecimal bd = new BigDecimal(bi).movePointLeft(4);
    System.out.println(bd);

Here are the results ( Left is the output, Right is MySQL) :-

enter image description here

Mahmoud Elbably
  • 139
  • 2
  • 10