2

My usecase is I want to push Avro data from Kafka to HDFS. Camus seems to be right tool, however I am not able to make it work. I am new to camus, trying to make camus-example work, https://github.com/linkedin/camus

Now I am trying to make camus-example work. However I am still facing issues.

Code Snippet for DummyLogKafkaProducerClient

package com.linkedin.camus.example.schemaregistry;

import java.util.Date;
import java.util.HashMap;
import java.util.Map;
import java.util.Properties;
import java.util.Random;

import kafka.javaapi.producer.Producer;
import kafka.producer.KeyedMessage;
import kafka.producer.ProducerConfig;

import com.linkedin.camus.etl.kafka.coders.KafkaAvroMessageEncoder;
import com.linkedin.camus.example.records.DummyLog;

public class DummyLogKafkaProducerClient {


    public static void main(String[] args) {

        Properties props = new Properties();

        props.put("metadata.broker.list", "localhost:6667");
        // props.put("serializer.class", "kafka.serializer.StringEncoder");
        // props.put("partitioner.class", "example.producer.SimplePartitioner");
        //props.put("request.required.acks", "1");

        ProducerConfig config = new ProducerConfig(props);

        Producer<String, byte[]> producer = new Producer<String, byte[]>(config);

        KafkaAvroMessageEncoder encoder = get_DUMMY_LOG_Encoder();

        for (int i = 0; i < 500; i++) {
            KeyedMessage<String, byte[]> data = new KeyedMessage<String, byte[]>("DUMMY_LOG", encoder.toBytes(getDummyLog()));
            producer.send(data);

        }
    }

    public static DummyLog getDummyLog() {
        Random random = new Random();
        DummyLog dummyLog = DummyLog.newBuilder().build();
        dummyLog.setId(random.nextLong());
        dummyLog.setLogTime(new Date().getTime());
        Map<CharSequence, CharSequence> machoStuff = new HashMap<CharSequence, CharSequence>();
        machoStuff.put("macho1", "abcd");
        machoStuff.put("macho2", "xyz");
        dummyLog.setMuchoStuff(machoStuff);
        return dummyLog;
    }

    public static KafkaAvroMessageEncoder get_DUMMY_LOG_Encoder() {
        KafkaAvroMessageEncoder encoder = new KafkaAvroMessageEncoder("DUMMY_LOG", null);
        Properties props = new Properties();
        props.put(KafkaAvroMessageEncoder.KAFKA_MESSAGE_CODER_SCHEMA_REGISTRY_CLASS, "com.linkedin.camus.example.schemaregistry.DummySchemaRegistry");
        encoder.init(props, "DUMMY_LOG");
        return encoder;

    }
}

I am also added Default no-arg constructor ot DummySchemaRegistry as it was giving instantiation Exception

package com.linkedin.camus.example.schemaregistry;

import org.apache.avro.Schema;
import org.apache.hadoop.conf.Configuration;

import com.linkedin.camus.example.records.DummyLog;
import com.linkedin.camus.example.records.DummyLog2;
import com.linkedin.camus.schemaregistry.MemorySchemaRegistry;

/**
 * This is a little dummy registry that just uses a memory-backed schema registry to store two dummy Avro schemas. You
 * can use this with camus.properties
 */
public class DummySchemaRegistry extends MemorySchemaRegistry<Schema> {
    public DummySchemaRegistry(Configuration conf) {
        super();
        super.register("DUMMY_LOG", DummyLog.newBuilder().build().getSchema());
        super.register("DUMMY_LOG_2", DummyLog2.newBuilder().build()
                .getSchema());
    }
    public DummySchemaRegistry() {
        super();
        super.register("DUMMY_LOG", DummyLog.newBuilder().build().getSchema());
        super.register("DUMMY_LOG_2", DummyLog2.newBuilder().build().getSchema());
    }
}

Below Exception trace I am getting after running the Program

Exception in thread "main" com.linkedin.camus.coders.MessageEncoderException: org.apache.avro.AvroRuntimeException: org.apache.avro.AvroRuntimeException: Field id type:LONG pos:0 not set and has no default value at com.linkedin.camus.etl.kafka.coders.KafkaAvroMessageEncoder.init(KafkaAvroMessageEncoder.java:55) at com.linkedin.camus.example.schemaregistry.DummyLogKafkaProducerClient.get_DUMMY_LOG_Encoder(DummyLogKafkaProducerClient.java:57) at com.linkedin.camus.example.schemaregistry.DummyLogKafkaProducerClient.main(DummyLogKafkaProducerClient.java:32) Caused by: org.apache.avro.AvroRuntimeException: org.apache.avro.AvroRuntimeException: Field id type:LONG pos:0 not set and has no default value at com.linkedin.camus.example.records.DummyLog$Builder.build(DummyLog.java:214) at com.linkedin.camus.example.schemaregistry.DummySchemaRegistry.(DummySchemaRegistry.java:16) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:408) at java.lang.Class.newInstance(Class.java:438) at com.linkedin.camus.etl.kafka.coders.KafkaAvroMessageEncoder.init(KafkaAvroMessageEncoder.java:52) ... 2 more Caused by: org.apache.avro.AvroRuntimeException: Field id type:LONG pos:0 not set and has no default value at org.apache.avro.data.RecordBuilderBase.defaultValue(RecordBuilderBase.java:151) at com.linkedin.camus.example.records.DummyLog$Builder.build(DummyLog.java:209) ... 9 more

Sandy
  • 21
  • 6

4 Answers4

1

I suppose camus expects the Avro schema to have default values. I had changed my dummyLog.avsc to following and recompile-

{ "namespace": "com.linkedin.camus.example.records", "type": "record", "name": "DummyLog", "doc": "Logs for not so important stuff.", "fields": [ { "name": "id", "type": "int", "default": 0 }, { "name": "logTime", "type": "int", "default": 0 } ] }

Let me know if it works for you.

Thanks, Ambarish

0

You can default any String or Long field as follows

  {"type":"record","name":"CounterData","namespace":"org.avro.usage.tutorial","fields":[{"name":"word","type":["string","null"]},{"name":"count","type":["long","null"]}]}
akshat thakar
  • 1,445
  • 21
  • 29
0

Camus doesn't assume the schema will have default values. I have recently used camus found the same issue. Actually the way it used in schema registry is not correct in default example. I have done some modification in Camus code you can check out https://github.com/chandanbansal/camus there are minor changes to make it work. They don't have decoder for Avro records. I have written that as well.

Garry
  • 678
  • 1
  • 9
  • 21
0

I was getting this issue because I was initializing the registry like so:

super.register("DUMMY_LOG_2", LogEvent.newBuilder().build().getSchema());

When I changed it to:

super.register("logEventAvro", LogEvent.SCHEMA$);

That got me passed the exception.

I also used Garry's com.linkedin.camus.etl.kafka.coders.AvroMessageDecoder.

I also found this blog (Alvin Jin's Notebook) very useful. It pinpoints every issue you could have with camus example and solves it!

Community
  • 1
  • 1
hba
  • 7,406
  • 10
  • 63
  • 105