0

Producer Code is Below

static PltPage pltPage;
public static void main(String[] args) throws IOException {

    Short itemtype = 1;
    Properties props = new Properties();
    props.put("metadata.broker.list", "localhost:9092");
    props.put("partitioner.class", "com.rms.com.SimplePartitioner");
    props.put("serializer.class", "com.rms.com.CustomSerializer");
    props.put("request.required.acks", "1");

    ProducerConfig config = new ProducerConfig(props);
   Producer<String,PltResultPage> producer = new Producer<String,PltResultPage>(config);

        String folder = new File(".").getAbsoluteFile().getPath();
        String parent = new File(folder).getParentFile().getParent();
        String path = parent + "/KafkaProducerSparkConsumer/src/resources/PortfolioPLT.txt";
        FileReader fr = new FileReader(path);
        BufferedReader br = new BufferedReader(fr);
        String sCurrentLine;
        List<Integer> periodIds = new ArrayList<Integer>();
        List<Integer> sampleIds = new ArrayList<Integer>();
        List<Integer> eventIds = new ArrayList<Integer>();
        List<Integer> dates = new ArrayList<Integer>();
        List<Double> losses = new ArrayList<Double>();

        while ((sCurrentLine = br.readLine()) != null) {
            String [] entries = sCurrentLine.toString().split("~");
            if (entries[1].equalsIgnoreCase("GR"))
            {
                periodIds.add(Integer.parseInt(entries[2]));
                sampleIds.add(Integer.parseInt(entries[3]));
                eventIds.add(Integer.parseInt(entries[4]));
                dates.add(2040);
                losses.add(Double.parseDouble(entries[6]));
            }
        }

    pltPage = ExportUtilities.generatepltpage(periodIds,sampleIds,eventIds,dates,losses,periodIds.size(),1000000000002L,Short.parseShort("1"));
    PltResultPage resultPage = new PltResultPage();
    resultPage.setAnalysisId(1);
    resultPage.setExternalID("1");
    resultPage.setItemId(1L);
    resultPage.setItemType(itemtype);
    resultPage.setOutputProperty(itemtype);
    resultPage.setResultType(itemtype);
    resultPage.setResultPage(pltPage);
    resultPage.setJobId(1L);
        KeyedMessage<String, PltResultPage> message = new KeyedMessage<String, PltResultPage>("test", resultPage);
        producer.send(message);
    producer.close();
}

Consumer Code is below

 Properties props = new Properties();
        props.put("zookeeper.connect", a_zookeeper);
        props.put("group.id", a_groupId);
        props.put("zookeeper.session.timeout.ms", "20000");
        props.put("zookeeper.sync.time.ms", "3000");
        props.put("auto.commit.interval.ms", "2000");
        props.put("auto.offset.reset", "smallest");
        props.put("serializer.class", "com.rms.com.CustomSerializer");

Map<String, Integer> topicCountMap = new HashMap<String, Integer>();
        topicCountMap.put(topic, new Integer(a_numThreads));
        final StringDecoder decoder =
                new StringDecoder(new VerifiableProperties(returnProperties(zooKeeper,groupId)));
        final CustomSerializer decoder2 = new CustomSerializer(new VerifiableProperties(returnProperties(zooKeeper,groupId)));
        final Map<String, List<KafkaStream<String, PltResultPage>>> consumerMap = this.consumer.createMessageStreams(topicCountMap, decoder, decoder2);
        final List<KafkaStream<String, PltResultPage>> streams = consumerMap.get(topic);
        executor = Executors.newFixedThreadPool(a_numThreads);
        int threadNumber = a_numThreads;
        for(KafkaStream stream : streams) {
            executor.submit(new ExecuteConsumerClient(stream, threadNumber));
            threadNumber++;
        }

System.out.println("calling ExecuteConsumerClient.run()");
        ConsumerIterator<String,PltResultPage> it = m_stream.iterator();

        while (it.hasNext())
        {
            try {
                CreateJavaSparkContext();
                System.out.println("Converting to ResultPage");
                PltResultPage pltResultPage = (PltResultPage)it.next().message();
                System.out.println("Before Impl Accept");
                sparkExportPLTToFile.accept(pltResultPage.getJobId(), pltResultPage.getItemId(), pltResultPage.getItemType(), pltResultPage.getOutputProperty(), pltResultPage.getResultType(), pltResultPage.getResultPage(), pltResultPage.getAnalysisId(), pltResultPage.getExternalID());
            }
            catch (Exception e)
            {
                System.out.println( "Exception in it.Run" + e.getStackTrace().toString() );
            }
            System.out.println("Executed impl for thread " + m_threadNumber);
        }
        System.out.println("Shutting down Thread: " + m_threadNumber);
    }

ITs failing when i try to convert the message to object. I got Custom Serializer code from one of the posts and it looks like below. Can anyone indicate whats wrong with the implementation. i tried using FromBytes from custom serializer and it did not help. The serializer is returning null object

Custom Serializer

public class CustomSerializer implements Encoder<PltResultPage>, Decoder<PltResultPage> {
    public CustomSerializer(VerifiableProperties verifiableProperties) {
        /* This constructor must be present for successful compile. */
    }

    public CustomSerializer() {

    }


    public byte[] toBytes(PltResultPage o) {
        try {
            ByteArrayOutputStream baos = new ByteArrayOutputStream();
            ObjectOutputStream oos = new ObjectOutputStream(baos);
            oos.writeObject(o);
            oos.close();
            byte[] b = baos.toByteArray();
            return b;
        } catch (IOException e) {
            return new byte[0];
        }
    }

    @Override
    public PltResultPage fromBytes(byte[] bytes) {
        try {
            return (PltResultPage) new ObjectInputStream(new ByteArrayInputStream(bytes)).readObject();
        } catch (Exception e) {
            return null;
        }
    }
}

PltResultPage is below.

public class PltResultPage implements Serializable {
    private Long jobId;
    private Long itemId;
    private Short itemType;
    private Short outputProperty;
    private Short resultType;
    private LossPage resultPage;
    private Integer analysisId;
    private String externalID;
private static final long serialVersionUID = 0L;

    public Long getJobId()
    {return this.jobId;}

    public Long getItemId()
    {return this.itemId;}

    public String getExternalID()
    {return this.externalID;}

    public Short getItemType()
    {return this.itemType;}

    public Short getOutputProperty()
    {return this.outputProperty;}

    public Short getResultType()
    {   return this.resultType;}

    public LossPage getResultPage()
    {return this.resultPage;}

    public Integer getAnalysisId()
    {return this.analysisId;}

    public void setJobId(Long jobid)
    {this.jobId = jobid;}

    public void setOutputProperty(Short output)
    {this.outputProperty = output;}

    public void setItemId(Long itemId)
    {this.itemId = itemId;}

    public void setItemType(Short type)
    {
        this.itemType = type;
    }

    public void setResultType(Short resultType)
    {
        this.resultType = resultType;
    }

    public void setResultPage(LossPage page)
    {this.resultPage = page;}

    public void setAnalysisId(Integer id)
    {
        this.analysisId = id;
    }

    public void setExternalID(String externalID)
    {
        this.externalID = externalID;
    }
}
user3897533
  • 417
  • 1
  • 8
  • 24
  • Please post the code for PltResultPage. Also ensure that PltResultPage (1) implements serializable and (2) has a serialVersionUID variable declared – Chris Gerken Dec 15 '15 at 12:05
  • Thanks chris for the response. Can you please look at the code. i added Plt ResultPage, the custom serializer is returning null for FromBytes. I get a index out of bounds exception. – user3897533 Dec 15 '15 at 14:31

1 Answers1

0

Try adding

private static final long serialVersionUID = 0L;

to PltResultPage. You don't see it, but this value gets serialized along with the other values and on deserialization that value is compared to the value in the loaded class in the current JVM. If the values are different the serialization will fail and you'll get a null result, even if you use the exact same source code for PltResultPage in both consumer and producer JVM's. If you don't specify serialVersionUID for a class the the JVM will make up a value for you and it's a safe bet that the random value for serialVersionUID in the consumer JVM will be different from the random value for serialVersionUID in the producer JVM.

In short, if you use the default Java serialization/deserializtion in the custom serializer, you must declare serialVersionUID in the classes of the objects being serialized.

Chris Gerken
  • 16,221
  • 6
  • 44
  • 59
  • Chris, i still get a index out of bounds. Can you please look at the above code entirely. I am Passing String, PltResultPage in producer and in Consumer i added the serializer as Custom, KafkaStream has in the consumer. Can you please check the properties at both consumer and producer and check if everything is specified right? – user3897533 Dec 15 '15 at 19:25
  • Can you confirm that you're not reading any old messages that were written before you added serialVersionUID to your code? I've had to change the topic name in cases like this just to be sure. – Chris Gerken Dec 15 '15 at 19:53
  • I tried, changing the topic name, and the group name, it did not help. Can you please look at the complete code above and point out any thing obvious i am missing.. i am out of ideas right now! – user3897533 Dec 15 '15 at 22:05
  • The only think I see is that you assume a StringDecoder with the consumer but you didn't specify the key serializer in the producer config. Maybe that's your mismatch. Other than that, you'll have to debug the code see what exception is being thrown deep in the bowels of the deserialization – Chris Gerken Dec 15 '15 at 22:44
  • props.put("serializer.class", "com.rms.com.CustomSerializer"); props.put("key.serializer.class", "kafka.serializer.StringEncoder"); is what i have. It did not work.. Did you look at the code by any chance? – user3897533 Dec 15 '15 at 23:57