0

I have a map/reduce code, which implements 2d array using TwoDArrayWritable. When I try to emit the 2d array, it gives me the exception related to initialization. I searched through and found that it requires the default constructor which is not provided. How do I provide the default constructor for TwoDArrayWritable or is there something else that I am getting wrong? Kindly assist me.

Here is the mapper code:

public class JaccardMapper extends Mapper<LongWritable, Text, IntTextPair, TwoDArrayWritable> {

    Hashtable movieInfo = new Hashtable<String, String>();
    String[] genres, actors, entities;
    String[] attributes = new String[] {"genre", "actors", "directors", "country", "year", "ratings"};
    double p,q,r,s;
    double result = 0.0;
    String input[] = null;
    Set<String> keys;

    TwoDArrayWritables array2d = new TwoDArrayWritables();
    //TwoDArrayWritable array2d = new TwoDArrayWritable(IntWritable.class);
    IntWritable[][] jaccard = new IntWritable[2][];
    //int[][] jaccard = new int[2][];


    public void map(LongWritable key, Text value, Context context) throws IOException, InterruptedException
    {
        p = 0;
        q = 0;
        r = 0;
        s = 0;

        /*
         * dataset format
         * 0 -> movieid
         * 1 -> title
         * 2 -> year
         * 3 -> actors
         * 4 -> directors
         * 5 -> genre
         * 6 -> country
         * 7 -> ratings
         * 8 -> cost
         * 9 -> revenue
         * */

        /*
         * input format
         * 0 -> genre
         * 1 -> actors
         * 2 -> directors
         * 3 -> country
         * 4 -> year
         * 5 -> ratings
         * */

        /*
         * (q + r) / (p + q + r) 
         * p -> number of variables positive for both objects 
         * q -> number of variables positive for the ith objects and negative for jth objects
         * r -> number of variables negative for the ith objects and positive for jth objects
         * s -> number of variables negative for both objects
         * */


        input = value.toString().toLowerCase().split(",");
        keys = movieInfo.keySet();


        //the jaccards 2d array column length depends on the user input best case is 6 but the worst case depends on the sub attributes count like more than one actor/director/genre/country.  
        int columnlength = input[1].split("\\|").length + input[2].split("\\|").length + input[3].split("\\|").length + input[4].split("\\|").length + 2;
        jaccard = new IntWritable[2][columnlength];
        for (int i = 0; i < jaccard.length; i++)
        {
            for (int j = 0; j < jaccard[i].length; j++)
            {
                jaccard[i][j] = new IntWritable(0);
            }
        }

        if (input.length > 0)
        {
            //iterate through the dataset in cache
            for(String keyy : keys)
            {
                //iterate to user's input attributes
                for (int attribute = 1; attribute < attributes.length; attribute++)
                {
                    if (!input[attribute].equals("-")) 
                    {
                        entities = input[attribute].toLowerCase().split("\\|");
                        int subattributecount = 0;

                        for(String entity : entities)
                        {
                                if (movieInfo.get(keyy).toString().toLowerCase().contains(entity))
                                {
                                    //if user criteria match with the data set, mark 1, 1
                                    jaccard[0][attribute + subattributecount] = new IntWritable(1);
                                    jaccard[1][attribute + subattributecount] = new IntWritable(1);
                                }
                                else
                                {
                                    //if user criteria doesn't match with the data set, mark 1, 0
                                    jaccard[0][attribute + subattributecount] = new IntWritable(1);
                                    jaccard[1][attribute + subattributecount] = new IntWritable(0);
                                }
                                subattributecount += 1;
                        }
                    }
                }
                IntTextPair pair = new IntTextPair(Integer.parseInt(input[0].toString()), movieInfo.get(keyy).toString());

                array2d.set(jaccard);
                //context.write(pair, array2d);
                context.write(pair, array2d);
            }


        }

}
}

Here is the TwoDArrayWritables wrapper class: import org.apache.hadoop.io.TwoDArrayWritable;

public class TwoDArrayWritables extends TwoDArrayWritable
{
    public TwoDArrayWritables() {
        super(TwoDArrayWritable.class);

    }


    public TwoDArrayWritables(Class valueClass) {
        super(valueClass);
        // TODO Auto-generated constructor stub
    }

}

Here is the exception:

14/12/26 16:15:32 INFO mapreduce.Job: Task Id : attempt_1419259182533_0112_r_000000_0, Status : FAILED
Error: java.lang.RuntimeException: java.lang.NoSuchMethodException: org.apache.hadoop.io.TwoDArrayWritable.<init>()
        at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:131)
        at org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserializer.deserialize(WritableSerialization.java:66)
        at org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserializer.deserialize(WritableSerialization.java:42)
        at org.apache.hadoop.mapreduce.task.ReduceContextImpl.nextKeyValue(ReduceContextImpl.java:146)
        at org.apache.hadoop.mapreduce.task.ReduceContextImpl.nextKey(ReduceContextImpl.java:121)
        at org.apache.hadoop.mapreduce.lib.reduce.WrappedReducer$Context.nextKey(WrappedReducer.java:307)
        at org.apache.hadoop.mapreduce.Reducer.run(Reducer.java:170)
        at org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:627)
        at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:389)
        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:167)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1554)
        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
Caused by: java.lang.NoSuchMethodException: org.apache.hadoop.io.TwoDArrayWritable.<init>()
        at java.lang.Class.getConstructor0(Class.java:2892)
        at java.lang.Class.getDeclaredConstructor(Class.java:2058)
        at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:125)
        ... 13 more
user1584253
  • 975
  • 2
  • 18
  • 55
  • Did you even try googling that exact error message? Sorry if that seems obvious to do, but there are a TON of Google results on this exact topic by typing in that exact error message. – austin wernli Dec 26 '14 at 18:06
  • I have googled it. I found the solution to make a default constructor. But I am implementing TwoDArrayWritable, so how do I make its default constructor? – user1584253 Dec 26 '14 at 18:31
  • You could try the answer posted here: http://stackoverflow.com/questions/4386781/implementation-of-an-arraywritable-for-a-custom-hadoop-type/4390928#4390928.. I got the link to that from this answer: http://stackoverflow.com/questions/4233214/hadoop-spill-failure Hopefully those point you in the right direction. Sounds like they use a wrapper class to define the constructor – austin wernli Dec 26 '14 at 19:12
  • I have implemented the wrapper class. I have modified it in the above code. But getting the following exception: Error: java.lang.RuntimeException: java.lang.InstantiationException: org.apache.hadoop.io.TwoDArrayWritable. Kindly guide me where I am wrong. – user1584253 Dec 26 '14 at 22:33
  • No idea if this will help, but this person is also trying to do something similar heh. Sorry i keep throwing you at posts, i'm not really pro status with Hadoopishness http://stackoverflow.com/questions/24904782/customizing-twodarraywritable-in-hadoop-and-not-able-to-iterate-the-same-in-redu – austin wernli Dec 26 '14 at 22:40
  • The post you referred won't help as it is different. I need to make a default constructor for TwoDArrayWritable. But I am confused how to implement it properly. – user1584253 Dec 27 '14 at 09:11
  • For the future reader like me who gets directed here, I'd also warn you of making your WritableComparable implementation an inner class (in my case, it was an inner class of my mapper). I got the same exceptions as above. When I moved it to a class of it's own, it worked fine. – Nishant Kelkar Apr 23 '15 at 01:04

0 Answers0