0

I am trying to create a Hadoop job that is chained with another few jobs so that it looks like Map1 -> Reduce -> Map2 -> Reduce. All of my classes are implemented in the same file. I am getting the following error on my first job.

java.lang.RuntimeException: Error in configuring object
at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:93)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:64)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:354)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307)
at org.apache.hadoop.mapred.Child.main(Child.java:170)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:88)
... 5 more
Caused by: java.lang.RuntimeException: java.lang.NoSuchMethodException: work.graph.WorkGraph$Map1.<init>()
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:115)
at org.apache.hadoop.mapred.MapRunner.configure(MapRunner.java:34)
... 10 more
Caused by: java.lang.NoSuchMethodException: work.graph.WorkGraph$Map1.<init>()
at java.lang.Class.getConstructor0(Class.java:2706)
at java.lang.Class.getDeclaredConstructor(Class.java:1985)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:109)

I looked at Hadoop: No Such Method Exception already, but that didn't fix my problem. Does anyone know of other reasons this error comes up?

EDIT: My code looks like the following:

public class WorkGraph {
    public static Hashmap dictionary = new Hashmap(); 
    public static class Map1 extends MapReduceBase implements Mapper {
         public void map {}
    }
    public static class Map2 extends MapReduceBase implements Mapper {
    }

public static void main(String[] args) throws Exception {
    JobConf job1 = new JobConf(WorkGraph.class);
    job1.setJobName("WorkGraph1");

    job1.setInputFormat(TextInputFormat.class);
    job1.setOutputFormat(TextOutputFormat.class);

    job1.setOutputKeyClass(Text.class);
    job1.setOutputValueClass(Text.class);

    job1.setMapperClass(Map1.class);

    FileInputFormat.addInputPath(job1, new Path(args[0]));
    FileOutputFormat.setOutputPath(job1, new Path(args[1] + "/map1"));

    JobClient.runJob(job1);

}
Community
  • 1
  • 1
Annika Peterson
  • 260
  • 2
  • 11

1 Answers1

1

My guess without seeing your source code is that you have your Map1 class defined as an inner class of WorkGraph, rather than a static class. Hadoop needs to be able to create an instance of your map / reduce classes using reflection, and require a default constructor for your class.

If your code looks like the following block, then your Map1 class is actually a inner child of the parent WorkGraph class, and requires a reference to the parent class to be passed in as the argument at construction (the compile hides all this from you):

public class WorkGraph {
  public class Map1 extends Mapper {

  }
}

This should actually read:

public class WorkGraph {
  public static class Map1 extends Mapper {

  }
}

You may find it useful to run the javap utility on your Map1 class file, as this should show you what the compiler has generated.

Chris White
  • 29,949
  • 4
  • 71
  • 93
  • Please post your code (link to a github gist, pastebin etc) if this isn's your problem - it's really difficult to help you if you're not willing to at least help us to help you – Chris White Feb 17 '13 at 14:15
  • Sorry for not posting the code earlier. This was once given as a homework question (this isn't homework for me), so I don't think I should post my answer to it. I did edit above to show you how I'm creating my static class. Thanks for your help – Annika Peterson Feb 17 '13 at 14:36
  • Your 'pseudo' code looks like it shouldn't have a problem - are you still seeing the original error? If so have you defined a constructor for the Map1 class (you don't need one)? – Chris White Feb 17 '13 at 15:04
  • I believe what happened is when I did this fix, I did not recompile correctly. I'm working on submitting my code again, but I'm pretty sure that this was my problem. Thanks for your help! – Annika Peterson Feb 17 '13 at 15:10
  • Do you happen to know why Hadoop requires it to be a static class rather than just having users define a constructor? – Annika Peterson Feb 17 '13 at 15:16
  • Hadoop uses reflection to create an instance of your mapper for each map task. For it to do this the mapper must have a default constructor (no args), otherwise it would not know how to create an instance of that class. The static keyword on the inner class denotes to java that this class does not need an instance of the parent class to be provided at construction time (and the 'inner' class is actually independent of the parent class) – Chris White Feb 17 '13 at 18:47