31

When I am running a Hadoop .jar file from the command prompt, it throws an exception saying no such method StockKey method.

StockKey is my custom class defined for my own type of key.

Here is the exception:

12/07/12 00:18:47 INFO mapred.JobClient: Task Id :   
attempt_201207082224_0007_m_000000_1, Status : FAILED

java.lang.RuntimeException: java.lang.NoSuchMethodException: SecondarySort$StockKey.      
<init>()
    at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:115)
    at org.apache.hadoop.io.WritableComparator.newKey(WritableComparator.java:109)
    at org.apache.hadoop.io.WritableComparator.<init>(WritableComparator.java:95)
    at org.apache.hadoop.io.WritableComparator.get(WritableComparator.java:51)
    at org.apache.hadoop.mapred.JobConf.getOutputKeyComparator(JobConf.java:795)
    at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.<init>(MapTask.java:817)
    at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:383)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
    at org.apache.hadoop.mapred.Child$4.run(Child.java:270)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at     
   org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1127)
    at org.apache.hadoop.mapred.Child.main(Child.java:264)
London guy
  • 27,522
  • 44
  • 121
  • 179
  • 1
    For the future reader like me who gets directed here, I'd also warn you of making your WritableComparable implementation an inner class (in my case, it was an inner class of my mapper). I got the same exceptions as above. When I moved it to a class of it's own, it worked fine – Nishant Kelkar Apr 23 '15 at 01:04

6 Answers6

72

There's another thing to check when getting errors like this for classes which are writables, mappers, reducers, etc.

If the class is an inner class, make sure it's declared static (i.e. doesn't need an instance of the enclosing class). Otherwise, Hadoop cannot instantiate your inner class and will give this same error - that a zero-arg constructor is needed.

Chris
  • 1,678
  • 14
  • 9
  • 3
    You just got my butt out of a sling, too. Thanks for the explanation about static inner classes! – Stevens Miller Jul 03 '14 at 19:22
  • 2
    I didn't know the inner class had to be static to instantiate outside of the enclosing class, this saved me so much time and frustration. – DragonDTG May 01 '15 at 16:09
57

You have to provide an empty default constructor in your key class. Hadoop is using reflection and it can not guess any parameters to feed.

So just add the default constructor:

public StockKey(){}
Thomas Jungblut
  • 20,854
  • 6
  • 68
  • 91
  • 1
    I also had this issue when implementing `Writable` and `WritableComparable` with inner default `Writable`s - not only do you have to provide a default constructor, but that default constructor also has to instantiate any other `Writable` instance variables; otherwise you get a SpillError. – bbengfort Nov 13 '13 at 15:07
3

Make sure you have the default constructor, but I also had to add the static keyword to my class declaration. That is,

public class SecondarySort {
  public static void main(String[] args) {...}

  public static class StockKey extends ... {}
}
1

For scala too, I fixed the problem adding default constructor as below,

class IntPair (first : IntWritable, second : IntWritable) extends WritableComparable[IntPair] {

     def this() = this(first = new IntWritable(), second = new IntWritable())

     def getFirst () : IntWritable = {
         first
     }

     def getSecond () : IntWritable = {
         second
     }

}
Community
  • 1
  • 1
prayagupa
  • 30,204
  • 14
  • 155
  • 192
0

No any answer helped me.

In my case it happened when me reduced constructor visibility by mistake or hurry.

E.g. parent constructor is public and inherited class's one is default or protected !

CodeToLife
  • 3,672
  • 2
  • 41
  • 29
0

Was facing this same issue. Fixed by following pointers from @Thomas and @Chris.

Looks like both these solutions are needed to solve the problem :

  • Answer from @Thomas is required as Hadoop is using reflection and when building large projects.

  • Answer from @Chris is required when using inner classes and invoking Mappers/Reducers from main().

vmorusu
  • 936
  • 1
  • 15
  • 32