0

I am trying to make a custom partitioner to allocate each unique key to a single reducer. this was after the default HashPartioner failed Alternative to the default hashpartioner provided with hadoop

I keep getting the following error. It has something to do with the constructor not receiving its arguments, from what I can tell from doing some research. but in this context, with hadoop, aren't the arguments passed automatically by the framework? I cant find an error in the code

18/04/20 17:06:51 INFO mapred.JobClient: Task Id : attempt_201804201340_0007_m_000000_1, Status : FAILED
java.lang.RuntimeException: java.lang.NoSuchMethodException: biA3pipepart$parti.<init>()
    at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:131)
    at org.apache.hadoop.mapred.MapTask$NewOutputCollector.<init>(MapTask.java:587)

This is my partitioner:

 public class Parti extends Partitioner<Text, Text> {
    String partititonkey;
    int result=0;
    @Override
    public int getPartition(Text key, Text value, int numPartitions) {


 String partitionKey = key.toString();

     if(numPartitions >= 9){
               if(partitionKey.charAt(0) =='0' ){
        if(partitionKey.charAt(2)=='0' )
                result= 0;
        else
        if(partitionKey.charAt(2)=='1' )
                result= 1;
        else
                result= 2;
             }else

        if(partitionKey.charAt(0)=='1'){
        if(partitionKey.charAt(2)=='0' )
                result= 3;
        else
        if(partitionKey.charAt(2)=='1' )
                result= 4;
        else
                result= 5;
             }else
        if(partitionKey.charAt(0)=='2' ){
        if(partitionKey.charAt(2)=='0' )
                result= 6;
        else
            if(partitionKey.charAt(2)=='1' )
                    result= 7;
            else
                    result= 8;
             }


        } //
    else

            result= 0; 

   return result;
}// close method
}// close class

My mapper signature

public static class JoinsMap extends Mapper<LongWritable,Text,Text,Text>{
    public void Map(LongWritable key, Text value, Context context) throws IOException, InterruptedException{

My reducer signiture

public static class JoinsReduce extends Reducer<Text,Text,Text,Text>{
public void Reduce (Text key, Iterable<Text> values, Context context) throws IOException, InterruptedException {

main class:

public static void main( String[] args ) throws Exception {




     Configuration conf1 = new Configuration();


     Job job1 = new Job(conf1, "biA3pipepart");

    job1.setJarByClass(biA3pipepart.class);

    job1.setNumReduceTasks(9);//***


     job1.setOutputKeyClass(Text.class);
     job1.setOutputValueClass(Text.class);

     job1.setMapperClass(JoinsMap.class);
     job1.setReducerClass(JoinsReduce.class);

     job1.setInputFormatClass(TextInputFormat.class);
     job1.setOutputFormatClass(TextOutputFormat.class);

    job1.setPartitionerClass(Parti.class); //+++
    // inputs to  map.
        FileInputFormat.addInputPath(job1, new Path(args[0]));   

     // single output from reducer.
     FileOutputFormat.setOutputPath(job1, new Path(args[1]));

     job1.waitForCompletion(true);


}

keys emitted by Mapper are the following:

0,0
0,1
0,2
1,0
1,1
1,2
2,0
2,1
2,2

and the Reducer only writes keys and values it receives.

zaranaid
  • 65
  • 1
  • 13
  • Please show the entire `biA3pipepart` class, the job configuration, and capitalize all your class names – OneCricketeer Apr 20 '18 at 22:48
  • Are you by any chance mixing the two different hadoop APIs. import org.apache.hadoop.mapred.Partitioner (older) vs import org.apache.hadoop.mapreduce.Partitioner; (newer) ?? – user238607 Apr 21 '18 at 09:52
  • all my imports start with org.apache.hadoop.mapreduce I am using the older API – zaranaid Apr 21 '18 at 10:00
  • I have added the full code to the question @cricket_007 – zaranaid Apr 21 '18 at 10:13
  • I have added full code @user238607 – zaranaid Apr 21 '18 at 10:13
  • Define the partitioner as static as well just as you have done for mapper and reducer. Just a suggestion for you to try out based on this : https://stackoverflow.com/questions/11022812/hadoop-no-such-method-exception .This question is exactly similar to yours : https://stackoverflow.com/questions/9437895/error-while-using-hadoop-partitioning – user238607 Apr 21 '18 at 15:40
  • your GREAT @user238607 !! it worked – zaranaid Apr 21 '18 at 15:56
  • Update your post with the solution. So that future readers can know what worked. People don't read comments. – user238607 Apr 21 '18 at 16:00

1 Answers1

2

SOLVED

I just added static to my Parti class like the mapper and reducer classes as suggested by comment (user238607).

public static class Parti extends Partitioner<Text, Text> {
zaranaid
  • 65
  • 1
  • 13