-1

Driver class:

import java.io.IOException;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.io.NullWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.input.TextInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
import org.apache.hadoop.mapreduce.lib.output.TextOutputFormat;

public class DRIVER {


public static void main(String arg[]) 
{
    try{
    Path in = new Path("aamazon.txt");
    Path out = new Path("/output");

    Configuration conf = new Configuration();

    Job job = Job.getInstance(conf);
    job.setJarByClass(DRIVER.class);
    job.setMapperClass(MAPPER.class);
    job.setReducerClass(REDUCER.class);
    job.setNumReduceTasks(0);

    FileInputFormat.addInputPath(job, in);
    FileOutputFormat.setOutputPath(job, out);

    job.waitForCompletion(true);

    System.out.println("Successful");}

    catch(Exception e){
        System.out.println(e.getMessage());
    }


}
}

Mapper Class:

import java.io.IOException;
import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.io.NullWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Mapper;

public class MAPPER extends Mapper<LongWritable,Text,LongWritable,Text>{

@Override
public void map(LongWritable key,Text value,Context con) 
{
    try
    {

        System.out.println(key +"\n"+ value);
        con.write(key, value);

    }

    catch(Exception e)
    {
        System.out.println(e.getMessage()); 
    }
}
}

Reducer Class:

import java.io.IOException;
import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.io.NullWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Reducer;

public class REDUCER extends Reducer<LongWritable,Text,LongWritable,Text>{

@Override
public void reduce(LongWritable key,Iterable<Text> value , Context con) 
{
    System.out.println("reducer");
    try{
    for(Text t:value)
    {

        con.write(key, t);
    }
    }
    catch (Exception e)
    {
        System.out.println(e.getMessage());
    }
}
}

Problems:

  1. Execution working up until Mapper
  2. Reducer never gets called
  3. If I set setNumReduceTasks(0) then Mapper is not getting called

Any ideas what is wrong?

theblindprophet
  • 7,767
  • 5
  • 37
  • 55
Ash
  • 13
  • 2

1 Answers1

0

You have set the number of reduce tasks as zero.

    Job job = Job.getInstance(conf);
    job.setJarByClass(DRIVER.class);
    job.setMapperClass(MAPPER.class);
    job.setReducerClass(REDUCER.class);
    job.setNumReduceTasks(0); // this should be greater than 0

Even after that if it did not work check you have permissions on the "/output" path on following line -

    Path out = new Path("/output"); // it is in the root folder. change it to "./output"
ViKiG
  • 764
  • 9
  • 21
  • even if i remove 0 or dont put that code , still reducer not working – Ash May 26 '16 at 22:04
  • I checked it by running the program. It works just fine. If I give number of reduce tasks as zero it simply skips the reduce and exits successfully without any errors. If you are still getting errors post the stack trace or any logged information(which you should have done while posting this question). – ViKiG May 27 '16 at 07:19
  • you are life saver... thanks mate. "./output" worked for me...appreciate – Ash May 27 '16 at 10:45