1

I have written a Map only hadoop job in which i have used MultipleOutputs concept. The problem here is, i want to test this code with MRUnit. I don't see any working example for MultipleOutputs testing.

My mapper code will be like,

    public void map(LongWritable key, Text value, Context context)
        throws IOException, InterruptedException {

    String inputString = value.toString();
    String outputString = null;
    Text resultValue = null;

    String finalResult = null;
    String exceptionMessage = null;

    try {

        outputString = processInput(dataSet, inputString);

    } catch (MalformedURLException me) {
        System.out.println("MalformedURLException Occurred in Mapper:"
                + me.getMessage());
        exceptionMessage = me.getMessage();
    } catch (SolrServerException se) {
        System.out.println("SolrServerException Occurred in Mapper:"
                + se.getMessage());
        exceptionMessage = se.getMessage();
    } 
    if (outputString == null || outputString.isEmpty()
            && exceptionMessage != null) {
        exceptionMessage = exceptionMessage.replaceAll("\n", ", ");
        finalResult = inputString + "\t[Error] =" + exceptionMessage;
        resultValue = new Text(finalResult);
        multipleOutputs.write(SearchConstants.FAILURE_FILE,NullWritable.get(), resultValue);
    } else {
        finalResult = inputString + outputString;
        resultValue = new Text(finalResult);
        multipleOutputs.write(SearchConstants.SUCCESS_FILE,NullWritable.get(), resultValue);
    }

}

Can anyone of you guys give me a working example of MRUnit test with MultipleOutputs?

Jahathesh
  • 11
  • 5

1 Answers1

2

Here's an example with a slightly simplified version of your class

import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.io.NullWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Mapper;
import org.apache.hadoop.mapreduce.lib.output.MultipleOutputs;

import java.io.IOException;

public class SomeMapper extends Mapper<LongWritable, Text, NullWritable, Text> {
    public static final String SUCCESS_FILE = "successFile";
    private static MultipleOutputs<NullWritable, Text> multipleOutputs;
    private static Text result = new Text();

    @Override
    public void setup(Context context) throws IOException, InterruptedException {
        multipleOutputs = new MultipleOutputs<>(context);
        super.setup(context);
    }

    @Override
    public void map(LongWritable key, Text value, Mapper.Context context) throws IOException, InterruptedException {
        String outputString = "some result";  // logic here

        result.set(outputString);
        multipleOutputs.write(SUCCESS_FILE, NullWritable.get(), result);
    }
}
import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.io.NullWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.lib.output.MultipleOutputs;
import org.apache.hadoop.mrunit.mapreduce.MapDriver;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.powermock.core.classloader.annotations.PrepareForTest;
import org.powermock.modules.junit4.PowerMockRunner;

@RunWith(PowerMockRunner.class)
@PrepareForTest({MultipleOutputs.class, SomeMapper.class})
public class SomeMapperTest {
    @Test
    public void someTest() throws Exception {
        MapDriver<LongWritable, Text, NullWritable, Text> mapDriver = MapDriver.newMapDriver(new SomeMapper());
        mapDriver.withInput(new LongWritable(0), new Text("some input"))
            .withMultiOutput(SomeMapper.SUCCESS_FILE, NullWritable.get(), new Text("some result"))
            .runTest();
    }
}

and the build.gradle

apply plugin: "java"
sourceCompatibility = 1.7
targetCompatibility = 1.7

repositories {
  mavenCentral()
}

dependencies {
  compile "org.apache.hadoop:hadoop-client:2.4.0"
  testCompile "junit:junit:4.12"
  testCompile("org.apache.mrunit:mrunit:1.1.0:hadoop2") {
    exclude(group: "org.mockito")
  }
  testCompile "org.powermock:powermock-module-junit4:1.6.2"
  testCompile "org.powermock:powermock-api-mockito:1.6.2"
}

Note the Mockito exclusion. Without it, I got the exception java.lang.NoSuchMethodError: org.mockito.mock.MockCreationSettings.getSerializableMode()Lorg/mockito/mock/SerializableMode; because that Hadoop dependency pulled in org.mockito:mockito-core:1.9.5, which conflicted with the Mockito version Powermock wanted to use.

You can find additional examples in MRUnit's org.apache.hadoop.mrunit.mapreduce.TestMultipleOutput unit tests.

Keegan
  • 11,345
  • 1
  • 25
  • 38
  • I am using Maven. i got below error when i tried to use PowerMock. i have excluded the mockito dependency as well. – Jahathesh May 26 '15 at 12:26
  • 1
    I am using Maven. i got below error when i tried to use PowerMock. i have excluded the mockito dependency as well. `java.lang.NoClassDefFoundError: org/mockito/cglib/proxy/Enhancer at org.powermock.api.extension.proxyframework.ProxyFrameworkImpl.isProxy(ProxyFrameworkImpl.java:50)` ` org.apache.mrunit mrunit 1.1.0 hadoop2 org.mockito mockito-core ` – Jahathesh May 26 '15 at 12:32
  • Were any of your dependency versions different than the ones in my example? Obviously `mvn dependency:tree` might give some insight. It kinda sounds like your version of Powermock doesn't include Mockito. – Keegan May 26 '15 at 12:58
  • I edited my answer to use the highest Hadoop version [supported](http://docs.aws.amazon.com/ElasticMapReduce/latest/DeveloperGuide/emr-plan-hadoop-version.html) by EMR, since you tagged your question as EMR. – Keegan May 26 '15 at 21:15