2

So I am trying create a unit test for my Hadoop mapping function, using Mockito. I have created the Mapper class properly:

class XmlMapper extends Mapper[LongWritable, Text, Text, LongWritable] {

    override def map(key: LongWritable, value: Text, context: Mapper[LongWritable, Text, Text, LongWritable]#Context): Unit = {
        //does stuff
    }
}

And then I have the following test:

import org.apache.hadoop.io.{LongWritable, Text}
import org.scalatest.FlatSpec
import org.scalatest.mockito.MockitoSugar
import org.mockito.Mockito.verify
import org.mockito.Mockito.times

class XmlMapperTest extends FlatSpec with MockitoSugar {

    "XmlMapper" should "output" in {
        val mapper = new XmlMapper
        val context = mock[mapper.Context]
        //so far does nothing yet
    }
}

But I get the following error:

- should output *** FAILED ***
[info]   org.mockito.exceptions.base.MockitoException: Mockito cannot mock this class: class org.apache.hadoop.mapreduce.Mapper$Context.
[info]
[info] Mockito can only mock non-private & non-final classes.
[info] If you're not sure why you're getting this error, please report to the mailing list.

Which doesn't make sense because Mapper.Context is a public abstract class.

Here is the total stack trace:

[info] Underlying exception : java.lang.UnsupportedOperationException: Cannot define class using reflection
[info]   at org.scalatest.mockito.MockitoSugar.mock(MockitoSugar.scala:73)
[info]   at org.scalatest.mockito.MockitoSugar.mock$(MockitoSugar.scala:72)
[info]   at XmlMapperTest.mock(XmlMapperTest.scala:7)
[info]   at XmlMapperTest.$anonfun$new$1(XmlMapperTest.scala:11)
[info]   at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:12)
[info]   at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85)
[info]   at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83)
[info]   at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
[info]   at org.scalatest.Transformer.apply(Transformer.scala:22)
[info]   at org.scalatest.Transformer.apply(Transformer.scala:20)
Vineet Patel
  • 477
  • 7
  • 17
  • What version of Mockito and Hadoop are you using? I just tried your code and it works... – ultrasecr.eth Feb 26 '19 at 18:40
  • I am using hadoop-mapreduce-client-core 2.8.1 and mockito-core 2.8.47. Using Java version 10, Java HotSpot(TM) 64-Bit Server VM version 10, all on Windows 10. – Vineet Patel Feb 26 '19 at 20:06
  • 1
    Hmm, that version of mockito-core seems quite old, anyway, if you're using scala you would probably wanna move to mockito-scala, I tried v1.1.5 and it works with your hadoop dependency... That said, I have to point out that ideally you want to avoid mocking classes that you don't own as a good practice – ultrasecr.eth Feb 26 '19 at 20:36
  • Thank you! Moving to mockito-scala worked. – Vineet Patel Feb 26 '19 at 21:04
  • Also what do you suggest for unit testing for Hadoop, if mocking classes isn't ideal? – Vineet Patel Feb 26 '19 at 21:04
  • I've put everything together as an answer, please check below – ultrasecr.eth Feb 26 '19 at 21:16

1 Answers1

2

It seems you're using an old version of mockito-core, given you're using Scala, I'd strongly suggest to move to the latest version of mockito-scala

Also, whenever possible and as a good practice you should avoid mocking classes you don't own. A better approach would be to wrap the interaction with the 3rd party in a class (or set of classes) and do integration testing to prove that they work as expected.

On the rest of the system, you inject these classes, and so, for testing, you mock said classes, this allows you to control (and simplify) the exposed interface and also minimise the disruption in case future versions of the third party lib change its API.

For more details check this StackOverflow answer and this post on the Mockito blog

ultrasecr.eth
  • 1,437
  • 10
  • 13
  • Thank you! Updating mockito-core or swapping it for mockito-scala helped me resolve the issues I was seeing when running tests for an old project. – Onema Jan 22 '22 at 02:06