3

I want to inject a dependency inside my Mapper class.

Example Mapper Class:

public class Mapper() {
    private MyInterface myObject;       

    public void map() {
       // Map code here
    }
}

I would like to inject an implementation of MyInterface to myObject using Spring. This is not possible directly using spring, since Hadoop framework itself instantiates the Mapper Objects.

Only way I can some up is to add a configure function to my Mapper class & then do something like:

public void configure() {
    // create application context here, then
    myObject= (MyInterface) applicationContext.getBean("bean.myImplementation1");
}

Is there a better way to do this ?

Thanks in advance

Water
  • 127
  • 8
  • I can't understand the following sentences **I would like to inject an implementation of MyInterface to myObject using Spring** can yuo explain in a better way? – Skizzo Jul 16 '14 at 07:15
  • @Skizzo : Hi, suppose I have 2 implementations of MyInterface, imp1 & imp2. I may choose to inject 1 of those into myObject variable at runtime. Usually its possible by injecting spring beans. – Water Jul 16 '14 at 09:19

3 Answers3

1

Went through a couple of books on Hadoop. Seems like, 'configure()' method is the only way, to do this.

Already added the code in the Question

Water
  • 127
  • 8
1

This is a common dilemma on Hadoop because the Mapper and Reducer are handed to you by the framework. I found it best to call out to a lightweight DI framework from the setup() methods. Read my blog post about Dependency Injection on Hadoop. I wrote a single class to handle DI called Spit-DI which is available on github and uses the JSR-250 @Resource annotation for injections.

It ends up looking like this:

class MovieMapper extends Mapper {
   @Resource
   private Movie movie;

   @Override
   protected void setup(Context context) {
      DependencyInjector.instance().using(context).injectOn(this);
   }
}

class Movie {
   @Resource
   private Counter numMoviesRequested;
   
   public Integer getYear(String title) { 
     numMoviesRequested.increment(1);
     // more code...
   }
}

/**
 * You can have a wrapper class around Spit-DI for all your configuration.
 * (We have a TestDependencyInjector as well for the context of unit testing.)
 */
class DependencyInjector {
   private SpitDI spit = new SpitDI();

   public void injectOn(Object instance) {
      spit.inject(instance);
   }

   public DependencyInjector using(final Mapper.Context context) {
      spit.bindByType(Movie.class, new Movie());
      spit.bindByName(Counter.class, "numMoviesRequested", context.getCounter("movies", "numMoviesRequested");
      return this;
   }
}
Paul Mazak
  • 31
  • 2
-1

The default way to injection in spring is base on type of the object. In your case you can't use this kind to injection because you have two different implementation of the same interface. Then in your case you can the following strategy to inject these object ( I'm supponing that you have a xml configuration to Spring)

<beans>
   <context:annotation-config/>
   <bean class="example.MyFirstImpl">
     <qualifier value="first"/>
   </bean>
   <bean class="example.MySecondImpl">
      <qualifier value="second"/>
   </bean>
   <bean class="example.TestComponent" />


</beans>

Then on your interface you can use

    public class TestComponent {

       @Autowired
       @Qualifier("first")
       MyInterface myInterface

}
Skizzo
  • 2,883
  • 8
  • 52
  • 99
  • I think you answered a different question. The question wants to know how to inject dependencies (which is not possible because the Hadoop framework does not utilize a DI framework to instantiate mappers/reducers.) Adding @Autowired to a Hadoop mapper will not result in that field being set to anything when Hadoop runs the mapper. – Carl G Jan 14 '15 at 21:31