0

I am using a native method in my mapper code.

class Map extends Mapper<LongWritable, Text, LongWritable, Text>{
   static{
      System.loadLibrary("myjni");
   }
   public native String getRow(String record, String query);

   public void map(...){
      //...
   }
}

I performed all necessary steps to create JNI library file - .so. And I also copied this .so file to hdfs. But still hadoop is not recognizing the path. Its giving an error - no myjni in java.libarary.path.

How to let know mapper the path of native library file. Please help. Thanks.

user1612078
  • 555
  • 1
  • 7
  • 22

1 Answers1

1

Files in HDFS are essentially unknown to everything except software explicitly designed to be HDFS-aware. In particular, Java and its class loaders know nothing of HDFS. To make HDFS-based files appear in the native file system for such uses, use the "DistrubutedCache" API in your MapReduce job. This is a mechanism for caching HDFS-based files and archives to your local file system (in YARN, a similar feature is called "resource localization"). See the following for help:

Stackoverflow yahoo tutorial

Community
  • 1
  • 1
Wheezil
  • 3,157
  • 1
  • 23
  • 36