4

I am trying to build my project using Eclipse on Windows and execute on a Linux cluster. The project depends on some external jars, which I enclosed using eclipse's "Export->Runnable JAR -> Package required library into jar" build option. I checked the jar contains the classes within a folder structure, and the external jars are in the root folder.

On Hadoop standalone, Cygwin and Linux, this works fine but on an actual Hadoop Linux cluster it fails, when it tries to access a class from the first external jar, throwing up a ClassNotFoundException.

Is there a way to force Hadoop to search the jar, I thought this would work.

10/07/16 11:44:59 INFO mapred.JobClient: Task Id : attempt_201007161003_0005_m_000001_0, Status : FAILED
Error: java.lang.ClassNotFoundException: org.jfree.data.xy.XYDataset
 at java.net.URLClassLoader$1.run(URLClassLoader.java:200)
 at java.security.AccessController.doPrivileged(Native Method)
 at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
 at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
 at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
 at java.lang.ClassLoader.loadClass(ClassLoader.java:252)
 at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:320)
 at org.akintayo.analysis.ecg.preprocess.ReadPlotECG.plotECG(ReadPlotECG.java:27)
 at org.akintayo.analysis.ecg.preprocess.BuildECGImages.writeECGImages(BuildECGImages.java:216)
 at org.akintayo.analysis.ecg.preprocess.BuildECGImages.converSingleECGToImage(BuildECGImages.java:305)
 at org.akintayo.analysis.ecg.preprocess.BuildECGImages.main(BuildECGImages.java:457)
 at org.akintayo.hadoop.HadoopECGPreprocessByFile$MapTest.map(HadoopECGPreprocessByFile.java:208)
 at org.akintayo.hadoop.HadoopECGPreprocessByFile$MapTest.map(HadoopECGPreprocessByFile.java:1)
 at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50)
 at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:358)
 at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307)
 at org.apache.hadoop.mapred.Child.main(Child.java:170)
palacsint
  • 28,416
  • 10
  • 82
  • 109
akintayo
  • 213
  • 1
  • 4
  • 9

2 Answers2

3

Java can not use jars that are in other jar:/ (classloaders can't handle this)

So what you have to do is to install those packages separately on each machine in cluster, or if not possible add jars on the run, to do this you have to add option -libjars mylib.jar when running hadoop jar myjar.jar -libjars mylib.jar and this should work.

wlk
  • 5,695
  • 6
  • 54
  • 72
1

Wojtek's answer is correct. Using -libjars will put your external jars in the distributed cache and make them available to all of your Hadoop nodes.

However, if your external jars are not changing frequently, you may find it more convenient to copy the jar files to the node's hadoop/lib manually. Once you restart Hadoop your external jar will be added to the classpath of your jobs.

gras
  • 55
  • 1
  • 6