I know this question might have already been answered but i havn't found a proper answer. I am using hadoop mapreduce on eclipse and i want to create an executable jar to put it on a linux server where there is my hdfs cluster. The problem when i run the jar on my hdfs cluster and eclipse, there is exception appearing caused by java.lang.NoClassDefFoundError: org/apache/commons/logging/LogFactory. So I added an external jar common-logging but when i run it again and another NoClassDefFoundError appeared but this time with log4j-level and so on. I wondered how many external jar i had to add but i don't think that it's the best solution. Does anyone have the same problem and how did you solve it ? Thank you very much.
Asked
Active
Viewed 113 times
0
-
There are a bunch of Jars which you'll need to add. Do you use plain vanilla Hadoop or CDH/HDP? In either case you can use maven to manage your dependencies and if it's the latter, I can give you my maven dependencies. – aa8y Jul 03 '14 at 20:13
1 Answers
0
Usually, this error occurs when a previous exception occurred during logging initialisation.
Hadoop is quite fickle about dependencies, you should either make sure that your third party libs are part of your hadoop classpath or create a shaded jar with all dependencies in your job jar. You should use maven or assembly to create that jar.

Erik Schmiegelow
- 2,739
- 1
- 18
- 22