0

I am newbie and I am trying to find out a solution to this problem. I have followed this turorial in order to setup Hadoop 2.7.2 on Ubuntu 15.10

http://idroot.net/tutorials/how-to-install-apache-hadoop-on-ubuntu-14-04/

When I launch "hdfs namenode format" I continue to receive this error Error: Could not find or load main class org.apache.hadoop.hdfs.server.namenode.NameNode

this is the bashrc content

export JAVA_HOME=/usr/lib/jvm/java-8-oracle

export HADOOP_INSTALL=/usr/local/hadoop

export PATH=$PATH:$HADOOP_INSTALL/bin

export PATH=$PATH:$HADOOP_INSTALL/sbin

export HADOOP_MAPRED_HOME=$HADOOP_INSTALL

export HADOOP_COMMON_HOME=$HADOOP_INSTALL

export HADOOP_HDFS_HOME=$HADOOP_INSTALL

export YARN_HOME=$HADOOP_INSTALL

export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_INSTALL/lib/native

export HADOOP_OPTS="-Djava.library.path=$HADOOP_INSTALL/lib"

Can anyone help me to solve this (stupid I think) question?

Many thanks Kama

sKhan
  • 9,694
  • 16
  • 55
  • 53
Roberto G.
  • 171
  • 5
  • 12
  • I have updated my answer, however if that doesn't work you can use my installation script for Hadoop [here](https://github.com/user501254/BD_STTP_2016/blob/master/InstallHadoop.sh). Remember to go through it before running. – Ashesh Jan 30 '16 at 20:23
  • See related question: https://stackoverflow.com/questions/40888460/error-could-not-find-or-load-main-class-org-apache-hadoop-hdfs-server-namenode/ – Gabor Szarnyas Feb 04 '20 at 22:54

4 Answers4

0

First make sure that the directories namenode and datanode already exist in the location specified within the hdfs-site.xml file. You can use the command mkdir to create those.

Then try to format the namenode using

hdfs namenode -format

or

/usr/local/hadoop/bin/hdfs namenode -format

Please note the hyphen.


My bashrc configuration for hadoop:

#HADOOP VARIABLES START
export HADOOP_HOME=/usr/local/hadoop
export PATH=$PATH:$HADOOP_HOME/bin
export PATH=$PATH:$HADOOP_HOME/sbin
export HADOOP_MAPRED_HOME=$HADOOP_HOME
export HADOOP_COMMON_HOME=$HADOOP_HOME
export HADOOP_HDFS_HOME=$HADOOP_HOME
export YARN_HOME=$HADOOP_HOME
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"
export HADOOP_CLASSPATH=${JAVA_HOME}/lib/tools.jar
#HADOOP VARIABLES END
Ashesh
  • 3,499
  • 1
  • 27
  • 44
  • Hi! I have checked directories but I still receive this error. I am using an oracle JRE instead a specific (openjdk) JDK. Could this be responsible? My opinion is that the issue is related to some the class not found so to hdfs command in unable to find the desired class – Roberto G. Jan 31 '16 at 12:39
  • Try changing all occurrences of "HADOOP_INSTALL" to "HADOOP_HOME" or "HADOOP_PREFIX" in your bashrc file. – Ashesh Jan 31 '16 at 14:25
  • IMO, JRE should work just fine as long as you aren't making any applications of your own. – Ashesh Jan 31 '16 at 14:31
  • @RobertoG. 2.7.2, which is the most current version. – Ashesh Jan 31 '16 at 18:44
0

Problem solved using the Ashes Script. Main difference is the usage of open jdk instead of oracle jre.

Thanks for the help!

Roberto G.
  • 171
  • 5
  • 12
0

I have this error too. For me, it's just because some files in /share/hadoop/yarn/ folder is missing, which was caused by an incomplete download of hadoop.tar.gz that can still be abstracted by command line. May help you, cheers.

0

One cause behind this problem might be a user-defined HDFS_DIR environment variable. This is picked up by scripts such as the following lines in libexec/hadoop-functions.sh:

HDFS_DIR=${HDFS_DIR:-"share/hadoop/hdfs"}
...
if [[ -z "${HADOOP_HDFS_HOME}" ]] &&
   [[ -d "${HADOOP_HOME}/${HDFS_DIR}" ]]; then
  export HADOOP_HDFS_HOME="${HADOOP_HOME}"
fi

The solution is to avoid defining an environment variable HDFS_DIR.

In my case, using sudo helped but for the wrong reasons: there was no problem with the permissions but with the environment variables.

Gabor Szarnyas
  • 4,410
  • 3
  • 18
  • 42