0

I encountered a problem while setting up a 4 clusters hadoop architecture following this tutorial. I have the 4 following machines (virtualized):

  • master-node
  • node1
  • node2
  • node3

I set all my conf files on master-node and exported them to the other ones with scp. Master-node can access the slave nodes through ssh. I set the JAVA_HOME in .bashrc on all machines. However, this is what I am getting:

hadoop@master-node:~$ start-dfs.sh
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.hadoop.security.authentication.util.KerberosUtil (file:/home/hadoop/hadoop/share/hadoop/common/lib/hadoop-auth-2.8.4.jar) to method sun.security.krb5.Config.getInstance()
WARNING: Please consider reporting this to the maintainers of org.apache.hadoop.security.authentication.util.KerberosUtil
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
Starting namenodes on [node-master]
node-master: ssh: connect to host node-master port 22: Connection timed out
node1: Error: JAVA_HOME is not set and could not be found.
node2: Error: JAVA_HOME is not set and could not be found.
node3: Error: JAVA_HOME is not set and could not be found.
Starting secondary namenodes [0.0.0.0]
hadoop@0.0.0.0's password: 
0.0.0.0: Error: JAVA_HOME is not set and could not be found.
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.hadoop.security.authentication.util.KerberosUtil (file:/home/hadoop/hadoop/share/hadoop/common/lib/hadoop-auth-2.8.4.jar) to method sun.security.krb5.Config.getInstance()
WARNING: Please consider reporting this to the maintainers of org.apache.hadoop.security.authentication.util.KerberosUtil
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release

[3 possibilities] There seems to be an issue with using openJDK 11 though I am not quite sure that this is what causing this mess. The errors suggest an issue with ssh but i) I uploaded my conf files without any problem and ii) I can access all nodes from the master-node. Might this have anything to do with the way set the JAVA_HOME path? Here is the end of my .bashrc:

export JAVA_HOME=/usr/lib/jvm/java-11-openjdk-amd64
export PATH=PATH:$PATH/bin

Thanks in advance for every lead (I don't use java much and I feel a bit lost here)

[edit] same with OracleJDK8

hadoop@master-node:~$  readlink -f /usr/bin/java
/usr/lib/jvm/java-8-oracle/jre/bin/java
hadoop@master-node:~$ export JAVA_HOME=/usr/lib/jvm/java-8-oracle/jre
hadoop@master-node:~$ start-dfs.sh
Starting namenodes on [node-master]
node-master: ssh: connect to host node-master port 22: Connection timed out
node1: Error: JAVA_HOME is not set and could not be found.
node3: Error: JAVA_HOME is not set and could not be found.
node2: Error: JAVA_HOME is not set and could not be found.
Starting secondary namenodes [0.0.0.0]
hadoop@0.0.0.0's password: 

0.0.0.0: Error: JAVA_HOME is not set and could not be found.

zar3bski
  • 2,773
  • 7
  • 25
  • 58

2 Answers2

0

Can you export path like,

export JAVA_HOME=/usr/lib/jvm/java-11-openjdk-amd64
export PATH=$PATH:$JAVA_HOME/bin

Then you have to execute following commands to make sure that your PATH is containing JAVA_HOME variable. After appending JAVA & PATH variable in .bashrc file execute below command,

source ~/.bashrc

Then check for echo $PATH, If the value is containing JAVA_HOME value, then It should work.

prashant khunt
  • 154
  • 3
  • 8
  • unfortunately, it does not (just tried): exactly the same behavior. It looks more and more like an access rights issue – zar3bski May 23 '18 at 14:28
0

Found it!!!!!! Turns out, JAVA_HOME is lost through ssh connection (why, I don't know. This led me to the answer)

To overcome the issue,

export JAVA_HOME=/usr/lib/jvm/java-11-openjdk-amd64

should also be added to

hadoop/etc/hadoop/hadoop-env.sh
zar3bski
  • 2,773
  • 7
  • 25
  • 58