-1

i try to run spark application on a cluster standalone mode, when i access to some remote machines and use "java -version" command i get the information(java version ..), but on others i get an error

-bash: command not found

so i thought that maybe java is not installed on those nodes so i tried with:

sudo apt -get install openjdk-8-jdk

but i get the same error, so wanna know how can i fix this, and i have some questions:

-Is it necessary that i install java on all remote machines? or if i install it only on the master node it is enough?

-if i have to install it on each node, how can i fix the problem that i explained before? (can not use install command...)

-In my code, i used expressions that are only supported with jdk 8, but some nodes (in which i could get "java version") it is installed jdk7, so do i have to reinstall jdk8 ?

hammad
  • 117
  • 4
  • 11
  • I do not believe that you get "the same error" when using `sudo apt-get` (be sure about the `apt-get` vs `apt get`). If so, check that you are using a .deb based distribution (`uname -a` should show ubuntu or debian). – Felix Aug 26 '16 at 13:31
  • -bash: sudo : command not found, that is what get as an error – hammad Aug 26 '16 at 13:36
  • @Mickaël B , no it is not because i have already checked it – hammad Aug 26 '16 at 13:40
  • I don't know spark, but here are a few elements of response : 1) not only must `java` be installed, its binaries directory must be included in the `$PATH` environment variable for `bash` to be able to execute it from anywhere without specifying its absolute path ; 2) jdk7 won't be able to execute Java8-specific code, but you can install multiple jdk on a same machine ; however, only one should be referenced in the `$PATH` – Aaron Aug 26 '16 at 13:58
  • @Aaron how can i change the path from jdk7 to jdk8? using command line ? – hammad Aug 26 '16 at 14:40
  • i think @the.Legend mentionned it in his answer, thank you – hammad Aug 26 '16 at 14:42
  • this probably belongs on http://unix.stackexchange.com – the8472 Aug 26 '16 at 16:32

1 Answers1

3

"command not found" error means that particular command you're trying to invoke is not found in neither of directories listed in $PATH system variable.

There are two ways how to fix this:

1) Specify full path when running an executable:

/opt/jdk-12345/bin/java -version

2) add the very same path to the beginning of PATH (change will be applied to current session only):

export PATH=/opt/jdk-12345/bin:$PATH
java -version

To fix this permanently, add that line (export PATH=/opt/jdk-12345/bin:$PATH) to ~/.bashrc (if BASH is default shell for that user) or to ~/.profile

Also because this is Unix Java, make sure to set up LD_LIBRARY_PATH and CLASSPATH variables if you're running some server applications. Usually this is done in application startup scripts, no need to go global.

Please verify which Server OS you're running ( uname -a or /bin/uname -a ) because different Unix systems have different package managers: apt-get is for Ubuntu/Debian, rpm for RedHat, Entropy for Sabayon/Gentoo, etc...

the.Legend
  • 626
  • 5
  • 13
  • is that link helpful? http://www.tecmint.com/install-java-jdk-jre-in-linux/ – hammad Aug 26 '16 at 14:47
  • 1
    This is kinda wrong, you should insert the binary at the start of the `$PATH` rather than the end, because `bash` will use the first reference it finds. This is especially important in the case jdk7 is already installed and in the path and OP wants to add jdk8. See [this ideone snippet](https://ideone.com/qIK42y) ; it fails to execute the program probably for reasons specific to ideone, but you can see from `stderr` it tries to execute the first one rather than the last one. – Aaron Aug 26 '16 at 14:50
  • @Aaron would you write it correctly please? – hammad Aug 26 '16 at 14:52
  • I've edited @the.Legend's answer – Aaron Aug 26 '16 at 14:58
  • thank you , it worked :) – hammad Aug 27 '16 at 10:02
  • hello, i used the second option to set it ( export PATH=/home/java/jdk1.7.0_71/bin:$PATH ), and when i checked with java -version, it was fine, but if i move to another node then come back, i find that it s not setted again?? – hammad Aug 27 '16 at 11:02
  • an `export` only work for your current shell and its subshells. Maybe do you have a spark-specific way to specify environment variables for your spark nodes? If so, that's what you should do. If not, follow the.Legend's advice to modify your .bashrc or .profile file. – Aaron Aug 27 '16 at 15:04