1

I am trying to setup an environment for apache-spark and found out it is incompatible to Java9. (Well I regret for not finding out this earlier.).I am not able to make spark work or java9 uninstalled.

I tried both approaches here and here

None of these are yielding any results.

If I run 'java --version' in my terminal following is the output:

java 9.0.4
Java(TM) SE Runtime Environment (build 9.0.4+11)
Java HotSpot(TM) 64-Bit Server VM (build 9.0.4+11, mixed mode)

My issue now is to uninstall Java9, reinstall Java8 and the reconfigure spark.

Any leads/help on this to this?

lpt
  • 931
  • 16
  • 35

2 Answers2

0

Try using these commands. Go to: /Library/Java/JavaVirtualMachines

remove jdk-9.0.4.jdk folder

This should work for you

Rohith Joseph
  • 603
  • 1
  • 5
  • 17
-1

Spark is not compatible with Java 9 yet, you will need Hadoop 3.0 to use Java 9, but I have not seen a Spark with Hadoop 3.0 yet. Your best option is to use a docker container that has Spark configured already. I use this one: https://github.com/jupyter/docker-stacks/tree/master/pyspark-notebook, there are many more on docker hub.

  • thanks, user9382513. Any other advise how to get rid of Java9? I cannot get uninstallation with those above methods described. This seems really pain in neck. – lpt Mar 03 '18 at 14:52