6

I am trying to install spark2.3.0, more specificly, it is spark-2.3.0-bin-hadoppo2.7

'D:\spark\bin' is already added in environment variable PATH. Meanwhile, JDK-10 is installed. Hadoop is not installed. But google says spark can work without hadoop.

Here is the error message

C:\Users\a>spark-shell
Exception in thread "main" java.lang.ExceptionInInitializerError
    at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:80)
    at org.apache.hadoop.security.SecurityUtil.getAuthenticationMethod(SecurityUtil.java:611)
    at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:273)
    at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:261)
    at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:791)
    at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:761)
    at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:634)
    at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2464)
    at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2464)
    at scala.Option.getOrElse(Option.scala:121)
    at org.apache.spark.util.Utils$.getCurrentUserName(Utils.scala:2464)
    at org.apache.spark.SecurityManager.<init>(SecurityManager.scala:222)
    at org.apache.spark.deploy.SparkSubmit$.secMgr$lzycompute$1(SparkSubmit.scala:393)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$secMgr$1(SparkSubmit.scala:393)
    at org.apache.spark.deploy.SparkSubmit$$anonfun$prepareSubmitEnvironment$7.apply(SparkSubmit.scala:401)
    at org.apache.spark.deploy.SparkSubmit$$anonfun$prepareSubmitEnvironment$7.apply(SparkSubmit.scala:401)
    at scala.Option.map(Option.scala:146)
    at org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:400)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:170)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:136)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.StringIndexOutOfBoundsException: begin 0, end 3, length 2
    at java.base/java.lang.String.checkBoundsBeginEnd(Unknown Source)
    at java.base/java.lang.String.substring(Unknown Source)
    at org.apache.hadoop.util.Shell.<clinit>(Shell.java:52)
    ... 21 more

Anyone has an idea what should I do to install spark?

BlueSun
  • 3,541
  • 1
  • 18
  • 37
Mengge
  • 121
  • 2
  • 7
  • 2
    hi please go with jdk 1.8 version its stable also please tell us are you trying to install spark and use it via spark-shell or you are using it in IDE say eclipse – Rajnish Kumar Apr 03 '18 at 06:49
  • Related [Why does spark-shell fail with “SymbolTable.exitingPhase…java.lang.NullPointerException”?](https://stackoverflow.com/questions/48036762/why-does-spark-shell-fail-with-symboltable-exitingphase-java-lang-nullpointer) – Naman Apr 03 '18 at 13:28

1 Answers1

11

Current Spark version (2.3) doesn't support JDK 9 nor 10. The latest supported JDK version is JDK 8. You should downgrade your Java installation.

OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
user9590153
  • 111
  • 2
  • I'm using openjdk version "1.8.0_222" and spark-2.4.3-bin-hadoop2.7, Still facing this issue. please help me. – kashyap Aug 26 '19 at 12:44