0

When I install Spark, I type spark-shell in the terminal, but it doesn't work It shows:

Failed to initialize compiler: object java.lang.Object in compiler mirror not found.
Note that as of 2.8 scala does not assume use of the java classpath.
For the old behavior pass -usejavacp to scala, or if using a Settings
object programmatically, settings.usejavacp.value = true.**

According to the solution I search on this page: http://blog.csdn.net/xiaomin1991222/article/details/50981584, I revised /spark-2.2.0-bin-hadoop2.7/bin/sprak-class2.cmd add this code

rem Set JAVA_OPTS to be able to load native libraries and to set heap size   
set JAVA_OPTS=%OUR_JAVA_OPTS% -Djava.library.path=%SPARK_LIBRARY_PATH%  -Xms%SPARK_MEM% -Xmx%SPARK_MEM%    -Dscala.usejavacp=true**

But, It doesn't work either. My JDK and JRE works normally.

Alper t. Turker
  • 34,230
  • 9
  • 83
  • 115

0 Answers0