7

I've installed Spark 1.5 on Ubuntu 14.04 LTS. When running build with command build/mvn -Dscala-2.11 -DskipTests clean package I get the following build error during project Spark SQL:

    [error] missing or invalid dependency detected while loading class file 'WebUI.class'.
        [error] Could not access term eclipse in package org,
        [error] because it (or its dependencies) are missing. Check your build definition for
        [error] missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
        [error] A full rebuild may help if 'WebUI.class' was compiled against an incompatible version of org.
        [error] missing or invalid dependency detected while loading class file 'WebUI.class'.
        [error] Could not access term jetty in value org.eclipse,
        [error] because it (or its dependencies) are missing. Check your build definition for
        [error] missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
        [error] A full rebuild may help if 'WebUI.class' was compiled against an incompatible version of org.eclipse.
        [warn] 22 warnings found
        [error] two errors found
        [error] Compile failed at Sep 18, 2015 6:09:38 PM [17.330s]
        [INFO] ------------------------------------------------------------------------
        [INFO] Reactor Summary:
        [INFO] 
        [INFO] Spark Project Parent POM ........................... SUCCESS [  6.723 s]
        [INFO] Spark Project Core ................................. SUCCESS [03:07 min]
    ...
        [INFO] Spark Project Catalyst ............................. SUCCESS [ 58.166 s]
        [INFO] Spark Project SQL .................................. FAILURE [ 19.912 s]
        [INFO] Spark Project Hive ................................. SKIPPED
        [INFO] Spark Project Unsafe ............................... SKIPPED
...
        [INFO] ------------------------------------------------------------------------
        [INFO] BUILD FAILURE
        [INFO] ------------------------------------------------------------------------

Here below my env variables in file .bashrc

export JAVA_HOME=/usr/lib/jvm/java-1.7.0-openjdk-amd64

export SCALA_HOME=/usr/local/src/scala/scala-2.11.7
export PATH=$SCALA_HOME/bin:$PATH
export PATH=/home/ubuntu/apache-maven-3.3.3/bin:$PATH

export SPARK_HOME=/home/ubuntu/spark-1.5.0
export MAVEN_OPTS="-Xmx2g -XX:MaxPermSize=512M -XX:ReservedCodeCacheSize=512m"

Update: tried to run with -Ylog-classpath, but didn't work:

Unable to parse command line options: Unrecognized option: -Ylog-classpath
guzu92
  • 737
  • 1
  • 12
  • 28
  • I'm getting the same error when attempting to build spark 1.5 on 2.11, There is a note on the Building Spark page (http://spark.apache.org/docs/1.5.0/building-spark.html) "Spark does not yet support its JDBC component for Scala 2.11." that might be related. – Angelo Genovese Sep 18 '15 at 19:27
  • @Angelo: you're right, I've omitted the -Dscala-2.11 option in the command and build was successful. Thanks ! – guzu92 Sep 21 '15 at 07:44
  • 1
    If you aren't using the SQL module, you could probably just comment it out of the top level pom and rebuild. I haven't tested that though, so YMMV. – Angelo Genovese Sep 24 '15 at 18:21
  • Can anyone comment on whether it matters using mvn of spark installation or a different version of mvn in the path? I used the latter and solved the problem by including Dscala-2.11(which I didn't use initially) – kon psych Feb 26 '16 at 17:25

5 Answers5

5

Just run ./dev/change-scala-version.sh 2.11 from your spark directory to switch all the code to 2.11. Then run mvn (3.3.3+) or make-distribution.sh with your flags set.

Martin Tapp
  • 3,106
  • 3
  • 32
  • 39
1

Refer to Angelo Genovese's comment, do not include -Dscala-2.11 in build command.

guzu92
  • 737
  • 1
  • 12
  • 28
1

If you don't specifically need spark-sql, then just exclude sql related modules from build:

mvn clean package -Dscala-2.11 -DskipTests -pl '!sql/core,!sql/catalyst,!sql/hive'

Utgarda
  • 686
  • 4
  • 23
0

I was running into this problem also, in a project that I'd imported into IntelliJ from a Maven pom.xml. My co-worker helped me figure out that although <scope>runtime</scope> is okay for most dependencies, this particular dependency needs to be <scope>compile</scope> (for reasons we don't understand):

<dependency>
    <groupId>org.scala-lang</groupId>
    <artifactId>scala-reflect</artifactId>
    <version>${scala.version}</version>
    <scope>compile</scope>
</dependency>
Ken Williams
  • 22,756
  • 10
  • 85
  • 147
-1

This build issue can be overcome by first changing the scala version from 2.10 to 2.11 by running 'change-scala-version.sh' command located @ spark-1.6.1/dev/change-scala-version.sh 2.11

Refer the below link for detailed info. http://gibbons.org.uk/spark-on-windows-feb-2016