0

I am trying to compile this simple code in scala to run it in spark:

import org.apache.spark.sql.SparkSession     
                                                                                                                                                                                                                                                                                                                                                                                                                                         
object Main {
def main(args: Array[String]) {                                                                                                                                                                                                                                                                                   
    if (args.length < 1) {                                                                                                                                                                                                                               
        System.err.println("Usage: HDFStEST <file>")                                                                                                                                                                                                 
        System.exit(1)                                                                                                                                                                                                                    
    }                                                                                                                                                                                                                                                                                                                                                                                              
    val spark = SparkSession.builder.appName("TesteHelena").getOrCreate()                                                                                                                                                                        
    println("hellooo") 
    spark.stop() }                                                                                                                                                                                                                                                                                                                                                                                                                                                   
}             
 

I don't know how to make scalac find the dependency org.apache.spark.sql.SparkSession I try to set where are the jars files with the following command:

scalac main.scala -cp C:\Spark\spark-2.4.0-bin-hadoop2.7\jars -d main.jar

which returns me the error:

main.scala:1: error: object apache is not a member of package org
import org.apache.spark.sql.SparkSession
 

and if I just send every jar file with the command:

scalac main.scala -cp C:\Spark\spark-2.4.0-bin-hadoop2.7\jars\* org.apache.spark.sql.SparkSession -d main.jar

it returns me the error:

error: IO error while decoding C:\Spark\spark-2.4.0-bin-hadoop2.7\jars\aircompressor-0.10.jar with UTF-8

for every jar file.

The command:

scalac main.scala -cp org.apache.spark.sql.SparkSession -d main.jar

returns me:

main.scala:1: error: object apache is not a member of package org
import org.apache.spark.sql.SparkSession

So, is there a way to use the Spark dependency in scalac to compile a program? I cannot use any dependency builder, like sbt and gradle, because I don't have access to internet in my terminal, due to security issues of my job, and they call those dependencies in their repository.

HSILS
  • 11
  • 2

1 Answers1

1

I solved my issue with the command:

scalac -cp C:\Spark\spark-2.4.0-bin-hadoop2.7\jars\ -extdirs C:\Spark\spark-2.4.0-bin-hadoop2.7\jars\ main.scala -d main1.jar

so, I added the scalac's option "-extdirs", which overrides the location of installed extensions. And it worked!

HSILS
  • 11
  • 2