I'm new to Scalatra. I've been experimenting with Spark. I want to integrate Spark with a Scalatra web interface. I have found two examples so far of code integrating Scalatra with Spark. One is from Github (link below) and does not seem to follow the normal build configuration per Scalatra's documentation, and so I'm not able to create a new Scalatra-Spark project using that model.
https://github.com/ytaras/scalate_spark
My own attempt to integrate a Simple Spark operation is getting the following error:
org.eclipse.jetty.server.AbstractConnector: method <init>()V not found
java.lang.NoSuchMethodError: org.eclipse.jetty.server.AbstractConnector: method <init>()V not found
My code that is throwing the error:
import org.apache.spark._
import org.apache.spark.SparkContext._
import scala.math.random
/** Computes an approximation to pi */
object SparkPi {
lazy val sc = new SparkContext("local", "Spark Pi")
def calcPi(slices: Int = 2) {
println("SparkPi.calcPi - sc: " + sc)
val n = 100000 * slices
val count = sc.parallelize(1 to n, slices).map { i =>
val x = random * 2 - 1
val y = random * 2 - 1
if (x * x + y * y < 1) 1 else 0
}.reduce(_ + _)
println("Pi is roughly " + 4.0 * count / n)
}
}
I'm not sure how to best indicate my Scalatra project structure, which is probably very relevant here for anyone trying to help me with this problem. But I'm using the normal Scalatra project creation per their documentation. I'm calling this SparkPi class through a normal Scalatra Servlet route:
get("/pi") {
calcPi()
}