1

I just installed spark-notebook on some old PCs where a test Spark cluster is running. I created a notebook from the following template:

{
  "Simple" : {
    "profile" : "standalone",
    "name" : "Simple Test Spark Cluster",
    "status" : "stopped",
    "template" : {
      "customLocalRepo" : null,
      "customRepos" : null,
      "customDeps" : null,
      "customImports" : null,
      "customSparkConf" : {
        "spark.app.name" : "Notebook",
        "spark.master" : "spark://mymaster:7077",
        "spark.eventLog.enabled" : "true",
        "spark.eventLog.dir" : "hdfs://mymaster:8020/var/log/spark",
        "spark.shuffle.service.enabled" : "true",
        "spark.dynamicAllocation.enabled" : "true"
      }
    }
  }
}

I first created some dummy data which worked:

val someValues = sc.parallelize(1L to 10000L)
case class Foo(key: Long, value: Long)
val someDummyGrouping = someValues.map(v => (v / 100) -> v).reduceByKey((a, b) => a + b).map(t => Foo(t._1, t._2))

Now, I wanted to register a temporary table as the documentation at spark-notebook's github page told:

import org.apache.spark.sql.SQLContext
val sqlContext = new SQLContext(sc)
import sqlContext.implicits._

someDummyGrouping.toDF.registerTempTable("foo")

Output:

import org.apache.spark.sql.SQLContext
sqlContext: org.apache.spark.sql.SQLContext = org.apache.spark.sql.SQLContext@5c42083d
import sqlContext.implicits._

However, when I try to query the table, I get no satisfying output:

:sql select * from foo

Output is:

import notebook.front.widgets.Sql
import notebook.front.widgets.Sql._
res13: notebook.front.widgets.Sql = <Sql widget>

[key: bigint, value: bigint]

When I run this on a spark-notebook I install on AWS it runs out of the box.

Did I forget to configure some things, and if yes, what am I missing?

rabejens
  • 7,594
  • 11
  • 56
  • 104

0 Answers0