This question has been asked here, but i don't think there is an answer.
My env: Spark v 2.2.1 and Scala 2.11.8
As shown in the original question, it seems spark-shell needs fully qualified notation
import java.sql.Timestamp
case class Crime(
caseNumber: String, date: Timestamp,
description: String, detail: String,
arrest: Boolean
)
//<console>:12: error: not found: type Timestamp
// caseNumber: String, date: Timestamp,
// ^
However, if the Timestamp
is fully qualified, there is no issue
case class Crime(
caseNumber: String, date: java.sql.Timestamp,
description: String, detail: String,
arrest: Boolean
)
// defined class Crime
And even for things like org.apache.spark.sql.Dataset
or org.apache.spark.sql.functions.{lit, col}
, import
ing doesn't work.
Any idea why? and it it possible to avoid the fully qualified notation?
PS: Databricks seems to not impose this constraint.