2

I tried to create variable/val with name "org" in spark shell(Scala version 2.10.5) , but throwing error

I tried with both var and val.

 var org = List(1)

error: value apache is not a member of List[Int] org.apache.spark.sql.catalyst.encoders.OuterScopes.addOuterScope(this)

  • Is this your whole script? It looks like somewhere below you try to `import org.apache.....` and your local variable `org` broke that. – Thilo Jul 19 '19 at 07:54
  • @Thilo , i just started spark-shell and then executed var org = List(1). – Santhosh Kumar Jul 19 '19 at 08:02
  • 1
    Well, probably a bug in spark-shell. It seems that it wants to call `org.apache.spark.sql.catalyst.encoders.OuterScopes.addOuterScope` internally. Just call the variable something else. – Thilo Jul 19 '19 at 08:20

1 Answers1

0

Well, I am able to create the list, dataframes with the name 'org'

scala> val org = List(1)
org: List[Int] = List(1)

scala> var org = List(1)
org: List[Int] = List(1)

scala>

scala>

scala> val org = Seq((1,2,3), (2,3,4)).toDF()
org: org.apache.spark.sql.DataFrame = [_1: int, _2: int ... 1 more field]

scala> org
res0: org.apache.spark.sql.DataFrame = [_1: int, _2: int ... 1 more field]

scala> org.show
+---+---+---+
| _1| _2| _3|
+---+---+---+
|  1|  2|  3|
|  2|  3|  4|
+---+---+---+

I am using spark version 2.2.0 & scala 2.11.8. I think this issue might be rectified in latest versions. But I am aware that you cannot create values or variables using some reserved keywords . For example

scala> val new = Seq((1,2)(2,3)).toDF()
<console>:1: error: illegal start of simple pattern
val new = Seq((1,2)(2,3)).toDF()

'org' might be a keyword in previous version. Hope that helps ...!!

Sarath Chandra Vema
  • 792
  • 1
  • 6
  • 13