0

I have recently started learning Scala. I am trying to execute the tf jar command on spark -shell prompt

jar tf C:/spark/lib/spark-examples-1.6.0-hadoop2.6.0.jar

And it's throwing error

error: ';' expected but double literal found.

Could someone please help me in finding the issue.

Andrey Tyukin
  • 43,673
  • 4
  • 57
  • 93
Sam
  • 3
  • 1

1 Answers1

0

The Spark shell is a slightly customized Scala repl, it's not that kind of "shell". It has no idea about the usual bash/cmd-specific commands, and it assumes that you input only valid Scala definitions and expressions that you want evaluated. C:/spark/lib/spark-examples-1.6.0-hadoop2.6.0.jar is not a valid Scala expression.

Note the exact position where it fails: it fails right after 1.6. Everything before the 1.6.0 part can be interpreted as Scala expression C.:/(spark)./(lib)./(spark).-(examples).-(1.6), but it then fails to parse 1.6.0, and exits with an error message.

Andrey Tyukin
  • 43,673
  • 4
  • 57
  • 93
  • Thanks Andrey for response.I have renamed the jar file as sample and modified the command as - – Sam Jul 16 '18 at 05:51
  • @Sam The actual name of the jar-file doesn't matter. In Scala / Spark there is no object `jar`, and it has no method `tf`, therefore the expression `jar tf someFileName` is meaningless to the Scala repl. You will have to either exit the scala repl, or open another `cmd`/`bash`-shell and issue the `jar` command there. – Andrey Tyukin Jul 16 '18 at 05:54
  • 1
    Ohh..Okay..Thank you..:) – Sam Jul 16 '18 at 05:55