-1

I am using spark with scala to do timeseries analysis. I am writing the same scripts in spark-shell everytime i close and open. I would like to be suggested how to save my scripts from spark-shell and use it later.

Do i need to dowload scala IDE, save it and run the file in spark-shell?
Thank you.

Magg_rs
  • 323
  • 2
  • 3
  • 12
  • 1
    Possible duplicate of [How to run Scala script using spark-submit (similarly to Python script)?](https://stackoverflow.com/questions/44346776/how-to-run-scala-script-using-spark-submit-similarly-to-python-script) – eliasah Aug 01 '17 at 12:47
  • You can use the following : https://stackoverflow.com/a/44347237/3415409 – eliasah Aug 01 '17 at 12:47
  • And where would i write this .scala file? in scala IDE? – Magg_rs Aug 01 '17 at 12:54
  • I usually write scripts with intellij, it allows me quick prototyping with the library or/and app I'm developing – eliasah Aug 01 '17 at 12:56

2 Answers2

3

Write your script in a file, for example script.scala

Then you can run :

spark-shell -i script.scala

That will launch the spark-shell and execute your script.

If you want to leave the shell at the end of the script add System.exit(0) to your script.

Fabich
  • 2,768
  • 3
  • 30
  • 44
0

If your code is simple, the simplest way is to save code as a text file and paste into spark shell when you need.

If not, you'd better download scala IDE(IDEA recommended), write a spark app, and run it with spark-submit.

flyhighzy
  • 610
  • 1
  • 5
  • 8