1

I have Spark 2.4 installation in my Windows . This is required as my Production env. uses Spark2.4 . Now, i wanted to test Spark3.0 feature Also . So can i install Spark-3.0 binaries ,in same Windows machine without disturbing Spark-2.4 installation ?? I don't want to use Linux Subsystem. Or VMs .

HimanshuSPaul
  • 278
  • 1
  • 4
  • 19
  • I don't see why that should not be possible. Separate directories, different launch commands, and you're all set. You might just need to configure the environment files for each installation to point them to different Java installations if needed. – ernest_k May 14 '21 at 15:23
  • How to Set different launch command for spark3 shells( like Spark-shell or Pyspark) ?? any link to follow?? – HimanshuSPaul May 14 '21 at 15:25
  • That may be irrelevant depending on how you launch spark. I guess the best is to just run them separately and see if anything breaks. Just note that you can't run both of them at the same time as there will be port conflicts. – ernest_k May 14 '21 at 15:28
  • yup . i wont be running both Spark3 and Spark 2 same time . Mostly i will be launching Spark-shell for both . I am confused how system will find which version to launch when ?? Or How to tell system which version launch ?? I don't want to change Environment variable everytime i want to switch between two version . – HimanshuSPaul May 14 '21 at 15:33
  • I would expect that you'll be calling them with absolute paths... something like `\bin\spark-shell...` and `\bin\spark-shell...`. You can also just create two symlinks/shortcuts to the each of those scripts (something like `spar2.cmd` pointing to `\bin\spark-shell.cmd` and so on), and these files being in a directory added to your path variable. – ernest_k May 14 '21 at 15:42

0 Answers0