0

I'm facing a problem with version compatibility when use spark with almond-sh kernel on windows. I tried many times with different version but not succeed (i also tried check compatibility table). What version i should choose for these

coursier (tried: 2.1.2, 2.1.5)
almond-sh (tried: 0.10.9, 0.13.0, 0.13.1, 0.13.14)
scala (tried: 2.12.10, 2.12.15 (wanted = scala build spark) )
spark = 3.2.2 build for hadoop 2.7.4 use scala 2.12.15
java = 1.8.0_121

CLI

bitsadmin /transfer downloadCoursierCli https://github.com/coursier/launchers/raw/master/coursier "%cd%\coursier"
bitsadmin /transfer downloadCoursierBat https://github.com/coursier/launchers/raw/master/coursier.bat "%cd%\coursier.bat"
    
coursier bootstrap --hybrid almond:<version> --scala <version> -o almond    
almond --install --force
jupyter kernelspec list 
jupyter-lab (to use kernel)
starball
  • 20,030
  • 7
  • 43
  • 238
qxk71551
  • 95
  • 9
  • What error do you get? When doing what? – Gaël J Jul 06 '23 at 06:46
  • Many error about class not found, method not found.. relative to conflict vesion – qxk71551 Jul 06 '23 at 11:44
  • Hi. Welcome ot stackoverflow. I suggest to read the article [how to ask](https://stackoverflow.com/help/how-to-ask). With that limited information is hard to suggest a solution. Please try to provide more debugging details such as full error messages, what were the steps you follow to get there, a link if you are following some tutorial, etc. – Gastón Schabas Jul 06 '23 at 23:52

0 Answers0