I have a regular sbt scala spark project and would like to integrate with zeppelin. As you can imagine the project not only contains a single big file but rather some nice classes to modularize functionality.
I is unclear to me how this could translate to zeppelin at best in a way where both world are integrated. For now I only managed to copy past let all the code into a single zeppelin notebook which is not really what I would like to achieve.
edit
Zeppelin works well when code is written in a single notebook and all code fits into this notebook. However, when the code gets more complex and I would want to encapsulate (reusable) modules as a class (in a separate file). Can I simply open a direction in zeppelin and then have regular Scala import statements work fine (like in Jupiter with python)?
.
├── README.md
├── build.sbt
├── project
│ ├── build.properties
│ ├── plugins.sbt
├── scalastyle-config.xml
├── src
│ ├── main
│ │ ├── resources
│ │ │ ├── log4j.properties
│ │ │ └── zcta510.csv
│ │ └── scala
│ │ └── myOrg
│ │ ├── MainApplication.scala
│ │ ├── moduleAReferencedinMain.scala
│ │ └── bigModuleBAllSubPartsReferencedInMainViaImportStatements
│ │ ├── B_1.scala
│ │ ├── B_2.scala
│ │ └── B_3.scala
│ └── test
│ ├── resources
│ │ └── zcta510.csv
│ ├── scala
│ │ └── myOrg
│ │ └── ModuleATest.scala
└── version.sbt
Also this is related to https://forums.databricks.com/questions/1861/importing-functions-from-other-notebooks.html