I have my main notebook in databricks that I am running my base set of code. Right now, I have to always pass "spark" and "dbutils" through my function to get the function to work properly.
Main notebook code:
from subfolder import awesome
awesome.somefunction(spark,dbutils,parameterC)
The code within the awesome.py file is the following: (this is located in a folder called "subfolder" which is one level deeper than the main notebook, it is also accompanied by an init python file)
def somefunction(spark,dbutils,parameterC):
# used spark in this function
# used dbutils in this function
# used parameterC in this function
# create a spark view at the end
# return None
return None
If I remove spark and dbutils from the function, I get the "spark" or "dbutils" module has not been found.
How can I get it so I don't have to automatically pass spark and dbutils to my .py file?