I have a project in pyspark in with i implemented ConfigParser
to load some settings from config.properties
file.
On local it work fine and without any issue so far...
But on cluster it throws No section error.
I Crawl internet and found some solution of provideing properties file to the executor throw --files
argument of spark-submit
I get the file path by using 'SparkFiles.get('file')'. like this:
config = ConfigParser.SafeConfigParser(os.environ)
config.read(SparkFiles.get('file'))
but when i access the variable in a function using config.get("SECTION","name")
it throws No Section error.
Here is the code flow.
In main.py
i initalize SparkContext
.
Then using import statement I import Utility.py
There on the top i do this:
config = ConfigParser.SafeConfigParser(os.environ)
config.read(SparkFiles.get('file'))
Then in next i call a function from Utility.py
where i try to access config.get("SECTION","name")
. There occurs the error.