2

When I run the following command:

spark-submit --name "My app" --master "local[*]" --py-files main.py --driver-memory 12g --executor-memory 12g

With the following code in my main.py:

sc = SparkContext.getOrCreate()
print(sc.getConf().getAll())

Driver memory and executor memory do not appear in the configuration. Even though I'm in local mode, I guess I should at least have the driver memory in the configuration.

Any ideas why it is not the case?

blackbishop
  • 30,945
  • 11
  • 55
  • 76

1 Answers1

1

Your submit command is not correct. The confs should come before the .py file. See Launching Applications with spark-submit:

 ./bin/spark-submit \
  --class <main-class> \
  --master <master-url> \
  --deploy-mode <deploy-mode> \
  --conf <key>=<value> \
  ... # other options
  <application-jar> \
  [application-arguments]

[...] For Python applications, simply pass a .py file in the place of <application-jar> instead of a JAR, and add Python .zip, .egg or .py files to the search path with --py-files.

This is said, your command shoud be like this:

spark-submit --name "My app" \
--master "local[*]" \
--driver-memory 12g \
--conf spark.executor.memory=12g \
/path_to/main.py
Koedlt
  • 4,286
  • 8
  • 15
  • 33
blackbishop
  • 30,945
  • 11
  • 55
  • 76