3

In Spark <1.3.x, the system property of the driver can be set by --conf option, shared between setting spark properties and system properties:

spark-submit --conf xxx.xxx=vvvvv

In Spark 1.4.0 this feature is removed, the driver instead log the following warning:

Warning: Ignoring non-spark config property: xxx.xxx=vvvvv 

How do set driver's system property in 1.4.0? Is there a reason it is removed without a deprecation warning?

Thanks a lot for your advices.

tribbloid
  • 4,026
  • 14
  • 64
  • 103
  • Possible duplicate of [How to pass -D parameter or environment variable to Spark job?](https://stackoverflow.com/questions/28166667/how-to-pass-d-parameter-or-environment-variable-to-spark-job) – Alex K Jun 07 '18 at 08:41

1 Answers1

6

This behavior was introduced in 336f7f5373e5f6960ecd9967d3703c8507e329ec with the JIRA discussion in https://issues.apache.org/jira/browse/SPARK-7037 . According to the JIRA previous version of spark-submit just silently ignored conf options that did not start with spark.. You might be able to set spark.driver.extraJavaOptions to have the options you want (depending on what you are trying to accomplish).

Holden
  • 7,392
  • 1
  • 27
  • 33
  • I've tested your alternative and it seems not working as well: I set the following spark-property: spark.driver.extraJavaOptions -Dbull.value.property=500, and try to read it from a driver program, the result looks like: System.getProperty("bull.value.property") == null. – tribbloid Aug 16 '15 at 01:25
  • 2
    An interesting thing is that from Spark 1.4.1, SparkConf again takes whatever property beingg set, and they will all be displayed in spark console! (You just no longer can read it with System.getProperty) This is awesome, they should do it long time ago but I'm glad they eventually made the right decision – tribbloid Aug 16 '15 at 01:30