1

I'm working with Zeppelin (0.7.1) on Spark (2.1.1) on my localhost, and trying to add some configuration values to the jobs I run.

Specifically, I'm trying to set the es.nodes value for elasticsearch-hadoop.

I tried adding the key and value to the interpreter configuration, but that didn't show up in sc.getConf. Adding to the interpreter's "args" configuration key the value of "--conf mykey:myvalue" didn't register as well. Is that not what the spark interpreter configuration is supposed to do?

Praveen Kumar K S
  • 3,024
  • 1
  • 24
  • 31
Oren
  • 1,796
  • 1
  • 15
  • 17
  • Hi, Did you try using `sc.setConf`? https://www.elastic.co/guide/en/elasticsearch/hadoop/current/spark.html#spark-native-cfg – 1ambda Jun 04 '17 at 05:34
  • AFAIK Zeppelin creates the sparkcontext on its own and I can't recreate it within the paragraph, nor can I add configuration flags to it after hand. If someone has this working please let me know. – Oren Jun 04 '17 at 10:54

1 Answers1

2

Apparently this is an intentional change in Zeppelin, implemented not long ago... It only allows spark.* properties to be delegated to the SparkConf. I have submitted a comment to change this, as I believe it is problematic. https://github.com/apache/zeppelin/pull/1970

Oren
  • 1,796
  • 1
  • 15
  • 17