I am creating a JsObject
using the following code:
var json = Json.obj(
"spec" -> Json.obj(
"type" -> "Scala",
"mode" -> "cluster",
"image" -> sparkImage,
"imagePullPolicy" -> "Always",
"mainClass" -> mainClass,
"mainApplicationFile" -> jarFile,
"sparkVersion" -> "2.4.4",
"sparkConf" -> Json.obj(
"spark.kubernetes.driver.volumes.persistentVolumeClaim.jar-volume.mount.path" -> "/opt/spark/work-dir/",
"spark.kubernetes.driver.volumes.persistentVolumeClaim.files-volume.mount.path" -> "/opt/spark/files/",
"spark.kubernetes.executor.volumes.persistentVolumeClaim.files-volume.mount.path" -> "/opt/spark/files/",
"spark.kubernetes.executor.volumes.persistentVolumeClaim.jar-volume.mount.path" -> "/opt/spark/work-dir/",
"spark.kubernetes.driver.volumes.persistentVolumeClaim.log-volume.mount.path" -> "/opt/spark/event-logs/",
"spark.eventLog.enabled" -> "true",
"spark.eventLog.dir" -> "/opt/spark/event-logs/"
)
)
)
Now, I will be fetching some additional sparkConf
parameters from my Database. Once I fetch it, I will be storing it inside a regular Scala Map (Map[String, String]
) which will contain the key-value pairs that should go into the sparkConf
.
I need to update that the sparkConf
within the spec
inside my JsObject
. So ideally I would want to apply a transformation like this:
val sparkSession = Map[String, JsString]("spark.eventLog.enabled" -> JsString("true"))
val transformer = (__ \ "spec" \ "sparkConf").json.update(
__.read[JsObject].map(e => e + sparkSession)
)
However, I'm not getting ways to do this.