0

I am creating a JsObject using the following code:

        var json = Json.obj(
            "spec" -> Json.obj(
                "type" -> "Scala",
                "mode" -> "cluster",
                "image" -> sparkImage,
                "imagePullPolicy" -> "Always",
                "mainClass" -> mainClass,
                "mainApplicationFile" -> jarFile,
                "sparkVersion" -> "2.4.4",
                "sparkConf" -> Json.obj(
                    "spark.kubernetes.driver.volumes.persistentVolumeClaim.jar-volume.mount.path" -> "/opt/spark/work-dir/",
                    "spark.kubernetes.driver.volumes.persistentVolumeClaim.files-volume.mount.path" -> "/opt/spark/files/",
                    "spark.kubernetes.executor.volumes.persistentVolumeClaim.files-volume.mount.path" -> "/opt/spark/files/",
                    "spark.kubernetes.executor.volumes.persistentVolumeClaim.jar-volume.mount.path" -> "/opt/spark/work-dir/",
                    "spark.kubernetes.driver.volumes.persistentVolumeClaim.log-volume.mount.path" -> "/opt/spark/event-logs/",
                    "spark.eventLog.enabled" -> "true",
                    "spark.eventLog.dir" -> "/opt/spark/event-logs/"
                )
            )
        )

Now, I will be fetching some additional sparkConf parameters from my Database. Once I fetch it, I will be storing it inside a regular Scala Map (Map[String, String]) which will contain the key-value pairs that should go into the sparkConf.

I need to update that the sparkConf within the spec inside my JsObject. So ideally I would want to apply a transformation like this:

        val sparkSession = Map[String, JsString]("spark.eventLog.enabled" -> JsString("true"))
        val transformer = (__ \ "spec" \ "sparkConf").json.update(
            __.read[JsObject].map(e => e + sparkSession)
        )

However, I'm not getting ways to do this.

Sparker0i
  • 1,787
  • 4
  • 35
  • 60
  • 1
    "I'm not getting ways to do this" meaning? Do you encountered error? unexpected result? If yes, which one ? – cchantep Sep 16 '21 at 15:06
  • Trying to do this throws a compiler error because `sparkSession` is a `Map[String, String]` whereas the required parameter is a `(String, JsValue)`. I need to store all values of my map into the updated `sparkConf` – Sparker0i Sep 20 '21 at 00:40
  • Does this answer your question? [How to update a nested json using scala play framework?](https://stackoverflow.com/questions/64280699/how-to-update-a-nested-json-using-scala-play-framework) – Tomer Shetah Sep 23 '21 at 07:46

0 Answers0