0

I am running a Spark job with the Spark Job Server in which I pass job parameters with an HTTP post (much like the word count example here: https://github.com/spark-jobserver/spark-jobserver).

At the moment I can successfully pass these parameters as a CSV list. For example:

curl -d "param1 = val1, param2 = val2" 'localhost:8090/jobs?appName=app&classPath=class&sync=true&context=contextName'

Is it possible to encapsulate these parameters in JSON format? I have tried with no success things like:

curl -H "Content-Type: application/json" -X POST -d '{"param1":"val1","param2":"val2"}' 'localhost:8080/...'
rengenin
  • 1
  • 1

3 Answers3

1
curl -d "@/tmp/test.json" 'localhost:8080/jobs?appName...'

You can pass the json file using @ symbol. Please check out the manual for curl.

g00glen00b
  • 41,995
  • 13
  • 95
  • 133
Siva
  • 21
  • 6
  • While this code snippet may solve the question, [including an explanation](http://meta.stackexchange.com/questions/114762/explaining-entirely-‌​code-based-answers) really helps to improve the quality of your post. Remember that you are answering the question for readers in the future, and those people might not know the reasons for your code suggestion. – Rosário Pereira Fernandes Sep 19 '17 at 03:21
0

Alright, I am able to pass the JSON contents in the POST body with:

curl -d "{"param1":"val1","param2":"val2"}" 'localhost:8080/jobs?appName...'

However, I would still like to be able to pass the actual JSON file in the POST as opposed to just passing the JSON contents. Can anyone please enlighten me?

rengenin
  • 1
  • 1
0

Try

curl --data-binary @path/to/config.json 'localhost:8090/jobs?appName=...'
Glenn
  • 6,455
  • 4
  • 33
  • 42