4

We are doing spark job submission using the rest api to spark master/cluster manager in dcos cluster

The below job works perfect

rest end point => service/spark/v1/submissions/create

{
"action" : "CreateSubmissionRequest",
"appArgs" : [ "100" ],
"appResource" : "https://<s3 location>/spark-examples-1.5.1-hadoop2.4.0.jar",
"clientSparkVersion" : "1.6.1",
"environmentVariables" : {
"SPARK_ENV_LOADED" : "1",
"SPARK_JAVA_OPTS" : "-Dspark.mesos.coarse=true -Dspark.mesos.executor.docker.image=mesosphere/spark:1.0.0-1.6.1-2"
},
"mainClass" : "org.apache.spark.examples.SparkPi",
"sparkProperties" : {
"spark.jars" : "https://<s3 location>/spark-examples-1.5.1-hadoop2.4.0.jar",
"spark.app.name" : "SparkPi",
"spark.submit.deployMode" : "cluster",
"spark.master" : "mesos://<dcos mesos master>/service/spark/",
"spark.executor.cores" : "1",
"spark.executor.memory" : "2048m",
"spark.cores.max" : "2",
"spark.mesos.executor.docker.image" : "mesosphere/spark:1.0.0-1.6.1-2"
}
}

with Authorization header Authorization token=${token}

When i submit it to chronos rest endpoint - /service/chronos/scheduler/iso8601

{
  "schedule": "R10/2016-06-16T08:28:00Z/PT2H",
  "name": "sparkjavachronos",
  "container": {
    "type": "DOCKER",
    "image": "mesosphere/spark:1.0.0-1.6.1-2"
  },
  "cpus": "0.5",
  "mem": "1024",
  "command": "/opt/spark/dist/bin/spark-submit --class org.apache.spark.examples.SparkPi --master mesos://<dcos mesos-master>/service/spark/ --deploy-mode cluster --supervise --executor-memory 2g --total-executor-cores 1 https://<s3 location>/spark-examples-1.5.1-hadoop2.4.0.jar 100"
}

chronos job submission is ok with Authorization header Authorization token=${token},

but when chronos executes the command it ends up with response indicating the request is unauthorized . is there a way for token forwarding to the command .

or how chronos command which talks to the cluster manager provides token ,in dcos which has the authorization token setup.

Venkatesh
  • 53
  • 2

0 Answers0