I am able to submit spark job on linux server using console. But is there any API or some framework that can enable to submit spark job in linux server?
Asked
Active
Viewed 228 times
0

thebluephantom
- 16,458
- 8
- 40
- 83

Vimal Dhaduk
- 994
- 2
- 18
- 43
-
what is your resource/cluster manager ? – mrsrinivas Jan 03 '17 at 19:55
-
Spark jobs are running on linux clustered server – Vimal Dhaduk Jan 04 '17 at 05:17
-
Are you using YARN or Spark standalone mode ? – mrsrinivas Jan 04 '17 at 05:18
-
I am using YARN – Vimal Dhaduk Jan 04 '17 at 05:19
2 Answers
0
You can use port 7077 to submit spark jobs in you spark cluster instead of using spark-submit.
val spark = SparkSession
.builder()
.master(spark://master-machine:7077)

Javier de la Rosa
- 227
- 3
- 14
-
-
Executing a program with that sentence, automatically, send a spark job to your cluster. Try it and let me know ;) – Javier de la Rosa Jan 04 '17 at 12:15
-
1
-
Javier your code will tell Spark to run as standalone on master-machine host will not submit anything – abiratsis Mar 05 '18 at 21:04
0
you can look into Livy server. It is in GA mode in Hortonworks and Cloudera distros of Apache Hadoop. We have had good success with it. its documentation is good enough to get started with. Spark jobs start instantaneously when submitted via Livy since it has multiple SparkContexts running inside it.

Yayati Sule
- 1,601
- 13
- 25