6

I use the model that spark on yarn,when i meet a problem the spark would restart automatic.

I want to run exact once whatever successful or fail.

Is there any conf or api can set?

I'm using spark version 1.5.

Yaron
  • 10,166
  • 9
  • 45
  • 65
ulysses
  • 123
  • 1
  • 12

1 Answers1

11

You have to set spark.yarn.maxAppAttempts property to 1. Default value for this is yarn.resourcemanager.am.max-attempts which is by default 2.

Set the property via code:

SparkConf conf = new SparkConf();
conf.set("spark.yarn.maxAppAttempts", "1");

Set when submitting the job via spark-submit:

--conf spark.yarn.maxAppAttempts=1

RNHTTR
  • 2,235
  • 2
  • 15
  • 30
code
  • 2,283
  • 2
  • 19
  • 27
  • it's ok,and i want to know how to get this key “spark.yarn.maxAppAttempts” .i never see it in source code or docs. – ulysses Jan 13 '17 at 02:11
  • thanks,I have seen it in site http://spark.apache.org/docs/1.5.0/running-on-yarn.html. – ulysses Jan 13 '17 at 08:28