1

My Spark program will first determine whether the input data path exists and,if it does not,exit safely.But after exiting,yarn will retry the job once.So,I guess one parameter will control the minimum run time of the job. On Spark-cluster.Is there a parameter that controls the minimum run time of the spark job,which is to trigger a retry even if the task succeeds but is less than that time.

---------after the first edit--------------

I turned the number of retries to 1,and now I don't have to think about the number of retries. There is only one sentence System.out.println('MyProgram'); in the main method in my program.The log shows that everything is fine,but yarn thinks it's a failed job.I'm very confused.

shaokai.li
  • 11
  • 2

1 Answers1

0

No. The retry occurs if your job ends with an exit status other than zero.

Elliott Frisch
  • 198,278
  • 20
  • 158
  • 249