We cannot configure the time hour limit to more than 36 hour limit. But we can remove this limit for spark commands.In order to run the Spark application from Analyze/Notebooks, you need to do the following before cluster start:
Edit Cluster configuration and update following configuration in Hadoop Configuration Over-rides
yarn.resourcemanager.app.timeout.minutes=-1
Edit Cluster configuration and update following configuration in Spark Configuration Over-rides
spark.qubole.idle.timeout=-1
Please let me know if this helps. Also, if you are not running a streaming application, and the data being processed/accessed by your spark app is not humongous, then you may also want to reduce runtime of your app through some performance tuning as well( thereby potentially can reduce the runtime of your app less than 36 hrs) which would not require removing this 36 hour limit in that case.