Here is my issue, now, when I start using the spark shell, it would consume lot of resources and perhaps keep them bounded/held up; there by impacting other parallel running applications.
say for example, i am running some spark-shell commands and accidentally leave the shell open and not close the session, all resources it will keep held up, and all other users wont have anything to work on, unless i close my session
How to fix this issue from yarn perspective.