How do I run more than one spark streaming job in dataproc cluster? I created multiple queues using capacity-scheduler.xml
but now I will need 12 queues if I want to run 12 different streaming - aggregate applications. Any idea?
Asked
Active
Viewed 215 times
0

zero323
- 322,348
- 103
- 959
- 935

passionate
- 503
- 2
- 7
- 25
-
got any solution?? – Shrashti Jan 25 '19 at 10:36
1 Answers
0
Dataproc 1.2 image enabled fair mode in capacity scheduler which should do what you want without overhead of queues [1] [2].

tix
- 2,138
- 11
- 18