0

I can see the parallelism of my job in the Spark Details page, but I wonder how many Executors my job is actually running with.

Where can I see this?

1 Answers1

0

If you follow the same methodology to find the Environment tab noted over here, you'll find an entry on that page for the number of executors used.

Depending on your environment, you may find that dynamicAllocation is true, in which case you'll have a minExecutors and a maxExecutors setting noted, which is used as the 'bounds' of your job. You'll have any number of executors in that range depending on if you're using static allocation or dynamic allocation.

If dynamicAllocation is false, then you'll see an executorInstances entry, with the count listed there.

vanhooser
  • 1,497
  • 3
  • 19