1

I ran my PySpark code in Hue Notebook and I notice that the job was not distributed evenly(As can be seen nn4 and nn6 nodes are fully occupied whereas the others are quite free). I've checked the connections of all nodes, all are functioning. How do I fix this or is this behaviour normal?

My nodes

Daren Tan
  • 21
  • 3

0 Answers0