I ran my PySpark code in Hue Notebook and I notice that the job was not distributed evenly(As can be seen nn4 and nn6 nodes are fully occupied whereas the others are quite free). I've checked the connections of all nodes, all are functioning. How do I fix this or is this behaviour normal?
Asked
Active
Viewed 159 times
1
-
fixed title, added image inline – Plamen G Oct 21 '16 at 13:19
-
@PlamenPetrov Thank you very much kind sir. Still new to posting questions on SO. – Daren Tan Oct 22 '16 at 07:01
-
@LostInOverflow ^ – Daren Tan Oct 22 '16 at 07:01