Assuming we have a network with n nodes and there is a coordinator elected that sends commands to nodes. Let's further assume that the coordinator has horrible bandwidth(upload speed) and he wants to send a large file 10 GB in nodes in o(n) assumption time.
My idea now to optimize performance is to use the erasure coding technique to split the large files into chunks and send one chunk per node so that later on, nodes communicate with each other in o(n^2) assumption time to retrieve the whole block. Hence the bandwidth is shared between validators so the leader does not need to upload huge data and limit his bandwidth and the performance of consensus. Will this incur higher throughput performance? Or I make a hole in the water?
One more advantage of using erasure coding is that if some nodes are considered malicious/byzantine nodes and refuse to communicate and send their chunks the non-faulty nodes will still be able to retrieve messages with lesser chunks with the power of erasure coding