I've been searching for this and from what little I've found modern day routers try to balance internet bandwidth as fairly as possible. (I'm using a Asus AC66U myself)
For example, if I have a 100 Mbit internet connection and two clients want to use maximum bandwidth the router tries to distribute 50/50 between these two. And if three clients are trying to use as much bandwidth as possible then it will be divided 33/33/33.
But it feels like this fair distribution is not always the case so can someone help me understand the following:
How is this distribution performed by routers in more detail and which scenarios cause this theoretical bandwidth distribution to fail?
Thanks!