Imagine the case of Twitter Periscope or Meerkat.The paradigm of their servers is that they recieve video from users and then stream it to a large amount of users (say > 100K for the sake of this argument).
Now, here is my confusion. Let's assume a video has a bit rate of 2Mb/s (Cause we're fancy and stream 1080p@30). Also assume there is only 1 guy broadcasting.
So: this guy needs outbound bandwidth of 2Mbps (Which is OK) the server inbound bandwidth needed is also ~2Mbps (Which is awesome) however, the outbound bandwidth of the server should be 2Mbps*100K = 200 Gbps (holy cow)
The question which might be stupid to some, yet I still wonder, The connections I have seen for companies like comcast,ATT,Level3 are max 10Gbps
How do they manage such bandwidth in terms of outbound connection providers?
A 10Gbps connection per server machine? A plan which I am not aware of? Or they don't need more than 10Gbps at all?
Thanks!