I am streaming video from a remote server through HTTP. I captured the packets with Wireshark on the client, I notice that the client of the connection stops sending packets from time to time, for several seconds each pause. The RTT is between 170ms and 200ms, bandwidth is 20Mbps, also the connection has packet lose rate as high as 5.8%, and I can see that the advertised window size from the server ramps up with slow start from 14KB to near 64KB(Window value size=501, [Calculated window size value=64128], Window size scaling factor=128).
My confusion is why the client stops sending packets from time to time while the receive buffer of the server for the connection was not filled at all?
What will packet lose affect in this situation(browser streaming video)?
I am thinking of this possible situation:
The browser streams the video with a single connection(with HTTP reusing the same TCP connection), client ACKs are not received in time while server is sending response to client, so the server stops and keeps waiting for ACK retransmit, at the same time, the client is waiting for server response packets, and of course, ACK from the server too. After some time, the client starts retransmitting the ACK, everything works again, and I notice from the captured data that the immediate packet after the pause is from client to server.
Is this understanding correct and does it make sense?