I'm trying to calculate the bandwidth delay product between various hosts, and reading this wiki article, I am confused.
From the acticle;
Residential ADSL2+: 20 Mbit/s (from DSLAM to residential modem), 50 ms RTT
B×D = 20×106 b/s × 50×10-3 s = 106 b, or 1 Mb, or 125 kB.
One of the connections I am testing from, to a test host, is an ADSL2+ connection. It has a downstream sync rate of "11006 kbps", I am guestimating that 10000Kbps is a reasonable theoretical maximum throughput. When "ping'ing" the test host from the ADSL2+ line I get an RTT of 29ms. The test host is connect to the "Internet" with a 100Mbps Ethernet connection.
Now, here's the confusing part; Performing a speed test against the server (it is running a copy of speedtest.net's mini speed test app) I get 9.23Mbps for the downstream. According to that Wiki article, 10000000bps * 0.029s = 290000bps (290 Kbps), which is much less than my 9.23Mbps.
Have I missed something obvious, or is the article wrong?