I want to measure bandwidh using c#. Here what I did. Comments and suggestions are welcome.
- Find maximum udp payload(on my test bed, its 1472 byte)
- Create non compressible data with 1472 byte size
- Send this data from a server to a client multiple times(on my test, its 5000 packets)
- Client start stopwatch at the time the first packet arrive
- When all data has been sent, send notification to client stating all data has been sent
- Client stop stopwatch
- I calculate bandwidth as (total packet sent(5000) * MTU(1500bytes)) / time lapse
- I notice that some packets are loss. a best, 20% loss. at worst 40% loss. I did not account this when calculating the bandwidth. I suspect client network device experience buffer overrun. Do I need to take account this factor?
If you guys have any suggestion or comment, feel free to do so.
Thanks.