Quality of Service (QoS) was designed to manage bandwidth usage, which implicitly assumes that applications compete for that (limited) resource. Is that really, ever a concern for ANY applications these days?
It also assumes that the QoS protocols and Internet Protocol options are implemented on both client and server ends, and recognized and honored on each network element in between (e.g., all switches, routers, proxies, and NATs). Is that ever true on anything other than, maybe, between two hosts on the same subnet, or on a highly-managed enterprise network?
And finally, has anyone ever used the QoS APIs AND identified an actual benefit? In other words, did it ever "save the day", and avert a problem that would surely have happened otherwise?
thanks, bob