I am trying to see the performance improvement of HTTP/2 compared to HTTP/1.1 from header compression. I have simulated a network of 300ms latency and 50 KB/s bandwidth (I tried few other combinations as well).
I did multiple attempts of tests which sends different number of requests to sever varying from 1 to 100.
In each test, I load my page multiple times and measure time between request sent and first byte of response received (I use Navigation Timing API for this). There is a reduction of this time between the very first request and consequent requests. However similar reduction is seen with HTTP/1.1 as well. So there is no visible gain compared to HTTP/1.1. There are considerable improvements to total page load time, but I have no way to say that it is from header compression or multiplexing. So I think measuring time between request sent and first byte of response received should give an accurate measure. But I am unable to experience that.
Find my sample test results
What should be the correct way to measure performance improvement from header compression ?
Thanks.