Say that I am accessing a web site, and technically, different HTTP(S) GET requests are usually sent to different servers to download the resources (scripts, css, markup files, images, etc). By using the Browser's Network Console, we can analyse the time durations and other characteristics of each and every such requests and responses.
I want to understand how this is done by Browsers in a technical and a system level.
As per my knowledge, it is not mentioned how these durations are calculated. Below are some of the resources I referred to:
I initially thought the browsers log the timestamps before and after they use a system call. For example, to get the duration of a TCP connection:
- The browser logs the start time (t1)
- Calls the
connect()
system call - Logs the end time (t2)
TCP connection establishment duration = t2 - t2
However, it is not the case when I checked the timestamps in Wireshark. Wireshark establishes the TCP connection before the Network Console's TCP connection start time.
I understand that different Web Browsers could be implementing this feature in different ways. Nevertheless, a general or a browser specific explanation is highly appreciated.
P.S. I cannot ask this question in Network Engineering Stack Exchange as questions about hosts/servers, applications, and protocols above OSI layer-4 are off-topic there.