I want to check my udp server client application. One of the features that I wanted to check is the time delay between data sent by the server and received by the client and vice versa. I figured out a way of sending a message from the server to the client and note the time. The client when receives this message sends the same message back to the server. The server gets this echoed message back and again notes the time. The difference between the time at which the message was sent and at which the echoed message is received back tells me the delay between data sent by the server and received by the client.
Is this approach correct? Because I also foresee a lot of other delays involved using this approach. What could be a possible way to calculate more accurate delays?
Waiting for help.