-1

I'm working in a simple program, for beginning i will ping in all MAN hosts to verify if all hosts are online (complete) but i want implement some way that measure latency between hosts. Is there any way to do that? Any tips?

Anyway, thank you

ARC Bueno
  • 21
  • 1
  • 2
  • 1
    Latency is no more than the delay between your ping request and when you receive the reply. It should be as simple as ordering your hosts by this delay. – Neil Jul 31 '17 at 12:33
  • Network latency changes all the time, and a measurement only reflects what the latency of the test at that given time. The next test could have a far different latency. Also, different protocols can have a different latency. For example, ping uses ICMP, which is low on the priority list, and its latency has nothing to do with the latency you may get for TCP or UDP. – Ron Maupin Jul 31 '17 at 14:19

1 Answers1

3

You can keep a timestamp for the ping and the pong packed and simply compute the difference between the two.

This is by definition, latency

You can repeat the process more that one time to compute other metric, such as the jitter.

Something simple as the following should serve the purpose.

while (sentPacket < MAX_PACK_NUM) {
            // Timestamp in ms when we send it
            Date now = new Date();
            long msSend = now.getTime();
             ....
            //send ping
            socket.send(ping);
            //receive ping
            socket.receive(response);
            now = new Date();
            long msReceived = now.getTime();
            // Print the packet and the delay
            long latency= msReceived - msSend;
            ++sentPacket;
         }
Davide Spataro
  • 7,319
  • 1
  • 24
  • 36