0

I am learning socket programming and have a simple simulator where i have both client and server on the same machine. And i am trying to simulate a call collision. To achieve a "collision", response time between client and server should only take less than 1 microsecond. I used tcpdump to capture data when sending requests and reponses between client and server. I tried to put timing to atleast synchronize the disconnection between the 2 but still, the timing results are more than 1 microsecond.

Any ideas?

mayor
  • 1
  • 2
  • Just out of curiosity, why are you trying to simulate a call collision? Can you have a bunch of threads, each with a client, hammering away at it? – Brad Werth Aug 14 '14 at 06:20
  • 1
    idea? Don't bother. You've shown that you can't get an interval that short, so the condition you're trying to test isn't going to happen... – jwenting Aug 14 '14 at 06:25
  • 1
    Actually, i am just trying to achieve how rapid the disconnection from client [FIN] and the server [FIN, ACK]. If we try to look the timing results in microseconds, there is actually no collision but if we try to look in seconds, there could be a collision. In my simulator, the criteria for a collision is that the timing results would be less than 1 microsecond. Hope this helps..thanks! – mayor Aug 14 '14 at 06:49

0 Answers0