0

I'm facing a misunderstanding here.

I have an application based on UDP for Automatic repeat request mechanism. The idea is to transmit real-time data with a mechanism of retransmission and reordering of received packets ( temporary buffer to manage received and recovered packets was implemented).

To test the efficiency of this application I have implemented a function that calculates the one-way packet delay ( to investigate the influence of the retransmission mechanism on the packet delay and average delay).

Now the problem is during the experiments in an ad-hoc network where just nodes are connected ( transmitter and receiver) I've noticed that the average delay for transmitted packets using JUST UDP ( when distance between receiver and transmitter = 60 m using WiFi g) is also increased!

I am wondering, does the delay have to increase when using just UDP in function of distance ? If so, so what is the cause ?

In another way, if I send 100 packet at 10 meters using UDP I'll get delay = 5 ms, and at 60 m I'll get 17 ms for example? Does this make sense?

tyChen
  • 1,404
  • 8
  • 27
Med Nimou
  • 3
  • 1
  • It seems like you answered your own question? Because you tried it, and it did increase. – user253751 Oct 09 '20 at 12:19
  • @user253751 can you please mention the major causes of increasing delay in this scenario ? I mean, if we take the distance as a major factor then the delay should not highly increase. And, if we use UDP in Ad-hoc Network, is there are a chance of receiving packets out of order ? I mean sender sent packets 1, 2, 3, 4 and the receiver get 1, 3, 2, 4. – Med Nimou Oct 14 '20 at 09:25

0 Answers0