I was able to complete basic tutorial in Veins.
In my simulation, 12 cars broadcast messages to each other. I want to compute the delay associated with every message. I am trying to achieve it in the following manner:
- Save the time when the transmission begins and send the packet
...
wsm->setDelayTime(simTime().dbl());
populateWSM(wsm);
sendDelayedDown(wsm, computeAsynchronousSendingTime(1, ChannelType::service));
...
- At the Rx side, compute the delay and save it
...
delayVector.record(simTime().dbl()-wsm->getDelayTime());
...
In the picture below you can see the delay w.r.t. node[0]. Two things puzzle me:
- Why the delay is in the range of seconds? I would expect it to be in the range of ms.
- Why does the delay increase with the simulation time?
Update
I have figured out that since 12 cars broadcast simulatenously, computeAsynchronousSendingTime(1, ChannelType::service)
will return bigger delay for subsequent cars. I can circumvent the issue by using sendDown(wsm)
. However, in this case, not all the messages are delivered, since a car tries to receive a packet while transmitting. So I would like to update the question: how do I simulate the most realistic scenario with the reasonable delay and packet loss?