Problem: I am working on a QueueModel in OMNeT++ that involves two input sources generating packets for a server. My aim is to simulate a queueing system and compare the number of packets generated by the simulation against analytical results.
Configuration:
*.source1.arrivalRate = 1 (jobs per second)
*.source2.arrivalRate = 1 (jobs per second)
**.bufferSize = 50
*.server.serviceTimeSource1 = 0.1 (seconds per job for source1)
*.server.serviceTimeSource2 = 1 (seconds per job for source2)
sim-time-limit = 1000000s
With around 30 repetitions, the number of packets generated is 1,032,000, which is roughly 3% higher than my analytical results — an acceptable margin.
Issue: When I increase the arrivalRate from 1 to 10 for each source, the number of packets generated becomes 30-40% higher than the analytical results, which is not acceptable.
Here is a snippet of my code for generating packets in Source1:
void Source1::initialize() {
sendMessageEvent = new cMessage("sendEvent");
scheduleAt(exponential(1.0 / par("arrivalRate").doubleValue()), sendMessageEvent);
}
void Source1::handleMessage(cMessage *msg) {
if (msg == sendMessageEvent) {
cPacket *packet = new cPacket("job");
packet->addPar("sourceName").setStringValue("source1");
send(packet, "out");
numJobsGenerated++;
// Other code for logging and visualization
scheduleAt(simTime() + exponential(1.0 / par("arrivalRate").doubleValue()), sendMessageEvent);
}
}
void Source1::finish() {
recordScalar("Source1 Jobs generated", numJobsGenerated);
}
Questions: Is there any obvious reason why increasing the arrivalRate would create such a large discrepancy? Are there any tweaks or fixes that can be applied to make the simulation more accurate? Any help would be greatly appreciated. Thank you!