I've been doing a homework sheet for a while now, and there is a massive discrepancy between what I think the asymptomatic complexity is and what the runtime result suggests.
Below is a table for the runtime for the program.
| Input Size | Runtime (Seconds) |
|---------------------|-------------------------|
| 10000 | 0.040533803 |
|---------------------|-------------------------|
| 20000 | 0.154712122 |
|---------------------|-------------------------|
| 30000 | 0.330814060 |
|---------------------|-------------------------|
| 40000 | 0.603440983 |
|---------------------|-------------------------|
| 50000 | 0.969272780 |
|---------------------|-------------------------|
| 60000 | 1.448454467 |
string = "";
newLetter = "a";
for (int i = 500; i < n; i++) {
string = string + newLetter;
}
return string
Why would there be a discrepancy between the complexity of the algorithm and the growth of the runtime apart from me being wrong?
From the runtime results, it looks like the program has a time complexity of O(n2). Doubling the input size seems to increase the runtime by a factor of 4, which would suggest a quadratic function?
But looking at the program itself, I'm 99.9% certain that the time complexity is actually O(n).
Is it possible that there's an extraneous reason for this discrepancy? Is there any possibility that the runtime results are indeed linear?
My best guess for this discrepancy is that the for loop makes the next iteration slower (which, looking at the program makes sense as I think the Java compiler would have to iterate every additional newLetter given) but that would still be linear no? It's not a nested for loop.