I was having some fun in C language with time.h library, trying to measure the number of clock ticks of some basic functions, just to figure out how fast they actually are. I used the clock() function. In this case I was measuring the printf() function.
Look at my program:
#include <stdio.h>
#include <time.h>
void main()
{
const int LIMIT = 2000;
const int LOOP = 20;
int results[LOOP];
for(int i=0; i<LOOP; i++)
{
int j;
clock_t time01 = clock();
for(j=1; j<LIMIT; j++)
{
printf("a");
}
clock_t time02 = clock();
results[i] = (int) (time02 - time01);
}
for(int i=0; i<LOOP; i++)
{
printf("\nCLOCK TIME: %d.", results[i]);
}
getchar();
}
The program just basically counts 20 times the number of clock ticks of 2000 times called printf("a") function.
The strange thing I don't understand is the result. I get most of the time, even when doing other tests, randomly two groups of results:
CLOCK TIME: 31.
CLOCK TIME: 47.
CLOCK TIME: 47.
CLOCK TIME: 31.
CLOCK TIME: 47.
CLOCK TIME: 31.
CLOCK TIME: 47.
CLOCK TIME: 31.
CLOCK TIME: 47.
CLOCK TIME: 47.
CLOCK TIME: 31.
CLOCK TIME: 47.
CLOCK TIME: 31.
CLOCK TIME: 47.
CLOCK TIME: 47.
CLOCK TIME: 31.
CLOCK TIME: 47.
CLOCK TIME: 31.
CLOCK TIME: 47.
CLOCK TIME: 31.
I don't understand how exactly compiler handles that function. There is some test for % character I guess, but that wouldn't make that difference. Looks more like compiler is doing something in the memory... (?) Does anyone know the precise background of compiling this code or why there appears that difference mentioned above? Or at least some link that would help me?
Thank you.