I am trying to figure out how does program profiling work. I am using Valgrind. My first question is:
What does the cost of a function mean for Valgrind? Is is time?
From what I read, it seems that Valgrind runs the program on a virtual machine that is supposed to mirror a "generic computer". It then counts events occuring in this machine. But how does it compute the cost of a function from this data? Can times smaller than 1 millisecond be measured on a standard desktop PC?
Edit:
Please what does the 1 dimensional number "cost" mean in the output of callgrind?