I have obtained the following results from executing a script:
real 0m1.027s
user 0m1.752s
I understand that:
- Real time is the wall time. This is the time I would have obtained if I have measured with a stopwatch from the start until the end of the execution.
- User time is the time the CPU was executing exclusively the code of the script (this time does not included kernel system calls for example).
How come it is possible to have user time > real time?