The time values are:
- Real time is the actual number of elapsed seconds. This includes time spent waiting as well as time spent working.
- User time is the amount of CPU-time spent in "user mode," that is, executing the actual code of your application.
- Sys time is the amount of time spent in "system mode," executing operating-system code directly on behalf of your application.
See: man time
. There is considerably
more information that can be presented. All of the figures relate to "this
process" ... the one being time
When you are evaluating "how fast" a program runs, there are several aspects to the overall problem which must be considered. First of all, the fact that programs are completely idle
much of the time... they have nothing to do. ("Press 'OK' to continue," and you haven't pressed it yet.) Then, there's the fact that programs which are
"doing something" are forced to wait much of the time. They've started a disk-read, for example, and are waiting for the data to arrive. And finally, there's the fact that when a program is
using the CPU, it must share the CPU with other programs that are in a similar position. The entire system, furthermore, must share other resources such as memory, and so processes may incur involuntary delays because of things like paging.
Most programs, when they are not idle, are I/O bound.
In other words, the main determinant of how fast those programs can get their jobs done, is how fast they can initiate and complete I/O (input/output) operations. Very rarely do you find programs that are CPU bound.
Thus the paradox that, while it is fairly easy to design a motherboard with a fast CPU or maybe two of them, "the speed of the CPU" is usually not
what makes the difference. That motherboard might have a dog-slow I/O bus. Many of them do.