I was converting a few shell subprocesses to JavaScript in a Node.js project, and out of curiosity I wanted to see the difference between cat
and echo
with the time
command (nothing to do with the project, just wanted to see how fast they were).
time echo "Hello World!"
Which is when I got the following output:
Hello World!
real 0m0.000s
user 0m0.000s
sys 0m0.000s
Not quite zero but I got a similar output from the cat
command:
time cat compile.js
import Build from "./src/build.js";
Build.main();
real 0m0.003s
user 0m0.003s
sys 0m0.000s
Some or all of the time
fields say 0m0.000s
. Does this mean it's processing so fast that it can't be measured with time
? If not, what's happening?
CodePudding user response:
echo
(as a shell built-in) probably takes less than 1 millisecond to execute.
Note the rather large real time is due to the inital startup for /bin/echo
, which I virtually never use. Subsequent runs are much faster as the executable is cached in memory and doesn't need to be loaded into disk again
cat
takes 3 milliseconds, virtually all of which is spent in user-level code, not system-level code. The real time is the same as the user time because, for example, the process is not interrupted to do something else while it is waiting for I/O to complete. On a busier system (or for a longer running command), you may see the real time exceed the sum of the user and system times.
Using /bin/echo
instead of the shell built-in can be instructive, because you get very different results for the first execution and subsequent executions:
bash-4.4$ time /bin/echo "Hello World!"
Hello World!
real 0m0.113s
user 0m0.001s
sys 0m0.003s
bash-4.4$ time /bin/echo "Hello World!"
Hello World!
real 0m0.002s
user 0m0.001s
sys 0m0.001s
I virtually never use /bin/echo
, so this is probably the first time I've used it since the last time I rebooted my computer. Much of the 113 milliseconds is spent simply loading the executable into memory, where it gets cached, so it doesn't need to be reloaded for the second execution.