Recently met a strange problem, the following code:
Int main (void)
{
Long long newT=0;
Long long oldT=0;
While (1)
{
The gettimeofday (& amp; T, NULL);
NewT=(t.t v_sec * 1000 * 1000) + t.t v_usec;
Printf (" Time Period=% LLD \ n ", newT - oldT);
OldT=newT;
}
return 0;
}
Run on ubutun and a development board, will find random out of the big block, see below the six lines, please god to analyze, thank you,
The Time Period=3
The Time Period=3
The Time Period=15
The Time Period=6
Time Period=366313
CodePudding user response:
Should find the reason, it should be the printf function caused by the delay itself,