I'm actually doing a program to test different sorting algorithms. However, as I try to time the code execution, it prints only 3 decimals as you can see in the portion of output that I've pasted, and I don't know why.
code.c
//insertion sort test
time_t t = 0;
t = clock();
insertion(arr, n);
t = clock() - t;
double time_taken = ((double)t)/CLOCKS_PER_SEC; // in secondi
printf("insertion() took %f seconds to execute\n", time_taken);
fprintf(fp, "%f,", time_taken);
fflush(fp);
data.csv
0.001000,0.000000,0.000000,0.001000,0.000000
0.002000,0.000000,0.000000,0.002000,0.000000
0.005000,0.000000,0.000000,0.005000,0.000000
0.008000,0.001000,0.000000,0.008000,0.000000
0.004000,0.000000,0.000000,0.021000,0.000000
0.011000,0.000000,0.000000,0.024000,0.001000
0.008000,0.016000,0.000000,0.025000,0.000000
0.031000,0.001000,0.001000,0.029000,0.000000
0.041000,0.001000,0.000000,0.033000,0.000000
0.060000,0.001000,0.001000,0.051000,0.001000
0.062000,0.001000,0.000000,0.059000,0.001000
0.074000,0.002000,0.001000,0.068000,0.000000
0.085000,0.002000,0.001000,0.080000,0.000000
I'm on Windows 11, with AMD Ryzen 5 processor. Maybe it is because of some configuration in the hardware or in the OS.
CodePudding user response:
Try running this:
#include <time.h>
#include <stdio.h>
int main () {
printf("CLOCKS_PER_SEC: %ld", CLOCKS_PER_SEC);
return(0);
}
If CLOCKS_PER_SEC shows as 1000 then this explains why the precision is limited to 3 decimal places.