Home > Blockchain >  How to transform a unix timestamp in nanoseconds to seconds without losing precision [C 17]
How to transform a unix timestamp in nanoseconds to seconds without losing precision [C 17]

Time:11-12

I need the unix time in nanoseconds for calculation purposes in seconds, but dont want to lose "precision". So I tried to transform the integer variable to double and expected a 128 divided by 10 to be a 12.8. But in this example I lost precision and only got 12. What am I doing wrong, or where is my understanding problem? This is what I tried:

#include <iostream>
#include <chrono>

using namespace std;

int main()
{
    
    int64_t a = std::chrono::duration_cast<std::chrono::nanoseconds>(std::chrono::system_clock::now().time_since_epoch()).count();
        
    double b = std::chrono::duration_cast<std::chrono::nanoseconds>(std::chrono::system_clock::now().time_since_epoch()).count() / (1000 * 1000 * 1000);
    
    cout<<(a) << " ns \n";
    
    cout.precision(9);
    cout << fixed << b << " s" << endl;
    
    return 0;
}

Output

1668199112421581468 ns 
1668199112.000000000 s

Wanted: 1668199112.421581468 s

CodePudding user response:

count returns an integral type, because std::chrono::nanoseconds::rep (the representation type) is an integral type with at least 64 bits.

You can use std::chrono::duration<double>, which supports fractional seconds. No need to do the math yourself, duration_cast knows about std::chrono::duration::period.

  • Related