Home > Enterprise >  Weird behaviour when calculating the duration between two dates
Weird behaviour when calculating the duration between two dates

Time:10-02

I want to display the duration between two date-times and I found a case that I'm not sure why it is not working. I'm currently using date-fns and I've also tried luxon and it gives me the same result.

Code snippet

import { intervalToDuration as intervalToDurationDateFns } from 'date-fns';
import { DateTime } from "luxon";

function intervalToDurationLuxon({ start, end }) {
  const startDate = DateTime.fromJSDate(start);
  const endDate = DateTime.fromJSDate(end);

  const i = startDate.until(endDate);

  return i.toDuration(['years', 'months', 'days', 'hours', 'minutes', 'seconds']).toObject();
}

const target = new Date(2023, 2, 1, 23, 59, 59, 999);

const beforeMiddleOfNight = new Date(2022, 8, 29, 23, 59, 59, 99);
const afterMiddleOfNight = new Date(2022, 8, 30, 0, 0, 0, 0);

console.log('date-fns')
console.log(intervalToDurationDateFns({ start: beforeMiddleOfNight, end: target }));
console.log(intervalToDurationDateFns({ start: afterMiddleOfNight, end: target }));

console.log('luxon')
console.log(intervalToDurationLuxon({ start: beforeMiddleOfNight, end: target }));
console.log(intervalToDurationLuxon({ start: afterMiddleOfNight, end: target }));

Actual output

date-fns 
{years: 0, months: 5, days: 1, hours: 0, minutes: 0, seconds: 0}
{years: 0, months: 5, days: 1, hours: 23, minutes: 59, seconds: 59}
luxon 
{years: 0, months: 5, days: 1, hours: 0, minutes: 0, seconds: 0.9}
{years: 0, months: 5, days: 1, hours: 23, minutes: 59, seconds: 59.999}

Expected output

date-fns 
{years: 0, months: 5, days: 2, hours: 0, minutes: 0, seconds: 0}
{years: 0, months: 5, days: 1, hours: 23, minutes: 59, seconds: 59}
luxon 
{years: 0, months: 5, days: 2, hours: 0, minutes: 0, seconds: 0.9}
{years: 0, months: 5, days: 1, hours: 23, minutes: 59, seconds: 59.999} 

I found that this happens only when the start is the last day of a month (28, 29, 30 or 31) and the end is between 1st of March and March 28th . Another thing that I've noticed is that the time of a day doesn't matter (same with the year), it still gives the wrong output.

I don't understand why days are wrongly calculated in this scenario.

Could someone explain me why this is happening?

Is there a better solution that covers also other scenarios that I couldn't find?

CodePudding user response:

This happens only when the start is the last day of a month (28, 29, 30 or 31) and the end is between 1st of March and March 28th.

The reason is because February has only 28 days, and counting in months is imprecise.

When you're at the end of a month (on the 28th, 29th, 30th or 31st) and you add a few months so that you arrive in February, you'll always end up at the end of February, on the 28th. Counted in months, all these intervals have the same duration, regardless on which day they started. The "remainder" is then computed as the number of days, hours etc you have to add to February 28th to arrive at the desired datetime in March.

This does not only happen for intervals ending in March, but applies to all months: if the end date's day of the month is smaller than the start date's day of the month, but the start date's day of the month is larger than the number of days in the month before the end date, the surplus days are ignored.

See also the documentation on Duration math.

  • Related