Home > other >  Why does Math.min() return -0 from [ 0, 0, -0]
Why does Math.min() return -0 from [ 0, 0, -0]

Time:12-23

I know (-0 === 0) comes out to be true. I am curious to know why -0 < 0 happens?

When I run this code in stackoverflow execution context, it returns 0.

const arr = [ 0, 0, -0];
console.log(Math.min(...arr));

But when I run the same code in the browser console, it returns -0. Why is that? I have tried to search it on google but didn't find anything useful. This question might not add value to someone practical example, I wanted to understand how does JS calculates it.

 const arr = [ 0, 0, -0];
    console.log(Math.min(...arr)); // -0

CodePudding user response:

-0 is not less than 0 or 0, both -0 < 0 and -0 < 0 returns False, you're mixing the behavior of Math.min with the comparison of -0 with 0/ 0.

The specification of Math.min is clear on this point:

b. If number is -0

  • Related