The docs for Compare(T? x, T? y)
say:
Returns:
A signed integer that indicates the relative values of x and y
Less than zero – x is less than y.
Zero – x equals y.
Greater than zero – x is greater than y.
Consider this example:
using System;
using System.Collections.Generic;
public class Program
{
public static void Main()
{
Console.WriteLine(Comparer<int?>.Default.Compare(null, 1)); // -1 ?huh?
Console.WriteLine(Comparer<int?>.Default.Compare(-1, 1)); // -1
Console.WriteLine(Comparer<int?>.Default.Compare(1, 1)); // 0
Console.WriteLine(Comparer<int?>.Default.Compare(2, 1)); // 1
Console.WriteLine((int?)null < 1); // False
Console.WriteLine((int?)null == 1); // False
}
}
So null
is "less" than some value. I would have been less surprised if it returned a different magic number (e.g. -2
), or thrown an exception, or whatever.
And null < 1
is false, as expected, which seems counter to the above.
Is there some reason for this, or is it just an API quirk to be aware of?
CodePudding user response:
It needs to have a defined comparison in terms of IComparable<T>
/ IComparer<T>
, and it has elected that null
is less than anything else. This is specified here. The only other option would have been to throw an exception, which most people would consider a bad thing here, as it would break any sorting with nulls in.
The <
operator is entirely separate, and has elected to return false
when either operand is null
. As non-intuitive as it seems: CompareTo
and <
do not need to agree precisely.
CodePudding user response:
This experiment will blow your mind (just like mine):
int? a = null;
int b = 0;
if (a < b)
log.Debug("null is smaller than 0");
else if (a == b)
log.Debug("null is equal to 0");
else
log.Debug("null is larger than 0");
output : null is larger than 0
This does not mean that it is larger, it just means it is NOT smaller :-) )
But now, look at those values in the watch-window:
a null int?
b 0 int
a>b false bool
a<b false bool
a==b false bool
So, apparently, comparing null
with an actual value gives bogus results.