So I was answering this question just now, and in it I found something strange that I could not find explanations of.
If we have a generic function, and the parameter is optional and the parameter's type is using the generic, something like this:
function ok<T>(thing?: T) {
}
Then when we try to narrow thing
inside the function to just T
explicitly checking if it is undefined:
thing !== undefined ? thing : 0;
// ^? T & ({} | null)
Then thing
is narrowed to T & ({} | null)
. Moreover, when we use typeof
, this is narrowed to the expected T
. This also behaves the same with if statements.
typeof thing !== "undefined" ? thing : 0;
// ^? T
Here's a minimal reproducible example with all the things I mentioned:
function ok<T>(thing?: T) {
thing !== undefined ? thing : 0;
// ^?
if (thing !== undefined) {
thing
// ^?
}
typeof thing !== "undefined" ? thing : 0;
// ^?
if (typeof thing !== "undefined") {
thing
// ^?
}
}
Why is this the case? Do the two conditions actually have different behavior that I didn't know about? Is this a bug? I'm looking for an explanation for this behavior.
CodePudding user response:
This is the result of improved intersection reduction, union compatibility, and narrowing, as implemented in microsoft/TypeScript#49119 and released in TypeScript 4.8.
Certain type guards will now narrow formerly un-narrowable (is that a word?) types by intersecting them with a filtered version of {} | null | undefined
, a union type equivalent to the unknown
type because the empty object type {}
accepts all values except null
and undefined
(see How to undestand relations between types any, unknown, {} and between them and other types? for more info).
Specifically, the PR mentions:
- In control flow analysis of equality comparisons with
null
orundefined
, generic types are intersected with{}
,{} | null
, or{} | undefined
in the false branch.
In your case, the value thing
started off as type T | undefined
, and once you eliminate undefined
from that with an equality check, you get T & ({} | null)
which more accurately represents the fact that thing
cannot be undefined
than T
alone would. While inference would usually prevent this for your exact example, nothing stops someone from writing ok<string | undefined>(Math.random()<0.5 ? "abc" : undefined)
, and so the type T
would be string | undefined
whereas T & ({} | null)
is just string
(not string & {}
; see the PR for details about why and how this is different).
On the other hand, this logic was not implemented for the conceptually equivalent typeof thing !== "undefined"
check. The only typeof
check I see being affected here is for typeof xxx === "object"
. Whether or not they would consider the difference in narrowing between x === undefined
and typeof x === "undefined"
to be a bug, design limitation, intentional, or a missing feature isn't documented there, but TypeScript is definitely behaving as designed. Maybe it would be worthwhile to open a new issue or at least comment about it in GitHub? Not sure.
CodePudding user response:
Very interesting questions.
I think the reason is something like this:
- according to this when using an unconstrained generic
T
could match bothnull
and{}
- this will lead you do have a type similar to
T | null | {}
. - from this it makes sense that when doing
T !== undefined
to end up with something similar toT & ({} | null)
. - but when doing a
typeof
to get different results
Also. You can check 2 different things
- Add a constraint to the generic. Something like this
function ok<T extends string>
. This will lead to consistent results. - Or use
!=
instead of!==
which will turnthing
inNonNullable<T>
. As it can be seen here
It seems like a long, long time ago {}
was the top type for generics and changing it now would break a lot of existing code.