I've just used haskell as calculator expecting to get some cryptic error message about ambiguity of the type instance of the numeric literal. But got none.
For example in ghci
> :t 1
1 :: Num p => p
> print 1
1
It seems crazy that I expect code like print 1
to break xD. But here ghci needs to use some implementation of Show
class. Also the 1
is represented in some way in runtime. What representation is used and why doesn't it ask me to specify the type?
Simmilar example:
> 7*15 2
107
The expression has general Num a => a
type but is calculated and for that some type needs to be chosen. Is it Integer? This behaviour of course makes sense but how does it work?
Typing e.g. print $ 5 ^ 5 ^ 5
(outside of Int range) prints the correct number so I guess that it is Integer rather than Int.
I haven't found any general Num a => Show a
instance, rather the implementations for types like Int are specific to them.
I've tested with ghc in versions 8.8.4, 7.10.3 and 9.0.2. Thanks in advance!
CodePudding user response:
Haskell has a type defaulting mechanism to avoid having to specify the type of every numeric literal you type. The default type for Num
is Integer
, while for Fractional
(literals with a decimal point) it is Double
. You can change these defaults using a top level default declaration. This article explains the defaulting rules
CodePudding user response:
The expression has general
Num a => a
type but is calculated and for that some type needs to be chosen. Is itInteger
? This behaviour of course makes sense but how does it work?
Haskell works with type defaulting and the default for a Num a => a
type is indeed Integer
. Since Integer
is also a member of the Show
typeclass, it will thus still use Integer
and print $ 5 ^ 5 ^ 5
will thus indeed use the Integer
instance of Show
to print the value.
CodePudding user response:
The default default is (Integer, Double)
. More info in the Report. You may also want to read about ExtendedDefaultRules
; notably this is on by default in ghci.