If I have an Algebraic Data Type like below
data BinaryTree a = Leaf
| Node a (BinaryTree a ) (BinaryTree a)
deriving (Eq,Ord)
Here Leaf is for empty and a is the node and the other two parameters are for subtrees coming from the node.
Is there a way I can specify that the argument a should be deriving Show
I was trying to give my own implementation of Show for BinaryTree
, and I started out simple like:
instance Show (BinaryTree a) where
show Leaf = "x"
show (Node node left right) = show node "\n" show left " " show right
But show node
doesn't work -> No instance for (Show a) arising from a use of ‘show’
CodePudding user response:
You can only work with show node
, if node
is of a type that is a member of the Show
typeclass. You thus should add a type constraint in the head of the instance
declaration:
-- ↓ type constraint
instance Show a => Show (BinaryTree a) where
show Leaf = "x"
show (Node node left right) = show node '\n' : show left " " show right