2

This is how Sum is currently defined.

newtype Sum a = Sum {getSum :: a}

Does anyone have examples of how's it useful outside the numerical context? If it isn't, then why don't we constrain the a to be Num a while defining the newtype? Why would anyone want to define a Sum "hello" at all?

sa___
  • 363
  • 2
  • 12
  • 5
    The `DatatypeContexts` feature was considered a "misfeature". It makes implicit what is better made explicit: that the `a` has a certain constraint on it: https://stackoverflow.com/a/7438716/67579 – Willem Van Onsem Dec 25 '19 at 13:21
  • 2
    https://www.youtube.com/watch?v=hIZxTQP1ifo&feature=youtu.be&t=1261 – ekim boran Dec 25 '19 at 13:26
  • @WillemVanOnsem, my understanding is that it didn't actually make anything implicit at all. You still have to write all the same constraints you would have had to without a datatype context, and you are forced to write them in places where they would otherwise not be necessary. It made more work for you with no gain. – luqui Dec 26 '19 at 02:58

1 Answers1

6

As a raw datatype definition that you've quoted, Sum a isn't actually useful - it just trivially wraps another type. And this wouldn't become any more useful with a Num a constraint, even if that actually did anything in practice.

What Sum is in practice used for is to make Semigroup and Monoid instances for numeric types, based on addition (and avoid privileging this instance over other possible ones by attaching it directly to the underlying type):

instance (Num a) => Semigroup (Sum a) where
    (<>) = (+)

and here of course the constraint is necessary. But these instances are the only reason the type even exists, and there's no gain in including a constraint in the type definition.

Robin Zigmond
  • 17,805
  • 2
  • 23
  • 34