3

I struggled for hours to get something like the following right:

to_xyz_regular :: (RealFrac t) => (RegularSpd t) -> (t, t, t)
to_xyz_regular spd@(RegularSpd lambda_min lambda_max spectrum delta inverse_delta) =  
    let
        l = length cie_x - 1
        il = 1.0 / fromIntegral l
        samp i = sample spd $ lambda_min + inverse_delta * fromIntegral i
        integrate curve = (foldr (+) 0 $
                           map (\i -> curve!!i * samp i) [0..l]) * il
    in (integrate cie_x, integrate cie_y, integrate cie_z)

(this is a conversion routine from color SPDs / spectra to XYZ colors).

The values cie_XXX were defined as follows:

cie_start = 360.0
cie_end = 830.0
cie_x = [0.0001299000, 0.0001458470, 0.0001638021, 0.0001840037,
         ....]
cie_y = ....
cie_z = ....

but then giving me roughly

Could not deduce (t ~ Double)
from the context (RealFrac t)
....

the problem being that the cie_XXX values where stored as Double, where I really would have liked them being poylmorphic.

The solution I finally found was to add type signatures for those values:

cie_start :: (RealFrac t) => t
cie_end :: (RealFrac t) => t
cie_x :: (RealFrac t) => [t]
cie_y :: (RealFrac t) => [t]
cie_z :: (RealFrac t) => [t]

Now my question is: Is this the proper way to have polymorphic values/literals in Haskell?


It was a bit tough for me to find a solution on websearch, because books, tutorials etc. mention type signatures only for parametric functions -> Are values just parameterless functions?

Sebastian Mach
  • 38,570
  • 8
  • 95
  • 130
  • 1
    If you want to leave the signatures off, you could disable [the monomorphism restriction](http://www.haskell.org/haskellwiki/Monomorphism_restriction). – aavogt Apr 11 '13 at 05:34
  • 1
    "Are values just parameterless functions?" No, [not everything is a function in Haskell.](http://web.archive.org/web/20111211020433/http://conal.net/blog/posts/everything-is-a-function-in-haskell) But, under the hood, any value with a type of the form `C => A` is a function from `C_Dict -> A`; [the monomorphism restriction exists because this might unexpectedly reduce sharing/increase computation](http://stackoverflow.com/a/12979513/237428). (Also, sorry about the Wayback Machine link above, but Conal Elliott's site seems to be down at the moment.) – Antal Spector-Zabusky Apr 11 '13 at 16:30
  • Very insightful comments, thanks. – Sebastian Mach Apr 13 '13 at 07:38

1 Answers1

4

The way you have chosen is the proper way to have polymorphic types for numeric literals in Haskell.

The reason Double was selected for the type of cie_XXX is due to Haskell's defaulting mechanism (which is described in greater detail here). Defaulting is invoked when an otherwise polymorphic type (contains a forall, either implicitly or explicitly) is required to be monomorphic by the monomorphism restriction or the type is ambiguous (contains type variables that appears only to the left of =>, e.g. a in (Read a, Show a) => String), and the type is constrained to be an instance of a Prelude or standard library defined subclass of Num. Defaulting forces the type to be unified with each entry in default (...) (starting with the left-most entry) until unification succeeds. The default default (...) is

default (Integer, Double)

If for some reason Float is the desired default for real numbers,

default (Integer, Float)

could be defined. If Float is the desired default for all numbers,

default (Float)

could be defined.

In the case of the original scenario,

cie_start :: Fractional a => a

is the most general type for cie_start (as well as cie_end). This type, when inferred, would violate the monomorphism restriction. However, in the presence of defaulting, a few types are first tried in the place of the type variable a. Given that the default default (...) is default (Integer, Double), Integer and Double are tried, in that order.

cie_start :: Fractional Integer => Integer

results in a constraint violation (Integer is not an instance of Fractional).

cie_start :: Fractional Double => Double

succeeds, and is simplified to

cie_start :: Double

To make the original scenario an error instead of defaulting to Double (and also disable defaulting entirely), add

default ()

at the top level of the module where cie_XXX are defined.

To make the scenario not require an explicit type signature and instead infer the most general type, add

{-# LANGUAGE NoMonomorphismRestriction #-}

to the top of the module where cie_XXX are defined (see Monomorphism restriction for more details about why this most general type is not used by default).

Overall, the method you have chosen to resolve this issue (adding an explicit type signature) is the best solution. It is considered a best practice to have type signatures on all top level definitions.

ScootyPuff
  • 1,335
  • 9
  • 18
  • In your introduction (2nd para) you say that defaulting is invoked by ambiguous types (which is true) but in the context of this question you might point out at the same time that it is also invoked by the monomorphism restriction. You can probably make the case the monomorphism restriction is a special case of ambiguous types but it's not entirely obvious... – drquicksilver Apr 11 '13 at 19:14
  • I have reworded it some and added additional details. I am having trouble wording it well, though. Let me know if it is unclear (or incorrect). – ScootyPuff Apr 11 '13 at 19:54