Haskell distinguishes negative zero:
ghci> (isNegativeZero (0 :: Float), isNegativeZero (-0 :: Float))
(False,True)
JSON also allows for distinguishing them, since both "0" and "-0" are valid, syntactically.
But Aeson throws away the sign bit:
ghci> isNegativeZero <$> eitherDecode "-0"
Right False
Why? How can I decode a JSON document while distinguishing non-negative and negative zero?