While playing around with double
string formatting, I noticed that, at least on my locale, the strings produced by a default DecimalFormat
object are different than the ones generated by Double.toString()
when it comes to special cases like Infinity
and NaN
. E.g. from this snippet:
for (double i : new double[]{ 1/0.0, -1/0.0, 0/0.0 }) {
System.out.println(Double.toString(i));
System.out.println(new DecimalFormat().format(i));
}
I get this output:
Infinity
∞
-Infinity
-∞
NaN
�
This not only causes issues with existing code that is watching out for NaN
and Infinity
, but the last symbol is unreadable with the fonts on my development system. As far as I can tell by checking its numeric value, that last symbol is the U+FFFD
Unicode replacement character, which probably means that it has not actually been set to something meaningful.
Since for some of my applications I prefer the 7-bit ASCII representations ("Be liberal in what you accept, be conservative in what you produce"), is there some way to force the strings produced by DecimalFormat
to match those produced by Double.toString()
?
One way might be to provide a DecimalFormatSymbols
of my own, but it seems that I'd have to set all related fields myself, which sounds somewhat fragile. Is there a way to avoid that?