Tangentially related to this question, what exactly is happening here with the number formatting?
In[1] := InputForm @ 3.12987*10^-270
Out[1] := 3.12987`*^-270
In[2] := InputForm @ 3.12987*10^-271
Out[2] := 3.1298700000000003`*^-271
If you use *10.^
as the multiplier the transition is where you would naively expect it to be:
In[3] := InputForm @ 3.12987*10.^-16
Out[3] := 3.12987`*^-16
In[4] := InputForm @ 3.12987*10.^-17
Out[4] := 3.1298700000000004`*^-17
whereas *^
takes the transition a bit further, albeit it is the machine precision that starts flaking out:
In[5] := InputForm @ 3.12987*^-308
Out[5] := 3.12987`*^-308
In[6] := InputForm @ 3.12987*10.^-309
Out[6] := 3.12987`15.954589770191008*^-309
The base starts breaking up only much later
In[7] := InputForm @ 3.12987*^-595
Out[7] := 3.12987`15.954589770191005*^-595
In[8] := InputForm @ 3.12987*^-596
Out[8] := 3.1298699999999999999999999999999999999999`15.954589770191005*^-596
I am assuming these transitions relate to the format in which Mathematica internally keeps it's numbers, but does anyone know, or care to hazard an educated guess at, how?