I'm wondering if Excel VBA stores doubles with different levels of precision in 32 vs 64 bit editions of the product.
The reason I'm asking is that I'm looking at a stochastic model, and I've noted that I'm getting very small differences in output using seeded random numbers (difference would be implied by a difference in a trigger threshold in roughly 1 in 200M numbers).
The code based is identical, so I wasn't sure if there's additional precision in 64 which might be causing a delta in probability threshold conditions with a vanishingly small level of occurrence vs the run in 32.
Looking at the VBA documentation, I couldn't see any specific call outs on precision, but I'm aware that with that documentation, just because something is written down doesn't mean there isn't something there!