I have a 64-bit long that I want to round down to the nearest 10,000, so I am doing a simple:
long myLong = 123456789
long rounded = (myLong / 10000) * 10000; //rounded = 123450000
This appears to do what I expect, but as I'm not 100% on the internals of how integer types get divided, I am just slightly concerned that there may be situations where this doesn't work as expected.
Will this still work at very large numbers / edge cases?