In my program, I have a large string of numbers that have been compiled together, and I'm switching it back and forth between different base values. But when I switch back to decimal, the computer directly switches to a number using exponential notation. The program I'm using is Scratch, but as long as any algorithms that are given are readable, I should be able to translate.
Essentially, I just need a way to go from like 1.0e13 to 10000000000000. Any ideas?