Any physically realized computer is made with real hardware. The hardware is what it is - it is made out of metal and plastic and glass arranged in a certain way. There is specialized circuitry dedicated specifically to performing operations like addition, subtraction, etc.
The amount of memory used to store some number can be determined at runtime. Perhaps you need four bits, perhaps 64, perhaps you need to represent numbers with a thousand or a million digits. That's fine; provided you add enough memory to your system, you can represent those things. However, the hardware that works with numbers - the metal and plastic and glass and circuitry - is not expandable in the same way. You cannot simply add more transistors or take them away to make addition work on smaller or larger numbers. You have to pick a largest size the hardware fundamentally works in; it can be any specific number, and while it can be any number, it does have to be specific. Once chosen, that is it.
That explains why any number is chosen. One has to be. But then, why not choose a bigger number? I'd imagine there is a practical concern here and that the maximum hardware-understood unsigned integer size has typically tracked with the amount of memory space available. When 4GB of memory was more than programs ever needed, there was no inherent need to have numbers larger than 2^32. With 64-bit unsigned integers we can now reference some 17 million gigabytes; if that ever becomes insufficient relative to the size of virtual memory address space (if trends in modern application design continue, maybe we'll get there!), I think we could all reasonably assume that new 128-bit architectures would become the norm.
Another reason why this may not be more of an issue is that general-purpose business-processing and commercial applications typically do not require such large numbers; so, the sorts of people who use computers for those applications, who may not be as technical, have their needs met. On the other hand, technical users for whom such large numbers might be useful already know enough to work around the hardware limitations, and so the need is not as pressing (nor is it probably as economically justified in general; how much more of the economy depends on JavaScript showing the next ad or the ACH transfer happening on schedule, than that part of the economy that depends on fundamental research continuing smoothly? Yes, long-term benefits of research are real, but that does not affect this quarter's profit and loss).