How can I/computer tell if binary numbers are signed or unsigned integers? Eg the binary number 1000 0001 can both be interpreted as -128, if signed, and 129, if unsigned.
One advantage of using unsigned integers in languages like C (as I understand it) is that it enables you to use larger integers due to the extra bit earned by not defining the sign. However, it seems to me that you need something, somewhere, that keeps track of whether the first bit represents a sign or is just a part of what describes the magnitude of the number.