I was learning about one's complement and how one's complement uses the most significant digit as a sign indicator.
If you represent binary number 110 with 1's complement, you get -1. But, obviously, 110 is equivalent to 6 in decimal. Here I was really confused.
How does the computer know if you're intending to use -1 or 6? Thanks.