C is designed to work as closely as possible to the way a computer works. This means that numbers (or any data type, for that matter), is represented in memory in a very strict way. For instance, integers are represented by 32 bits, with 2^32 possible values.
Depending on your computer, signed integers range from –2147483647 to 2147483647, where as unsigned integers range from 0 to 4294967295 (see https://msdn.microsoft.com/en-us/library/7fh3a000.aspx or http://www.cplusplus.com/reference/climits/)
The computer adds numbers mechanically, and doesn't check that adding caused it to "go over" the limit. Instead, the number just goes from the maximum possible to the minimum possible values.