Can (true) integer division ever over/underflow (with the assumption that the denominator is not 0)?
Since the value is always either staying the same or getting smaller (since, in integer division, the smallest absolute non-zero denominator is 1, therefore the result can never be bigger than the numerator), I would assume not.
I'm asking more or less in the context of C/C++ standards, and I'm interested in how the various modern CPU architectures might handle integer division differently when it comes to defined/undefined behavior.