Generally, UB is regarded as being something that has to be avoided, and the current C standard itself lists quite a few examples in appendix J.
However, there are cases where I can see no harm in exploiting UB other than sacrificing portability.
Consider the following definition:
int a = INT_MAX + 1;
Evaluating this expression leads to UB. However, if my program is intended to run on a, say, 32-bit CPU with modular arithmetic representing values in Two's Complement, I'm inclined to believe that I can predict the outcome.
In my opinion, UB is sometimes just the C standard's way of telling me: "I hope you know what you're doing, because we can't make any guarantees on what will happen."
Hence my question: is it safe to sometimes rely on machine-dependent behavior, even if the C standard considers it to invoke UB, or is "UB" really to be avoided, no matter what the circumstances are?