You have to understand the history. C was designed as a machine-independent system implementation language, which meant that it had to have as good performance as possible for commonly used things like integers, and it ran on a much wider range of processors than you'd expect to run into nowadays. The processors could have ones' complement, twos' complement, or signed magnitude for negative integer values. They could be designed to halt execution and raise some sort of signal on integer overflow, or designed to ignore it. Machine-addressible units of memory could range from one bit to 60 bits. Given the variety in what processors would do, any specific behavior would kill performance for processors that didn't match the behavior, so they left it as "undefined".
There were other sorts of incompletely specified behaviors. "Implementation-defined" usually referred to fairly minor differences, such as how long an "int" was.
Unspecified behavior usually referred to cases where there would be a few obvious choices, such as order of evaluation of function parameters. Whether or not it was a good idea, C generally labeled more complex potential incompatibilities as undefined. Personally, I'd like to see less "undefined behavior", substituting "implementation-defined" or "unspecified" as much as possible.