Huh? The difference between any two unsigned chars has a range of -CHAR_MAX to CHAR_MAX, thus the int return type. (I supposed it could have been short, but there are many implementations where sizeof (char) and sizeof (short) are the same.) Describe to me how you could arrive at any other behavior from either the BSD or Linux man pages? You can't. They're equivalent definitions.
The fundamental issue is people not understanding (or at least not applying their knowledge of) implicit conversions in C, and I'm entirely unsurprised that MySQL ran afoul of the rules. I had to emulate the MySQL password code in an asynchronous client library several years ago. I took one look at MySQL's code and my head spun. It's classic I-know-enough-C-to-be-dangerous. I translated to paper the algorithm that it was attempting, poorly, to implement, and then promptly purged my mind of the actual code so it wouldn't infect my own implementation.