Well, that, in my not so humble opinion, they did right.
As a general rule if Y has knowledge about X, and one needs attribute a of X, one needs to query Y for X.a: SPOD.
Here, Y=the compiler, X=char and a=sizeof().
There is really no point in relying on sizeof(char) equaling 1. Demonstrating one's macho-ness knowing all this little details about the language to oneself and ones peers? Optimization? Any decent compiler will optimize sizeof(char) away. Saves typing?
Hardwiring sizeof(char)==1 violates the SPOD principle.
In general, relying on this fact implicitly is even worse: it reduces the maintainability of code. Sooner or later the char in my function is going to be changed. Say to w_char_t. 'malloc((size_t) num)' is likely to be overlooked by me or, somewhat less likely, by my peers, leading to spurious problems. 'malloc(num * sizeof(char))' is likely to be caught on the first pass. Thus, using sizeof(char) makes my code more maintainable.