In this case, the semantic has _not_ changed. You just can't pretend from the user side, that nasal deamons are doing predictable things... Semantic has never included observed behavior of random faulty programs, and never will. In C, when you write a faulty construct that has undefined behavior, there is no way to predict what really will happen on a particular system. Even when you know both the exact compiler version and the exact glibc version: any unrelated change in the same file could trigger other optimisation and could change how you program will fail, or even mask completely the failure.
Standard preconditions user have to observe _are_ parts of the semantic, and neither those nor the correct behavior of the glibc has changed when you do observe them, so in no way this particular memcpy optimization is a major change. Actual preconditions are often relaxed in a given implementation, but unless it's documented in an additional standard the way they are relaxed will never be the same between two implementations or two version of the same, so nobody can pretend to reliably take advantage of undocumented relaxed preconditions.
Would that particular memcpy change be considered as a major change, _every_ glibc change would need to be considered as a major change.
In other words, when a language do define from the beginning of time that trying some operations would result in undefined behaviors (and has since always be consistent about this definition), and when a system does not provide further guarantees, then it does not matter what the observed behavior is with version X of the compiler, Y of the libC, and processor Z with die revision T -- changing any of X, Y, Z or T, or even seemingly unrelated parts of the faulty program can result in it to violently explode, and will eventually result in that because of Murphy's law. It will still result in that even if you blame glibc developers for your own mistakes.
Every C programmer should know the distinction between implementation-defined behavior, undefined behavior, and unspecified behavior -- otherwise he should rather program in an other language... You'd also better have some notions about how compilers, sometimes in a way related to associated libraries, can take advantages of explicitly undefined and unspecified behaviors to do some optimization. Stopping to do that would be hugely ridiculous, on a level as ridiculous as stopping to simplify boolean equations by taking advantage of "don't care" outputs, or even stopping to automatically factorize redundant computations.
If you know the difference, but just don't like that C has undefined behaviors, or that C compilers and other associated system stacks are targeting efficient code sometimes by taking advantages of explicitly undefined behaviors, well it's not going to change anytime. So in this case you also don't have any choice: use another language.