This bug made me think about other scenarios where optimizations can change the meaning of the code:
The program was accessing the pointer without checking for NULL. Since the result of dereferencing a null pointer is undefined, the compiler "inferred" that the pointer is never null, and optimized away the check.
That's valid logic, but there are rare cases where code intentionally uses undefined behavior. For example, OpenSSL tries to get entropy by reading an uninitialized buffer. If at any point in the future an optimization is introduced in the compiler which assumes that uninitialized variables are never accessed, bad things could happen.