Why would most languages have the integer overflow problem? You can detect an integer overflow
at runtime, and do something intelligent, like throw an exception. Even C as standardized
doesn't let you overflow an integer; it's undefined behavior, but wrap-around semantics
assumed so often that optimizing it breaks many programs.