In summary, the advisory states:
Application developers and vendors of large codebases that cannot be audited for use of the defective length checks are urged to avoiding [sic] the use of gcc versions 4.2 and later.
This advisory has disappointed a number of GCC developers, who feel that their project has been singled out in an unfair way. But the core issue is one that C programmers should be aware of, so a closer look is called for.
To understand this issue, consider the following code fragment:
char buffer[BUFLEN]; char *buffer_end = buffer + BUFLEN; /* ... */ unsigned int len; if (buffer + len >= buffer_end) die_a_gory_death("len is out of range\n");
Here, the programmer is trying to ensure that len (which might come from an untrusted source) fits within the range of buffer. There is a problem, though, in that if len is very large, the addition could cause an overflow, yielding a pointer value which is less than buffer. So a more diligent programmer might check for that case by changing the code to read:
if (buffer + len >= buffer_end || buffer + len < buffer) loud_screaming_panic("len is out of range\n");
This code should catch all cases; ensuring that len is within range. There is only one little problem: recent versions of GCC will optimize out the second test (returning the if statement to the first form shown above), making overflows possible again. So any code which relies upon this kind of test may, in fact, become vulnerable to a buffer overflow attack.
This behavior is allowed by the C standard, which states that, in a correct program, pointer addition will not yield a pointer value outside of the same object. So the compiler can assume that the test for overflow is always false and may thus be eliminated from the expression. It turns out that GCC is not alone in taking advantage of this fact: some research by GCC developers turned up other compilers (including PathScale, xlC, LLVM, TI Code Composer Studio, and Microsoft Visual C++ 2005) which perform the same optimization. So it seems that the GCC developers have a legitimate reason to be upset: CERT would appear to be telling people to avoid their compiler in favor of others - which do exactly the same thing.
The right solution to the problem, of course, is to write code which complies with the C standard. In this case, rather than doing pointer comparisons, the programmer should simply write something like:
if (len >= BUFLEN) launch_photon_torpedoes("buffer overflow attempt thwarted\n");
There can be no doubt, though, that incorrectly-written code exists. So the addition of this optimization to GCC 4.2 may cause that bad code to open up a vulnerability which was not there before. Given that, one might question whether the optimization is worth it. In response to a statement (from CERT) that, in the interest of security, overflow tests should not be optimized away, Florian Weimer said:
Joe Buck added:
It is clear that the GCC developers see their incentives as strongly pushing toward more aggressive optimization. That kind of optimization often must assume that programs are written correctly; otherwise the compiler is unable to remove code which, in a correctly-written (standard-compliant) program, is unnecessary. So the removal of pointer overflow checks seems unlikely to go away, though it appears that some new warnings will be added to alert programmers to potentially buggy code. The compiler may not stop programmers from shooting themselves in the foot, but it can often warn them that it is about to happen.
|Created:||April 11, 2008||Updated:||January 7, 2009|
|Description:||From the Gentoo advisory: am-utils creates temporary files insecurely allowing local users to overwrite arbitrary files via a symlink attack.|
|Created:||April 15, 2008||Updated:||June 18, 2009|
|Description:||From the CVE entry: libpng 1.0.6 through 1.0.32, 1.2.0 through 1.2.26, and 1.4.0beta01 through 1.4.0beta19 allows context-dependent attackers to cause a denial of service (crash) and possibly execute arbitrary code via a PNG file with zero length "unknown" chunks, which trigger an access of uninitialized memory.|
|Package(s):||opera||CVE #(s):||CVE-2008-1761 CVE-2008-1762 CVE-2008-1764|
|Created:||April 15, 2008||Updated:||April 16, 2008|
|Description:||From the CVE entries: Opera before 9.27 allows remote attackers to cause a denial of service (crash) and possibly execute arbitrary code via a crafted newsfeed source, which triggers an invalid memory access. Opera before 9.27 allows remote attackers to cause a denial of service (crash) and possibly execute arbitrary code via a crafted scaled image pattern in an HTML CANVAS element, which triggers a memory corruption. Unspecified vulnerability in Opera for Windows before 9.27 has unknown impact and attack vectors related to "keyboard handling of password inputs."|
|Created:||April 15, 2008||Updated:||July 30, 2009|
|Description:||From the CVE entry: Integer signedness error in the zlib extension module in Python 2.5.2 and earlier allows remote attackers to execute arbitrary code via a negative signed integer, which triggers insufficient memory allocation and a buffer overflow.|
|Created:||April 11, 2008||Updated:||May 9, 2008|
|Description:||From the Debian advisory: Sebastian Krahmer discovered that an integer overflow in rsync's code for handling extended attributes may lead to arbitrary code execution.|
|Created:||April 15, 2008||Updated:||March 25, 2009|
|Description:||From the Ubuntu advisory: It was discovered that Squid did not perform proper bounds checking when processing cache update replies. A remote authenticated user may be able to trigger an assertion error and cause a denial of service. This vulnerability is due to an incorrect fix for CVE-2007-6239.|
Page editor: Jake Edge
Next page: Kernel development>>
Copyright © 2008, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds