Security in the 20-teens
Security in the 20-teens
Posted Feb 3, 2010 8:46 UTC (Wed) by paulj (subscriber, #341)In reply to: Security in the 20-teens by nix
Parent article: Security in the 20-teens
Note that we don't know how hard this bar would be. There are things like
'clumping' of expertise, such that that in any specialised area in a technical
field the people working in it tend to be drawn from a much smaller group than
the set of all people qualified in the field. I.e. the set people who *write*
compiler A are less independent from those who author compiler B. Hence
your assumption that the attacker would have to *hack* into the other
compiler is unsafe. Rather they could simply transition from working on A to B,
either as part of their normal career progression or at least seemingly so.
Next, as dwheeler also notes in his paper, it may be hard to obtain another
unsubverted compiler. Indeed, looking carefully at his work it seems his proofs
specifically require 1 compiler-compiler that can be absolutely trusted to
compile the general compiler correctly, as the starting point of the process. (I
thought at first that perhaps a set was sufficient, such that you didn't have
to know which compiler was trustable, as long as you could be confident at
least one compiler was). See the first sentence of 8.3 in his thesis, and the
multiple discussions of the role of a trusted compiler in the DDC process.
So this still seems to boil down to "you have to write (or verify all the source)
of your compiler in order to really be able to trust it".
I'm not poo-poo'ing the work per se, just saying this good work is slightly
marred by the overly grand claim made in its title.
Posted Feb 3, 2010 12:33 UTC (Wed)
by paulj (subscriber, #341)
[Link] (2 responses)
There's nothing to stop the author of an compiler subverting its binaries such
Thinking in terms of a compiler specifically looking for login is ignoring the huge
I.e. in this discussion we're assuming DDC means you need to subvert 2
Anyway.
Posted Feb 4, 2010 22:39 UTC (Thu)
by dwheeler (guest, #1216)
[Link] (1 responses)
An author can do that, but such an author risks instantaneous detection. The more general the triggers and payloads, the more programs that include corrupted code... and thus the more opportunities for detection.
For example, if compiling "hello world" causes a corrupted executable to be emitted, then you can actually detect it via inspection of the generated executable. Even if the system shrouds this, examining the bits at rest would expose this ruse.
Besides, as I talk about in the dissertation, the "compiler" you use does NOT need to simply include a compiler as it's usually considered. You can include the OS, run-time, and compiler as part of the compiler under test. You need the source code for them, but there are systems where this is available :-).
I have an old SGI IRIX machine that I hope to someday use as a test on a Linux distro with glibc and gcc. In this case, I have high confidence that the IRIX is as-delivered. I can feed it the source code, and produce a set of executables such as OS kernel, C run-time, and compiler as traditionally understood. If I show that they are bit-for-bit identical, then either (1) the SGI IRIX system executable suite when used as a compiler has attacks that work the same way against the Linux distro written many years later, or (2) the Linux distro is clean.
I talk about expanding the scope of the term "compiler" in the dissertation.
> I.e. in this discussion we're assuming DDC means you need to subvert 2
Sure it is, and the thesis proves it. However, be aware that I very carefully define the term "compiler". In the dissertation, a compiler is ANY process that produces an executable; it may or may not do other things. For example, a compiler may or may not include the OS kernel, runtime, etc. Anything NOT included in the compiler-under-test is, by definition, not tested. If you want to be sure that (for example) the OS kernel doesn't subvert the compilation process, then you include it as part of the compiler-under-test during the DDC process.
Posted Feb 5, 2010 19:19 UTC (Fri)
by paulj (subscriber, #341)
[Link]
If yes, I bet it's using MD5 at best. Hashes seem to have quite limited lifetimes.
If no, how can you know the system today is as it was before? If you say "cause
Security in the 20-teens
that *generally* infects all binaries it touches, such that those binaries then
infect all other binaries they touch (e.g. by hooking open), and this infection
could also introduce system-binary specific attacks as/when it detected it
was running as part of those programmes.
advances made in virus design since Thompson wrote his.
compilers. However that's not the case, nor is it even supported by the
thesis being discussed.
Security in the 20-teens
compilers. However that's not the case, nor is it even supported by the
thesis being discussed.
Security in the 20-teens
it's been sitting in my garage", then how can I repeat your result? Perhaps you
will offer a compiler verification service, but then we're still back to Thompson's
point, surely?