Worth a read: Paul Karger and Roger Schell have released a new paper
) entitled "Thirty Years Later: Lessons from the Multics
Security Evaluation." It includes an analysis of the security of the
Multics operating system, written by the same two authors and published in
1974, along with a new forward describing how things have changed in the
mean time. Their assessment of the current state of computer security is
The unpleasant conclusion is that although few, if any,
fundamentally new vulnerabilities are evident today, today's
products generally do not even include many of the Multics security
techniques, let alone the enhancement identified as essential.
That essential enhancement is the creation of verifiable "security kernel"
around which the rest of the system could be built. In 2002, very few
systems built around such kernels exist, and the authors are not very
enthusiastic about those which do exist:
...the ring 0 supervisor of Multics of 1973 occupied about 628K
bytes of executable code and read-only data. This was considered
to be a very large system. By comparison, the size of the SELinux
module with the example policy code and read-only data has been
estimated to be 1767K bytes. This means that just the example
security policy of SELinux is more than 2.5 times bigger than the
entire 1973 Multics kernel and that doesn't count the size of the
Linux kernel itself. Given that complexity is the biggest single
enemy of security, this suggests that the complexity of SELinux
needs to be seriously examined.
Or, to put things in more general terms:
Given the understanding of system vulnerabilities that existed nearly
thirty years ago, today's "security enhanced" or "trusted" systems would
not be considered suitable for processing even in the benign closed
So how do we make things better? The paper does not provide a whole lot of
new suggestions. The authors talk some about the tools that are used; for
example, Multics was mostly free of buffer overflow vulnerabilities, thanks
to the use of PL/I as the implementation language. PL/I required an
explicit declaration of the length of all strings.
The net result is that a PL/I programmer would have to work very
hard to program a buffer overflow error, while a C programmer has
to work very hard to avoid programming a buffer overflow error.
Beyond that, one gets the sense that the authors feel they said what needed
to be said thirty years ago, and they are still waiting for the message to
get across. Their prediction:
It is unthinkable that another thirty years will go by without one
of two occurrences: either there will be horrific cyber disasters
that will deprive society of much of the value computers can
provide, or the available technology will be delivered, and
hopefully enhanced, in products that provide effective security.
The authors hope for the latter scenario; so do we.
to post comments)