LWN.net Logo

LCA: Bruce Schneier on the two sides of security

By Jonathan Corbet
January 30, 2008
The conference portion of linux.conf.au opened on Wednesday morning with a keynote by Bruce Schneier. LCA is a sold-out event; in fact, there are rather more attendees than can be fit into the hall where the keynotes are held. Thus the room was packed, with the second-class citizens - those with yellow badges who put off registration until late - watching a remote feed in a separate room. Those folks may have had a more distant experience, but it was almost certainly a cooler one too.

Bruce's key point is that we need to rethink how we try to achieve security, though it took a while to explain just why that is. Security, he says, has two components:

  • The feeling of security: that which helps us to sleep well at night.

  • The reality of security: whether we are, in fact, secure.

These two aspects of the problem are entirely separate from each other, but they both have to be addressed if our security goals are to be achieved.

Security is always a set of tradeoffs which we are all making every day. As an example, consider that, in all likelihood, nobody in the audience was wearing a bulletproof vest. It's not that the vests do not work; instead, nobody feels that the cost of wearing a bulletproof vest is justified given the risk. On a bigger scale, the answer to the question of how to prevent more 9/11-like attacks is clear: ban all aircraft. In fact, that was done in the US for a few days after those attacks, but, in the longer term, that is not a tradeoff that people are willing to make.

So the fundamental question for any security tradeoff is: is it worth it? As it happens, we are quite bad at making that decision. We tend to respond to feelings rather than reality. Spectacular risks drive us more than everyday risks. We fear the strange over the familiar and the personified (think Osama bin Laden) over the anonymous. Involuntary risks are seen as being bigger than those entered into voluntarily. In the end, evolution has equipped us quite well for making tradeoffs in the small communities we lived in many, many thousands of years ago. We are less well equipped for the world we live in now.

Since we respond to feelings more than reality, there are strong economic incentives for solutions which address feelings. The result is snake-oil products and security theater. Sometimes people notice that they are being sold bad security (later Bruce mentioned a US survey which indicated that the Transportation Security Agency is now less trusted than the taxation agency), but, all too often, they don't. They have a poor understanding of the risks and the costs involved, and there are plenty of people with strong interests in confusing the issue.

The security market is a lemons market, one where buyers and sellers have asymmetric access to information. Economic research shows that, in such markets, the bad products tend to drive the good ones out of the market. There is no easy way to evaluate the work which has gone into the creation of a truly secure product, so buyers respond to other, less reliable signals. Things like price, sales claims, or the Gartner Group. These signals are sloppy and prone to manipulation. When security is outsourced to outside agencies - governments, say - the problem gets even worse.

In the business world, information eventually brings some order to a lemons market. As businesses learn about what really works, access to information evens out - though there is always a problem with very rare, high-cost events where information is not available. In the individual world, though, it is much harder, because fear plays a much bigger role.

The fact of the matter is that fear is wired deeply into how we work - it is a result of a very old part of our brain. As humans, we have the ability to override our fears when reason indicates that we should, but it is a hard thing to do. The default state is that fear rules. So this is Bruce's core point: the feelings matter. All that security theater out there is not entirely stupid; any security solution must address the fears that people feel. We must address both aspects of security.

The problem is where the feeling of security and the reality of security diverge from each other. If only feelings are addressed, security has not really been achieved. If only the reality of security is addressed, people feel insecure and may make bad decisions. Either way, the full problem has not been solved. Addressing this all-too-common problem is hard, though; Bruce knows of no better way than the spreading of good information.

Your editor's perspective follows - nothing from this point on was said during the talk. It seems that he has a point here. Consider some common situations in the free software world:

  • A large number of security updates from a distributor may be an indication that the reality of security is being achieved: problems are being found and fixed before they are exploited. But all those updates can undermine the feeling of security. The seemingly endless stream of Wireshark updates is a case in point; most of these problems are found through proactive auditing by the developers and have never been exploited by the Bad Guys. But the feeling of insecurity associated with Wireshark can be strong. This feeling can push users toward other software which, while not having that long history of security updates, is actually less secure.

  • A system running SELinux may, in fact, be highly secure. But many administrators still turn it off. SELinux does not make them feel secure because they do not understand it, and they fear (rightly or wrongly) that it will interfere with the proper operation of the system. But, by turning it off, they undoubtedly expose themselves to a number of attacks which SELinux would block.

We should hear Bruce's point and think a bit more about how we can ensure that free software creates the feeling of security - but a feeling which is backed up by real security. It's a hard problem, one which lacks technical solutions. But we'll find ourselves less secure than we would otherwise be if we do not address that side of the issue.


(Log in to post comments)

Excellent stuff!

Posted Jan 31, 2008 11:40 UTC (Thu) by pr1268 (subscriber, #24648) [Link]

Thank you, Jon, for this article. This article (and the ten-year timeline part 4, above) certainly contribute to me being a satisfied subscription customer. :-)

My own thoughts, comments, and epiphanies below:

  • I openly admit to having been loath to run Wireshark due to its well-publicized patch rate. I was unaware that this is actually the result of proactive security (instead of reactive). Assuming this is the case, the Wireshark developers should serve as a model for all application developers working on security-sensitive software.
  • My own experiences with SELinux (way back in the FC2 days) was that controlling SELinux on a system was akin to running a nuclear power plant from the control room. So many "dials", "knobs" and "switches" to control, and accidentally throwing one wrong switch could scram the whole system.
  • The Transportation Security Agency is the most egregious example of a purely reactionary, horribly-implemented sorry excuse of an ineffective government bureaucracy I've ever seen. Ironically, I'm not sure that the American public even gets any feeling of security with the TSA. I personally don't perceive that I'm any safer (in reality) due to the TSA. But, oh well, taxpayers are willing to fund the feeling of security, and even I can't blame others for these feelings after witnessing the events of 9/11.

Just my $0.02 and change...

Excellent stuff!

Posted Jan 31, 2008 11:48 UTC (Thu) by nix (subscriber, #2304) [Link]

Wireshark uses privilege separation now, so problems in the packet dissectors will only
compromise the low-privilege account used to do the packet dissection. :)

LCA: Bruce Schneier on the two sides of security

Posted Jan 31, 2008 14:04 UTC (Thu) by ikm (subscriber, #493) [Link]

My solution to the problem: watch less TV.

Mitigation strategy

Posted Feb 2, 2008 0:45 UTC (Sat) by man_ls (guest, #15091) [Link]

That will protect you from paying attention to what Schneier calls movie-plot threats (to an extent, as long as you don't go to the movies). But what I gather from his speech is that we are wired to be like this; unfortunately you cannot change this fact with any strategy.

Maybe watching less TV you can mitigate how "exploitable" this built-in fearful inner self is. TV tends to take advantage of our innermost fears, so maybe frequent watchers are sensitized to security theater. But I tend to think that we actually are wired the way Schneier says: people in other ages (even before there was TV, kids) were as fearful and exploitable as we are now.

Mitigation strategy

Posted Feb 2, 2008 4:48 UTC (Sat) by ikm (subscriber, #493) [Link]

People are just dependent on other people -- is this news, really?

Mitigation strategy

Posted Feb 2, 2008 12:19 UTC (Sat) by man_ls (guest, #15091) [Link]

Not sure what you mean, but this has nothing to do with being dependent. What is really news is that we are so bad at evaluating danger. Schneier has posted innumerable examples, let me give you one more: many people are afraid of elevators, and certainly every time an elevator hits a small bump we all shudder. However, the rational risk is close to null: how many people do you know have died in an elevator? The annual death toll is ludicrous.

If we were completely rational we would shudder every time we got into a car -- a lot of people die every day in one of those. Our physical ways of evaluating danger are fit for a bunch of monkeys wandering in a savannah, but so outdated for today's world that they are funny. (In fact there is a whole sector of the economy based on this fact: amusement parks, where you feel fear for fun.)

You can thus imagine how well suited people are to evaluate computer dangers: very badly. Extensive training and experience is required just to perform rational assessments.

Mitigation strategy

Posted Feb 3, 2008 2:58 UTC (Sun) by ikm (subscriber, #493) [Link]

If we're talking about fear here, we're always talking about dependency, because fear is
always a result of being dependent. If we're talking about inabilities to perform assessments
correctly, they are probably because of the ignorance, pure lack of information, or
disinformation, which can also be a result of being dependent (on someone who does all the
assessments instead of you), or on some other conditions.

I would note that anyway, I think our physical ways of evaluating danger are still much better
than just a bunch of worldwide statistical crap, because instead of believing in some generic
and unconditional statistical facts, we can take many things into account which are special
for each situation. E.g, the fact that the driver is sober or drunk makes much difference,
don't you think? So maybe instead of thinking about the annual death rate, you should see how
good the driver is, what kind of shape the car's in, and so on? The fear of elevators can
indeed exist just because of the inability to assess its state (what's there under the hood
anyway? do you know how this crap works? is it really safe? have you ever seen the internals?
many questions. here we're totally dependent on a good will of the people who maintain this
elevator, hence we have fears).

Mitigation strategy

Posted Feb 3, 2008 12:37 UTC (Sun) by man_ls (guest, #15091) [Link]

If we're talking about fear here, we're always talking about dependency, because fear is always a result of being dependent.
I'm not sure I follow you here. If a pack of wolves suddenly appears behind me, how is the fear I feel a result of being dependent? I'm dependent on what exactly, on the wolves? On me? On some other people appearing and saving me? If I just hear wolves howling and the hairs on the back of my head suddenly all stand up, where is the dependency? Or when I find a snake in the grass and my palms get all sweaty? When lightning strikes beside my tree? I'm just trying to understand your statement, honestly.

Precisely for this kind of fears we are very well equipped. For the rest, not so much. You argue that a bunch of statistical numbers are not meaningful, and for some perils you are right: a careful assessment is better than a generic one. But e.g. with elevators we are not talking about a high risk or a low risk; statistics tell us that casualties due to cabin falls are zero, or so close to zero that they are not meaningful. 6 passenger deaths per year in the US, mostly due to falls into an open shaft and entanglement of clothes into the door. We don't depend on the internals or the people who maintain them; we shouldn't even worry about cabin falls. In short: they are safe devices, in the same league as escalators. When the cabin bumps in its way there should be no reason to be fearful, and yet we cannot avoid our hearts racing.

In contrast, with cars we all know that regardless of the condition of driver and car we are dependent on the good will of all other drivers. Even if everything else is in perfect condition, if a drunk driver invades your lane or doesn't stop at a red light you are done. Here statistics and anecdotal evidence tell us that we are in peril every minute we pass in a car. How come we feel cozy and secure in our vehicles? Once more, bad judgment.

Mitigation strategy

Posted Feb 3, 2008 18:16 UTC (Sun) by ikm (subscriber, #493) [Link]

> I'm not sure I follow you here. If a pack of wolves suddenly appears behind me, how is the
fear I feel a result of being dependent? I'm dependent on what exactly, on the wolves? On me?
On some other people appearing and saving me? If I just hear wolves howling and the hairs on
the back of my head suddenly all stand up, where is the dependency? Or when I find a snake in
the grass and my palms get all sweaty? When lightning strikes beside my tree? I'm just trying
to understand your statement, honestly.

You feel fear because you can't do much about the situation. You're dependent on something
else which would resolve the situation. You have to hope that the wolves aren't after you,
that you're not stumping on a snake, or that the lightning isn't striking at you. You're
dependent on the whimsical mercies of a chance. You're dependent just because you don't seem
to be able to resolve the situation yourself.

> But e.g. with elevators we are not talking about a high risk or a low risk; statistics tell
us that casualties due to cabin falls are zero, or so close to zero that they are not
meaningful.

Who cares about what they say? I've met many people who were saying many different things. Why
would I want to believe? Let me have my own statistics and draw my own conclusions. What I
know is that falling from great heights is dangerous and can be lethal, that metals are very
tough, that the elevator's engines are very powerful and can easily tear me apart -- that's
what I KNOW. Don't you think it kinda contradicts what these statistics of yours say? Why
would I want to believe them then?! You can say that nuclear power plants are safe, and I
would never agree -- just because they inherently contain sources of dangers, no matter how
perfectly confined they are. Same with elevators.

While I personally don't have any elevators' fears, I assert that the line of thought I
presented is totally legitimate and has its merits, and I also think this is the way any
living being makes its assessments.

> In contrast, with cars we all know that regardless of the condition of driver and car we are
dependent on the good will of all other drivers. Even if everything else is in perfect
condition, if a drunk driver invades your lane or doesn't stop at a red light you are done.

I would disagree here. A good driver is not just someone who knows how to turn left and right
and how to tell red from green, but a person who actually knows his stuff, sees problems
coming, sees if other persons don't behave right on the road, anticipates everything and
doesn't get into problems as a result.

What you push here is that we should trust somebody who is presumably much more clever than
us. What I push is that, first of all, we should trust ourselves, and if somebody wants to
earn our trust, he should indeed earn it first. I personally see no problems in how people do
their assessments -- they might not always be right, but they are doing the right thing. If
they are wrong, it's probably lack of information -- but you can't just shove this information
down their throats and expect them to accept it. Most probably, it will be rejected as being
too different from what they know already -- and that's the right thing for them to do.

Mitigation strategy

Posted Feb 3, 2008 22:25 UTC (Sun) by man_ls (guest, #15091) [Link]

You point at a good strategy to make our irrational fears go away, or at least keep them under control: not knowing how something works can make us fearful. Therefore, learning how things work can take us a long way to controlling our fears. And that is exactly what engineers have been doing since before History started: learning how things work and then controlling them. That is how people learned to build boats and entered the sea; how they built huge temples which defied our sense of stability; and even how they built those megaliths which still amaze us.

The point is, even if you know the Bernoulli effect by heart, even if you understand the principles of aeronautics and have compiled flight crash statistics yourself, you may not be able to stop your palms from sweating the first time your plane lifts in the air. Or even the hundredth time. Still, it's not that bad for a grassland monkey! :D

Mitigation strategy

Posted Feb 4, 2008 0:40 UTC (Mon) by ikm (subscriber, #493) [Link]

All fears are very rational in the end, so if your palms still sweat after the 100th time, try
another approach at understanding what you're actually afraid of :)

Mitigation strategy

Posted Feb 7, 2008 17:09 UTC (Thu) by dkite (guest, #4577) [Link]

I wouldn't like to fly with a pilot that doesn't fear what he does. In 
other words, a mistake can kill. The fact that this fear is real and 
vivid and acted upon makes air travel as safe as it is.

People act foolishly when afraid because they don't know what to do. On 
the other hand people regularly are hurt or killed at their workplace 
because they didn't know that they should be afraid.

How many times has people in this readership been afraid to apply a 
change to a working system? The fear moves you to double check, get other 
input, set up a test system, whatever.

Personally, when I fear things that I encounter regularly, I find out 
what to do. Not to allay fear, but to know how to act safely.

Derek (who in his work is regularly in situations that could kill him)

LCA: Bruce Schneier on the two sides of security

Posted Feb 1, 2008 12:24 UTC (Fri) by kleptog (subscriber, #1183) [Link]

The thought occurred to me that perhaps this is one of the things MS did get right. By having
their Patch <day of the week> they provide the feeling that it's all planned, everything is
under control. I wonder what would have happened if the Wireshark guys had annouced they were
doing a proactive audit and that a new release would happen every first day of the month with
all the issues found in the last month.

Now, the free software community to too large to coordinate anything like that. But imagine if
a distributor decided that all non-critical security updates would happen only on wednesdays,
would people "feel" safer due to it being planned, even though you're sacrificing a little
security (a few days delay).

LCA: Bruce Schneier on the two sides of security

Posted Feb 1, 2008 21:33 UTC (Fri) by jamesh (guest, #1159) [Link]

If you are going to delay disclosure of vulnerabilities, then you need to make sure you aren't
leaking information about those vulnerabilities before that date.

If the project uses CVS or Subversion, then there is no reason that the bad guys wouldn't be
watching the commits.  The contents of the commits may be enough for such a person to deduce
the vulnerability and be able to exploit it in the window the developers have provided (in
addition to the time it takes for people to patch their systems).

So you really want to delay exposure of the commits to the same point where you expose the
vulnerabilities.  With a public CVS/Subversion server, that probably means not committing the
work until that point which is not particularly helpful if you have multiple vulnerabilities
to track.

If you really do want to batch up the security vulnerabilities, perhaps one of the distributed
VCS systems would be appropriate.  The ability to perform disconnected development also means
that it is possible to keep a line of development private but disclose it at a later date with
full history, which is what is wanted here.

SELinux: Tradeoff and "felt" vs. "real" security

Posted Feb 1, 2008 16:27 UTC (Fri) by thias (guest, #425) [Link]

Hi Jonathan,

there are at least two different angles regarding SELinux in the context of Bruce LCA keynote:
The nature of security being a tradeoff and the difference between "felt" and "real" security.

If you consider security as a tradeoff, then the fact that SELinux is rather infrequently
deployed is at least a hint toward SELinux being a bad tradeoff: Most people (and I mean
professional sysadmins) tend to think, that the added complexity of SELinux is likely not the
cause of more security - quite the converse.  If you (as a sysadmin) do not understand how
things work an why, you will make bad decisions, and that will make your "real" security
worse.

That does not mean that SELinux is "not secure" - if you are in need of a bulletproof vest,
then please use it!  You have to learn all the necessary stuff about SELinux and you have to
deploy it in a thought out manner, and it will increase your security (considerably!).  But
for most security needs, the tradeoff is bad.

You state that "a system running SELinux may, in fact, be highly secure".  I would like to
stress the "may": You just need a small error in your ACLs (which is easily done and not so
easy to detected) or in one of the many SELinux knobs to play with, and your security turns
from "real" to purely "felt".  And while "felt" security is relevant as Bruce points out,
"felt" without "real" is a real problem :)


regards, thias

PS: I'm a "first day" subscriber and a quite happy one!  Since the topics of Bruce keynote
touch my professional habitat, I just felt the need to comment for the first time :)  Please
keep up the good work at LWN.

Lemons vs silver bullets ?

Posted Feb 3, 2008 13:50 UTC (Sun) by anchorsystems (subscriber, #40101) [Link]

There is a good blog post at Financial Cryptography that goes into
more detail than Bruce regarding the "lemons" market in security:

https://financialcryptography.com/mt/archives/000896.html

LCA: Bruce Schneier on the two sides of security

Posted Feb 13, 2008 13:39 UTC (Wed) by ekj (subscriber, #1524) [Link]

SELinux is a bad example.

I, like most sysadmins I know have been turning it off. But not for any reason of irrational fear like you suggest, but rather precisely for the reason one should do it, according to the Schneier you quote: For many people it just plain isn't worth it.

I assume I'm some uncertain amount safer when I have it turned on, hard to say precisely how much, but it'll certainly have some positive effect, prevent some types of attack from succeeding.

But I -KNOW- from personal first hand experience that:

  • It is complex. Complexity is -bad- for security.
  • I don't understand it. Not even after having spent probably a week spesifically trying to understand it. Possibly, I'm just stupid, but that's the way it is.
  • It takes a lot of time to configure it correctly for any non-trivial setting.
  • Having it turned on causes a lot of headaches with stuff that otherwise "just works".
Put differently: The COST of running with SELinux is known and HIGH. The benefit is unknown, but assumed moderately positive. Not enough positive to defend turning it on though.

Being more secure does not help if the added work is MORE than the gain in security. I don't use SELinux for the same reason I wouldn't support banning all airplanes; both would probably improve security, but the cost is to high.

Copyright © 2008, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds