|
|
Subscribe / Log in / New account

Gilmore on the "computer health certificate" plan

From:  John Gilmore <gnu-AT-toad.com>
To:  cryptography-AT-metzdowd.com, Ray Dillinger <bear-AT-sonic.net>, gnu-AT-toad.com
Subject:  Re: Computer "health certificate" plan: Charney of DoJ/MS
Date:  Thu, 07 Oct 2010 03:16:28 -0700
Message-ID:  <201010071016.o97AGV1J003361@new.toad.com>
Archive‑link:  Article

  http://www.bbc.co.uk/news/technology-11483008

BBC reports that Microsoft's idea seems to be that if your computer
doesn't present a valid "health certificate" to your ISP, then your
ISP wouldn't let it be on the net, or would throttle it down to a tiny
bandwidth.  The Health Certificate would, of course, be provided by
Intel and Microsoft, but only from machines with Treacherous Computing
hardware, after examining everything in your computer to make sure
Intel and Microsoft approve of it all.  (This is the same DRM
procedure they've been pushing for a decade -- the system would
cryptographically "attest" to arbitrary information about what's
running in your machine, using proprietary hardware and software you
have no control over and no ability to inspect, and the outsiders
would decide not to deal with you if they didn't like your
attestation.  The only change is that they've revised their goal from
"record companies won't sell you a song if you won't attest" to
"nobody will give you an Internet connection if you won't attest".)
Homebrew computers and Linux machines need not apply.  They don't
explain how this would actually be implemented -- in Ethernet
switches?  In DSL routers or NAT boxes?  In ISP servers?  They're not
quite sure whether the health certificate should *identify* your
device, but they're leaning in that direction.  But they're quite sure
that it all needs doing, by voluntary means or government coercion,
and that the resulting info about your "device health" should be
widely shared with governments, corporations, etc.

This proposal comes from Microsoft VP Scott Charney, well known to
many of us as the former Chief of the Computer Crime and Intellectual
Property Section in the Criminal Division of the U.S. Department of
Justice, or as he puts it, "the leading federal prosecutor for
computer crimes from 1991 to 1999".  He joined Microsoft in 2002 and
is running their "Treacherous Computing" effort as well as several
other things.

The vision that Charney is driving is described in six papers
here (one of which is the one the BBC is covering):

  https://www.microsoft.com/mscorp/twc/endtoendtrust/vision/

He's pushing the "Public Health Model" because public health
bureacracies have huge, largely unchecked powers to apply force to
people who they disfavor.  Along those lines, he converts the public
health departments' most draconian measure, used only in extreme
circumstances - quarantine - into the standard procedure for his New
Internet: quarantine EVERY device -- unless and until it "proves" that
it should evade the quarantine.

In his "Establishing End to End Trust" paper (another of the six), he
lays out the computer security problem and decides that defense isn't
enough; authentication, identification, and widespread auditing are
the next step in solving it.  He concludes:

  As we become increasingly dependent on the Internet for all our
  daily activities, can we maintain a globally connected, anonymous,
  untraceable Internet and be dependent on devices that run arbitrary
  code of unknown provenance?  If the answer to that is "no," then we
  need to create a more authenticated and audited Internet environment
  -- one in which people have the information they need to make good
  trust choices.

He makes halfhearted attempts to address privacy and anonymity issues,
but ultimately decides that those decisions will be made somewhere
else (not by the user or consumer, of course).  His analysis
completely ignores the incentives of monopoly hardware and software
providers; of corrupt governments such as our own; of even honest
governments or citizens desiring to act secretly or without
attribution; of advertisers; of the copyright mafia; of others
actively hostile to consumer and civil freedom; and of freedom-
supporting communities such as the free software movement.  It ignores
DRM, abuse of shrink-wrap contracts, copyright maximalization,
censorship, and other trends in consumer abuse.  It's designed by a
career cop/bureaucrat/copyright-enforcer and implemented by a
monopolist - hardly viewpoints friendly to freedom.

I'd recommend merely ignoring his ideas til they sink like a stone.
But it looks like Intel and Microsoft are actively sneaking up on the
free Internet and the free 10% of the computer market by building in
these techniques and seeking partnerships with governments, ISPs,
telcos, oligopolists, etc to force their use.  So some sort of active
opposition seems appropriate.

Perhaps Linux systems should routinely delete all the
manufacturer-provided device attestation and identification keys from
every Treacherous Computing device they ever boot on.  (This won't
affect keys that the *user* stores in their TPM if they want to.)  If
a significant part of the Internet is physically incapable of
attesting to the monopolists, ISPs will never be able to require such
attestation.  I've certainly deleted those keys on my own PCs that
came with such crap -- so far, no downside.  Let's keep it that way.

Security measures should report to the system owner -- not to the ISP
or the manufacturer.  The owner of the machine should determine which
software it's appropriate for it to run.  This whole idea of
collectivist "approval" of your computing environment gives me the
willies.  In their model, you'd be perfectly free to write a new piece
of software, sort of the way you are perfectly free to design and
build a new house.  First you spend tens of thousands of dollars on a
government-licensed architect and a similarly licensed structural
engineer.  Then you submit your plans to a bureaucrat, and wait.  And
wait.  And they demand changes.  And you negotiate, but they really
don't care what you want; you NEED their approval.  So you wait some
more, then give in to whatever they want.  Don't forget to use union
labor to build your software -- it'll be required.  And any bureaucrat
can come by after an alcoholic lunch to "inspect" your software -- and
if you don't properly kiss their ass and/or bribe them, their "red
tag" will physically keep your software from being usable on every
computer.  Periodically politicians will write bizarre new
requirements into the rules (e.g. you can't use PVC pipe because that
would put local plumbers out of work; or you can't use portable
languages because then your software might run on competing platforms),
and you'll just have to follow orders.  At least that's how the
Planning Department and Building Inspection Department work here in
San Francisco.  I don't see why a software monopoly enforced from the
top would work any different.  Writing software for any Apple platform
except the Mac is already like that.

	John Gilmore




to post comments

Gilmore on the "computer health certificate" plan

Posted Oct 8, 2010 15:35 UTC (Fri) by fuhchee (guest, #40059) [Link] (7 responses)

Perhaps a scheme similar to email antispam DNSBLs could be usefully applied to public shunning of likely-infected machines, just as it is being considered to allow shunning of suspected junk DNS domains.

There is another, simpler way

Posted Oct 8, 2010 16:24 UTC (Fri) by marcH (subscriber, #57642) [Link] (6 responses)

Gilmore's essay is entertaining, but it could indeed be more convincing if it suggested simpler alternatives. Like for instance forcing ISPs to monitor activity on port 25 and disconnect spammers based on such simple evidence.
Or like this, happening already: http://tinyurl.com/38dsuxv

Simpler still

Posted Oct 8, 2010 16:57 UTC (Fri) by mgb (guest, #3226) [Link]

Simply ban the crummy software responsible for 99% of infections, attacks, copyright infractions, stolen credit card numbers, and lost data.

It's called "Windows".

There is another, simpler way

Posted Oct 8, 2010 19:12 UTC (Fri) by freemars (subscriber, #4235) [Link] (1 responses)

Or blocking port 25 and port 80 out by default and turning them on by written-on-paper-and-signed request.

There is another, simpler way

Posted Oct 8, 2010 23:27 UTC (Fri) by marcH (subscriber, #57642) [Link]

A number of consumer ISPs are already blocking port 25 by default (except to their own mail server, for the few left who do not use some webmail)

Please, do not contribute to redirection-hell

Posted Oct 13, 2010 9:47 UTC (Wed) by dgm (subscriber, #49227) [Link] (2 responses)

Completely off-topic:

It's better if you do not use URL shorteners unless you really need to.

Some people like me like to se where one will land when a link is clicked, and avoid to do so in case it is not appropiate. There are many other concerns about them, including security and privacy, as to better avoid them if at all possible.

Please, keep them for when they make sense, Tweeter for example.

Thanks.

Please, do not contribute to redirection-hell

Posted Oct 13, 2010 9:51 UTC (Wed) by dgm (subscriber, #49227) [Link] (1 responses)

I mean Twitter, of course.

return -ENOCOFFEE;

please, think of the future

Posted Oct 13, 2010 19:47 UTC (Wed) by dmarti (subscriber, #11625) [Link]

And when a domain squatter picks up your favorite URL shortener domain, or someone wins it in a lawsuit, you will become a web spammer retroactively.

Gilmore on the "computer health certificate" plan

Posted Oct 8, 2010 16:00 UTC (Fri) by davide.del.vento (guest, #59196) [Link] (4 responses)

> Perhaps Linux systems should routinely delete all the
> manufacturer-provided device attestation and identification keys
> from every Treacherous Computing device they ever boot on.

What is he talking about?? I thought this is a (dumb) proposal from Wintel, not a reality...

> I've certainly deleted those keys on my own PCs that
> came with such crap -- so far, no downside.
> Let's keep it that way.

Even more puzzled :-(

Middleboxes

Posted Oct 8, 2010 16:38 UTC (Fri) by dmarti (subscriber, #11625) [Link] (3 responses)

So every vendor of access points and NAT devices is going to have to get a certificate for their products as well? If you have a middlebox in your house, that's what's "on the Internet" as far as your ISP knows.

The whole proposal doesn't seem very well thought out -- more from the point of view of security as a way to make people buy stuff.

(What the world really needs is a free-of-charge program to run on Approved Proprietary OSs that will make a trivial modification to their system software, just enough to make it _not_ pass a Treacherous Computing check.)

Middleboxes

Posted Oct 8, 2010 19:43 UTC (Fri) by deepfire (guest, #26138) [Link]

You'd be astonished as to how many people will use that utility.

My guess is something around 0.1% at best.

The reality is that people won't know and won't care, because the TPTB will smooth the path for the regular Joes, and they won't even need to notice.

Grass roots don't work. If they do, it means either complete incompetence on the part of TPTB, or that these grass-roots are entirely tamed.

Middleboxes

Posted Oct 11, 2010 3:04 UTC (Mon) by jjs (guest, #10315) [Link] (1 responses)

Of course it's well thought out. You just have to understand who is doing the thinking.

Basic requirements (from their standpoint)

1. Must be running the latest version of Windows. If not, automatically download & charge the credit card (thus a requirement for a support contract).

2. Must be running the latest MS antivirus. If not, automatically download & charge the credit card.

3. Must be running the latest MS software. If any older versions are found, automatically download & charge the credit card.

4. All software must be MS approved. Only MS software is approved. If any non-MS software is found, automatically delete and replace with the latest MS software. Automatically charge the credit card.

4a. If any software is found that is not MS software, it obviously is a virus. Automatically remove & charge the credit card for "maintenance."

See how easy it is? Simply provide MS the right to charge your credit card whenever they want, and they'll be happy.

Middleboxes

Posted Oct 11, 2010 5:19 UTC (Mon) by nicooo (guest, #69134) [Link]

> You just have to understand who is doing the thinking.

Based on those requirements, Steve Jobs.

Gilmore on the "computer health certificate" plan

Posted Oct 8, 2010 16:34 UTC (Fri) by nevyn (guest, #33129) [Link] (30 responses)

In their model, you'd be perfectly free to write a new piece of software, sort of the way you are perfectly free to design and build a new house.

This is a terrible argument to make, and not at all true. You are free to build your own house, and many people do it. You do have to obey rules though, and yes it's often for the protection of other people. There used to be no rules for building houses or driving a car or... but that doesn't work above a certain scale.

In fact with housing it's even worse because most of those rules are there because of mortgage and insurance companies don't want to lend money to, or insure money for, people that buy a death trap. Without those rules, and assurances, nobody would lend/insure and thus. almost nobody could buy.

A similar thing has happened with email already, 12+ years ago I got a domain and used it for personal email all the time and I would get 1-2 personal emails a week ... now almost nobody uses it, and 10-20 spam emails a day get through SA. So I now also have a gmail account, which gets no spam (and a little personal email) and facebook which gets no spam but some personal stuff. Both have huge significant rules.

Yes, this idea from MS sounds like a pretty bad idea (shockingly), but the argument that we'll let anyone do anything on the internet is old history and pretending we should is insane.

Gilmore on the "computer health certificate" plan

Posted Oct 8, 2010 18:28 UTC (Fri) by brouhaha (subscriber, #1698) [Link] (21 responses)

You do have to obey rules though, and yes it's often for the protection of other people.

When you build your own house, it isn't that difficult to demonstrate to the building inspector that it meets code, and get it approved. While there are a lot of restrictions you have to follow to meet code, they are well-known, and ultimately not terribly onerous.

Using teh building code for public safety as an analogy for some form of software certification breaks down because:

  1. If it is easy (or even possible) to prove that your software meets the "building code", then that building code isn't doing anything useful to promote public (internet) safety.
  2. If the "building code" actually does do anything that could guarantee that your software can't compromise public safety, then it will be impossible to prove that your software complies with the building code.
Of course, this works both ways. If the "building code" is vendor-neutral, then it can't actually provide any useful guarantee of public safety, because Microsoft software products wouldn't be able to meet it any more than custom or open-source software would.

Obviously, then, the "building code" for software would really only serve as a set of hoops the software vendor has to jump through, to set an artificial entry barrier to the software market, and exclude small developers and open-source software. It won't provide any end-user benefit whatsoever.

Gilmore on the "computer health certificate" plan

Posted Oct 8, 2010 19:37 UTC (Fri) by halfpast (guest, #70519) [Link] (20 responses)

Actually, I think the building code analogy works pretty well.

> Using teh building code for public safety as an analogy for some form of software certification breaks down because:

> If it is easy (or even possible) to prove that your software meets the "building code", then that building code isn't doing anything useful to promote public (internet) safety.

But you said that it "isn't that difficult" to meet the building code, and that the restrictions are well-known. Wouldn't they also be well-known if there was a computing equivalent?

> If the "building code" actually does do anything that could guarantee that your software can't compromise public safety, then it will be impossible to prove that your software complies with the building code.

Building codes don't guarantee against anything. That's not their purpose. Building codes can't stop an arsonist from setting your house on fire. They are there to significantly reduce the chances of your house being damaged or destroyed.

> Of course, this works both ways. If the "building code" is vendor-neutral, then it can't actually provide any useful guarantee of public safety, because Microsoft software products wouldn't be able to meet it any more than custom or open-source software would.

I'm confused. This implies that hardware and software made by different vendors can't talk to each other across a network, which is clearly not the case. In fact, you could easily argue that networking protocols are a form of building code, with the goal of interoperability.

> Obviously, then, the "building code" for software would really only serve as a set of hoops the software vendor has to jump through, to set an artificial entry barrier to the software market, and exclude small developers and open-source software. It won't provide any end-user benefit whatsoever.

Implementing a regular building code involves jumping through hoops created by an artificial entry barrier. But it's useful. The supposed end-user benefit would be a more secure environment. I don't even know what to think about your claim of "excluding small developers and open-source software".

I'm tired of seeing arguments that go like this: "But it's possible that they could screw up the implementation, so they shouldn't bother!" or "Hmm, sounds like a good idea, but [Microsoft | Oracle | Google] proposed it, and they probably have some ulterior motive, so it shouldn't be done!". The stated idea is interesting and merits investigation.

Gilmore on the "computer health certificate" plan

Posted Oct 8, 2010 20:10 UTC (Fri) by brouhaha (subscriber, #1698) [Link] (3 responses)

Wouldn't they also be well-known if there was a computing equivalent?

Possibly well-known, but they wouldn't actually accomplish anything. It is basically impossible to prove anything interesting or useful about the behavior of any piece of non-trivial software.

Building codes don't guarantee against anything.

Not guarantee in the sense that someone will pay you money if it fails. But the whole point of the building code is that if you follow it, the risk that your building will fail in certain known ways becomes negligible. That's what I'm claiming can't be done for software, in any meaningful sense.

This implies that hardware and software made by different vendors can't talk to each other across a network, which is clearly not the case. In fact, you could easily argue that networking protocols are a form of building code, with the goal of interoperability.

You can't prove that two pieces of hardware and software made by different vendors can talk to each other across a network, other than by demonstrating it some finite number of times. This doesn't prove that it will always work.

If you wanted a software "building code" that would ensure that the software couldn't participate in a botnet, and you demonstrated 37 times (or 37 billion times) that you'd connected it to the network and it didn't participate in a botnet, that isn't going to convince anyone.

This isn't because of any special characteristic of software. It's because software has so many orders of magnitude more "parts" than a house. If a house was as complex as Microsoft Windows (hundreds of millions of "parts" that have to be assembled correctly), it wouldn't be possible to have a building code that had any practical value.

We can't even prove whether a program will halt when given a particular input, much less anything actually useful about the behavior of a program.

Implementing a regular building code involves jumping through hoops created by an artificial entry barrier.

It's not an artificial entry barrier. It has precise, documented objectives that are not biased in favor of any particular person or business, and that don't prevent you from building a house while allowing a company with deep pockets to do so.

Having a software "building code" would either prevent you from writing your own software (if the "software building code" was actually of any use), because you couldn't afford to get your software certified to meet the code, or would be so lax as to let you build and certify your own software but provide no useful benefit.

Gilmore on the "computer health certificate" plan

Posted Oct 8, 2010 20:32 UTC (Fri) by deepfire (guest, #26138) [Link]

> This isn't because of any special characteristic of software. It's because software has so many orders of magnitude more "parts" than a house. If a house was as complex as Microsoft Windows (hundreds of millions of "parts" that have to be assembled correctly), it wouldn't be possible to have a building code that had any practical value.

I like how Dijkstra puts it, in EWD 1036:

> From a bit to a few hundred megabytes, from a microsecond to a half an hour of computing confronts us with completely baffling ratio of 10^9! The programmer is in the unique position that his is the only discipline and profession in which such a gigantic ratio, which totally baffles our imagination, has to be bridged by a single technology. He has to be able to think in terms of conceptual hierarchies that are much deeper than a single mind ever needed to face before.
> Compared to that number of semantic levels, the average mathematical theory is almost flat.
> By evoking the need for deep conceptual hierarchies, the automatic computer confronts us with a radically new intellectual challenge that has no precedent in our history.

Gilmore on the "computer health certificate" plan

Posted Oct 12, 2010 18:26 UTC (Tue) by halfpast (guest, #70519) [Link] (1 responses)

> [A computing equivalent would be] possibly well-known, but they wouldn't actually accomplish anything. It is basically impossible to prove anything interesting or useful about the behavior of any piece of non-trivial software.

Who says you have to prove anything in order for it to be useful? I don't have to prove that my lunch isn't poisoned before I attempt to eat it, and eating is useful. I just have to operate on the reasonable assumption that no one is trying to poison me today. Most every program written has not (and probably cannot) "prove(n) anything interesting or useful" about itself. That doesn't mean one shouldn't try to mitigate any undesirable effects, even if you can't completely stop those effects.

> But the whole point of the building code is that if you follow it, the risk that your building will fail in certain known ways becomes negligible. That's what I'm claiming can't be done for software, in any meaningful sense.

Meaningful to you? I see you would like things to be proven to you, but the behavior of complex systems can't be proven to operate under a restricted set of behaviors, at least in reality. Here's an easy refutation to your assertion that "computing building codes" can't mitigate risk in a meaningful sense. Security best practices are a form of voluntary "computing building codes". Are you saying that not using a firewall, allowing remote root logins, and turning off the script that disallows multiple failed login attempts would not increase the vulnerability of that machine in "any meaningful sense"?

> You can't prove that two pieces of hardware and software made by different vendors can talk to each other across a network, other than by demonstrating it some finite number of times.

I also can't prove that the sun will rise tomorrow. You wrote this line with the inductive assumption that, since your last message was posted to the thread, this one would too.

> If you wanted a software "building code" that would ensure that the software couldn't participate in a botnet, and you demonstrated 37 times (or 37 billion times) that you'd connected it to the network and it didn't participate in a botnet, that isn't going to convince anyone.

"We can't secure our machines against all attacks, so let's not try to secure our machines against any attacks."

> [Implementing a regular building code is] not an artificial entry barrier. It has precise, documented objectives that are not biased in favor of any particular person or business, and that don't prevent you from building a house while allowing a company with deep pockets to do so.

Of course it's an artificial entry barrier. You don't need to follow a code to build a house, per se. But once the gov't finds out you didn't follow that code, you can be fined, forced to follow the code, and be charged for the costs of making the changes. There doesn't have to be a code to build a house -- hence, it's an artificial entry barrier.

> Having a software "building code" would either prevent you from writing your own software (if the "software building code" was actually of any use), because you couldn't afford to get your software certified to meet the code, or would be so lax as to let you build and certify your own software but provide no useful benefit.

I have news for you, the compiler is software that builds code. Unless you're writing machine language by hand, your code is at the whim of a piece of software.

Gilmore on the "computer health certificate" plan

Posted Oct 12, 2010 21:39 UTC (Tue) by brouhaha (subscriber, #1698) [Link]

I don't have to prove that my lunch isn't poisoned before I attempt to eat it, and eating is useful. I just have to operate on the reasonable assumption that no one is trying to poison me today.

Today we operate on the assumption that programs are written in a reasonable manner, and yet you can see what the result of that assumption is.

However, even when programmers try really hard to write secure programs, they still fail. Having a "building code" with a few dozen checklist items is not going to result in programs that are secure. The "building code" will just give a false sense of security to people buying programs that are approved.

but the behavior of complex systems can't be proven to operate under a restricted set of behaviors

Thank you. You've made my point for me.

The real-world building code actually does have useful results, because a building is a much simpler system than a typical software program, and thus it is easier to make assertions about the behavior of the building.

"We can't secure our machines against all attacks, so let's not try to secure our machines against any attacks."

What would you put into a software "building code" that would make it secure against even a small number of attacks?

Furthermore, it is questionable that anything is gained by preventing some small number of attacks. It is an "asymmetric warfare" problem. You can put a lot of effort into preventing one class of attacks, and that will just make the attackers shift to doing a different class of attack. This is unlike a real-world building code, where a rule that makes the building robust in the face of gravity does not cause gravity to try to work differently.

I have news for you, the compiler is software that builds code. Unless you're writing machine language by hand, your code is at the whim of a piece of software.

It's not news to me. Once again, you've made my point for me.

And it's worse than that. Modern x86 processors have an utterly insane level of complexity, and we can't realistically be sure that they work correctly either. The famous Pentium division bug, and the more recent "F00F" bug are evidence of that. As far as we know, there aren't any x86 processor bugs that have provided exploits other than denial of service, but that doesn't mean that there aren't any.

Gilmore on the "computer health certificate" plan

Posted Oct 8, 2010 20:14 UTC (Fri) by deepfire (guest, #26138) [Link] (15 responses)

> But you said that it "isn't that difficult" to meet the building code, and
> that the restrictions are well-known. Wouldn't they also be well-known if
> there was a computing equivalent?

Do tell us.

The poster you respond to alludes (well, I think he does) to the fact that even such a seemingly trivial property of a program, namely whether it terminates or not, is not computable.

If such a trivial property of a program cannot be formally established, what can be said about any kind of security guarantees?

Right, that you cannot establish them at all. Why?

Cannot be done automatically, cannot be done manually.

Basically, factor in 1. code size, and 2. rate of changes. These two alone defeat any sensible attempt to obtain any _meaningful_ level of manual quality assurance.

Please, understand that anything short of a formal proof is a tool in the hands of manipulators -- you cannot twist facts, you need to involve uncertain matters for that.

halting problem not the issue

Posted Oct 9, 2010 11:06 UTC (Sat) by tialaramex (subscriber, #21167) [Link] (6 responses)

To be fair the halting problem is theoretical, it is an objection to some real world ideas but not this one. An algorithm which, for any program either says "It halts" or "I don't know" is quite practicable, and that resembles the building code which is not an exhaustive list of safe building practices but instead a restrictive list of allowed practices which are thought to be safe. So "I don't know" is enough reason to exclude things.

And we already act like this: a modern Linux system typically forbids self-modifying code, because it's more likely to be a security problem. You can re-enable it, trivially, but by default your program code is immutable, because that turns out to be a sensible restriction.

It is possible that the social fix for this problem will be annoying for the technically capable, like most LWN readers. In my country for example, the problem of DIYers killing or injuring not only themselves but visitors and subsequent owners of the same building through shoddy electrical work, led to rules prohibiting you from even replacing a wall socket without getting it checked by an approved electrician. So even though I know perfectly well how to safely do this work, legally I must either get it checked or put myself through the classes to become a qualified electrician and sign my own paperwork. But on the upside, I can feel more certain that e.g. a decorative metal socket in a house I visit isn't accidentally live.

halting problem not the issue

Posted Oct 10, 2010 0:46 UTC (Sun) by nix (subscriber, #2304) [Link] (5 responses)

So "I don't know" is enough reason to exclude things.
But (pace Rice's theorem), any prover you can come up with is going to say 'I don't know' about anything larger than a tiny toy program. That's not much use. You certainly can't automatically prove useful things about significant programs using automated systems unless those programs have been written with such proof in mind.

halting problem not the issue

Posted Oct 10, 2010 1:52 UTC (Sun) by brouhaha (subscriber, #1698) [Link]

It's not just a matter of automatic proof systems; you can't *manually* prove any useful characteristics of a non-trivial program either.

halting problem not the issue

Posted Oct 10, 2010 10:19 UTC (Sun) by tialaramex (subscriber, #21167) [Link] (3 responses)

I think you think your objection is stronger than it actually is.

Yes, such a rule would inevitably require that people write programs which are suitable for analysis. This would be a continuation of a long trend -- once upon a time a program to concatenate output streams that relied on the memory layout of the specific machine it ran on, modified its own loop to terminate, and stored intermediate state in the output file, replacing it at the last minute, might have been called "a clever hack" but today we recognise it as unmaintainable nonsense, and nobody cares if it is 1% faster and saves 400 bytes of RAM.

The general claim that "nothing useful" can be analysed is being overstated, mathematicians like Rice found that nothing _interesting_ to them is subject to such proofs. But the set they consider "trivial" because it is theoretically uninteresting includes lots of everyday work.

I have a dozen scripts on this machine that process log files. They all loop over the content of the file, reading a line at a time and do a bunch of things which we know (intuitively, and no doubt one of my old theory lecturers has proved this in theory) terminate for each line (e.g. adding something to a hash table, finding something in the hash table, incrementing an integer). When all the lines are read, they print summary results and exit.

These programs clearly terminate. To Rice they are "trivial" to a business they are "vital".

I am somewhat being the Devil's advocate here, but mostly I object to attempts to leverage a theoretical result to argue against a practical restriction that could, and may one day have to, be implemented. So long as people believe that this is all theoretically impossible they won't lift a finger to help. In _theory_ W^X makes no difference, a Turing machine is a Turing machine - but in practice the difference was noticeable. An flood of "smash buffer, execute code" exploits slowed to a trickle.

halting problem not the issue

Posted Oct 10, 2010 10:30 UTC (Sun) by brouhaha (subscriber, #1698) [Link] (1 responses)

Ignoring for the moment that you probably can't prove that your scripts produce the correct result, how would you prove that they are secure, i.e., that there isn't some malformed input that could cause the script to perform undesired actions (similar to SQL injection attacks)?

If you can't prove that your simple scripts are secure, how will you be able to prove any useful theorem about the security of any non-trivial program? (Note that to prove that a script is secure, you have to prove that the script interpreter is secure. If you only prove that the script is secure based on the documented semantics of the language it is written in, then in practice you don't know that the script running on the interpreter is secure.)

If you can't prove any useful theorem about the security of any non-trivial program, then how is this software "building code" actually going to be useful?

halting problem not the issue

Posted Oct 10, 2010 19:20 UTC (Sun) by deepfire (guest, #26138) [Link]

> If you can't prove any useful theorem about the security of any
> non-trivial program, then how is this software "building code" actually
> going to be useful?

The point is, it's actually going to be _harmful_, because it will be used as a tool in the hands of bureaucrats.

halting problem not the issue

Posted Oct 10, 2010 11:15 UTC (Sun) by nix (subscriber, #2304) [Link]

The problem is, though, that the state of the art is such that the only stuff you can prove secure (or haltable) is sufficiently trivial that the compiler could compute most of it itself and just tag this function KNOWN_SECURE or something. It's often easy to prove single functions secure in isolation. It's as soon as you start chaining them together (especially when conditionals, loops, and function calls interact) that the thing becomes hideous. This is doubly so if the function calls manipulate global state.

Now, let's be honest here. How many programs do you know that are pure filters? How many of these eschew conditionals, loops, and function calls, allowing analysis without horrible exponential explosions that have to be cut down with heuristic axes (losing proof power)?

(It is true that a lot of security-exposed stuff, like libpng, seems like a filter of sorts... only, oops, you forgot the memory allocator! Now almost nothing outside certain simple embedded applications is a filter. Even 'cat' is not.)

Gilmore on the "computer health certificate" plan

Posted Oct 9, 2010 21:32 UTC (Sat) by nevyn (guest, #33129) [Link] (4 responses)

the fact that even such a seemingly trivial property of a program, namely whether it terminates or not, is not computable.
If such a trivial property of a program cannot be formally established, what can be said about any kind of security guarantees?

This is not a trivial property. Building codes do not provide anything like that. So, yes, you can certainly say meaningful security related things like, "uses a string library" (for C) or even the more generally "uses certified/verified crypto." or "uses SHA256+ checksums" (indeed there are current govt. stds. which say the later two things). SELinux could and.or firewall rules could also be thought of as analogous to building code.

My main point was that trying to argue to govt. people that "computers shouldn't have anything analogous to building codes" is stupid. Gilmore may well be under the impression that Libertarianism hasn't repeatedly failed, but it's still an exercise in futility to tell non-Libertarian govts. that they have to be Libertarian in specific niche XYZ.

Gilmore on the "computer health certificate" plan

Posted Oct 9, 2010 22:08 UTC (Sat) by brouhaha (subscriber, #1698) [Link] (1 responses)

A building code that says that a load-bearing member has to be of at least a certain size has (in combination with the other rules) a known effect on the safety of the building.

Showing that a program "uses a string library" or "uses SHA256+ checksums" does not demonstrate anything whatsoever about the security of the program.

Gilmore on the "computer health certificate" plan

Posted Oct 12, 2010 12:24 UTC (Tue) by sorpigal (guest, #36106) [Link]

That no such security is demonstrated is no impediment if the nontechnical lawmakers have been told that it is. I have to sign off that my software contains no security vulnerabilities, which I always do despite knowing that I cannot prove it. Someone feels better because I claimed it, and to be fair I do my best that they shall not soon have a need to doubt me. The rule is stupid and useless but still it exists.

I think we're all looking at this backwards. This isn't an attack on non-Windows computing, this is an attack on Windows (and by Microsoft, ironically). I am perfectly at ease with the idea that ISPs check the OS and version of their customers before allowing a connection. If the "secureness" check starts with "Is it Windows?" and assumes security when this is found to be false, then we're in good shape. Of course this isn't Microsoft's intent (they want palladium all over again, an internet where only systems signed by MS can connect to anything and each byte of content can be audited and revoked at will by MS) but with a few gentle nudges we could use it to render the majority of Windows computers useless (that is, non-networkable) which can only be a good thing for their users.

Gilmore on the "computer health certificate" plan

Posted Oct 15, 2010 15:22 UTC (Fri) by dvdeug (guest, #10998) [Link]

You're adding on a big bureaucracy to verify what? Again, a house built to building codes is reasonably likely to stand against a reasonable known set of natural dangers. A building built to code won't fall in an small earthquake. There are no natural dangers that software have to deal with; they have to deal with humans. Great, you've required that you use SHA256+ checksums. You've dealt with one attack vector out of ten thousand. You can not now say that a program written to code won't fall to a human attacker, that you've improved its security significantly. The value of the bureaucracy has to at least equal its costs, and the evidence that you can is shaky.

Gilmore on the "computer health certificate" plan

Posted Oct 16, 2010 1:39 UTC (Sat) by dvdeug (guest, #10998) [Link]

I forgot about a great example of this. There is an Ada Conformity Assessment Test Suite where every Ada compiler can get certified that it correctly supports the Ada standard. It is of course not perfect, but it covers every feature in the standard, and GNAT and I assume every other Ada compiler has the test suite as part of the regular testing for the compiler. Still, there are virtually no certified compilers because the cost of the bureaucracy is more than the value of the certification. It would require Ada Core Technologies to undergo a new certification for every platform they supported every time a new bug fix release of GCC came out. Trying to keep the compiler certified would actually keep more buggy versions of the compiler on the market, as any bugfixes would need recertified.

I think it's noteworthy that few other language communities even have the concept of certification test; there is no official body of tests to test that your C compiler compiles C correctly. This of course calls into question any source code verification that a particular program compiled with that C compiler is secure. And the C library; as a recent problem with FTP programs shows, a problem with the C library can be a problem with your FTP program. So you've going to have to certify the FTP program compiled with a certain C compiler running with a certain exact version of the C library on an exact version of the kernel, all of which are going to have new bugs found, even security holes, and go unfixed because it's too expensive to recertify the new versions.

Gilmore on the "computer health certificate" plan

Posted Oct 12, 2010 18:46 UTC (Tue) by halfpast (guest, #70519) [Link] (2 responses)

> Please, understand that anything short of a formal proof is a tool in the hands of manipulators -- you cannot twist facts, you need to involve uncertain matters for that.

I hate to be the first one in your life to tell you this, but there are very few things that are certain in reality.

Let's assume for a moment that you have created a complex model of a program that couldn't possibly do anything that it wasn't designed to do (in other words, it's perfectly secure). Now you implement that program on an actual machine. Unless you designed the operating system, your program is insecure because your data isn't guaranteed to be safe. Even if you designed the perfect operating system, unless you designed the hardware, you can't be guaranteed that it operates perfectly within spec -- go look at the errata list for your processor. Now meditate on the fact that hackers can (and have) leveraged hardware flaws. Even if you designed the hardware and it was fabricated perfectly, with no errors, you still have to deal with the fundamental laws of the universe. Natural radiation can flip bits in memory, heat can flip bits in memory, I can go on.

Nothing is guaranteed. Furthermore, no one ever said that a "computing building code" would establish any kind of security guarantees. Using strncpy instead of strcpy does obtain a (more) meaningful level of manual quality assurance. Can I write you a proof? No, but real-world experience supports that assertion.

Gilmore on the "computer health certificate" plan

Posted Oct 17, 2010 8:54 UTC (Sun) by dark (guest, #8483) [Link] (1 responses)

Bad example. strncpy isn't specified the way the 'n' would lead you to expect. It's no improvement over strcpy, it just has a different failure mode that's just as bad. Instead of buffer overflows you'll have unterminated strings all over the place.

The correct replacement for strcpy is memcpy. Rationale: in order to safely use strcpy you have to know the length of the source string, and if you know the length of the source string then memcpy is more efficient.

In another sense your example is illustrative: I'm sure that a "software building code" will mandate the use of strncpy anyway, and getting that changed will require years of lobbying.

Gilmore on the "computer health certificate" plan

Posted Oct 17, 2010 10:12 UTC (Sun) by paulj (subscriber, #341) [Link]

Anyone who isn't using a sane string abstracting wrapper around C strings these days, at least in new code, is Doing It Wrong. ;)

Gilmore on the "computer health certificate" plan

Posted Oct 9, 2010 5:58 UTC (Sat) by drag (guest, #31333) [Link] (5 responses)

> This is a terrible argument to make, and not at all true. You are free to build your own house, and many people do it. You do have to obey rules though, and yes it's often for the protection of other people. There used to be no rules for building houses or driving a car or... but that doesn't work above a certain scale.

Laws should only exist in society when they serve to protect the liberty of the members of the society and to resolve potential interpersonal conflicts before they get out of hand.

That is the freedom to do what you want ends when you begin to inflict violence and harm onto other people or violate their property.

That's all there really is to it and there should not be done much beyond that.

Now beyond that lots of people voluntarily adhere to rules governed by contracts for many positive reasons. One of those things is for interoperability. The 'WiFi Alliance', for example, is a corporation that is used to help make sure that manufacturers of different stripes make compatible products.

This sort of 'health check' may actually of been a useful feature as people who lacked the technical ability to understand computer technology could use a third party help monitoring them without giving unregulated access to their computers. Unfortunately though when you get things like DRM involved you start invoking unjust and ill conceived rules and regulations set out by our (or at least my) government that is leveraged by folks like Microsoft to break interoperability. That is make things worse... not better. Anti-features and broken by design type things that take too much out of the hands of the property owner.

Potentially decent idea turned shitty...

Gilmore on the "computer health certificate" plan

Posted Oct 10, 2010 1:24 UTC (Sun) by marcH (subscriber, #57642) [Link] (4 responses)

> Laws should only exist in society when they serve to protect the liberty of the members of the society and to resolve potential interpersonal conflicts before they get out of hand.
> That is the freedom to do what you want ends when you begin to inflict violence and harm onto other people or violate their property.
> That's all there really is to it and there should not be done much beyond that.

In almost every society on Earth laws go well beyond that. Even in the US. That's because the vast majority of people want to actually live together, not just next to each other. This is admittedly annoying for the few who would preferred to be left alone (and not pay any taxes, among others).

Gilmore on the "computer health certificate" plan

Posted Oct 10, 2010 4:18 UTC (Sun) by drag (guest, #31333) [Link] (3 responses)

I said the way things should be. Not the way they are. :)

Most governments end up trying to engage in social engineering to try to manipulate everybody else into outcomes that the 'elites' tend to favor for whatever reason. The natural trend always ends up with larger and more pervasive government with more and more complex rules as time goes buy. Eventually you end up with a melt down and then the cycle restarts. People, historically, don't tend to do much until they start getting hungry (as in lack of food) and otherwise will tolerate almost anything. So it usually has to get really bad before any significant change in direction of politics sets it.

It's also a unfortunate reality that most of live in a place were it's vastly easier to get new laws passed then it is to correct old mistakes. Human nature, I suppose.

Gilmore on the "computer health certificate" plan

Posted Oct 10, 2010 11:17 UTC (Sun) by nix (subscriber, #2304) [Link] (2 responses)

The thing is, your 'should' disagrees with that of most of the rest of the world. Classical liberals disagree with you. In the US, the libertarians never exceed a few percent of the vote, and outside the US they don't even exist as a political platform.

Whatever global politics 'should' be, I don't think it should be something that almost nobody wants.

And this is *way* off-topic for LWN.

Gilmore on the "computer health certificate" plan

Posted Oct 10, 2010 21:55 UTC (Sun) by drag (guest, #31333) [Link] (1 responses)

Whether or not everybody agrees with me is, frankly, irrelevant. (Not to mention your wrong on multiple levels.)

If I followed your logic I would be a moron to be anything other then a happy Windows user.

Gilmore on the "computer health certificate" plan

Posted Oct 12, 2010 9:54 UTC (Tue) by marcH (subscriber, #57642) [Link]

> Whether or not everybody agrees with me is, frankly, irrelevant.

For better or for worse, "majority" is a concept central to democracy. By design, any extremely minor opinion is irrelevant to democracies. For better or for worse.

> If I followed your logic I would be a moron to be anything other then a happy Windows user.

To be thorough do not forget to also make a car analogy. It is typically useful to explain politics.

Gilmore on the "computer health certificate" plan

Posted Oct 9, 2010 11:32 UTC (Sat) by Lennie (subscriber, #49641) [Link]

"...but the argument that we'll let anyone do anything on the internet is old history and pretending we should is insane."

This to me sounds like it is the wrong way around, maybe you didn't mean to put it like that, I don't know.

In many universities and some ISP's (atleast in Europe), if an IP-address is spewing out crap, they quarantine it and allow it access to just the few sites needed to fix it.

Judging by the comments I see about this system it works well.

I really doubt the concept from Microsoft would work anyway, they messed up their TPM-enabled projects many times already.

Even if it did work, it would probably brake more then it fixes anyway.

Gilmore on the "computer health certificate" plan

Posted Oct 12, 2010 12:05 UTC (Tue) by sorpigal (guest, #36106) [Link]

In fact with housing it's even worse because most of those rules are there because of mortgage and insurance companies don't want to lend money to, or insure money for, people that buy a death trap. Without those rules, and assurances, nobody would lend/insure and thus. almost nobody could buy.

Maybe this is the intent but--as with all things--the reality is somewhat less reasonable. Anecdote: Some friends of mine built a new house. When getting it inspected they were informed that they had to replace the faucets because the style they had chosen was not allowed by the county government. There is no why.

Gilmore on the "computer health certificate" plan

Posted Oct 8, 2010 18:17 UTC (Fri) by djtm (guest, #70528) [Link]

The funny thing is that other than Windows PCs are usually not a problem. So it wouldn't make sense to extend the measures to them in the first place...

Gilmore on the "computer health certificate" plan

Posted Oct 8, 2010 21:01 UTC (Fri) by xorbe (guest, #3165) [Link] (1 responses)

What we need is nationalized computer health care for all! You'll have to pay a monthly premium even if you don't have a computer or internet access.

Gilmore on the "computer health certificate" plan

Posted Oct 10, 2010 0:41 UTC (Sun) by nix (subscriber, #2304) [Link]

Yeah, the way nationalized healthcare systems require you to pay a monthly premium even if you're not alive!

Gilmore on the "computer health certificate" plan

Posted Oct 8, 2010 21:38 UTC (Fri) by rfunk (subscriber, #4054) [Link] (3 responses)

It's amazing how many otherwise-insightful rants in this industry are ruined by being overly-infused with Libertarian politics.

Gilmore on the "computer health certificate" plan

Posted Oct 9, 2010 1:42 UTC (Sat) by jd (guest, #26381) [Link] (1 responses)

Agreed, but I'd extend it to any political view, though I have to admit the Libertarians have been... louder than some. In cases where the issue is restated using some hot-button issue of the day, how is it any different from any other flamebait? It actually detracts from the issue, diverting attention away from the real problem at hand to one that is only a problem for those of a different political persuasion.

In other cases, the problem at hand is only a problem because of the political persuasion of the poster.

Aside from the fact that such comments are neither polite nor respectful (and usually not terribly informative either), they're not constructive or informative.

Now, I have to admit that this idea that Microsoft wants to have computers pass a health-check something of a joke, if not a positive insult to the intelligence of non-Microsoft users, but snide comments about the US health care debate won't help.

There are a few things that might help. The Government has an Internet Czar post that has been vacant for a number of years. Having someone credible in that position to actually use it to educate people on why such a scheme might be a bad idea, or to even offer a compromise (NIST is certifying for CC anyway, it might as well certify for Internet Health as well) that Microsoft and Intel are guaranteed to reject.

(Compromises as a means of killing a bad idea are not a new idea.)

Gilmore on the "computer health certificate" plan

Posted Oct 9, 2010 16:14 UTC (Sat) by drag (guest, #31333) [Link]

I expect that a 'internet czar' would be spending much more time trying to figure out how to implement the massive increase in size and scope of the three letter government agencies' ability to 'wiretap' internet activity without warrants.

Which is something that the Obama administration has been pushing very hard for in the past few months. Generally speaking 'Czars' are there to carry out the policy of the administration, of which, reducing the security of our communications is a high priority.

Worrying about discrediting some notion of some private sector security researcher is probably not going to be a high on the list of things they are going to care about.

Gilmore on the "computer health certificate" plan

Posted Oct 10, 2010 20:07 UTC (Sun) by smoogen (subscriber, #97) [Link]

As an age old Libertarian told me:

"Everyone wants to be a libertarian until they have to deal with the real world." Not that he didn't go on tilting at windmills as he called it.

Libertarianism like Communism as an ideal looks good on paper... but fails because humans aren't built for it. We are too individualistic to allow for a pure collective to ever 'prosper' and we are too communal to ever allow libertarianism to bloom.

Computers being built by us are probably too tainted also :).

Gilmore on the "computer health certificate" plan

Posted Oct 10, 2010 7:14 UTC (Sun) by ummmwhat2 (guest, #57575) [Link] (1 responses)

This is a disaster even for disinterested users who are satisfied with their Microsoft products.

One of Microsoft's biggest problems right now is that most users are content with Windows XP; thus they aren't making enough money on Windows 7. Under the current system, discontinuing bug support means that users with older systems operate at their own risk, but under the new system, they wouldn't be able to connect at all. Obligatory car analogy: Toyota drops support for their 2003 models, and suddenly filling stations are forbidden to sell you any gas.

Gilmore on the "computer health certificate" plan

Posted Oct 10, 2010 11:18 UTC (Sun) by nix (subscriber, #2304) [Link]

No, this is more like 'suddenly you find your car halts if you try to drive it out of your garage'. If these restrictions are baked into routers (fat chance, they don't have CPU power to spare, but anyway) you couldn't even get metaphorical petrol and fill your metaphorical tank up by hand.

Quick tip for writing to a non-US audience

Posted Oct 17, 2010 9:02 UTC (Sun) by rwmj (subscriber, #5474) [Link] (1 responses)

Don't make ridiculous claims about Public Health services. You obviously know nothing about the subject and you come across (to people who live outside the US) as an idiot for making such claims.

Quick tip for writing to a non-US audience

Posted Oct 21, 2010 20:51 UTC (Thu) by nix (subscriber, #2304) [Link]

Actually the NHS *does* have huge powers (sectioning, for instance, which can be used to keep people institutionalized essentially indefinitely). However, there are checks and balances on how they apply these powers, and the people who will apply them are medical professionals, not monsters. There are cases where they are used wrongly, but they are sufficiently uncommon that they make the news when they happen (there was a fairly nasty case mentioned by Private Eye a few months ago).

However, *someone* in society *must* have such huge powers, because there really are some mental disorders which make people a danger to others (unchecked paranoid schizophrenia, for instance). If it's not to be the health people who have those powers, who does?

The important thing is not that the powers exist: it's that there is an appeal procedure.


Copyright © 2010, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds