User: Password:
|
|
Subscribe / Log in / New account

Geer: Trends in cyber security

Dan Geer has posted the transcript of his "Trends in cyber security" talk presented to the US National Reconnaissance Office in early November. "The trendline in the number of critical monocultures seems to be rising and many of these are embedded systems both without a remote management interface and long lived. That combination -- long lived and not reachable -- is the trend that must be reversed. Whether to insist that embedded devices self destruct at some age or that remote management of them be a condition of deployment is the question. In either case, the Internet of Things and the appearance of microcontrollers in seemingly every computing device should raise hackles on every neck."
(Log in to post comments)

Geer: Trends in cyber security

Posted Dec 2, 2013 21:01 UTC (Mon) by dvdeug (subscriber, #10998) [Link]

The idea to open source Windows XP and any other software that's no longer being updated with security patches is provocative, but I don't really see where it would help. You can find a lot of people out there running obsolete versions of the kernel, of Apache, of various other open source programs. I suspect a substantial number of Windows XP installations are simple refusals to upgrade, which would be equally true if it was open sourced.

Maybe XP would have enough audience to get decent patches if open sourced, but you'd also be exposing all that code to black hats, and for some time--maybe the remaining length of XP's usefulness--the black hats, having the easier job, would have the lead.

Geer: Trends in cyber security

Posted Dec 2, 2013 21:03 UTC (Mon) by droundy (subscriber, #4559) [Link]

I wonder if his suggestion to mandate the open-sourcing of unsupported software is intended rather as a stick to force Microsoft to resume providing security updates?

Geer: Trends in cyber security

Posted Dec 3, 2013 12:14 UTC (Tue) by jonnor (guest, #76768) [Link]

Upgrading Windows to a newer major version is a big undertaking (Linux is almost just as bad), I can understand why users are hesitant to do that. But if minor security patches for XP were introduced (by an open source community, or MS) I don't think there would be as much resistance in installing those.

Geer: Trends in cyber security

Posted Dec 3, 2013 12:38 UTC (Tue) by anselm (subscriber, #2796) [Link]

»Minor security patches« to Windows XP are one thing. Another thing is that XP has some really annoying shortcomings that basically hurt the rest of the world for as long as XP will be around.

For example, XP (probably by design) does not support the TLS SNI extension (even if all modern web browsers do, as well as pretty much every OS of importance under the sun including Windows from Vista on). If you're a web site operator, it would be really nice to be able to stipulate that clients can handle SNI, because this would allow you to use name-based HTTPS sites and cut down on your requirements for IPv4 addresses. But as long as XP remains a viable client-side operating system, that won't fly. And retrofitting that sort of functionality may be beyond what a »minor security patch« can achieve.

Geer: Trends in cyber security

Posted Dec 3, 2013 13:30 UTC (Tue) by tao (subscriber, #17563) [Link]

What's so big about updating Linux to a new major version?

aptitude dist-upgrade

And unless you're done some local modifications that break things, you're done (well, after waiting a long time for the upgrades to download and install).

Geer: Trends in cyber security

Posted Dec 3, 2013 13:40 UTC (Tue) by Wol (guest, #4433) [Link]

I've currently got "emerge -uDN world" running on my system :-)

Another issue with upgrading XP, though, is system requirements. I don't have any keys for Vista, 7 or 8, and even if I did I couldn't upgrade my XP system - its ram is maxed out at 768Mb.

Cheers,
Wol

Geer: Trends in cyber security

Posted Dec 4, 2013 11:44 UTC (Wed) by job (guest, #670) [Link]

Testing your applications, basically. Testing and taking care of the ensuing breakage.

Recently someone had a handful of people working overtime to chase down something that was triggered by a change in one of the default ulimits.

These things happens all the time.

Geer: Trends in cyber security

Posted Dec 3, 2013 0:21 UTC (Tue) by Aliasundercover (subscriber, #69009) [Link]

Update Today, Update Tomorrow, Update Forever is the answer provided the question is how to enforce planned obsolescence and remote control.

Even where free software updates come from honest sources not seeking power over users they remain a pest. When does computer security let us treat our digital machines as, well, machines that just work while we ignore them knowing they ain't broke so we need not fix them enduring unwanted changes and quality risk?

Could it be the native difficulty of the problem is only part of why we don't have a solution?

Geer: Trends in cyber security

Posted Dec 3, 2013 0:53 UTC (Tue) by khim (subscriber, #9252) [Link]

When does computer security let us treat our digital machines as, well, machines that just work while we ignore them knowing they ain't broke so we need not fix them enduring unwanted changes and quality risk?

When they are simple enough. I'm not sure where the borderline lies, but I think right now, today, we can create completely bug-free programs of about 100K in size - and then it takes many-many man-years of work.

We know that because there are pieces of silicone which were reviewed by a lot of developers, which are supposed to be absolutely unbreakable, which are very simple… and which were cracked repeatedly. Think Wii's boot1: it's 17K in size and by now it's in the desired state (where you could treat this small program which you don't need to fix… indeed you can not fix it). And it only took few years to iron out all the bugs! Note that in all these years the hardware side was 100% stable and software side was basically unmodified (higher-level components were modified, of course, but you can not modify boot1). More-or-less the same story with XBox360 and PS3.

Thus yes, we could reach this state—but only for tiny and very simple pieces.

Basically that means that everything is crackable and will remain crackable for the foreseable future—the only exceptions are some "last-resort" protection schemes in critical places (like, hopefully, nuclear plants). But even in these critical pieces of modern infrastructure everything above these lowest-level "last-resort" schemes are basically hopeless.

Even high-level CPUs have tons of problems and must be updated from time to time! And if you can not trust CPU then what hope is there for the rest of the system?

If you want to see something as exteremely complex as your router or mobile phone (I don't talk about smartphone, but just a simple dumbphone here) to reach that state… I think you'll need to want for decades, perhaps for centuries… and they will be obsolete long before that!

Geer: Trends in cyber security

Posted Dec 3, 2013 10:57 UTC (Tue) by Wol (guest, #4433) [Link]

Then let's add that thing I rail against ...

MATHS IS NOT REALITY !!!

Prove your program correct, by all means (not forgetting that, like most mathematical problems, it is (a) hard, and (b) might not even have an answer).

Then remember that just because your program is correct and is certain to run as required in the imaginary world that is maths, that does not mean that your mathematical space corresponds with the real world and your program will actually do (in the real world) what you programmed it to do in your mathematical space.

Cheers,
Wol

Geer: Trends in cyber security

Posted Dec 3, 2013 12:08 UTC (Tue) by khim (subscriber, #9252) [Link]

Then let's add that thing I rail against ...

MATHS IS NOT REALITY !!!

Sure and this is why software is not a math. Why do you want to raise that point here?

Remember that just because your program is correct and is certain to run as required in the imaginary world that is maths, that does not mean that your mathematical space corresponds with the real world and your program will actually do (in the real world) what you programmed it to do in your mathematical space.

Indeed. The infamous Memcmp Flaw does not exist in naive, simple, model of XBox360's CPU, yet it does exist in real world and if you'll take more precise model (verilog one, for example), you can investigate it there. I've even observed errors in some obscure CPUs which can not be reproduced on verilog model because it's not precise enough! Even if digital world deals with zeros and ones it also deals with timings are these are not zeros and ones thus in the end software is not math, it's just approximated by math much better than many other things.

Geer: Trends in cyber security

Posted Dec 3, 2013 13:34 UTC (Tue) by Wol (guest, #4433) [Link]

How do you define software? Software+hardware is not maths, but if you define software as being just code (of any sort) then it fits the description of maths. This is the big problem of people in America patenting software - it IS just maths. Let's face it, a program, supplied on a CD, is simply one huge number!

And how do we execute a program? We just feed the huge number that is a program, into the device (now a minor part of a cpu) called an Arithmetic Logic Unit (ie a maths machine) and we get another huge number out.

Source code is converted into object code by running a mathematical algorithm (called a compiler) over it ...

etc etc etc. And if software wasn't maths, you couldn't prove it correct ... you need mathematical tools to do that!

I know the difference between maths and science is a subject of great philosophical argument (I've had a few on Groklaw :-) but to me the difference is simple. Maths is using logic to reason about the world. Science is observing the world to see if our logic mirrors reality. And by that definition software is most definitely maths.

Of course, my definition is complicated by a whole bunch of "misnaming"s, because Computer Science isn't, and Theoretical Physics is maths, and stuff like that. But anything where you can use LOGIC (which is itself maths!) must be maths.

Oh - and that memcmp flaw - reading your link it appears to rely on observing timing differences, which is science, which isn't surprising because it includes *hardware* in the mix. The flaw has nothing to do with the software - the instructions - the maths - but everything to do with how long the hardware takes to carry out the instructions.

Cheers,
Wol

Geer: Trends in cyber security

Posted Dec 3, 2013 15:06 UTC (Tue) by khim (subscriber, #9252) [Link]

Let's face it, a program, supplied on a CD, is simply one huge number!

But that logic any novel, any song, and any movie stored on CD is “simply one huge number”, too.

How do you define software?

Wikipedia contains pretty good definition: Computer software, or just software, is any set of machine-readable instructions that directs a computer's processor to perform specific operations. As you can see the primary property of the software is not the fact that it contains zeros and ones (many other things are comprosed from zeros and ones when stored on CD), but the fact that it can drive certain hardware.

And how do we execute a program? We just feed the huge number that is a program, into the device (now a minor part of a cpu) called an Arithmetic Logic Unit (ie a maths machine) and we get another huge number out.

Right, but the end result is not important, the process is. From purely marhematical standpoint merge sort and heap sort may be quite similar, but on real hardware they have significantly different properties and thus are used in different pieces of programs. When you write programs you are driven by capabilities and limitation of hardware in most cases (unless you are writing purely mathematical constructs for purely mathematical “hardware” like Turing machine).

And if software wasn't maths, you couldn't prove it correct ... you need mathematical tools to do that!

You can prove certain things about chemical compounds, too. Why chemistry is not a math, then?

I know the difference between maths and science is a subject of great philosophical argument (I've had a few on Groklaw :-) but to me the difference is simple. Maths is using logic to reason about the world. Science is observing the world to see if our logic mirrors reality.

Well… Ok, let's go with this definition.

And by that definition software is most definitely maths.

Very, very small part of the software it math by that definition. Think about sorting again. If you talk about purely mathematical properties of merge sort and/or heap sort then it may be considered math. But then you start talking about properties of CPU, caches, memory and so on—and at this point you've left realm of math and are firmly in realm of science. Guess where typical requirement to produce certain results in 16ms puts the software?

Reading what you wrote about the memcmp flaw rather more closely, it is obviously a hardware issue :-)

No, it's software issue. Software was written with wrong model of the hardware in the mind thus it had flaw and it was possible to exploit said flaw.

Geer: Trends in cyber security

Posted Dec 3, 2013 17:12 UTC (Tue) by mathstuf (subscriber, #69389) [Link]

> But that logic any novel, any song, and any movie stored on CD is “simply one huge number”, too.

Which is why *patents* shouldn't apply. Copyrights are sufficient. Of course, the copyrights are typically associated with the specific *expression* (or interpretation) of a number which represents the work in some *specified encoding* (PNG, JPG, MP3, vorbis, VP8, etc.), not the number itself.

An artist can make a painting with the numbers 1 through 10 on it and copyright it, but that doesn't mean that the numbers themselves are copyrighted.

I also think that the Knuth quote "Be careful about using the following code -- I've only proven that it works, I haven't tested it." has a kernel of truth rooted in the difference of code in-and-of-itself (which I'd call math) and when the real world starts beating on it with a large club.

> prove certain things about chemical compounds

I don't think that computational chemistry has gotten to the point of proving things in medicine or even organic chemistry (which is still done using statistics gathered from scientific experiments). It can help *explain* things, but I would be interested in research in purely computational chemistry making previously unknown hypotheses later shown correct through experiment.

Geer: Trends in cyber security

Posted Dec 3, 2013 17:29 UTC (Tue) by dlang (subscriber, #313) [Link]

> I don't think that computational chemistry has gotten to the point of proving things in medicine or even organic chemistry (which is still done using statistics gathered from scientific experiments). It can help *explain* things, but I would be interested in research in purely computational chemistry making previously unknown hypotheses later shown correct through experiment.

I've actually seen articles this week on exactly this topic. One I remember was on computing drug interactions

It's still new enough that major projects are newsworthy, but it seems to be getting there.

Geer: Trends in cyber security

Posted Dec 3, 2013 21:31 UTC (Tue) by khim (subscriber, #9252) [Link]

An artist can make a painting with the numbers 1 through 10 on it and copyright it, but that doesn't mean that the numbers themselves are copyrighted.

Indeed. Numbers can not be patented, copyrighted or trademarked (Intel was unable to trademark 486, that's why 80486 was followed by Pentium, not 80586).

> But that logic any novel, any song, and any movie stored on CD is “simply one huge number”, too.
Which is why *patents* shouldn't apply. Copyrights are sufficient.

Really? Since when copyrights can be applied to “math”? Either it's “simply one huge number” (i.e.: math) and can not be patented, copyrighted or trademarked or it's not just a number, but it's something else, too (i.e.: not a math). I'll wish you luck if you'll try to push that silly “contents of a CD is just a number” theory in court.

I also think that the Knuth quote "Be careful about using the following code -- I've only proven that it works, I haven't tested it." has a kernel of truth rooted in the difference of code in-and-of-itself (which I'd call math) and when the real world starts beating on it with a large club.

Indeed. And that is exactly why software is valuable and why it's not a math: because “real world was beating on it with a large club” and it adjusted. Think about VP8 criticism: apparently some people forgot that codecs are supposed to be used to compress the information obtained from real world and presented later to real world people and instead concentrated too much on one single model which gave them numbers. That's all good and well, but you should never forget when you are writing program that you are dealing with real world and not with “just a math”—otherwise you'll end up with code with was “proven but not tested” (best case scenario).

I don't think that computational chemistry has gotten to the point of proving things in medicine or even organic chemistry (which is still done using statistics gathered from scientific experiments).

Your information is outdated by about two decades. Novadays computers are used quite extensively to save on experiments. And indeed it's quite effective: it finds enzimes which work in certain way pretty well. The only thing it can not do is to prove that there are no bad side-effects—but this is similar to the memcmp flaw mentioned before: it does not exist in simplified world of mathematical model of the real world but it does exist in reality.

Geer: Trends in cyber security

Posted Dec 3, 2013 17:37 UTC (Tue) by Wol (guest, #4433) [Link]

> But that logic any novel, any song, and any movie stored on CD is “simply one huge number”, too.

Correct! Which is why copyright, not patent, is an appropriate protection, if any. Oh, and if it's stored as a huge number it itself isn't a novel, song or movie anyway. Once again, we get into philosophy, here it's semantics or meaning. That huge number is meaningless without a human to interpret it :-)

> Wikipedia contains pretty good definition: Computer software, or just software, is any set of machine-readable instructions that directs a computer's processor to perform specific operations. As you can see the primary property of the software is not the fact that it contains zeros and ones (many other things are comprosed from zeros and ones when stored on CD), but the fact that it can drive certain hardware.

Except that you are misreading it. I guess English is not your native language, but the crucial word is "a list of INSTRUCTIONS". I can give instructions till I'm blue in the face, but without something (hardware) to carry out orders nothing will happen. If I give you a list of instructions (a recipe) it won't make a cake magically appear in your kitchen. If I give you directions (again taken from your wikipedia quote) *nothing* will happen unless *you* (hardware) carry them out.

So no, software can NOT drive ANY hardware. What it can do (when fed through an ALU) is make the *computer* drive the hardware for you.

> You can prove certain things about chemical compounds, too. Why chemistry is not a math, then?

Can you prove a chemical compound is true? Sod's law, but you've picked on a chemist here :-) And yes, it's easy to use maths to prove what chemicals SHOULD do (chemists do it all the time), but guess what! When you use SCIENCE to OBSERVE, they don't do what the maths says they should! Take for example the internal combustion engine. Iso-octane plus oxygen gives carbon dioxide and water (plus heat and motion). Except it doesn't. This is science - using maths to calculate what *should* happen, then observing to see what *does* happen. And it never does exactly what you expect. (And yes, the use of air rather than oxygen complicates the maths, but reality still doesn't do what the maths says! :-)

> But then you start talking about properties of CPU, caches, memory and so on—and at this point you've left realm of math and are firmly in realm of science.

And at this point you have left the realm of software and are firmly in the realm of hardware! (Which is why you are, I would agree, in the realm of science :-)

> No, it's software issue. Software was written with wrong model of the hardware in the mind thus it had flaw and it was possible to exploit said flaw.

It's a flaw in the spec, and thus is a programming issue, sure. But programming is not software. The fact is, the *hardware* takes a different time to execute the instructions, depending on the result. The software has no concept of time.

And doesn't this "bug" have absolutely no effect on the result? As I understand it, you feed stuff into the algorithm, and observe how long it takes to calculate the answer. The software ALWAYS produces the SAME result (which is another proof it's maths!). The attacker is interested in the time the hardware takes to produce it, therefore the attack is against the hardware, and as I say it may be a *programming* issue, but it's not a *software* issue - the software is always correct.

Going back to Wikipedia, software is "a list of instructions". Without hardware/wetware, software can do absolutely nothing - it is an abstract concept. ALL software is maths. Once you put a computer in to the mix - any computer - you have left the realm of software.

You are clearly including the presence of a computer in your definition of software. I think pretty much all computer scientists would tell you you are wrong. Sorry. (And this is exactly the confusion American snakes^wlawyers are using to try and patent software.)

Cheers,
Wol

Geer: Trends in cyber security

Posted Dec 3, 2013 18:31 UTC (Tue) by malor (guest, #2973) [Link]

See, you've got this fundamental confusion going on here: you're mixing up hardware and software.

Software *is* math. If you prove a piece of software correct, it will always do what it is supposed to do -- *if* the hardware is correct. The fact that some hardware can potentially run a given software program incorrectly does not make it something other than math.

Software exists only in the abstract, and as such, it can be modeled perfectly as a mathematical process. Chemistry math is always inexact, because you cannot model every individual molecule. All you can do is make abstractions: approximation is the best result possible.

But software isn't like that. 2 + 2 is always 4, even if your handheld calculator claims it's 3.999998. A loop counting from 1 to 100 is a hard truth in exactly the same way that 2+2=4 is, even if the computer running it glitches out, and stops at 98.

You can argue that computer hardware is imperfect, because it is. You can argue that programmers and programming are imperfect, because they are. But you cannot (correctly) argue that software is not math.

You are, in essence, making the argument that two plus two is not four, because calculators are unreliable.

Geer: Trends in cyber security

Posted Dec 3, 2013 22:07 UTC (Tue) by khim (subscriber, #9252) [Link]

> But that logic any novel, any song, and any movie stored on CD is “simply one huge number”, too.

Correct! Which is why copyright, not patent, is an appropriate protection, if any.

Really? Since when you can apply copyright on a math? Either it's “simply one huge number” (i.e.: math) and can not be patented, copyrighted or trademarked or it's not just a number, but it's something else, too (i.e.: not a math). I'll wish you luck if you'll try to push that silly “contents of a CD is just a number” theory in court.

You are clearly including the presence of a computer in your definition of software.

Well, sure. Software is a set of instructions for a hardware, nothing more, nothing less. Hardware is it's evil twin (well, may be the other way around, but anyway: software without corresponding hardware is pretty pointless).

I think pretty much all computer scientists would tell you you are wrong.

Nope. I know quite a few computer scientists who will agree with me. Indeed the famous Knuth's “Be careful about using the following code—I've only proven that it works, I haven't tested it” maxima is important part of software development. If CPU has an error then it's software's job to mitigate said error, if software already exist and we want to develop new piece of hardware then we need to deal with software expectation (or change the software). Software and hardware are intrinsically tied—one is useless without the other one.

Can you prove a chemical compound is true? Sod's law, but you've picked on a chemist here :-) And yes, it's easy to use maths to prove what chemicals SHOULD do (chemists do it all the time), but guess what! When you use SCIENCE to OBSERVE, they don't do what the maths says they should! Take for example the internal combustion engine. Iso-octane plus oxygen gives carbon dioxide and water (plus heat and motion). Except it doesn't. This is science - using maths to calculate what *should* happen, then observing to see what *does* happen. And it never does exactly what you expect.

Well, software is developed in the exact same fashion: till you actually run the thing on real hardware you'll not know exactly how it works. Sometimes it works as expected, sometimes it's too slow and sometimes it does not work because you've forgotten about some important property of hardware (for example if you are switching from x86 to arm and thus are not prepared to deal with memory coherecy issues).

The software has no concept of time.

If your software have no concept of time then I agree, that that software is probably math. Of course this automatically excludes all the OSes, compilers, games, codecs and other interesting pieces of software and moves the discussion in the realm of “how many angels can dance on the head of a pin?” questions.

And doesn't this "bug" have absolutely no effect on the result?

Yes, because the goal of software was the protection of secret key—and it failed to do that.

The attacker is interested in the time the hardware takes to produce it, therefore the attack is against the hardware, and as I say it may be a *programming* issue, but it's not a *software* issue - the software is always correct.

If you separate the software from hardware then you get quite useless set of zeros and ones, sorry. It's not even a software anymore because how can you check is something is software or not if you don't have a list of instructions accepted by hardware on hand?

Geer: Trends in cyber security

Posted Dec 4, 2013 17:27 UTC (Wed) by Wol (guest, #4433) [Link]

> If you separate the software from hardware then you get quite useless set of zeros and ones, sorry. It's not even a software anymore because how can you check is something is software or not if you don't have a list of instructions accepted by hardware on hand?

Because as soon as you add hardware to the mix IT'S NOT SOFTWARE!

As for "how do you check?", I guess you must be a youngster. Us oldsters did it the hard way. WE DIDN'T have hardware to hand, and had to prove it on paper. Seriously. That's what we did!

My first boss, in the very job he later took me on in, had to write a program without any hardware to write it on. When the computer finally arrived, six months later, the office typists typed it in and it ran. Flawlessly.

May I please refer you back to that very wikipedia article you yourself referenced - software is A LIST OF INSTRUCTIONS. And, without a computer (you DO know the first computers were wetware, not hardware?) that list of instructions is totally useless.

Look at what you wrote! "If you separate the software from the hardware"!!! You're redefining meanings to suit your current needs. You can't do that!!! You're defining software to include hardware and now you're implying that they CAN be separated. If they can (as I would argue, as language itself implies that they can) then your (re)definition doesn't work.

I'm quite happy with a requirement that a PROGRAM needs hardware to be useful. I'm quite happy that programming is not maths (it's "doing maths", which isn't the same thing at all :-). But the software itself is just maths. Because the human readable source code, the binary, the executable stored as pits on a CD, the executable as stored as magnetic poles on a hard drive or capacitive charges on an SSD, IS THE SAME THING AS FAR AS MATHS IS CONCERNED.

Software is maths. Hardware is reality. A program needs both of them to work.

Cheers,
Wol

Geer: Trends in cyber security

Posted Dec 3, 2013 13:37 UTC (Tue) by Wol (guest, #4433) [Link]

Reading what you wrote about the memcmp flaw rather more closely, it is obviously a hardware issue :-)

Whether the flaw appears depends on what hardware you run it on!

Oh - another definition of software - you can (time taken to do it is discounted) run it on wetware. You can clearly run this memcmp code on wetware :-) and the timing issues (SCIENCE!) will clearly be very different. But if you run it on wetware you can prove it correct! ie it's maths!

Cheers,
Wol

Geer: Trends in cyber security

Posted Dec 3, 2013 7:33 UTC (Tue) by NAR (subscriber, #1313) [Link]

"Time to market is king" and "release early, release often". These philosophies govern proprietary and open source development, none of them is that keen on quality. If we're lucky, the not ready feature is not included. If we're not, then all bets are off.

Geer: Trends in cyber security

Posted Dec 3, 2013 18:30 UTC (Tue) by NightMonkey (subscriber, #23051) [Link]

"When does computer security let us treat our digital machines as, well, machines that just work while we ignore them knowing they ain't broke so we need not fix them enduring unwanted changes and quality risk?"

Where are these perfect machines you speak of? :)

Geer: Trends in cyber security

Posted Dec 3, 2013 19:11 UTC (Tue) by Aliasundercover (subscriber, #69009) [Link]

> Where are these perfect machines you speak of? :)

Your refrigerator, stove, furnace, washing machine and air conditioner come close enough. They don't last forever but they don't die for lack of updates either.

Geer: Trends in cyber security

Posted Dec 3, 2013 19:39 UTC (Tue) by NightMonkey (subscriber, #23051) [Link]

Heh, I think you've been a victim of Maytag's old "lonely repairman" marketing.

http://www.youtube.com/watch?v=t88WJmZDMBY
http://www.youtube.com/watch?v=Sb3fSXiz1XM
http://www.youtube.com/watch?v=-4j8RoesBrg

I think folks have been led to believe that we have created perfection somewhere. We have not. Entropy exists. Also, the environment that the machines are in matters. You perform more maintenance on a truck in the desert than you do on a truck in a garage in Kentucky. And is anyone actively trying to destroy your washing machine 24/7 365? I hope not! :)

A simple computer that plays tic-tac-toe and doesn't live in a TCP/IP network doesn't need updates. The fact is, computers as we use them are very complex, suprisingly brittle constructs, but lots and lots of marketing have convinced people that they are actually robust.

Geer: Trends in cyber security

Posted Dec 3, 2013 20:09 UTC (Tue) by raven667 (subscriber, #5198) [Link]

> A simple computer that plays tic-tac-toe and doesn't live in a TCP/IP network doesn't need updates. The fact is, computers as we use them are very complex, surprisingly brittle constructs, but lots and lots of marketing have convinced people that they are actually robust.

This is a great distillation of the truth of it. I wonder if we really haven't crossed a threshold, or several, where we are getting diminishing returns for all the added complexity. How many of the benefits of modern technology could be implemented using extremely simple electronics and software such that the complexity is manageable. Right now we have layers upon layers upon layers of independent computers and firmware all meshing together in a way that reminds me more of Verner Vinge "A Deepness in the Sky". Could you even identify all of the computers in a modern laptop?

Geer: Trends in cyber security

Posted Dec 3, 2013 20:50 UTC (Tue) by Wol (guest, #4433) [Link]

Actually, I think quite often technology has a NEGATIVE return!

Managers have a habit of taking a broken process and computerising it in an attempt to make things better.

Sorry, computerising a broken process can't magically fix it. And often, once it's fixed, you don't need to computerise it!

Cheers,
Wol

Tailpipe emmission standards

Posted Dec 3, 2013 15:21 UTC (Tue) by rriggs (subscriber, #11598) [Link]

I see maintaining secure systems like tailpipe emission standards on cars. If your car needs a tune-up to reduce noxious emissions, you cannot get your license plates renewed until it meets standards.

Should the feds scan systems and issue warnings and escalating fines for people with insecure, unpatched or compromised systems?

Tailpipe emmission standards

Posted Dec 3, 2013 15:53 UTC (Tue) by mpr22 (subscriber, #60784) [Link]

No, because the Feds should not be given a trivial cover under which to compromise people's systems.

Tailpipe emmission standards

Posted Dec 9, 2013 1:29 UTC (Mon) by nix (subscriber, #2304) [Link]

Why would the Feds *want* to compromise someone's emission control systems? Some bizarre pro-pollution bias?

Tailpipe emmission standards

Posted Dec 9, 2013 1:33 UTC (Mon) by mpr22 (subscriber, #60784) [Link]

I was referring to the unpacking of the analogy back into the computer space, rather than directly to the analogy. (Of course, the reason to compromise the ECS is so that you can have them fail an emissions spot-check and have their vehicle gone over by an particularly "observant" policeman.)

Tailpipe emmission standards

Posted Dec 3, 2013 17:17 UTC (Tue) by mathstuf (subscriber, #69389) [Link]

I'm sure there would be exemptions (similar to how vehicles older than 1993(? early 90's at least; my '89 was exempt) just don't get emission stamps. The earliest you could probably push back is Vista and anything older is "insecure, but can't mandate anything since it's just too old". This would probably actually result in companies sticking with legacy systems even longer to avoid the forced upgrade train once they use newer software.

Tailpipe emmission standards

Posted Dec 3, 2013 17:33 UTC (Tue) by dlang (subscriber, #313) [Link]

one problem with requirements like this is that they can end up preventing progress by locking in existing solutions to problems and preventing new things from being tried.

If you have to prove that your OS is secure before connecting to the Internet, you cannot develop a new OS, especially as a hobbiest.

The vehicle emissions example is one that is a really good example of how things can go wrong. I live in California, which has the strictest emissions rules around, and there are cars that produce less pollution than cars sold in California that are not allowed to be sold here because they aren't "equipped properly", the manufacturers came up with different solutions to the problems than what the state regulators did.

you really don't want this sort of checklist auditing to be able to control everyone's computers.

Tailpipe emmission standards

Posted Dec 5, 2013 11:54 UTC (Thu) by nye (guest, #51576) [Link]

>The vehicle emissions example is one that is a really good example of how things can go wrong. I live in California, which has the strictest emissions rules around, and there are cars that produce less pollution than cars sold in California that are not allowed to be sold here because they aren't "equipped properly", the manufacturers came up with different solutions to the problems than what the state regulators did.

Not that this changes your point, but I'm just wondering: do you mean that the regulations specify a particular technology, rather than actually measuring emissions? Or do you mean that they measure a particular set of substances and the cars in question fail on one particular part of the test despite being better overall?

Tailpipe emmission standards

Posted Dec 5, 2013 15:20 UTC (Thu) by raven667 (subscriber, #5198) [Link]

The US standards from the early '90s started requiring particular technologies, to cut down on gasses that the high temp, high compression, lean burning, 50 mpg cars of that era were making, which is inflexible as engineers found several ways to solve these problems that they weren't allowed to use. One big thing is to put more gas in the engine at startup so that comes out in the exhaust and burns in the catalytic converter to warm it up, which is terrible for fuel economy. Thats how we ended up going from Geo Metros to GMC Suburbans.

Tailpipe emmission standards

Posted Dec 5, 2013 16:53 UTC (Thu) by mathstuf (subscriber, #69389) [Link]

> started requiring particular technologies

And that, IMNSHO, is the (main) problem: legislating solutions rather than results :( .

Tailpipe emmission standards

Posted Dec 5, 2013 20:33 UTC (Thu) by dlang (subscriber, #313) [Link]

> And that, IMNSHO, is the (main) problem: legislating solutions rather than results :( .

and what makes you think that lawyers and politicians are going to do any better of a job legislating how computers should be secured than how to build cars?

that's the real problem with calls to require that only 'qualified' or 'good' people connect to the Internet.

Tailpipe emmission standards

Posted Dec 5, 2013 22:16 UTC (Thu) by mathstuf (subscriber, #69389) [Link]

I…agree? Legislating "how" (the solution) is usually a bad path. What you want is to expect results from things while also keeping an eye on the methods to make sure that the best reason for that path is better than "the ends justify the means". I think I would impose HIGH fines (proportional to company size and amount of data) for security leaks by companies. Ramp them up if the company isn't disclosing breaches in reasonable timeframes[1]. The problem is that fines are too low for companies to justify security because it's not *their* data and PR is such an ephemeral thing for those too big to fail.

[1]Apparently JP Morgan lost ~465,000 (pre-paid) CC numbers in July and it's only public[2] now because they couldn't "rule out the possibility that some card holders' personal data may have been accessed" instead of being proactive and saying "we've had a breach and your number may have been leaked" in, say, August.
[2]http://arstechnica.com/security/2013/12/hack-on-jpmorgan-...

Geer: Trends in cyber security

Posted Dec 3, 2013 17:46 UTC (Tue) by k3ninho (subscriber, #50375) [Link]

>The Internet is an empire. The Internet was built by academics, researchers, and hackers -- meaning that it embodies the liberal cum libertarian cultural interpretation of "American values," namely that it is open, non-hierarchial, self organizing, and leaves essentially no opportunities for governance beyond a few rules of how to keep two parties in communication over the wire. Anywhere the Internet appears, it brings those values with it. Other cultures, other governments, know that these are our strengths and that we are dependent upon them, hence as they adopt the Internet they become dependent on those strengths and thus on our values. A greater challenge to sovereignty does not exist, which is why the Internet will either be dramatically balkanized or it will morph into an organ of world government. In either case, the Internet will never again be as free as it is this morning.

I know that the talk was given to a USA government body, but this sounds crazy to me. I'm European and see the internet as a mutually-beneficial grouping of self-governed parties, knowing that peering data for the good of all results in better internet for everyone using it -- we might have many differences and past history but the many opportunities for collaboration and trade which the internet brings our way, these benefits make us all richer in a number of ways. The internet might be diverse but we are stronger together; there may be some able to peer less data because of their lower network capabilities, so we share the transit according to our trunk's capabilities to each according to their torrenting hordes.

So bias remains bias, especially when you're touting the notion of the internet as a dumb network with end-to-end-ness which is somehow sharing libertarian values, not the values of the people setting up their end-to-end communications and interactions. The internet (and the wild west) may have been a thing for people to get out from under their oppressive governments or slave masters, but no man was ever an island. The claim is even less believiable under the Reed law model for the power of internet interactions and after the revelations from the Snowden files.

K3n.

Geer: Trends in cyber security

Posted Dec 4, 2013 14:54 UTC (Wed) by Otus (subscriber, #67685) [Link]

Yeah, the talk is dripping with some very strong US-type libertarianism. Also:

> Put concretely, the central expression of a free society is a free market

Wait, what? Isn't the central expression of a free society free speech?

Geer: Trends in cyber security

Posted Dec 4, 2013 18:19 UTC (Wed) by Wol (guest, #4433) [Link]

To which I would add, the American market is rigged. The "Buy American" act. Defence contracts subsidising civil industry (the aircraft market is a classic).

Even today, the patent system (which lets American companies get patents on stuff that is unpatentable elsewehere - obvious or prior art), and then use it to exclude foreign competitors. (Ancient) Case in point - how many Americans know that Edison's first lightbulb patent (rejected) - in which he claimed to have invented the lightbulb - actually POSTDATED his visit to a FACTORY MAKING THE THINGS COMMERCIALLY?!!!!

Or look at copyrights - it's better today, but it wasn't that long ago that foreigners couldn't (in practice) get enforceable copyrights.

And as for a free market - well the Americans don't have enough history to tell them otherwise, but in England (NOT Britain), we had a pretty free market in the 1200s. And don't tell me that was a free society!!! King John, Magna Carta, all that, driven precisely because we did NOT have a free society!!!

Cheers,
Wol

Geer: Trends in cyber security

Posted Dec 4, 2013 18:33 UTC (Wed) by nybble41 (subscriber, #55106) [Link]

>> The Internet was built by academics, researchers, and hackers -- meaning that it embodies the liberal cum libertarian cultural interpretation of "American values," namely that it is open, non-hierarchial, self organizing, and leaves essentially no opportunities for governance beyond a few rules of how to keep two parties in communication over the wire.

> I know that the talk was given to a USA government body, but this sounds crazy to me. I'm European and see the internet as a mutually-beneficial grouping of self-governed parties, knowing that peering data for the good of all results in better internet for everyone using it -- we might have many differences and past history but the many opportunities for collaboration and trade which the internet brings our way, these benefits make us all richer in a number of ways.

You may think it's "crazy", but you're really saying exactly the same thing where it counts. Now, the part about the Internet being "an empire"--that's crazy, and contrary to everything they said about it corresponding to libertarian values. It's a non-authoritarian culture, which is about as far from imperialism as you can get. It actually enhances sovereignty, rather than undermining it--but sovereignty of the individual, not the state. The threat to the freedom of the Internet, whether through balkanization or world government, is from how the authoritarians will react to this threat to their "sovereignty", meaning their supposed ownership and control over those otherwise sovereign individuals unfortunate enough to fall under their jurisdiction.

Geer: Trends in cyber security

Posted Dec 9, 2013 1:33 UTC (Mon) by nix (subscriber, #2304) [Link]

It actually enhances sovereignty, rather than undermining it--but sovereignty of the individual, not the state.
The Chinese government might disagree with you. It costs them a crazy amount of manpower, but they get a very good idea of the pulse of the people while keeping them pretty thoroughly censored.

Geer: Trends in cyber security

Posted Dec 4, 2013 13:47 UTC (Wed) by jengelh (subscriber, #33263) [Link]

>Whether to insist that embedded devices self destruct at some age

Well, we already have "planned obsolescence", so might just as well save the extra bucks for the self-destruct. Also because certain gadgets (most of the time batteries) already go up in flames all by themselves. ;)


Copyright © 2013, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds