Geer: Trends in cyber security
Geer: Trends in cyber security
Posted Dec 3, 2013 0:21 UTC (Tue) by Aliasundercover (guest, #69009)Parent article: Geer: Trends in cyber security
Even where free software updates come from honest sources not seeking power over users they remain a pest. When does computer security let us treat our digital machines as, well, machines that just work while we ignore them knowing they ain't broke so we need not fix them enduring unwanted changes and quality risk?
Could it be the native difficulty of the problem is only part of why we don't have a solution?
Posted Dec 3, 2013 0:53 UTC (Tue)
by khim (subscriber, #9252)
[Link] (12 responses)
When they are simple enough. I'm not sure where the borderline lies, but I think right now, today, we can create completely bug-free programs of about 100K in size - and then it takes many-many man-years of work. We know that because there are pieces of silicone which were reviewed by a lot of developers, which are supposed to be absolutely unbreakable, which are very simple… and which were cracked repeatedly. Think Wii's boot1: it's 17K in size and by now it's in the desired state (where you could treat this small program which you don't need to fix… indeed you can not fix it). And it only took few years to iron out all the bugs! Note that in all these years the hardware side was 100% stable and software side was basically unmodified (higher-level components were modified, of course, but you can not modify boot1). More-or-less the same story with XBox360 and PS3. Thus yes, we could reach this state—but only for tiny and very simple pieces. Basically that means that everything is crackable and will remain crackable for the foreseable future—the only exceptions are some "last-resort" protection schemes in critical places (like, hopefully, nuclear plants). But even in these critical pieces of modern infrastructure everything above these lowest-level "last-resort" schemes are basically hopeless. Even high-level CPUs have tons of problems and must be updated from time to time! And if you can not trust CPU then what hope is there for the rest of the system? If you want to see something as exteremely complex as your router or mobile phone (I don't talk about smartphone, but just a simple dumbphone here) to reach that state… I think you'll need to want for decades, perhaps for centuries… and they will be obsolete long before that!
Posted Dec 3, 2013 10:57 UTC (Tue)
by Wol (subscriber, #4433)
[Link] (11 responses)
MATHS IS NOT REALITY !!!
Prove your program correct, by all means (not forgetting that, like most mathematical problems, it is (a) hard, and (b) might not even have an answer).
Then remember that just because your program is correct and is certain to run as required in the imaginary world that is maths, that does not mean that your mathematical space corresponds with the real world and your program will actually do (in the real world) what you programmed it to do in your mathematical space.
Cheers,
Posted Dec 3, 2013 12:08 UTC (Tue)
by khim (subscriber, #9252)
[Link] (10 responses)
Sure and this is why software is not a math. Why do you want to raise that point here? Indeed. The infamous Memcmp Flaw does not exist in naive, simple, model of XBox360's CPU, yet it does exist in real world and if you'll take more precise model (verilog one, for example), you can investigate it there. I've even observed errors in some obscure CPUs which can not be reproduced on verilog model because it's not precise enough! Even if digital world deals with zeros and ones it also deals with timings are these are not zeros and ones thus in the end software is not math, it's just approximated by math much better than many other things.
Posted Dec 3, 2013 13:34 UTC (Tue)
by Wol (subscriber, #4433)
[Link] (8 responses)
And how do we execute a program? We just feed the huge number that is a program, into the device (now a minor part of a cpu) called an Arithmetic Logic Unit (ie a maths machine) and we get another huge number out.
Source code is converted into object code by running a mathematical algorithm (called a compiler) over it ...
etc etc etc. And if software wasn't maths, you couldn't prove it correct ... you need mathematical tools to do that!
I know the difference between maths and science is a subject of great philosophical argument (I've had a few on Groklaw :-) but to me the difference is simple. Maths is using logic to reason about the world. Science is observing the world to see if our logic mirrors reality. And by that definition software is most definitely maths.
Of course, my definition is complicated by a whole bunch of "misnaming"s, because Computer Science isn't, and Theoretical Physics is maths, and stuff like that. But anything where you can use LOGIC (which is itself maths!) must be maths.
Oh - and that memcmp flaw - reading your link it appears to rely on observing timing differences, which is science, which isn't surprising because it includes *hardware* in the mix. The flaw has nothing to do with the software - the instructions - the maths - but everything to do with how long the hardware takes to carry out the instructions.
Cheers,
Posted Dec 3, 2013 15:06 UTC (Tue)
by khim (subscriber, #9252)
[Link] (7 responses)
But that logic any novel, any song, and any movie stored on CD is “simply one huge number”, too. Wikipedia contains pretty good definition: Computer software, or just software, is any set of machine-readable instructions that directs a computer's processor to perform specific operations. As you can see the primary property of the software is not the fact that it contains zeros and ones (many other things are comprosed from zeros and ones when stored on CD), but the fact that it can drive certain hardware. Right, but the end result is not important, the process is. From purely marhematical standpoint merge sort and heap sort may be quite similar, but on real hardware they have significantly different properties and thus are used in different pieces of programs. When you write programs you are driven by capabilities and limitation of hardware in most cases (unless you are writing purely mathematical constructs for purely mathematical “hardware” like Turing machine). You can prove certain things about chemical compounds, too. Why chemistry is not a math, then? Well… Ok, let's go with this definition. Very, very small part of the software it math by that definition. Think about sorting again. If you talk about purely mathematical properties of merge sort and/or heap sort then it may be considered math. But then you start talking about properties of CPU, caches, memory and so on—and at this point you've left realm of math and are firmly in realm of science. Guess where typical requirement to produce certain results in 16ms puts the software? No, it's software issue. Software was written with wrong model of the hardware in the mind thus it had flaw and it was possible to exploit said flaw.
Posted Dec 3, 2013 17:12 UTC (Tue)
by mathstuf (subscriber, #69389)
[Link] (2 responses)
Which is why *patents* shouldn't apply. Copyrights are sufficient. Of course, the copyrights are typically associated with the specific *expression* (or interpretation) of a number which represents the work in some *specified encoding* (PNG, JPG, MP3, vorbis, VP8, etc.), not the number itself.
An artist can make a painting with the numbers 1 through 10 on it and copyright it, but that doesn't mean that the numbers themselves are copyrighted.
I also think that the Knuth quote "Be careful about using the following code -- I've only proven that it works, I haven't tested it." has a kernel of truth rooted in the difference of code in-and-of-itself (which I'd call math) and when the real world starts beating on it with a large club.
> prove certain things about chemical compounds
I don't think that computational chemistry has gotten to the point of proving things in medicine or even organic chemistry (which is still done using statistics gathered from scientific experiments). It can help *explain* things, but I would be interested in research in purely computational chemistry making previously unknown hypotheses later shown correct through experiment.
Posted Dec 3, 2013 17:29 UTC (Tue)
by dlang (guest, #313)
[Link]
I've actually seen articles this week on exactly this topic. One I remember was on computing drug interactions
It's still new enough that major projects are newsworthy, but it seems to be getting there.
Posted Dec 3, 2013 21:31 UTC (Tue)
by khim (subscriber, #9252)
[Link]
Indeed. Numbers can not be patented, copyrighted or trademarked (Intel was unable to trademark 486, that's why 80486 was followed by Pentium, not 80586). Really? Since when copyrights can be applied to “math”? Either it's “simply one huge number” (i.e.: math) and can not be patented, copyrighted or trademarked or it's not just a number, but it's something else, too (i.e.: not a math). I'll wish you luck if you'll try to push that silly “contents of a CD is just a number” theory in court. Indeed. And that is exactly why software is valuable and why it's not a math: because “real world was beating on it with a large club” and it adjusted. Think about VP8 criticism: apparently some people forgot that codecs are supposed to be used to compress the information obtained from real world and presented later to real world people and instead concentrated too much on one single model which gave them numbers. That's all good and well, but you should never forget when you are writing program that you are dealing with real world and not with “just a math”—otherwise you'll end up with code with was “proven but not tested” (best case scenario). Your information is outdated by about two decades. Novadays computers are used quite extensively to save on experiments. And indeed it's quite effective: it finds enzimes which work in certain way pretty well. The only thing it can not do is to prove that there are no bad side-effects—but this is similar to the memcmp flaw mentioned before: it does not exist in simplified world of mathematical model of the real world but it does exist in reality.
Posted Dec 3, 2013 17:37 UTC (Tue)
by Wol (subscriber, #4433)
[Link] (3 responses)
Correct! Which is why copyright, not patent, is an appropriate protection, if any. Oh, and if it's stored as a huge number it itself isn't a novel, song or movie anyway. Once again, we get into philosophy, here it's semantics or meaning. That huge number is meaningless without a human to interpret it :-)
> Wikipedia contains pretty good definition: Computer software, or just software, is any set of machine-readable instructions that directs a computer's processor to perform specific operations. As you can see the primary property of the software is not the fact that it contains zeros and ones (many other things are comprosed from zeros and ones when stored on CD), but the fact that it can drive certain hardware.
Except that you are misreading it. I guess English is not your native language, but the crucial word is "a list of INSTRUCTIONS". I can give instructions till I'm blue in the face, but without something (hardware) to carry out orders nothing will happen. If I give you a list of instructions (a recipe) it won't make a cake magically appear in your kitchen. If I give you directions (again taken from your wikipedia quote) *nothing* will happen unless *you* (hardware) carry them out.
So no, software can NOT drive ANY hardware. What it can do (when fed through an ALU) is make the *computer* drive the hardware for you.
> You can prove certain things about chemical compounds, too. Why chemistry is not a math, then?
Can you prove a chemical compound is true? Sod's law, but you've picked on a chemist here :-) And yes, it's easy to use maths to prove what chemicals SHOULD do (chemists do it all the time), but guess what! When you use SCIENCE to OBSERVE, they don't do what the maths says they should! Take for example the internal combustion engine. Iso-octane plus oxygen gives carbon dioxide and water (plus heat and motion). Except it doesn't. This is science - using maths to calculate what *should* happen, then observing to see what *does* happen. And it never does exactly what you expect. (And yes, the use of air rather than oxygen complicates the maths, but reality still doesn't do what the maths says! :-)
> But then you start talking about properties of CPU, caches, memory and so on—and at this point you've left realm of math and are firmly in realm of science.
And at this point you have left the realm of software and are firmly in the realm of hardware! (Which is why you are, I would agree, in the realm of science :-)
> No, it's software issue. Software was written with wrong model of the hardware in the mind thus it had flaw and it was possible to exploit said flaw.
It's a flaw in the spec, and thus is a programming issue, sure. But programming is not software. The fact is, the *hardware* takes a different time to execute the instructions, depending on the result. The software has no concept of time.
And doesn't this "bug" have absolutely no effect on the result? As I understand it, you feed stuff into the algorithm, and observe how long it takes to calculate the answer. The software ALWAYS produces the SAME result (which is another proof it's maths!). The attacker is interested in the time the hardware takes to produce it, therefore the attack is against the hardware, and as I say it may be a *programming* issue, but it's not a *software* issue - the software is always correct.
Going back to Wikipedia, software is "a list of instructions". Without hardware/wetware, software can do absolutely nothing - it is an abstract concept. ALL software is maths. Once you put a computer in to the mix - any computer - you have left the realm of software.
You are clearly including the presence of a computer in your definition of software. I think pretty much all computer scientists would tell you you are wrong. Sorry. (And this is exactly the confusion American snakes^wlawyers are using to try and patent software.)
Cheers,
Posted Dec 3, 2013 18:31 UTC (Tue)
by malor (guest, #2973)
[Link]
Software *is* math. If you prove a piece of software correct, it will always do what it is supposed to do -- *if* the hardware is correct. The fact that some hardware can potentially run a given software program incorrectly does not make it something other than math.
Software exists only in the abstract, and as such, it can be modeled perfectly as a mathematical process. Chemistry math is always inexact, because you cannot model every individual molecule. All you can do is make abstractions: approximation is the best result possible.
But software isn't like that. 2 + 2 is always 4, even if your handheld calculator claims it's 3.999998. A loop counting from 1 to 100 is a hard truth in exactly the same way that 2+2=4 is, even if the computer running it glitches out, and stops at 98.
You can argue that computer hardware is imperfect, because it is. You can argue that programmers and programming are imperfect, because they are. But you cannot (correctly) argue that software is not math.
You are, in essence, making the argument that two plus two is not four, because calculators are unreliable.
Posted Dec 3, 2013 22:07 UTC (Tue)
by khim (subscriber, #9252)
[Link] (1 responses)
Really? Since when you can apply copyright on a math? Either it's “simply one huge number” (i.e.: math) and can not be patented, copyrighted or trademarked or it's not just a number, but it's something else, too (i.e.: not a math). I'll wish you luck if you'll try to push that silly “contents of a CD is just a number” theory in court. Well, sure. Software is a set of instructions for a hardware, nothing more, nothing less. Hardware is it's evil twin (well, may be the other way around, but anyway: software without corresponding hardware is pretty pointless). Nope. I know quite a few computer scientists who will agree with me. Indeed the famous Knuth's “Be careful about using the following code—I've only proven that it works, I haven't tested it” maxima is important part of software development. If CPU has an error then it's software's job to mitigate said error, if software already exist and we want to develop new piece of hardware then we need to deal with software expectation (or change the software). Software and hardware are intrinsically tied—one is useless without the other one. Well, software is developed in the exact same fashion: till you actually run the thing on real hardware you'll not know exactly how it works. Sometimes it works as expected, sometimes it's too slow and sometimes it does not work because you've forgotten about some important property of hardware (for example if you are switching from x86 to arm and thus are not prepared to deal with memory coherecy issues). If your software have no concept of time then I agree, that that software is probably math. Of course this automatically excludes all the OSes, compilers, games, codecs and other interesting pieces of software and moves the discussion in the realm of “how many angels can dance on the head of a pin?” questions. Yes, because the goal of software was the protection of secret key—and it failed to do that. If you separate the software from hardware then you get quite useless set of zeros and ones, sorry. It's not even a software anymore because how can you check is something is software or not if you don't have a list of instructions accepted by hardware on hand?
Posted Dec 4, 2013 17:27 UTC (Wed)
by Wol (subscriber, #4433)
[Link]
Because as soon as you add hardware to the mix IT'S NOT SOFTWARE!
As for "how do you check?", I guess you must be a youngster. Us oldsters did it the hard way. WE DIDN'T have hardware to hand, and had to prove it on paper. Seriously. That's what we did!
My first boss, in the very job he later took me on in, had to write a program without any hardware to write it on. When the computer finally arrived, six months later, the office typists typed it in and it ran. Flawlessly.
May I please refer you back to that very wikipedia article you yourself referenced - software is A LIST OF INSTRUCTIONS. And, without a computer (you DO know the first computers were wetware, not hardware?) that list of instructions is totally useless.
Look at what you wrote! "If you separate the software from the hardware"!!! You're redefining meanings to suit your current needs. You can't do that!!! You're defining software to include hardware and now you're implying that they CAN be separated. If they can (as I would argue, as language itself implies that they can) then your (re)definition doesn't work.
I'm quite happy with a requirement that a PROGRAM needs hardware to be useful. I'm quite happy that programming is not maths (it's "doing maths", which isn't the same thing at all :-). But the software itself is just maths. Because the human readable source code, the binary, the executable stored as pits on a CD, the executable as stored as magnetic poles on a hard drive or capacitive charges on an SSD, IS THE SAME THING AS FAR AS MATHS IS CONCERNED.
Software is maths. Hardware is reality. A program needs both of them to work.
Cheers,
Posted Dec 3, 2013 13:37 UTC (Tue)
by Wol (subscriber, #4433)
[Link]
Whether the flaw appears depends on what hardware you run it on!
Oh - another definition of software - you can (time taken to do it is discounted) run it on wetware. You can clearly run this memcmp code on wetware :-) and the timing issues (SCIENCE!) will clearly be very different. But if you run it on wetware you can prove it correct! ie it's maths!
Cheers,
Posted Dec 3, 2013 7:33 UTC (Tue)
by NAR (subscriber, #1313)
[Link]
Posted Dec 3, 2013 18:30 UTC (Tue)
by NightMonkey (subscriber, #23051)
[Link] (4 responses)
Where are these perfect machines you speak of? :)
Posted Dec 3, 2013 19:11 UTC (Tue)
by Aliasundercover (guest, #69009)
[Link] (3 responses)
Your refrigerator, stove, furnace, washing machine and air conditioner come close enough. They don't last forever but they don't die for lack of updates either.
Posted Dec 3, 2013 19:39 UTC (Tue)
by NightMonkey (subscriber, #23051)
[Link] (2 responses)
http://www.youtube.com/watch?v=t88WJmZDMBY
I think folks have been led to believe that we have created perfection somewhere. We have not. Entropy exists. Also, the environment that the machines are in matters. You perform more maintenance on a truck in the desert than you do on a truck in a garage in Kentucky. And is anyone actively trying to destroy your washing machine 24/7 365? I hope not! :)
A simple computer that plays tic-tac-toe and doesn't live in a TCP/IP network doesn't need updates. The fact is, computers as we use them are very complex, suprisingly brittle constructs, but lots and lots of marketing have convinced people that they are actually robust.
Posted Dec 3, 2013 20:09 UTC (Tue)
by raven667 (subscriber, #5198)
[Link] (1 responses)
This is a great distillation of the truth of it. I wonder if we really haven't crossed a threshold, or several, where we are getting diminishing returns for all the added complexity. How many of the benefits of modern technology could be implemented using extremely simple electronics and software such that the complexity is manageable. Right now we have layers upon layers upon layers of independent computers and firmware all meshing together in a way that reminds me more of Verner Vinge "A Deepness in the Sky". Could you even identify all of the computers in a modern laptop?
Posted Dec 3, 2013 20:50 UTC (Tue)
by Wol (subscriber, #4433)
[Link]
Managers have a habit of taking a broken process and computerising it in an attempt to make things better.
Sorry, computerising a broken process can't magically fix it. And often, once it's fixed, you don't need to computerise it!
Cheers,
Geer: Trends in cyber security
When does computer security let us treat our digital machines as, well, machines that just work while we ignore them knowing they ain't broke so we need not fix them enduring unwanted changes and quality risk?
Geer: Trends in cyber security
Wol
Geer: Trends in cyber security
Then let's add that thing I rail against ...
MATHS IS NOT REALITY !!!Remember that just because your program is correct and is certain to run as required in the imaginary world that is maths, that does not mean that your mathematical space corresponds with the real world and your program will actually do (in the real world) what you programmed it to do in your mathematical space.
Geer: Trends in cyber security
Wol
Geer: Trends in cyber security
Let's face it, a program, supplied on a CD, is simply one huge number!
How do you define software?
And how do we execute a program? We just feed the huge number that is a program, into the device (now a minor part of a cpu) called an Arithmetic Logic Unit (ie a maths machine) and we get another huge number out.
And if software wasn't maths, you couldn't prove it correct ... you need mathematical tools to do that!
I know the difference between maths and science is a subject of great philosophical argument (I've had a few on Groklaw :-) but to me the difference is simple. Maths is using logic to reason about the world. Science is observing the world to see if our logic mirrors reality.
And by that definition software is most definitely maths.
Reading what you wrote about the memcmp flaw rather more closely, it is obviously a hardware issue :-)
Geer: Trends in cyber security
Geer: Trends in cyber security
Geer: Trends in cyber security
An artist can make a painting with the numbers 1 through 10 on it and copyright it, but that doesn't mean that the numbers themselves are copyrighted.
> But that logic any novel, any song, and any movie stored on CD is “simply one huge number”, too.
Which is why *patents* shouldn't apply. Copyrights are sufficient.I also think that the Knuth quote "Be careful about using the following code -- I've only proven that it works, I haven't tested it." has a kernel of truth rooted in the difference of code in-and-of-itself (which I'd call math) and when the real world starts beating on it with a large club.
I don't think that computational chemistry has gotten to the point of proving things in medicine or even organic chemistry (which is still done using statistics gathered from scientific experiments).
Geer: Trends in cyber security
Wol
Geer: Trends in cyber security
Geer: Trends in cyber security
> But that logic any novel, any song, and any movie stored on CD is “simply one huge number”, too.
Correct! Which is why copyright, not patent, is an appropriate protection, if any.You are clearly including the presence of a computer in your definition of software.
I think pretty much all computer scientists would tell you you are wrong.
Can you prove a chemical compound is true? Sod's law, but you've picked on a chemist here :-) And yes, it's easy to use maths to prove what chemicals SHOULD do (chemists do it all the time), but guess what! When you use SCIENCE to OBSERVE, they don't do what the maths says they should! Take for example the internal combustion engine. Iso-octane plus oxygen gives carbon dioxide and water (plus heat and motion). Except it doesn't. This is science - using maths to calculate what *should* happen, then observing to see what *does* happen. And it never does exactly what you expect.
The software has no concept of time.
And doesn't this "bug" have absolutely no effect on the result?
The attacker is interested in the time the hardware takes to produce it, therefore the attack is against the hardware, and as I say it may be a *programming* issue, but it's not a *software* issue - the software is always correct.
Geer: Trends in cyber security
Wol
Geer: Trends in cyber security
Wol
"Time to market is king" and "release early, release often". These philosophies govern proprietary and open source development, none of them is that keen on quality. If we're lucky, the not ready feature is not included. If we're not, then all bets are off.
Geer: Trends in cyber security
Geer: Trends in cyber security
Geer: Trends in cyber security
Geer: Trends in cyber security
http://www.youtube.com/watch?v=Sb3fSXiz1XM
http://www.youtube.com/watch?v=-4j8RoesBrg
Geer: Trends in cyber security
Geer: Trends in cyber security
Wol
