|
|
Subscribe / Log in / New account

Geer: Trends in cyber security

Geer: Trends in cyber security

Posted Dec 3, 2013 10:57 UTC (Tue) by Wol (subscriber, #4433)
In reply to: Geer: Trends in cyber security by khim
Parent article: Geer: Trends in cyber security

Then let's add that thing I rail against ...

MATHS IS NOT REALITY !!!

Prove your program correct, by all means (not forgetting that, like most mathematical problems, it is (a) hard, and (b) might not even have an answer).

Then remember that just because your program is correct and is certain to run as required in the imaginary world that is maths, that does not mean that your mathematical space corresponds with the real world and your program will actually do (in the real world) what you programmed it to do in your mathematical space.

Cheers,
Wol


to post comments

Geer: Trends in cyber security

Posted Dec 3, 2013 12:08 UTC (Tue) by khim (subscriber, #9252) [Link] (10 responses)

Then let's add that thing I rail against ...

MATHS IS NOT REALITY !!!

Sure and this is why software is not a math. Why do you want to raise that point here?

Remember that just because your program is correct and is certain to run as required in the imaginary world that is maths, that does not mean that your mathematical space corresponds with the real world and your program will actually do (in the real world) what you programmed it to do in your mathematical space.

Indeed. The infamous Memcmp Flaw does not exist in naive, simple, model of XBox360's CPU, yet it does exist in real world and if you'll take more precise model (verilog one, for example), you can investigate it there. I've even observed errors in some obscure CPUs which can not be reproduced on verilog model because it's not precise enough! Even if digital world deals with zeros and ones it also deals with timings are these are not zeros and ones thus in the end software is not math, it's just approximated by math much better than many other things.

Geer: Trends in cyber security

Posted Dec 3, 2013 13:34 UTC (Tue) by Wol (subscriber, #4433) [Link] (8 responses)

How do you define software? Software+hardware is not maths, but if you define software as being just code (of any sort) then it fits the description of maths. This is the big problem of people in America patenting software - it IS just maths. Let's face it, a program, supplied on a CD, is simply one huge number!

And how do we execute a program? We just feed the huge number that is a program, into the device (now a minor part of a cpu) called an Arithmetic Logic Unit (ie a maths machine) and we get another huge number out.

Source code is converted into object code by running a mathematical algorithm (called a compiler) over it ...

etc etc etc. And if software wasn't maths, you couldn't prove it correct ... you need mathematical tools to do that!

I know the difference between maths and science is a subject of great philosophical argument (I've had a few on Groklaw :-) but to me the difference is simple. Maths is using logic to reason about the world. Science is observing the world to see if our logic mirrors reality. And by that definition software is most definitely maths.

Of course, my definition is complicated by a whole bunch of "misnaming"s, because Computer Science isn't, and Theoretical Physics is maths, and stuff like that. But anything where you can use LOGIC (which is itself maths!) must be maths.

Oh - and that memcmp flaw - reading your link it appears to rely on observing timing differences, which is science, which isn't surprising because it includes *hardware* in the mix. The flaw has nothing to do with the software - the instructions - the maths - but everything to do with how long the hardware takes to carry out the instructions.

Cheers,
Wol

Geer: Trends in cyber security

Posted Dec 3, 2013 15:06 UTC (Tue) by khim (subscriber, #9252) [Link] (7 responses)

Let's face it, a program, supplied on a CD, is simply one huge number!

But that logic any novel, any song, and any movie stored on CD is “simply one huge number”, too.

How do you define software?

Wikipedia contains pretty good definition: Computer software, or just software, is any set of machine-readable instructions that directs a computer's processor to perform specific operations. As you can see the primary property of the software is not the fact that it contains zeros and ones (many other things are comprosed from zeros and ones when stored on CD), but the fact that it can drive certain hardware.

And how do we execute a program? We just feed the huge number that is a program, into the device (now a minor part of a cpu) called an Arithmetic Logic Unit (ie a maths machine) and we get another huge number out.

Right, but the end result is not important, the process is. From purely marhematical standpoint merge sort and heap sort may be quite similar, but on real hardware they have significantly different properties and thus are used in different pieces of programs. When you write programs you are driven by capabilities and limitation of hardware in most cases (unless you are writing purely mathematical constructs for purely mathematical “hardware” like Turing machine).

And if software wasn't maths, you couldn't prove it correct ... you need mathematical tools to do that!

You can prove certain things about chemical compounds, too. Why chemistry is not a math, then?

I know the difference between maths and science is a subject of great philosophical argument (I've had a few on Groklaw :-) but to me the difference is simple. Maths is using logic to reason about the world. Science is observing the world to see if our logic mirrors reality.

Well… Ok, let's go with this definition.

And by that definition software is most definitely maths.

Very, very small part of the software it math by that definition. Think about sorting again. If you talk about purely mathematical properties of merge sort and/or heap sort then it may be considered math. But then you start talking about properties of CPU, caches, memory and so on—and at this point you've left realm of math and are firmly in realm of science. Guess where typical requirement to produce certain results in 16ms puts the software?

Reading what you wrote about the memcmp flaw rather more closely, it is obviously a hardware issue :-)

No, it's software issue. Software was written with wrong model of the hardware in the mind thus it had flaw and it was possible to exploit said flaw.

Geer: Trends in cyber security

Posted Dec 3, 2013 17:12 UTC (Tue) by mathstuf (subscriber, #69389) [Link] (2 responses)

> But that logic any novel, any song, and any movie stored on CD is “simply one huge number”, too.

Which is why *patents* shouldn't apply. Copyrights are sufficient. Of course, the copyrights are typically associated with the specific *expression* (or interpretation) of a number which represents the work in some *specified encoding* (PNG, JPG, MP3, vorbis, VP8, etc.), not the number itself.

An artist can make a painting with the numbers 1 through 10 on it and copyright it, but that doesn't mean that the numbers themselves are copyrighted.

I also think that the Knuth quote "Be careful about using the following code -- I've only proven that it works, I haven't tested it." has a kernel of truth rooted in the difference of code in-and-of-itself (which I'd call math) and when the real world starts beating on it with a large club.

> prove certain things about chemical compounds

I don't think that computational chemistry has gotten to the point of proving things in medicine or even organic chemistry (which is still done using statistics gathered from scientific experiments). It can help *explain* things, but I would be interested in research in purely computational chemistry making previously unknown hypotheses later shown correct through experiment.

Geer: Trends in cyber security

Posted Dec 3, 2013 17:29 UTC (Tue) by dlang (guest, #313) [Link]

> I don't think that computational chemistry has gotten to the point of proving things in medicine or even organic chemistry (which is still done using statistics gathered from scientific experiments). It can help *explain* things, but I would be interested in research in purely computational chemistry making previously unknown hypotheses later shown correct through experiment.

I've actually seen articles this week on exactly this topic. One I remember was on computing drug interactions

It's still new enough that major projects are newsworthy, but it seems to be getting there.

Geer: Trends in cyber security

Posted Dec 3, 2013 21:31 UTC (Tue) by khim (subscriber, #9252) [Link]

An artist can make a painting with the numbers 1 through 10 on it and copyright it, but that doesn't mean that the numbers themselves are copyrighted.

Indeed. Numbers can not be patented, copyrighted or trademarked (Intel was unable to trademark 486, that's why 80486 was followed by Pentium, not 80586).

> But that logic any novel, any song, and any movie stored on CD is “simply one huge number”, too.
Which is why *patents* shouldn't apply. Copyrights are sufficient.

Really? Since when copyrights can be applied to “math”? Either it's “simply one huge number” (i.e.: math) and can not be patented, copyrighted or trademarked or it's not just a number, but it's something else, too (i.e.: not a math). I'll wish you luck if you'll try to push that silly “contents of a CD is just a number” theory in court.

I also think that the Knuth quote "Be careful about using the following code -- I've only proven that it works, I haven't tested it." has a kernel of truth rooted in the difference of code in-and-of-itself (which I'd call math) and when the real world starts beating on it with a large club.

Indeed. And that is exactly why software is valuable and why it's not a math: because “real world was beating on it with a large club” and it adjusted. Think about VP8 criticism: apparently some people forgot that codecs are supposed to be used to compress the information obtained from real world and presented later to real world people and instead concentrated too much on one single model which gave them numbers. That's all good and well, but you should never forget when you are writing program that you are dealing with real world and not with “just a math”—otherwise you'll end up with code with was “proven but not tested” (best case scenario).

I don't think that computational chemistry has gotten to the point of proving things in medicine or even organic chemistry (which is still done using statistics gathered from scientific experiments).

Your information is outdated by about two decades. Novadays computers are used quite extensively to save on experiments. And indeed it's quite effective: it finds enzimes which work in certain way pretty well. The only thing it can not do is to prove that there are no bad side-effects—but this is similar to the memcmp flaw mentioned before: it does not exist in simplified world of mathematical model of the real world but it does exist in reality.

Geer: Trends in cyber security

Posted Dec 3, 2013 17:37 UTC (Tue) by Wol (subscriber, #4433) [Link] (3 responses)

> But that logic any novel, any song, and any movie stored on CD is “simply one huge number”, too.

Correct! Which is why copyright, not patent, is an appropriate protection, if any. Oh, and if it's stored as a huge number it itself isn't a novel, song or movie anyway. Once again, we get into philosophy, here it's semantics or meaning. That huge number is meaningless without a human to interpret it :-)

> Wikipedia contains pretty good definition: Computer software, or just software, is any set of machine-readable instructions that directs a computer's processor to perform specific operations. As you can see the primary property of the software is not the fact that it contains zeros and ones (many other things are comprosed from zeros and ones when stored on CD), but the fact that it can drive certain hardware.

Except that you are misreading it. I guess English is not your native language, but the crucial word is "a list of INSTRUCTIONS". I can give instructions till I'm blue in the face, but without something (hardware) to carry out orders nothing will happen. If I give you a list of instructions (a recipe) it won't make a cake magically appear in your kitchen. If I give you directions (again taken from your wikipedia quote) *nothing* will happen unless *you* (hardware) carry them out.

So no, software can NOT drive ANY hardware. What it can do (when fed through an ALU) is make the *computer* drive the hardware for you.

> You can prove certain things about chemical compounds, too. Why chemistry is not a math, then?

Can you prove a chemical compound is true? Sod's law, but you've picked on a chemist here :-) And yes, it's easy to use maths to prove what chemicals SHOULD do (chemists do it all the time), but guess what! When you use SCIENCE to OBSERVE, they don't do what the maths says they should! Take for example the internal combustion engine. Iso-octane plus oxygen gives carbon dioxide and water (plus heat and motion). Except it doesn't. This is science - using maths to calculate what *should* happen, then observing to see what *does* happen. And it never does exactly what you expect. (And yes, the use of air rather than oxygen complicates the maths, but reality still doesn't do what the maths says! :-)

> But then you start talking about properties of CPU, caches, memory and so on—and at this point you've left realm of math and are firmly in realm of science.

And at this point you have left the realm of software and are firmly in the realm of hardware! (Which is why you are, I would agree, in the realm of science :-)

> No, it's software issue. Software was written with wrong model of the hardware in the mind thus it had flaw and it was possible to exploit said flaw.

It's a flaw in the spec, and thus is a programming issue, sure. But programming is not software. The fact is, the *hardware* takes a different time to execute the instructions, depending on the result. The software has no concept of time.

And doesn't this "bug" have absolutely no effect on the result? As I understand it, you feed stuff into the algorithm, and observe how long it takes to calculate the answer. The software ALWAYS produces the SAME result (which is another proof it's maths!). The attacker is interested in the time the hardware takes to produce it, therefore the attack is against the hardware, and as I say it may be a *programming* issue, but it's not a *software* issue - the software is always correct.

Going back to Wikipedia, software is "a list of instructions". Without hardware/wetware, software can do absolutely nothing - it is an abstract concept. ALL software is maths. Once you put a computer in to the mix - any computer - you have left the realm of software.

You are clearly including the presence of a computer in your definition of software. I think pretty much all computer scientists would tell you you are wrong. Sorry. (And this is exactly the confusion American snakes^wlawyers are using to try and patent software.)

Cheers,
Wol

Geer: Trends in cyber security

Posted Dec 3, 2013 18:31 UTC (Tue) by malor (guest, #2973) [Link]

See, you've got this fundamental confusion going on here: you're mixing up hardware and software.

Software *is* math. If you prove a piece of software correct, it will always do what it is supposed to do -- *if* the hardware is correct. The fact that some hardware can potentially run a given software program incorrectly does not make it something other than math.

Software exists only in the abstract, and as such, it can be modeled perfectly as a mathematical process. Chemistry math is always inexact, because you cannot model every individual molecule. All you can do is make abstractions: approximation is the best result possible.

But software isn't like that. 2 + 2 is always 4, even if your handheld calculator claims it's 3.999998. A loop counting from 1 to 100 is a hard truth in exactly the same way that 2+2=4 is, even if the computer running it glitches out, and stops at 98.

You can argue that computer hardware is imperfect, because it is. You can argue that programmers and programming are imperfect, because they are. But you cannot (correctly) argue that software is not math.

You are, in essence, making the argument that two plus two is not four, because calculators are unreliable.

Geer: Trends in cyber security

Posted Dec 3, 2013 22:07 UTC (Tue) by khim (subscriber, #9252) [Link] (1 responses)

> But that logic any novel, any song, and any movie stored on CD is “simply one huge number”, too.

Correct! Which is why copyright, not patent, is an appropriate protection, if any.

Really? Since when you can apply copyright on a math? Either it's “simply one huge number” (i.e.: math) and can not be patented, copyrighted or trademarked or it's not just a number, but it's something else, too (i.e.: not a math). I'll wish you luck if you'll try to push that silly “contents of a CD is just a number” theory in court.

You are clearly including the presence of a computer in your definition of software.

Well, sure. Software is a set of instructions for a hardware, nothing more, nothing less. Hardware is it's evil twin (well, may be the other way around, but anyway: software without corresponding hardware is pretty pointless).

I think pretty much all computer scientists would tell you you are wrong.

Nope. I know quite a few computer scientists who will agree with me. Indeed the famous Knuth's “Be careful about using the following code—I've only proven that it works, I haven't tested it” maxima is important part of software development. If CPU has an error then it's software's job to mitigate said error, if software already exist and we want to develop new piece of hardware then we need to deal with software expectation (or change the software). Software and hardware are intrinsically tied—one is useless without the other one.

Can you prove a chemical compound is true? Sod's law, but you've picked on a chemist here :-) And yes, it's easy to use maths to prove what chemicals SHOULD do (chemists do it all the time), but guess what! When you use SCIENCE to OBSERVE, they don't do what the maths says they should! Take for example the internal combustion engine. Iso-octane plus oxygen gives carbon dioxide and water (plus heat and motion). Except it doesn't. This is science - using maths to calculate what *should* happen, then observing to see what *does* happen. And it never does exactly what you expect.

Well, software is developed in the exact same fashion: till you actually run the thing on real hardware you'll not know exactly how it works. Sometimes it works as expected, sometimes it's too slow and sometimes it does not work because you've forgotten about some important property of hardware (for example if you are switching from x86 to arm and thus are not prepared to deal with memory coherecy issues).

The software has no concept of time.

If your software have no concept of time then I agree, that that software is probably math. Of course this automatically excludes all the OSes, compilers, games, codecs and other interesting pieces of software and moves the discussion in the realm of “how many angels can dance on the head of a pin?” questions.

And doesn't this "bug" have absolutely no effect on the result?

Yes, because the goal of software was the protection of secret key—and it failed to do that.

The attacker is interested in the time the hardware takes to produce it, therefore the attack is against the hardware, and as I say it may be a *programming* issue, but it's not a *software* issue - the software is always correct.

If you separate the software from hardware then you get quite useless set of zeros and ones, sorry. It's not even a software anymore because how can you check is something is software or not if you don't have a list of instructions accepted by hardware on hand?

Geer: Trends in cyber security

Posted Dec 4, 2013 17:27 UTC (Wed) by Wol (subscriber, #4433) [Link]

> If you separate the software from hardware then you get quite useless set of zeros and ones, sorry. It's not even a software anymore because how can you check is something is software or not if you don't have a list of instructions accepted by hardware on hand?

Because as soon as you add hardware to the mix IT'S NOT SOFTWARE!

As for "how do you check?", I guess you must be a youngster. Us oldsters did it the hard way. WE DIDN'T have hardware to hand, and had to prove it on paper. Seriously. That's what we did!

My first boss, in the very job he later took me on in, had to write a program without any hardware to write it on. When the computer finally arrived, six months later, the office typists typed it in and it ran. Flawlessly.

May I please refer you back to that very wikipedia article you yourself referenced - software is A LIST OF INSTRUCTIONS. And, without a computer (you DO know the first computers were wetware, not hardware?) that list of instructions is totally useless.

Look at what you wrote! "If you separate the software from the hardware"!!! You're redefining meanings to suit your current needs. You can't do that!!! You're defining software to include hardware and now you're implying that they CAN be separated. If they can (as I would argue, as language itself implies that they can) then your (re)definition doesn't work.

I'm quite happy with a requirement that a PROGRAM needs hardware to be useful. I'm quite happy that programming is not maths (it's "doing maths", which isn't the same thing at all :-). But the software itself is just maths. Because the human readable source code, the binary, the executable stored as pits on a CD, the executable as stored as magnetic poles on a hard drive or capacitive charges on an SSD, IS THE SAME THING AS FAR AS MATHS IS CONCERNED.

Software is maths. Hardware is reality. A program needs both of them to work.

Cheers,
Wol

Geer: Trends in cyber security

Posted Dec 3, 2013 13:37 UTC (Tue) by Wol (subscriber, #4433) [Link]

Reading what you wrote about the memcmp flaw rather more closely, it is obviously a hardware issue :-)

Whether the flaw appears depends on what hardware you run it on!

Oh - another definition of software - you can (time taken to do it is discounted) run it on wetware. You can clearly run this memcmp code on wetware :-) and the timing issues (SCIENCE!) will clearly be very different. But if you run it on wetware you can prove it correct! ie it's maths!

Cheers,
Wol


Copyright © 2025, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds