|
|
Subscribe / Log in / New account

Geer: Trends in cyber security

Geer: Trends in cyber security

Posted Dec 3, 2013 17:37 UTC (Tue) by Wol (subscriber, #4433)
In reply to: Geer: Trends in cyber security by khim
Parent article: Geer: Trends in cyber security

> But that logic any novel, any song, and any movie stored on CD is “simply one huge number”, too.

Correct! Which is why copyright, not patent, is an appropriate protection, if any. Oh, and if it's stored as a huge number it itself isn't a novel, song or movie anyway. Once again, we get into philosophy, here it's semantics or meaning. That huge number is meaningless without a human to interpret it :-)

> Wikipedia contains pretty good definition: Computer software, or just software, is any set of machine-readable instructions that directs a computer's processor to perform specific operations. As you can see the primary property of the software is not the fact that it contains zeros and ones (many other things are comprosed from zeros and ones when stored on CD), but the fact that it can drive certain hardware.

Except that you are misreading it. I guess English is not your native language, but the crucial word is "a list of INSTRUCTIONS". I can give instructions till I'm blue in the face, but without something (hardware) to carry out orders nothing will happen. If I give you a list of instructions (a recipe) it won't make a cake magically appear in your kitchen. If I give you directions (again taken from your wikipedia quote) *nothing* will happen unless *you* (hardware) carry them out.

So no, software can NOT drive ANY hardware. What it can do (when fed through an ALU) is make the *computer* drive the hardware for you.

> You can prove certain things about chemical compounds, too. Why chemistry is not a math, then?

Can you prove a chemical compound is true? Sod's law, but you've picked on a chemist here :-) And yes, it's easy to use maths to prove what chemicals SHOULD do (chemists do it all the time), but guess what! When you use SCIENCE to OBSERVE, they don't do what the maths says they should! Take for example the internal combustion engine. Iso-octane plus oxygen gives carbon dioxide and water (plus heat and motion). Except it doesn't. This is science - using maths to calculate what *should* happen, then observing to see what *does* happen. And it never does exactly what you expect. (And yes, the use of air rather than oxygen complicates the maths, but reality still doesn't do what the maths says! :-)

> But then you start talking about properties of CPU, caches, memory and so on—and at this point you've left realm of math and are firmly in realm of science.

And at this point you have left the realm of software and are firmly in the realm of hardware! (Which is why you are, I would agree, in the realm of science :-)

> No, it's software issue. Software was written with wrong model of the hardware in the mind thus it had flaw and it was possible to exploit said flaw.

It's a flaw in the spec, and thus is a programming issue, sure. But programming is not software. The fact is, the *hardware* takes a different time to execute the instructions, depending on the result. The software has no concept of time.

And doesn't this "bug" have absolutely no effect on the result? As I understand it, you feed stuff into the algorithm, and observe how long it takes to calculate the answer. The software ALWAYS produces the SAME result (which is another proof it's maths!). The attacker is interested in the time the hardware takes to produce it, therefore the attack is against the hardware, and as I say it may be a *programming* issue, but it's not a *software* issue - the software is always correct.

Going back to Wikipedia, software is "a list of instructions". Without hardware/wetware, software can do absolutely nothing - it is an abstract concept. ALL software is maths. Once you put a computer in to the mix - any computer - you have left the realm of software.

You are clearly including the presence of a computer in your definition of software. I think pretty much all computer scientists would tell you you are wrong. Sorry. (And this is exactly the confusion American snakes^wlawyers are using to try and patent software.)

Cheers,
Wol


to post comments

Geer: Trends in cyber security

Posted Dec 3, 2013 18:31 UTC (Tue) by malor (guest, #2973) [Link]

See, you've got this fundamental confusion going on here: you're mixing up hardware and software.

Software *is* math. If you prove a piece of software correct, it will always do what it is supposed to do -- *if* the hardware is correct. The fact that some hardware can potentially run a given software program incorrectly does not make it something other than math.

Software exists only in the abstract, and as such, it can be modeled perfectly as a mathematical process. Chemistry math is always inexact, because you cannot model every individual molecule. All you can do is make abstractions: approximation is the best result possible.

But software isn't like that. 2 + 2 is always 4, even if your handheld calculator claims it's 3.999998. A loop counting from 1 to 100 is a hard truth in exactly the same way that 2+2=4 is, even if the computer running it glitches out, and stops at 98.

You can argue that computer hardware is imperfect, because it is. You can argue that programmers and programming are imperfect, because they are. But you cannot (correctly) argue that software is not math.

You are, in essence, making the argument that two plus two is not four, because calculators are unreliable.

Geer: Trends in cyber security

Posted Dec 3, 2013 22:07 UTC (Tue) by khim (subscriber, #9252) [Link] (1 responses)

> But that logic any novel, any song, and any movie stored on CD is “simply one huge number”, too.

Correct! Which is why copyright, not patent, is an appropriate protection, if any.

Really? Since when you can apply copyright on a math? Either it's “simply one huge number” (i.e.: math) and can not be patented, copyrighted or trademarked or it's not just a number, but it's something else, too (i.e.: not a math). I'll wish you luck if you'll try to push that silly “contents of a CD is just a number” theory in court.

You are clearly including the presence of a computer in your definition of software.

Well, sure. Software is a set of instructions for a hardware, nothing more, nothing less. Hardware is it's evil twin (well, may be the other way around, but anyway: software without corresponding hardware is pretty pointless).

I think pretty much all computer scientists would tell you you are wrong.

Nope. I know quite a few computer scientists who will agree with me. Indeed the famous Knuth's “Be careful about using the following code—I've only proven that it works, I haven't tested it” maxima is important part of software development. If CPU has an error then it's software's job to mitigate said error, if software already exist and we want to develop new piece of hardware then we need to deal with software expectation (or change the software). Software and hardware are intrinsically tied—one is useless without the other one.

Can you prove a chemical compound is true? Sod's law, but you've picked on a chemist here :-) And yes, it's easy to use maths to prove what chemicals SHOULD do (chemists do it all the time), but guess what! When you use SCIENCE to OBSERVE, they don't do what the maths says they should! Take for example the internal combustion engine. Iso-octane plus oxygen gives carbon dioxide and water (plus heat and motion). Except it doesn't. This is science - using maths to calculate what *should* happen, then observing to see what *does* happen. And it never does exactly what you expect.

Well, software is developed in the exact same fashion: till you actually run the thing on real hardware you'll not know exactly how it works. Sometimes it works as expected, sometimes it's too slow and sometimes it does not work because you've forgotten about some important property of hardware (for example if you are switching from x86 to arm and thus are not prepared to deal with memory coherecy issues).

The software has no concept of time.

If your software have no concept of time then I agree, that that software is probably math. Of course this automatically excludes all the OSes, compilers, games, codecs and other interesting pieces of software and moves the discussion in the realm of “how many angels can dance on the head of a pin?” questions.

And doesn't this "bug" have absolutely no effect on the result?

Yes, because the goal of software was the protection of secret key—and it failed to do that.

The attacker is interested in the time the hardware takes to produce it, therefore the attack is against the hardware, and as I say it may be a *programming* issue, but it's not a *software* issue - the software is always correct.

If you separate the software from hardware then you get quite useless set of zeros and ones, sorry. It's not even a software anymore because how can you check is something is software or not if you don't have a list of instructions accepted by hardware on hand?

Geer: Trends in cyber security

Posted Dec 4, 2013 17:27 UTC (Wed) by Wol (subscriber, #4433) [Link]

> If you separate the software from hardware then you get quite useless set of zeros and ones, sorry. It's not even a software anymore because how can you check is something is software or not if you don't have a list of instructions accepted by hardware on hand?

Because as soon as you add hardware to the mix IT'S NOT SOFTWARE!

As for "how do you check?", I guess you must be a youngster. Us oldsters did it the hard way. WE DIDN'T have hardware to hand, and had to prove it on paper. Seriously. That's what we did!

My first boss, in the very job he later took me on in, had to write a program without any hardware to write it on. When the computer finally arrived, six months later, the office typists typed it in and it ran. Flawlessly.

May I please refer you back to that very wikipedia article you yourself referenced - software is A LIST OF INSTRUCTIONS. And, without a computer (you DO know the first computers were wetware, not hardware?) that list of instructions is totally useless.

Look at what you wrote! "If you separate the software from the hardware"!!! You're redefining meanings to suit your current needs. You can't do that!!! You're defining software to include hardware and now you're implying that they CAN be separated. If they can (as I would argue, as language itself implies that they can) then your (re)definition doesn't work.

I'm quite happy with a requirement that a PROGRAM needs hardware to be useful. I'm quite happy that programming is not maths (it's "doing maths", which isn't the same thing at all :-). But the software itself is just maths. Because the human readable source code, the binary, the executable stored as pits on a CD, the executable as stored as magnetic poles on a hard drive or capacitive charges on an SSD, IS THE SAME THING AS FAR AS MATHS IS CONCERNED.

Software is maths. Hardware is reality. A program needs both of them to work.

Cheers,
Wol


Copyright © 2025, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds