|
|
Subscribe / Log in / New account

Garrett: Linux kernel lockdown, integrity, and confidentiality

Garrett: Linux kernel lockdown, integrity, and confidentiality

Posted Apr 23, 2020 7:17 UTC (Thu) by scientes (guest, #83068)
In reply to: Garrett: Linux kernel lockdown, integrity, and confidentiality by NYKevin
Parent article: Garrett: Linux kernel lockdown, integrity, and confidentiality

I appreciate that your effort. Besides the property rights mentioned above, I am mainly talking about how Linux is impossible to make secure because of the massive attack surface. The main way to show this, is to show what it is NOT, which is software written from the beginning to be proven correct:

https://microkerneldude.wordpress.com/2016/06/16/verified...


to post comments

Garrett: Linux kernel lockdown, integrity, and confidentiality

Posted Apr 23, 2020 10:44 UTC (Thu) by smoogen (subscriber, #97) [Link] (5 responses)

When I sat in a class in proven software in the 1980's, the professor who was all for it wanted to stress that it isn't perfect unless you map out the entire universe in your proof. All it takes to destroy most proofs is some change in the environment be it running on a different CPU, hardware or even the orientation of the system. [The example given was that software which had been proven correct would fail in a way outside of the proof, if the computer was turned 90 degrees during the day. It worked fine at night in that orientation. The reason turned out to be a microwave dish in another lab which affected various components inside of the computer just slightly enough to make them not react properly.]

In the end, you can't make anything absolutely secure and generally usable. You can make it resilient and robust, but if people or the environment can interact with the system at all.. there will eventually be a way to make it insecure.

Garrett: Linux kernel lockdown, integrity, and confidentiality

Posted Apr 23, 2020 11:16 UTC (Thu) by Wol (subscriber, #4433) [Link]

This is the difference between Mathematics and Science.

Your program proof will prove that your program is correct and consistent.

You then need to test it, to make sure (as best you can!) that theory and reality agree.

Mathematical proofs prove that things are *consistent*. Scientific proofs prove that reality disagrees with mathematics.

Cheers,
Wol

Garrett: Linux kernel lockdown, integrity, and confidentiality

Posted Apr 24, 2020 12:52 UTC (Fri) by jhhaller (guest, #56103) [Link] (3 responses)

The one I remember was a UK company (INMOS?) who was going to build a provably correct microprocessor. Unfortunately, they didn't account for silicon defects in actual implementations of the microprocessor. While that's not to say formally verifying the design is bad, just that it won't guarantee a particular chip will always work the way the specification says it should.

Garrett: Linux kernel lockdown, integrity, and confidentiality

Posted Apr 24, 2020 17:49 UTC (Fri) by zlynx (guest, #2285) [Link] (2 responses)

Yep. Reminds me of a security story I heard a long time ago, about pen-testing a very fancy biometric lock system. An iris scanner, if I recall correctly.

One attacker bypassed the whole thing by inducing a signal into the wire in the wall that triggered the unlock relay. So much for security scanners.

And one of the smart cards where secret keys could be extracted by observing the outputs while applying a hair dryer to the card.

Garrett: Linux kernel lockdown, integrity, and confidentiality

Posted Apr 24, 2020 18:00 UTC (Fri) by Cyberax (✭ supporter ✭, #52523) [Link]

You can trick a LOT of electronic locks using a simple magnet to trigger the motor relay.

Garrett: Linux kernel lockdown, integrity, and confidentiality

Posted Apr 24, 2020 18:09 UTC (Fri) by kmweber (guest, #114635) [Link]

And to a large extent, it matters just what you're trying to protect against.

In college, I had a summer job that involved (among other things) cleaning and performing routine maintenance on large inkection molding machines at an automobile factory. This required physically entering the machines. The molds used weighed on the order of forty tons; consequently, the hydraulic presses generated several thousand tons of force. You did NOT want to be stuck inside that thing if it were accidentally powered up.

Consequently, as is standard practice in such environments, we used a lockout/tagout process that involved placing padlocks to immobilize the switches and valves for the power sources, and then placed the keys for those padlocks in a lock box. The lock box was shut by individual padlocks for everyone on the team, and each person kept their key on them. That way, it could only be opened (and thus the padlocks on the switches could only be unlocked) if everyone involved came out of the machine and removed their personal padlock.

Of course, the lock box was made of fairly brittle plastic, and the padlocks themselves were of the sort that you could cheaply purchase at any hardware store. So it wouldn't have been difficult at all for someone with ill intent to bypass the whole system--all you'd need is a hammer or bolt cutters. But that was fine, because it wasn't intended to protect against malice. It was there to prevent accidents, and it did so quite effectively.


Copyright © 2025, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds