While organizations are generally good about installing network firewalls, we have generally
forgotten the basic purpose of a firewall to segment an organizations data and provide access
control between bits of important data. Most organizations don't want to spend the time and
money it takes to actually understand their data flow to the point where they could segment it.
SCADA systems connected in any fashion to the internet is the perfect example of this.
Many organizations really should have separate infrastructure for their important business
critical operations and another set of systems for their internet accessible operations, including
having two separate machines on peoples desktops. At a minimum would be terminal servers
with access controls to restrict the flow of data into and out of the company environment.
This is never going to happen. There are too many efficiencies gained by having everything
easily accessible and it is so much simpler not to segment so that only rare and critical
environments are ever treated this way.
Personally what I think has some merit is sandboxing technologies like SELinux which aim to
firewall applications from one another. The goal, which may not be reached yet, is that the kind
of data driven exploit, which is generally common these days, would not have enough access
once it got onto a system to steal important data or leverage local exploits to grant itself further
access. Once you can access the local kernel enough to exploit one of its bugs the game is over
and any security hardening or structure in place is totally meaningless.
I don't know whether sandboxing applications will raise the bar enough, it is probably a safe bet
that we are in a world where the attackers are just going to press the "override security" button
like in some 80's hacker movie and while there may be islands of hardness out there it will never
be the norm.