Anti-virus to protect against anti-virus vulnerabilities
Posted Apr 12, 2006 18:41 UTC (Wed)
by cventers (guest, #31465)
[Link] (29 responses)
The thing is that we have the technology to stop that nonsense. PaX and
As for viruses themselves, I think the tried and true UNIX security model
I think the biggest challenge we're facing in this area, right now, boils
It's really too bad that mechanisms like the setuid bit aren't a little
But the biggest thing I see here is also something inherently possible in
Since I almost never download / compile / install applications manually,
Posted Apr 12, 2006 18:51 UTC (Wed)
by micampe (guest, #4384)
[Link] (18 responses)
I think the biggest challenge we're facing in this area, right now, boils down to usability. If it were easier to obtain a Linux distribution under which all common applications would behave naturally running without special privileges, the temptation of logging in as 'root' would be gone. When you remove this temptation, what is a virus to do -- it certainly can't modify anything important to the system.
Posted Apr 12, 2006 19:07 UTC (Wed)
by cventers (guest, #31465)
[Link] (9 responses)
Also, a normal desktop user doesn't give things +x, typically because
If viral / malware code cannot as much as run, what damage can it do?
Posted Apr 12, 2006 20:46 UTC (Wed)
by smoogen (subscriber, #97)
[Link] (8 responses)
Todays virus/malware targets are the following:
Download its toolkit (spam relay, keylogger, search and send equiv).
Most email, internet browsers, and desktop environments give all the tools to do this. Heck all you need to do is put a ~/Desktop/freeporn.desktop on the users desktop for most of this to be self-fulfilling.
Several helpful desktops will un-zip/un-tar packages for you and you can just execute them with user interaction (social engineering).. or you can try to execute various exploits in many plugins, browsers, email, etc to do the same.
The spam relay, keyboard logger and search for gnucash docs and send do not require root access. (The keyboard logger may only log the one user but that can be devastating enough).
The only reason you see the comparative lack of malware is not the security model but the fact that less than 10% of useful targets run desktop linux
Posted Apr 12, 2006 22:46 UTC (Wed)
by man_ls (guest, #15091)
[Link] (7 responses)
Finally, the security model is actually better than Windows', and this can slow down the expansion of a virus. If this makes it to be below the percolation threshold, then the virus will not propagate very much. Similarly, if the added difficulty makes the activity unprofitable, then phishers and other scammers will not target the platform.
Posted Apr 13, 2006 8:52 UTC (Thu)
by NAR (subscriber, #1313)
[Link] (6 responses)
Hm, what was it, three security-related stable kernel releases in two days? Not exactly the sign of better designed and implemented systems...
Then there is diversity
There might be diversity in distributions, but the applications are the same everywhere. An OpenSSH bug affects nearly everyone. A glibc bug also. But you're right - as long as relatively few people are using (a specific variant of) Linux, it won't worth the work to create malware for Linux. But I'm afraid as soon as Linux reaches "world dominiation", it will become the primary target of malware authors and there will be Linux viruses and worms.
Posted Apr 13, 2006 11:30 UTC (Thu)
by man_ls (guest, #15091)
[Link]
But it's a good point I could have added to my little list. The kernel development process is so open and flexible that bugs are closed as soon as they are found, and there is a special branch for that; if this does not suit them, distributions can (and do) patch their own branches; if it was really serious, we users might patch and install our own kernels.
Contrast this with proprietary development, where flaws are not found so easily; when they are, they are not solved so fast; and often nobody knows what is being solved. The problem with this approach is that sometimes it seems better to just wait for the regular upgrade cycle and hope noone notices: wrong!
A centralized patch managemente strategy is another issue: distributions tend to aggregate lots of software, and any security issues will generally be solved by them. Proprietary systems tend to aggregate lots of software, so administrators (or users) have to go hunting to know about and get the relevant patches for every tiny piece. I would guess most of them give up.
Promiscuity could be mentioned too. GNU/Linux users often rely on a few sources to provide them with software, sometimes just the distribution itself (witnessed by the aphorism "if it's not in Debian it doesn't exist"). Proprietary software users, in contrast, are more likely to download and install software from random sites; not to speak about software from the darknet, which is a completely unreliable source.
Yes, in the end there will be Linux worms but they need not be so devastating as those found now. We have learnt, we will be prepared.
Posted Apr 13, 2006 13:32 UTC (Thu)
by hazelsct (guest, #3659)
[Link]
But this is off topic for XSS.
Posted Apr 13, 2006 15:48 UTC (Thu)
by ljt (guest, #33337)
[Link] (1 responses)
But OSS software *is* already dominating the (internet) world: apache, bind, ntpd, sendmail, openssh, ... OSS is in fact _the_ target to take over the internet, it's all over there and there is even the source to get your proof of concept working!
Yet, worms tend to appear in the win32 world, I wonder why..
Posted Apr 13, 2006 21:15 UTC (Thu)
by hppnq (guest, #14462)
[Link]
(Where's David Wheeler when you need him?)
Posted Apr 13, 2006 20:06 UTC (Thu)
by cventers (guest, #31465)
[Link] (1 responses)
And when that line of defense crumbles, we have technology like PaX
And when that line of defense crumbles, we have the UNIX security model.
Compare this to Windows which is such a breeding ground for malware that
Malware authors may very well target Linux, but they're going to have a
Posted Apr 14, 2006 9:01 UTC (Fri)
by NAR (subscriber, #1313)
[Link]
That's one thing that the bugs are getting fixed. It's a completely different story whether the users actually install the patches.
we have technology like PaX [...] we have the UNIX security model.
Most of the spyware/adware I've seen on Windows was installed by the user himself. PaX or the UNIX security model does not defend against this kind of threat - the line of defense is that there's a lot fewer software for Linux and the installation of non-trusted software is lot more complicated (wget, untar, configure, make) that running an .exe. I'm afraid part of reaching "world dominitation" on the desktop is to remove these defenses.
Posted Apr 12, 2006 19:21 UTC (Wed)
by bockman (guest, #3650)
[Link] (1 responses)
Posted Apr 14, 2006 0:06 UTC (Fri)
by drag (guest, #31333)
[Link]
I feel that very soon major Desktop environments (well Gnome and KDE) should start looking at applications that are ment to handle files directly off of the internet very closely.
The browser, any AIM client with file sharing capabilties, irc client, email client, and probably the 'unzip'-style tools.
I think it would be great to have a little GUI item for a 'security center'. You'd have sensible defaults for these sort of applications for something like AppArmor. Something that you can activate, is tested, and supported by the desktop makers. And also a way to easily enable something like Clamav to scan files being copied to the home directory or tmp.
Or something like that. The capabilities for this is certainly present now.. it's just you need some way for normal users to use this stuff. People have been talking about MAC stuff for a while saying how that is example of Linux's superiority, but so far it's fairly unusable to protect user's data.
Keeping everybody from running root is only a partial solution.. It makes it much more difficult for malware makers to make stuff that will hide (ie kernel module rootkit), but it still leaves user's vunerable to data theft, keyloggers, and other such nasties.
Posted Apr 13, 2006 2:08 UTC (Thu)
by Ross (guest, #4065)
[Link] (2 responses)
Of course the hole in that plan is that there are so many routes for local root vulnerabilities -- those aren't always found and patched quickly like remote root vulnerabilities.
Posted Apr 14, 2006 1:18 UTC (Fri)
by drag (guest, #31333)
[Link] (1 responses)
No root access will protect the system from the users and users from other users.
In a server environment then this is very good and is ideal situation to have, but a typical PC is a single user environment. The most important items to a user is in their home directory.
The inconvience of having to re-install a operating system is nothing compared to getting your bank account emptied because somebody pulled the password for your online banking from a Firefox session.
So if your goal is building a desktop operating system then your priorities, in terms of security, differ from a multiuser or server environment.
So your priority shifts from "limit the damage a user can do" to "protect that user's files and secrets at all costs". That single user's information is the priority.
Those files and such are the most important parts of the system. If those get violated then what does not having root matter? The attacker has everything they wanted without even thinking about becoming root. Root is immaterial.
Plus in addition the tendancy for distros like Ubuntu to ship with 'sudo' enabled by default pretty much defeats the whole user seperation thing anyways. Even if there were no exploitable local user hole (which is very unlikely) then it still is pretty easy to lift the password from the user if you control their home directory environment.
Posted Apr 16, 2006 1:05 UTC (Sun)
by Ross (guest, #4065)
[Link]
Posted Apr 13, 2006 14:55 UTC (Thu)
by djao (guest, #4263)
[Link] (2 responses)
Posted Apr 14, 2006 1:25 UTC (Fri)
by drag (guest, #31333)
[Link] (1 responses)
Passwords for email or websites get stored there. A person can modify your path statement so that you can launch a trojen'd application or whatnot to steal other information.
All sorts of stuff like that.
So if your goal is to protect your information it doesn't matter if you have clean backups. By the attacker (or attacker's automated attack via virus or worm) is able to steal confidential information then your screwed anyways. Damage has been done.
Personally I have a encfs that I keep secret stuff in, but most users aren't going to know how to use that sort of thing. It would be nice to have that sort of thing built into the desktop environment.
Posted Apr 19, 2006 1:16 UTC (Wed)
by GreyWizard (guest, #1026)
[Link]
Posted Apr 12, 2006 18:56 UTC (Wed)
by NAR (subscriber, #1313)
[Link] (1 responses)
Except deleting that .tex file in the user's home directory that contains her just finished thesis and she haven't made a backup yet. Or can open port 6284 and listen for a magic command, that would make the computer take part in a DDoS attack. I believe than on a work station/home system the most important files are not in /usr, but in /home, so while the traditional UNIX system can protect the former, it's not enough.
Posted Apr 13, 2006 8:11 UTC (Thu)
by trochej (guest, #35052)
[Link]
But won't touch on sources in my home directory or my daughter's homework in her home dirctory. And since you said it already: "she haven't made a backup yet", which means she did backups, right? Every day, of course, right? Because malware is not only danger to your data. So, she did, didn't she? And while she's out looking for that backup, I login to my undamaged home and do some coding.
Posted Apr 12, 2006 21:26 UTC (Wed)
by sholdowa (guest, #34811)
[Link]
The Unix security model has proved itself to be more robust than most over the last 40 or so years, especially when the base model is used in conjunction with acls. Let alone the addition of selinux (:
Posted Apr 13, 2006 3:17 UTC (Thu)
by piman (guest, #8957)
[Link] (6 responses)
I don't remember a time when this has ever been true. You can always run the appropriate interpreter/runtime linker for the program to execute it, even without +x set.
Posted Apr 13, 2006 11:17 UTC (Thu)
by zotz (guest, #26117)
[Link] (4 responses)
How do you see a virus taking advantage of this fact? How is it going to get executed to run the interpreter and invoke itself without being +x? (Granted, I may be a little dense this early in the morning.)
all the best,
drew
Posted Apr 13, 2006 15:54 UTC (Thu)
by bronson (subscriber, #4806)
[Link] (3 responses)
$ echo "echo Howdy" >> /tmp/tt
Running an binary executable without +x:
$ chmod a-x /bin/echo
Trivial. The +x bit is just for convenience. I'm really surprised that there are still people that think it adds any sort of security whatsoever.
Posted Apr 13, 2006 16:05 UTC (Thu)
by bronson (subscriber, #4806)
[Link] (1 responses)
Posted Apr 13, 2006 19:58 UTC (Thu)
by cventers (guest, #31465)
[Link]
It doesn't matter that you can beg the linker to load it for you -- to do
Posted Apr 13, 2006 21:49 UTC (Thu)
by man_ls (guest, #15091)
[Link]
The issue here is precisely about the convenience that +x permissions represent: a user clicking on a file attached to a mail message. On Linux desktops any random file that you download needs to have its permissions raised before it can run; if you can do it, then you (hopefully) know enough to be careful.
Posted Apr 13, 2006 21:39 UTC (Thu)
by cventers (guest, #31465)
[Link]
Suppose that the linker didn't allow you to load up stuff that wasn't +x.
In fact -- someone else could write and compile this, and convince you to
But if that in and of itself doesn't have +x, you can't accidentally run
Posted Apr 12, 2006 19:33 UTC (Wed)
by felixfix (subscriber, #242)
[Link]
Seems to me that spam, viruses, phishing, etc, depend to a great extent on botnets legal and otherwise. That these botnets can't be all that hard to find, and to shut down or hijack, if one has the resources. Since the US's NSA, for instance, is widely reputed to monitor the net for scare words, it seems highly unlikely that they are not aware of these botnets and the traffic that controls them.
I propose that the reason we still have so much spam, zombies, and botnets is because the NSA and its ilk want to keep them available "in case of emergency" such as diverting them to DDoS their enemies in case of, well, who knows what ... This also means that if anyone actually comes up with a good way of clobbering botnets or protecting against them, that would not please the NSA, and they might try to sabotage such efforts.
I hope no one takes me seriously, especially the NSA folks.
Posted Apr 12, 2006 21:31 UTC (Wed)
by ccchips (subscriber, #3222)
[Link] (2 responses)
Posted Apr 14, 2006 5:06 UTC (Fri)
by AnswerGuy (guest, #1256)
[Link]
As for those organizations which can better afford to purchase and
If they have to "send around a specialist" to do so then they have bigger
The gateways don't represent an undo cost ... and doing both might not be
"Patching" ClamAV across your systems in your enterprise should amount to running a script to scp/ssh to each of the hosts or something equally trivial. Modern Linux systems in an enterprise should be
So the "sending around a specialist" should, be more like: drop the updated packages into the magic directory on the company master YUM/APT repository and let all the systems pick them up and apply them according to your existing deployment policies. If it's any worse than that then you don't have an effect patch/update management system and should really solve the strategic infrastructural issue which will then solve such little problems as this in due course.
JimD
Posted Apr 16, 2006 1:12 UTC (Sun)
by Ross (guest, #4065)
[Link]
Posted Apr 13, 2006 7:40 UTC (Thu)
by gvy (guest, #11981)
[Link]
This time, running software to cover Windows bugs might make *NIX systems more vulnerable. Ouch.
The virus 'problem' is one that amuses me greatly.Anti-virus to protect against anti-virus vulnerabilities
SSP offer great minimal-impact ways in which we can defend ourselves from
many exploits.
comes into play here. If you don't give an application or script +x, it
doesn't run.
down to usability. If it were easier to obtain a Linux distribution under
which all common applications would behave naturally running without
special privileges, the temptation of logging in as 'root' would be gone.
When you remove this temptation, what is a virus to do -- it certainly
can't modify anything important to the system.
better. Imagine if you will a more descriptive setuid profile, such that
an installed setuid binary can assume *some* privileged access, but
absolutely no more than it needs. Linux and its extensions (and I'm sure
other OSS operating systems) are gaining ground in terms of how well you
can controll access. We (as developers) need to be able to take advantage
of these extensions in a robust and portable way.
the OSS model (and difficult to impossible in other models). When I want
a piece of software, I use "emerge". This makes it possible for the
application sources in question to be signed / approved by a trusted
party.
I have no more to worry about than how much trust I really have for my
trusted party.
Anti-virus to protect against anti-virus vulnerabilities
On the systems you're referring to, there is nothing, I repeat, absolutely nothing as important as the user's home, so the "it doesn't run as root" myth is utterly useless and dangerous for users. The illusion of safety is more dangerous than the real threat.
Perhaps I should have been more specific in what I was getting at. It is Anti-virus to protect against anti-virus vulnerabilities
of course possible to do localized damage. The important point from a
malware perspective, I think, is that a virus without special
administrator permissions cannot do very much to hide.
they don't know how to. (A more experienced user might, but then you have
to consider that these people are more sophisticated in choosing what
gets +x and what doesn't.)
Anti-virus to protect against anti-virus vulnerabilities
Exectute its toolkit (set +x bit in some cases or just get the higher program to do so).
Maybe hide itself (get root,etc).
Useful targets
The only reason you see the comparative lack of malware is not the security model but the fact that less than 10% of useful targets run desktop linux
I can think of many other reasons; most have been discussed here at one time or another. First, the systems themselves tend to be better designed, so this can require attacks to be more specialized. Then there is diversity -- the Windows monoculture makes it all too easy to know your way around your target machine. Another one is the security culture -- Unix and Linux users are better informed, and they tend to know their environments better. Whether this is because they are more technically inclined to begin with or they learn more using Linux is an open issue. Badly designed software is much more common on Windows too; and privacy is less valued, so your private data is more likely to be lying around.
First, the systems themselves tend to be better designed
Useful targets
Reliable sources
Hm, what was it, three security-related stable kernel releases in two days? Not exactly the sign of better designed and implemented systems...
Hmmm, I would say this concerns the development process, not the design of the systems. A better design would translate in things like defense in depth: the difficulty to exploit a kernel-related bug is higher since you normally require a local account or even superuser privileges.
An OpenSSH bug affects nearly everyone.
You have a point, it's true that OpenSSH is widely used, but even this is not ubiquitous as not everyone uses SSH to administer their systems; some people use webmin, some (as I do now) just use their machines locally. Often different branches (as in Apache httpd 1.3 and 2.0) will have different "bugsets".
There's also diversity of architectures, for those who will take advantage of it. My office desktop is an alpha, and home server/firewall is an ARM Netwinder, both of which are totally immune to all x86 ELF binaries and buffer overflow attacks. It's really not expensive to do this, though running Debian helps. :-)Architecture diversity too
"But I'm afraid as soon as Linux reaches "world dominiation", it will become the primary target of malware authors and there will be Linux viruses and worms. "Useful targets
I think there are a lot more Windows systems out there then you imagine. ;-)
Useful targets
I'm not really worried about viruses and worms, because I think the open Useful targets
source model works for getting bugs found and fixed fast.
(which should be standard) to get in the way of allowing a vulnerability
to be a vector for an arbitrary code exploit.
huge corporations like Sony took advantage of totally stupid design
decisions (such as AUTORUN) to install crap into people's kernels without
permission.
huge challenge in their face when they do. And if they do manage to find
ways to squeeze on in, I'm a million percent confident the OSS community
will close those holes in seconds, because having every line of source
code for your system means that there are orders of magnitude more people
ready and willing to do so.
I think the open source model works for getting bugs found and fixed fast.
Useful targets
Yes. Sometime I think that desktop disttibutions should start shipping with network-exposed apps (browser, e-mail, newsreader, chatter) started by default within sandboxes, or at least a different account which can read but not write the user files. Anti-virus to protect against anti-virus vulnerabilities
This may make for some inconvenience, but it would make
more difficult to cause serious damage (at the minimum, the virus
writers would be forced to include some privilege escalation exploit in their "poison pill" ).
Well those sandboxes are what things like AppArmor from Novell and SELinux from Redhat are designed to accomplish.Anti-virus to protect against anti-virus vulnerabilities
The point is, the malware can trash the user's home directory, but it can't trash the system. On Windows people are regularly reinstalling the whole system to recover from these issues.Anti-virus to protect against anti-virus vulnerabilities
Ya sure..Anti-virus to protect against anti-virus vulnerabilities
Well I don't know how many people I've heard blame their children for malware on their home desktop or laptop. In that situation, unless you are sharing an account with your child/spouse/etc., you do get a measure of protection.Anti-virus to protect against anti-virus vulnerabilities
Even if the threat of linux viruses attacking home directories en masse becomes real some day, you still have the major advantage that home directories are a lot easier to back up and restore than entire systems. That gives the minority of us who possess backup-conscientious attitudes a real fighting chance compared with the alternative scenario.
Viruses attacking home directories
But your home directory contains secret information right? Viruses attacking home directories
A personal computer with reasonable bandwidth has valuable resources that are easier for an attacker to exploit than secret information. Cleaning out a bank account is more complex and risky than installing a spam relay and selling it to the highest bidder, for instance. Defending a computer is a challenging problem that has yet to be effectively addressed by any technology. Still, enumerating viruses is a particularly bad (though lucrative) approach.You understate your case.
what is a virus to do -- it certainly can't modify anything important to the system.
Anti-virus to protect against anti-virus vulnerabilities
> Except deleting that .tex file in the user's home directory that contains Anti-virus to protect against anti-virus vulnerabilities
> her just finished thesis and she haven't made a backup yet.
setuid/gid has always done exactly what you want it to do. All you have to do is properly set up the privileges of the user/group that you are changing to. Create your own custom sets and there's nothing you can't do. root isn't the only user to setuid to.Anti-virus to protect against anti-virus vulnerabilities
> If you don't give an application or script +x, it doesn't run.Anti-virus to protect against anti-virus vulnerabilities
"I don't remember a time when this has ever been true. You can always run the appropriate interpreter/runtime linker for the program to execute it, even without +x set."Anti-virus to protect against anti-virus vulnerabilities
---
http://www.ourmedia.org/node/187924
Bahamian Nonsense
Running a shell script without +x:Anti-virus to protect against anti-virus vulnerabilities
$ . /tmp/tt
Howdy
$ /bin/echo hi # use fullpath to avoid bash builtin
bash: /bin/echo: Permission denied
$ /lib/ld-2.3.6.so /bin/echo hi
hi
"I'm really surprised..." Sorry, zotz, that comment is not aimed at you. It's aimed at whatever is giving people the idea that ONLY files marked +x can be executed. A lot of people have this potentially dangerous misconception.Anti-virus to protect against anti-virus vulnerabilities
The security comes from the fact that unlike the dominant desktop Anti-virus to protect against anti-virus vulnerabilities
operating system (Microsoft Windows), merely *clicking* on something (or
typing its path directly) will not invoke it as an executable.
so requires explicit user action, ie, they must know what they are doing.
We are not talking about theoretical mechanisms to execute random code using generic shell commands; if you are already running you can just chmod the script file and running. But when you want to dupe users into running the malicious code, imagine they receive the following message: "to see dancing pigs just download the attached file on your desktop, start a console and type at the prompt '/bin/ld-2.3.6.so ~/Desktop/dancing.pigs'". Not practical.
Change permissions before it runs
To put a finer point on it, it doesn't get run *accidentally*.Anti-virus to protect against anti-virus vulnerabilities
If you went to the trouble, you could still write and compile your own
program that would read an ELF, set up the text, etc and run.
download it.
it. The only way to do so is to be very explicit (ie, hand it to the
linker, give it +x, or some other non-"whoops I just opened a viral
attachment" method).
I post this in honour of two good friends who can always come up with the wildest conspiracy theories, yet in such a manner that you have to stop for a moment and think why it is not quite so plausible ... I hope I do them justice :-)Conspiracy theory for fun
I can assure you, there are probably places that can afford putting someting on a gateway more than they can afford sending a support specialist around to apply updates.....Anti-virus to protect against anti-virus vulnerabilities
Personally I loved the hint of sarcasm in the original posting.I loved the sarcasm to which you are responding
deploy these little embedded systems gateways than to deploy patches
to their existing systems ...
systems management problems then the gateways will ever be able to address.
such a bad idea (assuming that the gateways in question actually perform
their core jobs ... VPN/routing ... satisfactorily as well). Belts *and*
suspenders. It's only effective with a bit of diversity (two pairs of
suspenders or two belts doesn't add any appreciable benefit ... you can
still "get caught with your pants down" as it were).
configured with some sort of patch/update management agent, whether that's
YUM and apt-get, or something like Opsware, or OpenCountry or whatever.
I would hope that even places which run virus scanners apply security updates to their systems. Virus scanners don't prevent all security problems, and because they are based on signatures you have windows of exposure between the time new malware is in the wild and when it is blocked -- windows which would be covered if the malware was taking advantage of a vulnerability which had been patched long ago, if the system had that patch applied.Anti-virus to protect against anti-virus vulnerabilities
Recent fun around library used in wide variety of Symantec products was rather lightshedding on the nature of such "security" products...not clamav only