|
|
Subscribe / Log in / New account

Trouble at Linux Mint — and beyond

By Jonathan Corbet
February 24, 2016
When the Linux Mint project announced that, for a while on February 20, its web site had been changed to point to a backdoored version of its distribution, the open-source community took notice. Everything we have done is based on the ability to obtain and install software from the net; this incident was a reminder that this act is not necessarily as safe as we would like to think. We would be well advised to think for a bit on the implications of this attack and how we might prevent similar attacks in the future.

It would appear that the attackers were able to take advantage of a WordPress vulnerability on the Linux Mint site to obtain a shell; from there, they were able to change the web site's download page to point to their special version of the Linux Mint 17.3 Cinnamon edition. It also appears that the Linux Mint site was put back on the net without being fully secured; the attackers managed to compromise the site again on the 21st, restoring the link to the corrupted download. Anybody who downloaded this distribution anywhere near those two days will want to have a close look at what they got.

The Linux Mint developers have taken a certain amount of grief for this episode, and for their approach to security in general. They do not bother with security advisories, so their users have no way to know if they are affected by any specific vulnerability or whether Linux Mint has made a fixed package available. Putting the web site back online without having fully secured it mirrors a less-than-thorough approach to security in general. These are charges that anybody considering using Linux Mint should think hard about. Putting somebody's software onto your system places the source in a position of great trust; one has to hope that they are able to live up to that trust.

It could be argued that we are approaching the end of the era of amateur distributions. Taking an existing distribution, replacing the artwork, adding some special new packages, and creating a web site is a fair amount of work. Making a truly cohesive product out of that distribution and keeping the whole thing secure is quite a bit more work. It's not that hard to believe that only the largest and best-funded projects will be able to sustain that effort over time, especially when faced with an increasingly hostile and criminal net.

There is just one little problem with that view: it's not entirely clear that the larger, better-funded distributions are truly doing a better job with security. It probably is true that they are better able to defend their infrastructure against attacks, have hardware security modules to sign their packages, etc. But a distribution is a large collection of software, and few distributors can be said to be doing a good job of keeping all of that software secure.

So, for example, we have recently seen this article on insecure WordPress packages in Debian, and this posting on WebKit security in almost all distributions. Both are heavily used pieces of software that are directly exposed to attackers on the net; one would hope that distributors would be focused on keeping them secure. But that has not happened; the projects and companies involved simply have not found the resources to stay on top of the security of these packages.

It is not hard to see how this could be a widespread problem. When users evaluate distributions, the range of available packages tends to be an important criterion. Distributors thus have an incentive to include as many packages as they can — far more than they can support at a high level. The one exception might be the enterprise distributions which, one would hope, would be more conservative in the packages they choose to provide for their customers. But such distributions tend to ship old software (which can have problems of its own) and are often accompanied by add-on repositories filling in the gaps — and possibly introducing security problems of their own.

The situation is seemingly getting murkier rather than better. Some projects try to get users to install their software directly rather than use the distribution's packages. That might lead to better support for that one package, but it adds another moving part to the mix and shorts out all of the mechanisms put in place to get security updates to users. Language-specific packages are often more easily installed from a repository like CPAN or PyPI, but these organizations, too, do not issue security advisories and almost certainly do not have the resources in place to ensure that they are not distributing packages with known vulnerabilities. Many complex applications support some form of plugins and host repositories for them; the attention to security there is mixed at best. Projects like Docker host repositories of images for download. Public hosting sites deliver a lot of software, but are in no position to guarantee the security of that software. And so on.

Combine all this with a net full of bad actors who are intent on installing malware onto users' systems, and the stage is set for a lot of unhappiness. Indeed, it is surprising that there have not been more incidents than we have seen so far. There can be no doubt that we will see more of them in the future.

As a community, we take a certain amount of pride in the security of our software. But, regardless of whether that pride is truly justified, we are all too quick to grab software off some server on the net and run it on our systems — and to encourage others to do the same. As a community, we are going to have to learn to do a better job of keeping our infrastructure secure, of not serving insecure software to others, and of critically judging the security of providers before accepting software from them. It is not fun to feel the need to distrust software given to the community by our peers, but the alternatives may prove to be even less fun than that.

Index entries for this article
SecurityDistribution security


to post comments

Trouble at Linux Mint — and beyond

Posted Feb 25, 2016 7:02 UTC (Thu) by isido (subscriber, #6976) [Link] (11 responses)

A very good article! I don't know what would be a way forward to fix the current semi-broken (securitywise) distribution model - this would seem to be a very hard problem.

One solution might be to move a bit towards to BSD-like model with OS core maintained by the distribution and relying more or less on the upstream to keep the software secure. I don't know if this would be actually a good solution - it would move the responsibility from the distribution maintainers to users and upstream.

Also, I'm not sure if rolling distributions á la Arch are stable enough for all purposes.

One thing I'm fairly certain though - in a couple years, dockerized applications with outdated dependencies will provide a nice new attack vector into our systems.

Trouble at Linux Mint — and beyond

Posted Feb 25, 2016 7:21 UTC (Thu) by mjthayer (guest, #39183) [Link]

> One solution might be to move a bit towards to BSD-like model with OS core maintained by the distribution and relying more or less on the upstream to keep the software secure. I don't know if this would be actually a good solution - it would move the responsibility from the distribution maintainers to users and upstream.

I am obviously biased here, as I work at a project which "[tries] to get users to install their software directly", but linking to the upstream's download infrastructure and spending the effort which did go to maintaining packages on evaluating upstream's security might be a solution.

Trouble at Linux Mint — and beyond

Posted Feb 25, 2016 10:31 UTC (Thu) by aleXXX (subscriber, #2742) [Link]

> One solution might be to move a bit towards to BSD-like model with OS core maintained by the distribution and
>relying more or less on the upstream to keep the software secure. I don't know if this would be actually a good
> solution - it would move the responsibility from the distribution maintainers to users and upstream.

there are only a handful of BSDs, maybe it would indeed be better if the number of Linux distributions would shrink instead of grow, so the work would be, at least potentially, a bit more concentrated and not spread too thin.

Trouble at Linux Mint — and beyond

Posted Feb 25, 2016 12:56 UTC (Thu) by pabs (subscriber, #43278) [Link] (8 responses)

Do upstreams in general care more about security than the distros?

Trouble at Linux Mint — and beyond

Posted Feb 25, 2016 14:00 UTC (Thu) by jospoortvliet (guest, #33164) [Link] (2 responses)

In general? No idea. Depends on the upstream, I suppose. I'd expect projects building web apps to care a fair bit, especially if there's a business behind it - customer pressure and all that. I know we at ownCloud do a decent job... No idea about others, to be honest.

Trouble at Linux Mint — and beyond

Posted Feb 25, 2016 16:25 UTC (Thu) by misc (subscriber, #73730) [Link]

Caring about security can mean differents things. Is "publishing a announce" caring, or is "backporting the fix to a stable release" caring ? One of the main interest of using a distribution is the coordination, and if the model for security is "upgrade to latest version", it might not work that well in practice (mostly because some upstream are also platform for plugins and/or have unexpressed dependencies on the protocol with others components).

There is a reason why the model of Gentoo upgrade is not that widely used in the industry, even if CD/CI and new stuff permitting to revert (such as docker/rkt, ostree, snappy, or just immutable server) might change that.

Business depending on a web application might also have incentives to not offer a free version for a too long time, since some might sell a entreprise version supported for a longer time (even if that's at least better than the alternative of making a opencore model such as newer companies try to do, such as docker and docker datacenter, or beegfs with a strange opensource license)

I have been told that the LF have created a checklist regarding security, which might be a starting point for evalutating a upstream for suitability. having dealt with various upstream for reporting CVE, I must say that while I do not have a set big enough to get any definite conclusion, I was not impressed by how most of them handle security.

Few peoples know how to get a CVE, few seems to care getting one before publishing the patch (or even after), and one vendor took even 2 months to say "there is no security problem" issue, stance that was reverted once I posted publically to get feedback from others (and that's why "responsible disclosure" folks can't have nice things).

At the same time, dealing with distribution security team was a much nicer experience, mostly because they deal with more security issue, so they have likely more experience dealing with it. Maybe some kind of federation of upstream project security could work. 1 single point of contact for various upstreams, able to decide if a security bug is a security bug or not, able to contact the right person using the right medium, following a published process, etc.

And upstream should have to follow some charter to be part of the group.

That's maybe something that could convince me (as a packager, ex-distributor and current sysadmin) that upstream would provides the support I would find acceptable.

Of course, better understanding with the constraint of their downstream (and of the users of the downstream, cause distro do not have rules for the fun of having them) could help a lot. But since there is always a new set of upstream, and each starting by thinking "we know better", that's a never ending task.

Trouble at Linux Mint — and beyond

Posted Feb 27, 2016 5:20 UTC (Sat) by pabs (subscriber, #43278) [Link]

How do you track security issues (CVEs and otherwise) in the ownCloud VMs that are published?

How do users of those VMs get (for example) glibc security fixes?

Trouble at Linux Mint — and beyond

Posted Feb 25, 2016 16:23 UTC (Thu) by khim (subscriber, #9252) [Link] (4 responses)

Of course, no. But then distribution model is broken beyond repair anyway. There are constant bickering about bundled libraries, but the whole discussion is so bizzare it's not even funny.

Sure, if some popular library is updated and project which uses it does not issue a new version then you vulnerable. But are you protected if said project is changed to use library which is system-provided and thus updated? No, of course not: if project does not care about security enough to replace the library they depend upon then what chance is there that they would fix bugs in their own code? Or do you really believe that overworked distribution packagers could actually find time to do an actual review for the code of the software they are packaging?

Distributions were born to solve one simple problem: slow internet gave no way to easily download large packages and small HDDs needed shared libraries to save space. That's it. We should stop pretending that they are solving any other problems - and the problems of slow Internet access and small HDDs are largely behind us, too.

Distributions in their current form are part of problem, not a solution.

Trouble at Linux Mint — and beyond

Posted Feb 27, 2016 5:22 UTC (Sat) by pabs (subscriber, #43278) [Link] (1 responses)

Are you manually downloading, compiling and installing upstream projects like Linux, glibc, PHP and ownCloud instead of using a distribution?

Trouble at Linux Mint — and beyond

Posted Mar 1, 2016 11:04 UTC (Tue) by nix (subscriber, #2304) [Link]

I do! But unlike khim I recognize that I'm a bit of a lunatic and that what everyone else, particularly people looking for security, should do is use a distro to let the trouble off them and to avoid wasting time. (My network-exposed and sensitive boxes run Debian stable, though with glibc replaced to get my stack-protector patches. I'm *so* looking forward to not having to do that last bit.)

I don't do this because of security -- that's a ridiculous stance. I do this because I'm a control freak and hate changes in my working environment and don't want anything to change unless I changed it, and because I want to know the interrelationships and build trickeries of every package I run. I'd now say "because it's been useful in my job so often", which is true, only that doesn't explain why I started doing it fifteen years before I worked for a Linux distributor. The correct explanation is "because it's a habit and a really silly hobby", kind of like stamp collecting with packages. It is not a security benefit.

Trouble at Linux Mint — and beyond

Posted Feb 27, 2016 14:52 UTC (Sat) by lsl (subscriber, #86508) [Link]

> Distributions were born to solve one simple problem: slow internet gave no way to easily download large packages and small HDDs needed shared libraries to save space. That's it.

Nope. What pabs said, really. Distributions do a lot of work so you don't have to.

There's maybe a dozen of packages where my needs are special enough that I'm not getting them from my distro, directly tracking upstream instead.

Some of them are (intended to be) temporary and I'm working towards being able to just use the distro packages. Why? Because tracking those projects takes time. I can only afford that for a small set of packages I care deeply about. Doing it for all the software on even my personal machines is simply not realistic - that'd be a full-time job.

This is what distros provide to you.

Trouble at Linux Mint — and beyond

Posted Feb 28, 2016 23:59 UTC (Sun) by thoeme (subscriber, #2871) [Link]

>Distributions were born to solve one >simple problem: slow internet gave no >way to easily download large >packages and small HDDs needed >shared libraries to save space.
That's way too short sigthend: my distribution gives me a fully consistent linux experience on my workstation, and I very seldomly leave this comfort zone because of the simple fact that *they* know better than me what to fix and what to leave alone. If there are problems, I report them and wait for the fix from my distro, trying to temporarily work around the problem (like pulling problemeous hardware) instead of spending hours to try to ducktape the problem myself.... which seldom works anyway.
So in that light, the distribution *is* my upstream and I care less about the project itself.

Trouble at Linux Mint — and beyond

Posted Feb 25, 2016 13:24 UTC (Thu) by gnu (guest, #65) [Link] (8 responses)

I think the users should be educated to verify whatever they download. Package managers make the task of verifying packages easy. But users should also verify the downloaded ISOs. Debian shows us the way by signing the ISOs with the GPG key and has a page on how to verify:

https://www.debian.org/CD/verify

Trouble at Linux Mint — and beyond

Posted Feb 25, 2016 13:33 UTC (Thu) by gnu (guest, #65) [Link] (7 responses)

.. and it appears that they had the gpg signatures published as well. Doh! So, it is a question of "usability" of the verification then!

https://mirrors.kernel.org/linuxmint/stable/17.3/

Trouble at Linux Mint — and beyond

Posted Feb 25, 2016 13:55 UTC (Thu) by gnu (guest, #65) [Link] (4 responses)

.. and for the reference, the Tor project has a great link explaining how to verify the signatures.

https://www.torproject.org/docs/verifying-signatures.html.en

Trouble at Linux Mint — and beyond

Posted Feb 25, 2016 22:48 UTC (Thu) by giraffedata (guest, #1954) [Link] (3 responses)

And those instructions are far too complex to expect more than a few people to do it. The problem of hijacked web pages would have to get a lot worse than it is before users would find that worthwhile.

Could the web browser take care of all this? It already has a web of trust for use in authenticating websites; how hard would it be for the web browser to validate a signed checksum and check the checksum when it downloads a file?

Trouble at Linux Mint — and beyond

Posted Feb 25, 2016 22:55 UTC (Thu) by pizza (subscriber, #46) [Link] (1 responses)

> Could the web browser take care of all this? It already has a web of trust for use in authenticating websites; how hard would it be for the web browser to validate a signed checksum and check the checksum when it downloads a file?

....This ends up devolving into a PKI problem..

(The signature could also be compromised, as could the public key used to check the signature, as could the fingerprint of the public key, and so forth...)

Trouble at Linux Mint — and beyond

Posted Feb 26, 2016 0:05 UTC (Fri) by giraffedata (guest, #1954) [Link]

Is it a bigger PKI problem than with web browsers authenticating websites or with users manually verifying signatures per the referenced instructions?

Trouble at Linux Mint — and beyond

Posted Feb 27, 2016 0:01 UTC (Sat) by mcatanzaro (subscriber, #93033) [Link]

Kind of, the tech is mostly in place, it's just not widely used yet. If you use HSTS on your server (no excuse for not doing so, but few sites do), and your users use a browser that checks certificate transparency logs (Chrome only right now, Firefox is working on it last I heard), and you have some automated notification when rogue certs for your domain appear in the log (I haven't heard anything about this, it's probably the missing link), then there's no way this could happen (unless the user has never visited your site before, or not for such a long time as for the HSTS policy to expire).

I expect most of the above will be widely deployed in the next five years or so, and we'll all be safer for it. In the meantime....

Trouble at Linux Mint — and beyond

Posted Feb 25, 2016 13:58 UTC (Thu) by jospoortvliet (guest, #33164) [Link] (1 responses)

Sadly, if the shipped software is full of known and well described security holes, verifying the packages isn't going to help you much... This is where the blog linked in the article comes in.

The problem is not on the users' side but, imho, on the distribution and upstream's collaboration problems. There are too many distributions and collaboration with most of them is too hard (tools, policies etc) to make it feasible for upstream to collaborate. The likes of Wordpress don't even try - tell their user to use zip files. We've decided to do the same, seeing how important fixes didn't make it to users, though we still also build packages and are still looking for a better solution.

Thing is, I think that it would be good to have distributions ship end user software, rather than forcing users to get it directly from upstream. The model is, in principle, better, imho. But then, the software needs to be up to date and secure - it frequently isn't, at least in so called 'stable' distributions, as of today. The model is probably OK for non-web-facing software written in C or C++ but with web apps written in PHP, Perl, Go and friends, all with their often crazy needs - it seems hard to get it right.

Trouble at Linux Mint — and beyond

Posted Feb 27, 2016 0:04 UTC (Sat) by mcatanzaro (subscriber, #93033) [Link]

Trouble at Linux Mint — and beyond

Posted Mar 7, 2016 15:40 UTC (Mon) by dvainsencher (guest, #4143) [Link]

Lets admit it: current software is made of bales of hay. We don't deserve to speak of security holes before we upgrade at least to fired mud, because *it is all holes*. From languages ridden with behavior defined to be undefined and compilers that consider those an excuse to propagate any undefinedness found, through applications that often do not have a security strategy at all, many-mega-LOC kernels and OSes full of hidden state to distributions whose CMS vulnerabilities become distribution vulnerabilities...

We collectively have some ideas, but are light years away from actually using them systematically. We suffer from a lack of norms: not enough people are saying "its cool that your app does that, but if it doesn't stand up to a bug in non-critical parts, we are not helping anyone install that anywhere".

Who is going to be the first that makes it as easy to deploy a website made of 50k total LOC (kernel to content, say based on a micro kernel like seL4) as it is to deploy the current typical behemoth? because we have a chance of eliminating security bugs from a 50k website whose kernel has been verified. If you think we have a chance eliminating security bugs from anything resembling the current status quo, I'd like to hear your argument.


Copyright © 2016, Eklektix, Inc.
This article may be redistributed under the terms of the Creative Commons CC BY-SA 4.0 license
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds