|
|
Subscribe / Log in / New account

Justicz: Remote Code Execution in apt/apt-get

Justicz: Remote Code Execution in apt/apt-get

Posted Jan 24, 2019 2:16 UTC (Thu) by dfsmith (guest, #20302)
In reply to: Justicz: Remote Code Execution in apt/apt-get by flussence
Parent article: Justicz: Remote Code Execution in apt/apt-get

Because apt also supports file/CDROM/S3/tor/other methods of fetching.* The "overly complicated dialect" looks closer to a dotless SMTP than HTTP to me (though it uses the HTTP status codes), and the bug itself was similar to an SQL injection attack/escaping exploit where a linefeed failed terminate the remotely generated response. The Trojan package was ingeniously downloaded as Release.gpg, with the redirect sent to that file.

Using wget/curl would still expose you to this kind of oversight, which was in the layer between transport and status-back-to-apt, and not related to "Expected-anything".**

* In fact, according to the article, the https fetcher is a wrapper around libcurl. See also apt-cache search apt-transport.

** Corrections gladly accepted: I read the article and I think I understand what was happening, but I am not familiar with apt on a code basis.


to post comments

Justicz: Remote Code Execution in apt/apt-get

Posted Jan 24, 2019 9:11 UTC (Thu) by juliank (guest, #45896) [Link] (1 responses)

https used to use curl, but it's been switched quite some time ago to use the same code as http. Our requirements are simple, really:

* We need to do pipelining. pdiffs without pipelining is too slow
* Servers mess up pipelining, responding out of order. We need to be able to fix this up (by detecting the hash we downloaded and swapping download items)

apt sends the expected hashes to the method, even though it does not really have too. The method sends the calculated hashes back. For .deb files, the hash is used directly, other files pass through the store method and thus cannot be manipulated like in this exploit. We'll do the same for .deb files shortly, I just gotta finish fixing up the test suite regressions.

Justicz: Remote Code Execution in apt/apt-get

Posted Jan 25, 2019 17:01 UTC (Fri) by flussence (guest, #85566) [Link]

Okay, I concede “make pipelining actually work” is a pretty convincing argument for the complexity… better that than the alternative (http2c, which has a null security track record).

Justicz: Remote Code Execution in apt/apt-get

Posted Jan 25, 2019 0:51 UTC (Fri) by ThinkRob (guest, #64513) [Link] (9 responses)

Not only is the existence of such an apparently complicated mechanism defensible, but I don't think the intermediate protocol itself can really be blamed at all [1].

The root cause here is a failure to adhere to a tried-and-true security axiom: never directly intermingle user input with trusted data. In this case, the intermediate apt<->worker protocol would have been fine IF the worker process didn't take things from a (semi-|un-)trusted remote server and just drop it directly into its output with no escaping/validation/normalization/etc. This is exactly the same category of bug as SQL injection vulnerabilities, XSS vulnerabilities, directory traversal attacks, etc. Building a path that contains user input can be done safely, but it requires a fair amount of thought. (Mitigations for this sort of thing are platform/language specific... but are doable in all!)

---
[1] A binary protocol for the apt<->worker communication would have made things harder, but I'm not sure it would have made them provably impossible. What with various character sets, encodings, etc. it's not unheard of to tamper with a non-text backend using text inputs... IIS has the scars to prove that!

Justicz: Remote Code Execution in apt/apt-get

Posted Jan 25, 2019 20:03 UTC (Fri) by wx (guest, #103979) [Link] (8 responses)

Sorry, but that's nonsense. This is not just a mere worker process and a few mistakes here and there.

The established design paradigm for package managers is to have an unprivileged sandbox process that interacts with external entities such as HTTP servers. If anything goes wrong in the sandbox it cannot affect the privileged package manager process. Originally, this was done to mitigate things like buffer overflows in the HTTP implementation but, done properly, the mitigation scope is much wider.

At a glance, APT appears to follow this paradigm with its transport method processes but in reality it does not: Irrespective of the protocol issues and no matter how you try to frame it, it is completely inexcusable for the transport method to do anything else than transport. Under no circumstances should it have been involved in verification (hash computation) at all. The very definition of a sandbox is that you do not trust what's done in there, ever.

In fact, the only plausible reason to design and implement things the way it was done in APT is to intentionally enable the type of exploit we're seeing here. Let's face it, this was a backdoor and it was the second such crypto bypassing backdoor that's become public in APT over the last two years.

The Ubuntu bug on the subject already calls for an external audit of APT. Unfortunately, that approach is futile at this stage. Auditing for accidental bugs is hard, auditing for intentional vulnerabilities that arise from the subtle interaction of an improper design and disguised exploitable functionality is orders of magnitude harder.

The only appropriate response to this incident is to revoke commit access of whoever introduced the code leading to this, throw out the entire implementation, and start over from scratch with a clean design. If you do this transparently with an audit of the new design (before implementing anything!), apply modern techniques to harden the sandbox (seccomp), and have the implementation audited it might be a net win for the project.

PS: pdiffs are a red herring. They add complexity for no tangible benefit (most DDs and virtually all users that know about the possibility disable them because they are slower). Just get rid of them.

Justicz: Remote Code Execution in apt/apt-get

Posted Jan 26, 2019 0:30 UTC (Sat) by pabs (subscriber, #43278) [Link]

pdiffs and debdelta have a lot of value for slow connections and low bandwidth quotas. Fast and unlimited quota Internet access is unevenly distributed and Debian needs to support those who do not have that luxury.

Justicz: Remote Code Execution in apt/apt-get

Posted Jan 26, 2019 8:39 UTC (Sat) by juliank (guest, #45896) [Link]

> The established design paradigm for package managers is to have an unprivileged sandbox process that interacts with external entities

LOL. APT was written in the early 90s, I started sandboxing methods in 2014. I'm not sure what other package managers are doing; but I doubt most are doing that. I have known and it was publically advertised that there is still one leak in the sandbox - the hash verification, and this one is incredibly hard to fix.

> PS: pdiffs are a red herring. They add complexity for no tangible benefit (most DDs and virtually all users that know about the possibility disable them because they are slower). Just get rid of them.

Pdiffs provide substantial performance improvements at least for 10 Mbit/s connections ever since we fixed the bottleneck. I do not think that they perform worse on 50 Mbit/s connections, but they might perform worse on 1 Gbit/s connections.

Justicz: Remote Code Execution in apt/apt-get

Posted Jan 26, 2019 8:44 UTC (Sat) by juliank (guest, #45896) [Link]

> The only appropriate response to this incident is to revoke commit access of whoever introduced the code leading to this

Good news! The people who originally wrote apt are no longer involved in it since a decade or so.

> throw out the entire implementation, and start over from scratch with a clean design

Now, that does seem infeasible. I tried, but I did not get anywhere.

> If you do this transparently with an audit of the new design (before implementing anything!)

Now that's absurd. I mean, the design would basically be the same, it's just the implementation that would be a lot cleaner. And we'd start off with a zero-trust model instead of evolving to it.

> apply modern techniques to harden the sandbox (seccomp), and have the implementation audited it might be a net win for the project.

Seccomp is basically impossible to use. APT has seccomp sandboxing, but it's turned off by default, as it just crashes randomly due to different NSS modules performing syscalls you do not expect and funny stuff like that.

Justicz: Remote Code Execution in apt/apt-get

Posted Jan 26, 2019 14:10 UTC (Sat) by ms-tg (subscriber, #89231) [Link] (4 responses)

> In fact, the only plausible reason to design and implement things the way it was done in APT is to intentionally enable the type of exploit we're seeing here. Let's face it, this was a backdoor and it was the second such crypto bypassing backdoor that's become public in APT over the last two years.

So, before CoC culture existed, there was an existing culture of assuming good intentions in others unless proven otherwise. Especially in cases such as this where, as others have pointed out, the age of the code in question makes malicious intentionality almost completely absurd.

But today, in the age of CoC — does the above unfortunately too-common drive-by assertion of malicious intent violate any putative community norm? If not, should it?

(These are genuine questions, not rhetorical)

Justicz: Remote Code Execution in apt/apt-get

Posted Jan 26, 2019 20:00 UTC (Sat) by wx (guest, #103979) [Link] (3 responses)

> So, before CoC culture existed, there was an existing culture of assuming good intentions in others unless proven otherwise.

How many more cases of DDs breaking essential crypto do you need until you'll accept that evidence as proof?

> Especially in cases such as this where, as others have pointed out, the age of the code in question makes malicious intentionality almost completely absurd.

You are grossly underestimating the threat posed by this. Debian is used in many places, including highly sensitive government networks. If it takes ten years to establish a covert channel into those networks then there are still plenty of actors that will gladly do it. And mind you, the whole "a little mistake here and there that eventually add up" strategy is exactly how you want to operate in that sort of setting because you can then play the "Oh, I assure you it was just an honest mistake!" game later.

It's a rather unfortunate situation since neither party can prove they are right. Maybe it was an honest mistake. I don't know. If I were running any systems on Debian (luckily I'm not) I'd simply have to assume this was a backdoor and migrate away from Debian.

This is bad for the project. Thus my suggestion to stop shrugging this off as "bugs happen" and take it as an opportunity to rethink the whole design from scratch. And this time after surveying what other package managers are doing and under the assumption that there are plenty of malicious adversaries knowing that Debian is no longer a niche OS.

> But today, in the age of CoC — does the above unfortunately too-common drive-by assertion of malicious intent violate any putative community norm? If not, should it?

It depends on what you want to achieve. If you want to establish safe spaces where no critical thinking can challenge your perfect world then you need to put in place a CoC and use it to get rid of critics.

If you want to create secure internet-facing critical infrastructure you always need to assume the worst and be thankful for anyone who dares to point out that the emperor is in fact very naked.

Justicz: Remote Code Execution in apt/apt-get

Posted Jan 27, 2019 1:59 UTC (Sun) by ms-tg (subscriber, #89231) [Link] (1 responses)

Can you explain exactly why the assertions of seriousness and plausible suggestion for security redesign necessesitate or even justify the implication of malicious intent without evidence?

Justicz: Remote Code Execution in apt/apt-get

Posted Jan 27, 2019 2:36 UTC (Sun) by mgb (guest, #3226) [Link]

It would be negligent - and for some criminally negligent - not to seriously consider the possibility of malicious intent.

Justicz: Remote Code Execution in apt/apt-get

Posted Jan 27, 2019 6:29 UTC (Sun) by h2 (guest, #27965) [Link]

So wx, which package manager/distro do you find safe enough for your worries? Obviously nothing that uses rpm, given the far too close ties of the for profit redhat corporation to state actors, I assume not pacman, since your standards would seem to preclude that a priori, so which package manager exactly are we talking about here? Slackware's? Portage? I'm curious.


Copyright © 2025, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds