The future for general-purpose computing
There can be no doubt that general-purpose computing has been a boon to the world. The ability to run different kinds of programs, from various sources, including bought from companies, written from scratch, and, well, built from source, is something that we take for granted on many—most—of the computing devices that we own. But that model seems to be increasingly disappearing in many kinds of devices, including personal computers, as a recent kerfluffle in the Apple world helps to demonstrate.
In mid-November, macOS users suddenly started having difficulty launching applications on their systems. It was taking minutes to launch applications and the timing seemed suspiciously aligned with the release of macOS "Big Sur" on the same day. It turned out that Apple's Online Certificate Status Protocol (OCSP) servers were overwhelmed or otherwise non-functional, which led to the problems.
OCSP is used as part of the process of verifying notarized applications on macOS; those applications are signed by the developer's key. Apple signs the developer's public key, which is contained in a certificate similar to those used by TLS, but the system needs to check to ensure that the key has not been revoked. This check is performed at installation time and then each time the application is run.
Normally, if the OCSP servers are not available, because they are down or the system is not connected to the internet, the connection will fail, which is treated as a "soft failure" so the certificate is considered valid. That way, the applications open immediately. During the outage, though, the servers were up but not responding correctly, so the applications would not launch until the connection timed out. That raised the visibility of the OCSP checking, which had already been going on in macOS for some time.
The failure led to a rather over-the-top blog post by Jeffrey Paul that pointed out some major privacy flaws with OCSP, especially in relation to the checking that macOS Gatekeeper does to ensure that applications have valid signatures before running them. Every time an internet-connected macOS system starts an application, an OCSP query with a whole treasure trove of private information is sent to Apple. Obviously, the servers know what date and time the request was made and the IP address from which it was made; the latter greatly narrows down the geographic location of the system in question. There is also a hash sent for the certificate being queried, which Paul inaccurately called the "application hash". All of that gives Apple a bunch of data that folks may not really want to provide to the company, but the OCSP queries are made over unencrypted HTTP. So anyone able to see the traffic (e.g. ISPs, government spy agencies, WiFi hotspot providers) also gets a look at which applications the user is running, when they are running them, and where.
Paul's analysis was somewhat flawed (as pointed out by Jacopo Jannone and probably others) in that the hash being sent is for the developer certificate, not the application itself. In many cases, that may amount to the same thing because the developers only ship a single application, but the OCSP check is not directly sending a hash that uniquely identifies the application. The information it does send is still pretty useful to Apple, its content-delivery network (CDN) partner Akamai, and any "man in the middle" that can see the traffic. There are also the US and other governments to consider, which can (and do) regularly request records of this sort, without needing a warrant.
In some sense, the privacy implications are not all that different from those of web browsers, which also use OCSP to determine if the TLS certificates for HTTPS sites have been revoked. They use OCSP unencrypted as well, since there is something of a chicken-and-egg problem with regard to determining certificate validity over HTTPS; OCSP stapling helps get around that problem to some extent. There is also a big difference, though, in that all of the macOS application OCSP checks go to Apple servers, while the checks for web-site-certificate validity go to various certificate authorities (CAs) rather than a central location.
In response to the uproar from the OCSP server failures, Paul's post, Twitter threads like this one from Jeff Johnson, and more, Apple published a statement about its security checks. As part of that, the company committed to stop collecting IP addresses from the OCSP checks, to create a new encrypted protocol for doing the checks, and to provide a way for users to opt out of the checks.
In general, checking the integrity of programs before running them is a good thing, of course, but the devil is in the details. Some strongly believe that Apple is making the right decisions to secure its users and their computers, but there is a cost to that security. Apple effectively gets to decide which programs can be installed and run on macOS systems, at least for most users. If a developer crosses the company in some way, their certificate can be revoked. If the developer does not want to ask Apple's permission (in the form of a developer certificate), their applications cannot be installed and run at all.
There are, evidently, settings and mechanisms to get around these checks now, which is something of a double-edged sword. On one hand, it restores control of the system to owner of the hardware, but on the other, it opens said owner up to a number of unpleasant risks. On the gripping hand, perhaps, is the concern that those mechanisms may be disappearing over time. Do triple-edged swords exist?
For example, a tool called Little Snitch has been used to bypass the OCSP checking, but it no longer works for that purpose in Big Sur. Apple has exempted some of its applications from being routed through the frameworks that third-party firewall programs (like Little Snitch) must use. That allows those programs to evade the firewalls and even bypass VPNs. That seems like a good way to wrest control from the owners of the hardware, in truth.
Potentially worse still is the lockdown that may be coming on the new Apple Arm-based hardware. Every other Arm-based Apple device (e.g. iPhone, iPad) is locked down such that only Apple-approved operating systems can be installed on it. Those new macOS systems will only run the Big Sur version, so their owners effectively cannot control their network traffic via firewalls or VPNs. If Apple stays the course with lockdowns for Arm-based hardware, it is a little hard to see what, exactly, buyers of that hardware are "owning".
In a Twitter thread, Ruby on Rails creator Dave Heinemeier Hansson summed the problem up nicely:
We need to remain vigilant, and resist these power grabs masquerading purely as benevolent security measures. Yes, there are security benefits. No, we don’t trust Apple to dictate whether our computers should be allowed to run a piece of software. We already lost that on iOS.
Free software is a way out of that future, for sure, but it requires being able to install and run software of the owner's choice on their devices. That may also be less secure in some scenarios, but it is clearly more free (as in freedom). That makes sense for those who are technically savvy, but sadly may leave our less knowledgeable friends, neighbors, and family behind. Protecting those folks, without forcing them to buy into a walled garden of one sort or another, is a difficult nut to crack.
Index entries for this article | |
---|---|
Security | Integrity management |
Security | Privacy |
Posted Dec 10, 2020 0:24 UTC (Thu)
by dskoll (subscriber, #1630)
[Link]
Apple has been playing shenanigans like this for years. I'm always very sad when I go to an open-source/free software meeting and see lots of people lugging around Macbooks.
I'm convinced general-purpose computing has a future... but not with Apple products.
Posted Dec 10, 2020 1:08 UTC (Thu)
by Cyberax (✭ supporter ✭, #52523)
[Link] (2 responses)
For now.
Posted Dec 11, 2020 20:32 UTC (Fri)
by cpitrat (subscriber, #116459)
[Link] (1 responses)
Posted Dec 14, 2020 21:12 UTC (Mon)
by areilly (subscriber, #87829)
[Link]
Posted Dec 10, 2020 10:41 UTC (Thu)
by ale2018 (guest, #128727)
[Link] (2 responses)
AppArmor doesn't entail privacy-breaking network exchanges, and can be disabled rather easily. Yet, it testifies a whim of super-control, apparently directed against malware, which reaches the point to break long established contracts such as that of the execve man page. That's the same whim that arguably justifies Apple's choice, isn't it?
Posted Dec 10, 2020 11:15 UTC (Thu)
by smurf (subscriber, #17840)
[Link] (1 responses)
In fact the opposite appears to be true: a signature with the magic pixie dust in it allows them to bypass net filters and VPNs. It's fairly easy to paint a scenario where that would be dangerous for the continued health of the user.
Posted Dec 15, 2020 23:00 UTC (Tue)
by jafd (subscriber, #129642)
[Link]
It's still backwards in a lot of ways. What if the network is airtight except for the tunnel made by a VPN? What if I set the routing table so that only select hosts (including Apple's infra) is available via the tunnel, and 0.0.0.0 is in the airtight network? Is it going to try bypassing the VPN, or will it use my routes? The notion of the system software poking around and trying to weasel its way out doesn't sit well with me. It is likely to be a very bug-prone and clever-by-half design. I predict that security researchers are going to be able to poke holes in it.
Posted Dec 10, 2020 11:42 UTC (Thu)
by halla (subscriber, #14185)
[Link]
Posted Dec 10, 2020 17:54 UTC (Thu)
by smitty_one_each (subscriber, #28989)
[Link] (7 responses)
Posted Dec 11, 2020 13:32 UTC (Fri)
by jerojasro (guest, #98169)
[Link] (6 responses)
The invisible hand of the
Posted Dec 11, 2020 14:20 UTC (Fri)
by smitty_one_each (subscriber, #28989)
[Link] (3 responses)
Posted Dec 11, 2020 17:17 UTC (Fri)
by dskoll (subscriber, #1630)
[Link] (2 responses)
So you don't think there's any role at all for government regulation? Throw out anti-trust laws? Let corporations do whatever they want to stifle competition with no legal barriers?
Posted Dec 11, 2020 17:21 UTC (Fri)
by smitty_one_each (subscriber, #28989)
[Link] (1 responses)
Posted Dec 11, 2020 19:05 UTC (Fri)
by jerojasro (guest, #98169)
[Link]
And, do we have enough regulation? or are in a situation where we are wishing
that Big Tech were less prone to treating customers like serfs
?
Posted Dec 11, 2020 21:40 UTC (Fri)
by ecree (guest, #95790)
[Link] (1 responses)
No regulation that's ever likely to make it through the lobbyist-infested swamp of government will prevent users who want this kind of B&D hardware from buying it. The only way is to stop users from wanting it. (No, I don't know how to achieve that either.)
Posted Dec 16, 2020 9:49 UTC (Wed)
by cortana (subscriber, #24596)
[Link]
With the popularity of group chats in iMessage, Apple has embraced and extended, and peer pressure from users is taking care of the extinguishment phase.
Posted Dec 10, 2020 18:29 UTC (Thu)
by mcatanzaro (subscriber, #93033)
[Link] (5 responses)
Most browsers stopped doing this a decade ago when Adam Langley famously pointed out that it is pointless, see https://www.imperialviolet.org/2012/02/05/crlsets.html. I believe Firefox is the only major browser that still does this useless check.
Posted Dec 12, 2020 2:30 UTC (Sat)
by IELLC_LWN (guest, #125891)
[Link]
Posted Dec 12, 2020 7:39 UTC (Sat)
by NYKevin (subscriber, #129325)
[Link] (3 responses)
Posted Dec 12, 2020 14:02 UTC (Sat)
by mcatanzaro (subscriber, #93033)
[Link] (2 responses)
Posted Dec 12, 2020 19:37 UTC (Sat)
by NYKevin (subscriber, #129325)
[Link] (1 responses)
Posted Dec 12, 2020 19:58 UTC (Sat)
by mcatanzaro (subscriber, #93033)
[Link]
Posted Dec 17, 2020 3:41 UTC (Thu)
by brunowolff (guest, #71160)
[Link]
Posted Dec 17, 2020 6:26 UTC (Thu)
by lysse (guest, #3190)
[Link]
In isolation, that's true. But when you have a stream of such checks emanating from a single IP address, and you can already uniquely identify some of the applications, in many cases you'll be able to take a pretty good guess at many of the rest of them once you know their developers. So I'm not persuaded that Paul's concerns are as flawed as they've been painted here.
The future for general-purpose computing
The future for general-purpose computing
This is not (yet) true. You can just create a self-signed certificate and use it to sign binaries.
The future for general-purpose computing
The future for general-purpose computing
The future for general-purpose computing
The future for general-purpose computing
The future for general-purpose computing
The future for general-purpose computing
The future for general-purpose computing
If only there were some. . .invisible hand. . .that could somehow inject competition into the situation
The future for general-purpose computing
market government regulation?
The future for general-purpose computing
singular is to plural.
The future for general-purpose computing
The future for general-purpose computing
Capitalism is buyer/marketplace/seller.
Sellers seek to conquer marketplace and buyer.
You need "enough" regulation for keep buyer/marketplace/seller in equilibrium.
The future for general-purpose computing
The future for general-purpose computing
The future for general-purpose computing
The future for general-purpose computing
Mozilla is in the process of deploying CRLite, which will eventually replace OCSP for most certificates: Introducing CRLite: All of the Web PKI’s revocations, compressed. Enable it today with The future for general-purpose computing
security.pki.crlite_mode = 2
.
The future for general-purpose computing
The future for general-purpose computing
The future for general-purpose computing
The future for general-purpose computing
The future for general-purpose computing
The future for general-purpose computing