|
|
Subscribe / Log in / New account

Don't Panic about "going dark"

By Jake Edge
February 3, 2016

The rising use of encrypted communication channels, coupled with more system-wide encryption on devices like phones, has been increasingly fingered by law enforcement and other government agencies as an impediment to doing their jobs. This is the so-called "going dark" problem that posits that the ability to thwart terrorism and other crimes is slowly being reduced because these organizations can't see inside the encrypted data—even if they have obtained the legal authority to do so. A recent report [PDF] from a panel of security experts frames the debate a bit differently, as its title ("Don't Panic") might indicate.

The report comes from a "a diverse group of security and policy experts from academia, civil society, and the U.S. intelligence community" under the auspices of the Berkman Center for Internet & Society at Harvard University. The group was convened by Matt Olsen, Bruce Schneier, and Jonathan Zittrain. The latter two are reasonably well-known in our communities; Olsen is former director of the US National Counterterrorism Center (NCTC) and former general counsel to the US National Security Agency (NSA). The report and its conclusions were endorsed by the private sector participants (including Olsen), but the government officials "are precluded from signing on because of their employment"; they were simply thanked for participating in the discussions over the last year.

The "going dark" argument has been used to bolster the idea of encryption backdoors that can "only" be used by properly authorized law enforcement or national security agencies. The idea is as appealing as it is impossible, but debunking it is not really the thrust of the report. Instead, it looks at the reality of the computing landscape and concludes, rather ironically, that various factors will increase, not decrease, the ability to do surveillance.

The "findings" section (pages five and six of the PDF) is kind of eye-opening. For one thing, end-to-end encryption is "unlikely to be adopted ubiquitously by companies, because the majority of businesses that provide communications services rely on access to user data for revenue streams and product functionality", it states. In addition, the "Internet of Things" (IoT) will provide many more avenues for surveillance through sensors, audio, video, and the like.

Beyond that, metadata (e.g. email headers, phone and SMS call records, or location information) is typically not encrypted and is unlikely to be encrypted anytime soon. "This information provides an enormous amount of surveillance data that was unavailable before these systems became widespread." The report points to fragmentation in the software ecosystem as another reason not to panic about going dark. In fact, the report questions the metaphor itself:

Although we were not able to unanimously agree upon the scope of the problem or the policy solution that would strike the best balance, we take the warnings of the FBI and others at face value: conducting certain types of surveillance has, to some extent, become more difficult in light of technological changes. Nevertheless, we question whether the “going dark” metaphor accurately describes the state of affairs. Are we really headed to a future in which our ability to effectively surveil criminals and bad actors is impossible? We think not.

From a security and privacy standpoint, the findings are worrisome, as the report acknowledges. The hope that more and better-integrated encryption will lead to better privacy seems to be dashed by the realities of the market. Companies will still want to track users and their data so they can use (or abuse) that information—or sell it to others. New devices will roll out with insufficient security and privacy safeguards that will spill our secrets left and right. While the report may provide some solace for the agencies that are concerned about going dark, it can only be viewed with some sadness by privacy and security advocates.

The moderately lengthy report provides some background on the debate, including its roots in the "crypto wars" of the 1990s (and earlier). It also gives more detail on the bullet points in the findings section. It builds a fairly strong case that companies and fragmentation in the market will lead to less encryption or, at least, less encryption where the users hold the only keys.

The IoT will likely bring a whole new range of surveillance options. There are several examples given of existing devices (automobile assistance systems with in-car microphones, smart TVs, wireless cameras, even the "OK Google" feature in the Chrome browser) that have the potential to be used for surveillance. Court orders could be used to force those companies to arrange for law enforcement to use these products for surveillance. While the report doesn't directly address it, there is the risk that organizations or individuals could exploit security vulnerabilities in the devices to do the same—with no court order required.

There are also three "individual statements from signatories" attached as Appendix A (page 19), which offer some additional perspectives. Susan Landau focused on the "business case" for encryption:

At a time when nation-state espionage is heavily aimed at business communications and these communications are often found on personal devices, national security dictates that they be secured. And that means policy facilitating the ubiquitous use of uncompromised strong encryption is in our national security interest

Schneier pointed out that there are multiple uses for encryption, from protecting credit card numbers to helping dissidents avoid arrest to journalists communicating with sources, all of which are worth protecting—and protecting well.

Adding backdoors will only exacerbate the risks. As technologists, we can’t build an access system that only works for people of a certain citizenship, or with a particular morality, or only in the presence of a specified legal document. If the FBI can eavesdrop on your text messages or get at your computer’s hard drive, so can other governments. So can criminals. So can terrorists. This is not theoretical; again and again, backdoor accesses built for one purpose have been surreptitiously used for another.

[...] We’re not being asked to choose between security and privacy. We’re being asked to choose between less security and more security.

He also reprises another common theme in his writing:

Of course, criminals and terrorists have used, are using, and will use encryption to hide their planning from the authorities, just as they will use many aspects of society’s capabilities and infrastructure: cars, restaurants, telecommunications. In general, we recognize that such things can be used by both honest and dishonest people. Society thrives nonetheless because the honest so outnumber the dishonest. Compare this with the tactic of secretly poisoning all the food at a restaurant. Yes, we might get lucky and poison a terrorist before he strikes, but we’ll harm all the innocent customers in the process. Weakening encryption for everyone is harmful in exactly the same way.

In his statement, Zittrain reiterates the avenues that are being opened up by new technology:

As data collection volume and methods proliferate, the number of human and technical weaknesses within the system will increase to the point that it will overwhelmingly likely be a net positive for the intelligence community. Consider all those IoT devices with their sensors and poorly updated firmware. We’re hardly going dark when — fittingly, given the metaphor — our light bulbs have motion detectors and an open port. The label is “going dark” only because the security state is losing something that it fleetingly had access to, not because it is all of a sudden lacking in vectors for useful information.

As can be seen, the report is a bit bleak, at least for privacy advocates, but it does paint a realistic picture of where we are today—and where we are likely to be in the near future. There are few, if any, who are arguing that there are no circumstances that should allow government access to private data. But there is a balance to be struck and, at least rhetorically, politicians generally seem to want to magically legislate around the realities of encryption, instead of recognizing the limits of their power. As this report shows, the sky is not falling: law enforcement can get most of what it needs without endangering the real, important uses of encryption. Hopefully the politicians are listening.


Index entries for this article
SecurityEncryption


to post comments

Don't Panic about "going dark"

Posted Feb 4, 2016 14:56 UTC (Thu) by flussence (guest, #85566) [Link] (3 responses)

> [...] increasingly fingered by law enforcement and other government agencies as an impediment to doing their jobs.

As if they did before.

At least encryption would give them plausible deniability for the utter uselessness of their international panopticon when SHTF.

Unknown acronym

Posted Feb 24, 2016 23:28 UTC (Wed) by Max.Hyre (subscriber, #1054) [Link] (2 responses)

SHTE?

The only instances I find in Wikipedia seem to be in a foreign languate (i.e.: not English. :-) Ditto for DuckDuckGo

Unknown acronym

Posted Feb 25, 2016 5:44 UTC (Thu) by mathstuf (subscriber, #69389) [Link] (1 responses)

I think it's "stuff hits the fan". Or some variant ;) .

Unknown acronym

Posted Feb 25, 2016 8:13 UTC (Thu) by lgeorget (guest, #99972) [Link]

Indeed: https://www.allacronyms.com/_internet_slang/SHTF/Shit_Hit...

For non-native english speakers, I recommend having a terminal opened at all time and the BSD program wtf installed (http://www.freshports.org/games/wtf/). It's a simple translator for acronyms. I don't know if it's a new trend, but nowadays on forums, people are supposed to be not only fluent in English but also in acronyms. Of course, you can have your own acronyms database.

By the way, it'll also be useful if you have to skim through GCC documentation and source code.


Copyright © 2016, Eklektix, Inc.
This article may be redistributed under the terms of the Creative Commons CC BY-SA 4.0 license
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds