GUADEC: Danny O'Brien on privacy, encryption, and the desktop
Journalist and digital rights activist Danny O'Brien came to GUADEC to try to educate GNOME hackers about
the threats facing journalists, their computers, and their online
communication from governments and organized crime. But free software can
help, so he wanted to outline the features that he thinks could be added to
desktops to help secure them and protect the privacy of all users, not just
journalists. Part of his job as internet advocacy coordinator for the Committee to Protect Journalists (CPJ) is to talk
to internet developers and "persuade them to think about how
journalists in repressive regimes are affected
" by the choices those
developers make.
![[Danny O'Brien]](https://static.lwn.net/images/2010/guadec-dobrien-sm.jpg)
O'Brien has written for multiple publications including Wired UK and
the Need To Know email newsletter that he founded, which ceased
publication in 2007. He has also worked for the Electronic Frontier
Foundation (EFF) as activist coordinator and most recently its
international outreach coordinator. He is now with CPJ, which is
an organization that seeks to protect journalists from various threats,
both physical and in the online world. "They know the levers of
power to get people out of trouble, or to stop them from getting into
it
", he said.
He started out by explaining that journalists do not understand recursion as he found out when he tried to unpack GUADEC (GNOME users' and developers' European conference) for his boss. The use of an acronym as the first word of an unpacked acronym was problematic enough, but when tried to explain that GNOME is (or was) the GNU Network Object Model Environment, he sensed he was getting in a bit too deep. Then he had to try to explain "GNU's Not Unix".
The problems that many in the online and free software worlds have been
concerned about for years are finally becoming mainstream he
said. "Powerful forces are trying to stop the spread of information
online
", and that message is finally starting to get out. He put up
the recent xkcd comic ("It's the
world's tiniest open-source violin
") as an example of one place
where those concerns are starting to get some mainstream attention.
He pointed to a number of different attacks against the computers of journalists, generally from governments, but sometimes also from organized crime syndicates. It's not just repressive regimes that target journalists, he said, noting reports on the CPJ website regarding Japanese journalists who have been subjected to governmental pressure and mistreatment.
One of the more insidious attacks against journalists' computers was an email sent to foreign journalists based in Shanghai and Beijing from a fictional editor for The Straits Times. The email was a credible request for assistance in contacting people on a list contained in a PDF attachment—a PDF with a zero-day exploit that installed spyware on the computer. It was not just the foreign correspondents who were targeted, however, as the email was also sent to the native Chinese assistants of the correspondents, which is a list that would be difficult to generate—unless a large intelligence agency was involved.
Another common tactic used by governments to intimidate and spy on journalists is to raid the offices of a television/radio station or publication because the organization supposedly owes back taxes. All of the computer equipment is then seized for evidence. A variation of that scheme was recently used in Kyrgyzstan where a television station was raided due to alleged software "piracy" and all of the computers were confiscated. Whether tax or copyright violation charges are ever filed is irrelevant because the government is really after the information stored on the computers.
Free software hackers have more of an interest in these kinds of problems
"than just not [being] the ones affected
". There are things
that free software already does fairly well because those hackers
"have an interest in creating secure systems
", but there's more that could
be done. It makes sense for it to be the free software community that
fixes these problems, because it is "not beholden to big
interests
", O'Brien said.
So what is the "low hanging fruit
"? Encryption is one area
that is relatively well covered, at least for the web, with TLS. It
provides security for both publishers, readers, and commenters that is
protected from even "state-sized interceptors
". It makes
simple censorship more difficult. The well-known Great Firewall of China looks for keywords, while the lesser-known Great English
Firewall matches URLs to a list of child pornography sites; each of those
censorship methods is blocked by encrypting web traffic.
But there all sorts of internet protocols that are plaintext. "Since
we don't use telnet any more, why should our code?
" He was
disappointed that the Telepathy communication framework doesn't ship with
Off-the-Record (OTR)
encryption support because it makes his job harder when recommending tools
to journalists.
He mentioned some Russian journalists that he had talked to who don't talk
on the telephone because they believe it to be bugged. They also only use
Gmail over HTTPS, "which is fine if you trust Google
", but
they switched to using Yahoo Messenger "because they heard good
things about it
"—unfortunately Messenger isn't encrypted.
O'Brien said that the reason they didn't know that it "is less secure
is because their desktop isn't telling them
".
SSL certificates are another area of concern. Certificates can be forged
by governments or other entities and then used in targeted attacks to
intercept encrypted communications. The journalists that O'Brien deals
with are the "canaries in a coal mine
" for these kinds of
problems. It is a "challenge for user experience
" to
alert the user to things like changed certificates, but there are also
technical barriers as the libraries often don't return that kind of status
to the applications.
He would like to see desktops have some sort of "advocate
" for
user security that would check and report on privacy and security issues
with the software being used. User privacy and security are
"pervasive concerns that should live on the desktop
", O'Brien
said. The desktop is becoming more intertwined with the web so it would be
very beneficial to have some kind of
active monitoring that is "sitting there checking that the systems
are secure
".
When someone wants to communicate with multiple friends, why does the data
have to be sent to a central server, he asked. He would like to see the
desktop become a "first-class player on the internet
" by
communicating in a decentralized, peer-to-peer fashion.
The organizations
that know they don't want people to have privacy recognize that the desktop
is the gatekeeper. A person's desktop is their "heart of
trust
", he said. "We have a responsibility to take the freedom
that we take for granted and give it to people whose only privacy is their
desktop
".
O'Brien came to GUADEC because he believes that the project can help solve
the problems in the privacy and security areas. GNOME has the "user
experience chops
" to make
these kinds of changes, while continuing to produce a usable desktop.
While he is particularly focused on journalists, the changes he advocates
would be useful to many, but making them usable too will be a big challenge.
Index entries for this article | |
---|---|
Security | Encryption |
Security | Privacy |
Security | Secure Sockets Layer (SSL)/Certificates |
Conference | GUADEC/2010 |
Posted Aug 4, 2010 15:57 UTC (Wed)
by JoeBuck (subscriber, #2330)
[Link] (6 responses)
Posted Aug 4, 2010 16:16 UTC (Wed)
by zlynx (guest, #2285)
[Link] (3 responses)
Those Canadian agencies are so secret that I've never even heard of them. They must be good, eh?
Posted Aug 4, 2010 16:30 UTC (Wed)
by ofeeley (guest, #36105)
[Link] (1 responses)
Posted Aug 6, 2010 16:45 UTC (Fri)
by ofeeley (guest, #36105)
[Link]
Posted Aug 5, 2010 5:43 UTC (Thu)
by JoeBuck (subscriber, #2330)
[Link]
Posted Aug 4, 2010 20:27 UTC (Wed)
by klbrun (subscriber, #45083)
[Link] (1 responses)
Posted Aug 7, 2010 13:18 UTC (Sat)
by Trou.fr (subscriber, #26289)
[Link]
Posted Aug 4, 2010 16:15 UTC (Wed)
by gmaxwell (guest, #30048)
[Link] (9 responses)
This is especially relevant for the developers of protocols. Any protocol can be designed to transparently use and mandate encryption. The work required to develop this is small because there are already many libraries that implement the hard parts, and the computation required is irrelevant for most protocols. And yet... we often don't bother "if the user needs crypto, they can tunnel it" but thats rubbish, the user doesn't understand all the risks they face and even if they do it's unlikely that everyone that they communicate with shares the same concerns.
Even without authentication, which can't be provided without at least a small imposition on the user, automatically keyed encryption provides enormous protection: It forces any attacker into an active attack which are much more easily detected, prosecuted, or avoided and much harder to implement. Simple unauthenticated encryption also greatly frustrates panopticon-style monitor-everything surveillance.
If you're worried the encryption without authentication may give the user a false sense of security then simply _don't tell the user that they have encryption_. Protection is still valuable even if the user doesn't know about it, it discourages the creation of unlawful eavesdropping infrastructures by reducing their value so used widely enough it even protects people who don't use it.
The OTR protocol is a great example of this mindset (ignoring the bug with multiple logins). Painless, transparent, always on encryption plus optional authentication which is very easy to use.
HTTP security as implemented in browswers today an example of a failed attempt. No security at all for the vast majority of connections because getting security is costly and annoying. Worst, In spite of compromising basic privacy for sake of always getting authentication it's still completely vulnerable to an active attacker because users usually begin on a HTTP page and won't notice the lack of encryption.
As developers I think we have an ethical obligation to bake these kinds of mandatory security measures into our applications and protocols. Asking the user to take the cost of using these security measures and convincing all their friends to use them is little better than not having them at all.
In the past the Telepathy developers have responded to calls for OTR support in a shameful manner: Mocking people that ask for it as paranoids and pushing some cumbersome certificate based alternative which doesn't provide the right security properties and doesn't provide them in a way which will be useful for anyone.
I hope Danny's presentation made some progress in convincing people in the error of their ways.
Posted Aug 4, 2010 16:40 UTC (Wed)
by farnz (subscriber, #17727)
[Link] (4 responses)
Thing is that at a deeper level, we have IPSec. It has issues in a NATted world that need fixing (whether by enhancing NAT traversal of IPSec, or by going to IPv6 and removing NAT while we're at it), but it should in theory let all protocols that don't care use encryption by default, whether it's part of the protocol or not. Authentication is harder to solve (due to the inherent need for some sort of out-of-band proof of identity).
And once you've solved the encryption problem once (by having all data that goes over IP encrypted in IPSec, even if it's then encrypted again inside the protocol for authentication purposes, ala HTTPS), there's no need for protocol designers to care.
Posted Aug 4, 2010 17:16 UTC (Wed)
by gmaxwell (guest, #30048)
[Link] (3 responses)
Even on the IPv6 internet it isn't there and doesn't appear to be forthcoming. Hell Path mtu discovery is absolutely mandatory with IPv6 because routers can't fragment and yet many sites which should know better (e.g. ISC) have been blocking needs fragment packets. IPSEC doesn't even have a chance.
I support the notion of host-to-host IPSEC but we live in the real world and need to deal with real issues, and today IPSEC is a non-solution. And because it wasn't made mandatory with IPv6 it will likely always be vulnerable to downgrading attacks: simply block IPSEC and it will be disabled (either automatically and without the users knowledge or manually) and you can even plausibly claim an honest misconfiguration caused the blocking.
Proposing that we solve this by using IPSEC is like proposing we boil the oceans. Effectively there is no party in control of IP, just a loose collective of voices at the IETF, and getting principled ethical action out of a consortium of many interests approaches impossible as the number of participants increases. Especially when some of those voices are, in fact, very interested in preserving an environment where surveillance is easy and cheap.
But for applications there are single people and small groups with the power to make the right choices. They ought to stand up and make them, rather than waiting for the IP folks to solve it for them.
Even in a future world where it IPSEC is viable the application level security still provides value because IPSEC's location in the network stack precludes it from providing the same security properties, you mention authentication which is a big one but their are additional properties like OTR's anti-non-repudiation which requires specialized interaction between authentication and crypto. Or many other cases where transport security isn't enough like email where the material should be stored in a secure form... so effort spent securing our applications will not be wasted.
Posted Aug 4, 2010 20:30 UTC (Wed)
by farnz (subscriber, #17727)
[Link] (2 responses)
On the other hand, the vast majority of protocol and application designers just don't care. "Boiling the oceans" by making IPSec work everywhere isn't easy - but it's a one-off big cost. Trying to round off every grain of sand by getting both protocol developers (who don't always see why it's important) and application developers (who often see it as extra work for no gain) to handle encryption in the protocol is similarly hard - but it's a lot of small costs, and we keep paying them day-in, day-out as we try to fix everything.
Further, the nature of IPSec is that it gets implemented once per OS, and then it's not an issue for any application that uses that OS - indeed, I can blithely write encryption unaware code using the old BSD sockets API like I've always done, and benefit. If I have to get encryption right, not only is that extra effort that I'm going to fight back against (naturally - like most programmers, I'm lazy), but it also opens up lots of ways for me to get it wrong, whether we're talking design flaws like WEP's flaws, or implementation failures like the Debian OpenSSL flaw, which rendered keys effectively 16-bits long.
At least with IPSec, people with more clue than I'll ever have have checked the design for mistakes, and another group of very clueful people will implement it, and fix the flaws found in the wild. And note that I'm not wedded to IPSec; if there was (say) a library that I could just link against and get all the nice security properties IPSec offers, that'd do just as well.
Posted Aug 4, 2010 21:42 UTC (Wed)
by gmaxwell (guest, #30048)
[Link] (1 responses)
Host to host IPSEC is currently going nowhere, as has been the case for the past decade. Even with working support in the OS (and I don't believe that _any_ major operating system has working IPSEC OE out of the box, at least not without manually configuring certs for everything you want to talk to!), getting it enabled and getting the network to stop breaking it is a major challenge. I don't see how you can claim that something which is effectively vaporware has "nice security properties"
At least with the application oriented model we can make progress by winning one application at a time. Either way we have to win over people who don't care but winning over developers gives incremental progress all along the way, while trying to win over people running networks and people configuring hosts doesn't get you much until its done everywhere.
There are libraries that give you a sockets like interface to TLS, it's simple enough that many things support it optionally. To be immune to downgrading attacks it needs to be made mandatory.
Posted Aug 5, 2010 10:52 UTC (Thu)
by farnz (subscriber, #17727)
[Link]
But the people responsible for end-host software (Linux distro developers, for example) also ought to care about their users' privacy. Practically speaking, I've yet to come across a library that's a transparent replacement for BSD sockets (as in LD_PRELOAD or equivalent) so that I don't have to care about encryption - it just works. Instead, I have to remember to not use the libraries I've used for years, because if my quick hack becomes important, it might matter.
At least with IPSec OE (which still needs work to fix, hence not ruling out other libc/kernel level routes), it doesn't matter if I forget to put in the SSL layer - it still gets encrypted. And if I need more complex solutions (authentication, repudiability etc), I can still put it in in the application layer.
In the end, whether you try and tackle application and protocol developers one at a time, or distro developers, you're facing an uphill struggle - not least because (by and large), people don't see encryption as important. I believe that we're better off putting the effort into making all communications encrypted by default using some form of opportunistic encryption (not necessarily IPSec - an automatic SSL layer that just happens without application intervention would work, too, as would any other form of OE that doesn't require application developer support).
And, of course, you are claiming that genuine vapourware (as in doesn't exist at all) has nice security properties. I am claiming that current deployments of IPSec have nice security properties, and I don't see how OE (the vapourware) breaks them.
Posted Aug 5, 2010 19:12 UTC (Thu)
by BenHutchings (subscriber, #37955)
[Link] (3 responses)
Check the date on that entry. I really don't think it was intended as mocking anyone.
Posted Aug 5, 2010 19:43 UTC (Thu)
by gmaxwell (guest, #30048)
[Link] (2 responses)
I probably should have mentioned that this dispute regarding OTR in Telepathy vs the decision to promote certificate based XMPP-only security had been going on for a year prior to that "joke" post across several mailing lists and bug trackers.
To see the dramaz for yourself, google around or start at http://bugs.freedesktop.org/show_bug.cgi?id=16891
Posted Aug 6, 2010 0:49 UTC (Fri)
by pabs (subscriber, #43278)
[Link]
Posted Aug 19, 2010 10:18 UTC (Thu)
by robot101 (subscriber, #3479)
[Link]
(And, let's be clear: The exact details of the XMPP encryption are fairly irrelevant to this discussion - although "certificate based" - this is just a technical convenience to allow us to use existing TLS libraries. A certificate is just a key in a certain format - it does not impose any requirement on how those certificates are signed or verified. So they don't need CAs or governments to sign them or whatever, you can still do SSH / OTR style "leap-of-faith" and manually verification of fingerprints/identities if you wish, which is how we always planned to present it in the UI.)
However, we had a discussion with some EFF members who explained to us the problem with our approach, which is nothing to do with any technical details or any type of encryption being "more bettar" than another - it's simply that by not supporting the use of OTR in Telepathy, we were reducing the existing privacy of the currently deployed OTR users in the cases they were talking to Telepathy users. This reasoning was explained to us clearly and in a level-headed manner, and we were inclined to agree.
We therefore adjusted our plans to allow both the XMPP and OTR style encryption to be presented through a similar Telepathy API (and also therefore, a similar UI in the client application, presenting a hopefully better-integrated and smoother experience for the user whichever technology is in use). We currently do not have many spare resources to work on an OTR implementation of this API, although we would be very happy to support somebody who was interested in working on it, and would be happy to add support in the Empathy UI if this API was implemented.
(speaking as a co-founder of the Telepathy project, although I contribute in more of a hand-wavy direction-setting way now :D)
Posted Aug 4, 2010 16:37 UTC (Wed)
by pj (subscriber, #4506)
[Link] (2 responses)
Posted Aug 7, 2010 1:04 UTC (Sat)
by dannyobrien (subscriber, #25583)
[Link]
Writing a doorman monitor for users, which would look for and complain about outgoing traffic would be a lower-hanging fruit.
Posted Aug 4, 2010 22:56 UTC (Wed)
by gdt (subscriber, #6284)
[Link]
Another good example of programmers not understanding the use of the OS in repressive circumstances is their misuse of flags. Flags indicate an allegiance to a cause -- usually a nation. Using flags as an icon for the choice of language is fraught with difficulties. As there are circumstances where indicating the wrong allegiance can be fatal, having flags anywhere in the user interface is stupid. One package I used displayed the USA flag to mean "English language in use". Obviously that programmer had never considered that some countries that are fine for English-language speakers to visit are antagonistic to those with an allegiance to the USA (a good current example would be Iran).
Posted Aug 5, 2010 4:15 UTC (Thu)
by drag (guest, #31333)
[Link] (3 responses)
In the USA the standards for evidence the government can use to legally justify spying on you is different for content you surrender to Google vs content you manage your self. This is something you need to keep in mind if your engaging in sensitive correspondence that may be of political interest to somebody in government.
The rules are complicated and difficult to understand. For example if you have not edited a document in 180 days it's easier for the government to force Google to surrender it to them then if it's something you've worked with recently.
-------------------------------
Another thing to be aware of:
President Obama is also pushing hard to expand the ability for the FBI and other government agencies to gain unfettered access to metadata on your communications and reduce judicial oversight.
The FBI for years has used what is called "National Security Letters"
These letters, when sent to corporations, were designed as a way to gain access to very small and very limited amounts of information about a 'subject of interest'. At the time the FBI needed a certain amount of facts to be able to justify that a character was probably a spy or a terrorist or something like that in order to legally write the letter.
With the Patriot Act this was expanded greatly by removing the level of justification. Now there is no requirements for facts or proof, but it depends entirely on a judgement call by the FBI to issue these sort of 'limited judgeless warrants' for information.
However the amount and types of information was still very specific. (who owns the account, how long they have owned the account, what is their address, etc) (The FBI actually went beyond that and actually got their hands slapped by George Bushes' legal council)
However, this is something that very recently Obama is trying to fix. The way he is trying to fix it is by greatly expanding the type of information the FBI can request.
Obama is pushing for the ability to have _any_ transactional data for any electronic communications to be eligible for a request using NSLs and to have NSLs be used on anybody in the course of a terror investigation even if they are not actually anybody that is suspected of being a terrorist or even directly involved.
So basically... any website you visited, web searches you've done, information on who you've been emailing, information on everybody that visited a suspected website, cell phone GPS tracking data, etc etc.
So they are saying that while they need judge to help them get access to your email they should be able to request the headers of your email with no justification.
What is on top of this there are gag orders involved with NSLs.
For example:
If the government gets a warrant to wiretap your phones they are legally required end up revealing the wiretap to you after it's over.
However the NSLs if the FBI sends a request to Google for your metadata then it's illegal for Google to tell you about it. So it makes judicial oversight and journalist investigations impossible. Nobody can practically challenge a NSL because nobody is allowed to know about it.
So... yeah. It's going to be very difficult in the future to trust any corporation with your data if your involved in anything politically disruptive or unpopular with whatever president is in power. They will be able to use the FBI to require corporations to spy on you with utmost secrecy. Anybody who know the history of the FBI know much of the time it's used as a political tool by whoever is in charge of the executive branch at the time..
Posted Aug 5, 2010 7:52 UTC (Thu)
by gmaxwell (guest, #30048)
[Link] (2 responses)
Seems some crazy people have applied bush administration style thinking to the wiretap laws and decided that it's possible for you to wave your right to privacy in the provider's TOS No NSLs required, as far as I can tell their legal reasoning would even allow them to sell your private data to your enemies/competitors for monetary gain.
Additionally, many "cloud computing" services may not enjoy stored communications act http://www.georgetownlawjournal.com/issues/pdf/98-4/Robis...
Posted Aug 5, 2010 9:14 UTC (Thu)
by drag (guest, #31333)
[Link]
But this is not the government agency, it's just one business sharing information with another. Different rules. They cannot use the threat of force to compel corporations to cooperate like the government can, but they can probably get stuff that would be troublesome for the government to do easily legally.
Think about the difference between a P.I. vs a Cop.
I guess it's time we take a closer look at those EULAs for our ISPs.
---------------------------
on a side note:
If anybody tells you that if you just cooperate with the government and not to worry because they won't throw you in jail... they are lying to you. Often you have no choice but to cooperate since like all the other amendments in the bill of rights the government has been working very hard to eliminate the positive effects of the 5th amendment, but regardless: Get a lawyer first.
----------------------------
For people that are curious more about this phenomena the The Washington Post has done a very good investigative journalism to document the extent at which the government has gone to monitor and survey it's own citizens. The vast majority of the money is getting funneled into private contractors... If you want to get rich quick: Join the FBI for a couple years, drop out, and then create your own private investigative firm and get a contract with the government. Very lucrative.
Posted Aug 6, 2010 21:23 UTC (Fri)
by spender (guest, #23067)
[Link]
I don't mean to devalue your point when applied to the existence of private companies who may be engaging in the same or similar behavior, I'm just speaking to this specific instance.
-Brad
Posted Aug 5, 2010 20:51 UTC (Thu)
by Kwi (subscriber, #59584)
[Link]
Interesting article, although with one minor error: In "the lesser-known Great English Firewall matches URLs to a list of child pornography sites", it should of course (at the very least) be "alleged child pornography sites". These secret extrajudicial censorship lists contain few child pornography sites, if any. Instead, they are used to filter whatever their sponsoring organizations (e.g. Internet Watch Foundation or Save the Children Denmark) deem immoral, be it old album covers, perfectly legal porn sites, political websites critical of censorship, and (apparently) the occassional Australian dentists and Dutch transport companies.
Posted Aug 12, 2010 16:36 UTC (Thu)
by bhaskar (guest, #69531)
[Link]
One thing to remember is that proprietary providers of encryption may have been subverted. One example is the Swiss firm Crypto AG, which sold crypto gear to all comers back in the 1980s. The NSA managed to install a back door, which allowed them to read Libyan encrypted communications. Right now Saudi Arabia and the UAE are planning to ban the Blackberry because they don't have access to encrypted messages, but for all we know the NSA might already have their hooks into RIM.
GUADEC: Danny O'Brien on privacy, encryption, and the desktop
GUADEC: Danny O'Brien on privacy, encryption, and the desktop
The manin equivalent would be CSEC. But of course post-911 there's been a flowering of establishments apparently devoted to spying on everything and everyone.
FINTRAC is one of newish Canadian spy agencies. It recently got into trouble with the privacy commissioner for having too little governance and accountability w.r.t. privacy.
GUADEC: Danny O'Brien on privacy, encryption, and the desktop
GUADEC: Danny O'Brien on privacy, encryption, and the desktop
GUADEC: Danny O'Brien on privacy, encryption, and the desktop
GUADEC: Danny O'Brien on privacy, encryption, and the desktop
GUADEC: Danny O'Brien on privacy, encryption, and the desktop
Sometimes security is locked in a trade-off with convenience. But often it's not: In most cases you can have some security (if not maximal security) without any impact on convenience at all.
GUADEC: Danny O'Brien on privacy, encryption, and the desktop
GUADEC: Danny O'Brien on privacy, encryption, and the desktop
Due to the pervasiveness of NAT (as you mention) and layer-4 firewalling opportunistic user to user IPSEC isn't just a dream it's pure fantasy. It also has fairly high per-packet overhead while connection oriented security protocols like TLS have very little overhead past the initial setup.
GUADEC: Danny O'Brien on privacy, encryption, and the desktop
GUADEC: Danny O'Brien on privacy, encryption, and the desktop
GUADEC: Danny O'Brien on privacy, encryption, and the desktop
GUADEC: Danny O'Brien on privacy, encryption, and the desktop
GUADEC: Danny O'Brien on privacy, encryption, and the desktop
Mocking people that ask for it as paranoids...
GUADEC: Danny O'Brien on privacy, encryption, and the desktop
GUADEC: Danny O'Brien on privacy, encryption, and the desktop
GUADEC: Danny O'Brien on privacy, encryption, and the desktop
GUADEC: Danny O'Brien on privacy, encryption, and the desktop
Funnily enough, an app like that is exactly what I ended up discussing with a few people after the keynote. I think the tough problem is having a heuristic that can say "this is (probably) encrypted". Perhaps better to see it as a developer utility (a sort of netlint) rather than a user warning system.GUADEC: Danny O'Brien on privacy, encryption, and the desktop
GUADEC: Danny O'Brien on privacy, encryption, and the desktop
GUADEC: Danny O'Brien on privacy, encryption, and the desktop
GUADEC: Danny O'Brien on privacy, encryption, and the desktop
GUADEC: Danny O'Brien on privacy, encryption, and the desktop
GUADEC: Danny O'Brien on privacy, encryption, and the desktop
http://taosecurity.blogspot.com/2010/08/project-vigilant-...
http://www.reddit.com/r/IAmA/comments/cx2t8/iama_voluntee...
(check out the Forbes journalist asking on reddit for input; that's quality)
GUADEC: Danny O'Brien on privacy, encryption, and the desktop
GUADEC: Danny O'Brien on privacy, encryption, and the desktop