A team of researchers led by Dan Boneh at Stanford University undertook a study of the "private browsing" feature offered by most web browsers, and found numerous exploitable holes in the wall of protection the features are supposed to maintain. The results, presented at the USENIX Security conference in August, spawned a variety of reactions from browser makers, from defensive posturing to bug reports. In the meantime, the paper provides a practical method to increase private browsing's privacy through a new Firefox extension.
The study examined the private browsing features offered by the four most popular browsers: Internet Explorer, Firefox, Chrome, and Safari with regard to two distinct privacy models. The first is termed the "local attacker," although it need not be a hostile party; browser makers advertise private browsing features as a way to keep family members from learning about birthday present shopping and secret vacation plans. A private browsing failure with regard to the local attacker would allow the attacker to learn what happened during a private session after the session had been terminated — specifically by finding information written to disk somewhere by the browser, as opposed to retrieving proxy caches or other network attacks.
The second model is the "web attacker," namely a hostile site. A web
attack flaw would allow the attacker to either discover that a visitor in a
private browsing session was the same as a visitor in a previous
non-private session, or vice versa. In either case, the attack would be limited to the browser exposing information during the browsing session, not simply by recording the IP address of the visitor or by having the user choose to do something that identifies himself or herself (e.g. log in).
In looking at the ways that a browser can "leak" information during a private browsing session, the team classified all of the possible state changes into four categories. The first category, those changes initiated by the site without user intervention (such as saving cookies, browsing history, and caching data) were actively guarded against by all four browsers. It was in the other three categories that the browsers disagreed, and were internally inconsistent. Those include changes initiated by the site but requiring user action (such as saving a password or generating an SSL client certificate), changes initiated by the user (such as adding a bookmark or downloading a file), and changes that are not user-specific (such as installing an update, or refreshing a block-list).
The team performed tests tracking disk writes during private browsing
sessions. It also performed an audit of the Firefox source code, and found
several leaks, where data generated during a private browsing session was
later accessible from a public browsing session. Some problems were the
same across all browsers, such as the ability to add bookmarks during a
private browsing session, and automatic recording of file downloads (which
persist even if the files themselves are removed). None have a separate "private
bookmarks" feature; all allow you to add global bookmarks during a
private session. Some browsers allow you to manually delete individual
entries from the Downloads window's log — if you remember to do so.
Others were limited to some browsers and not others. Internet Explorer, for example, can be tricked into using SMB queries to request page content simply by using the SMB \\servername\resource.ext naming convention, which bypasses IE's private browsing altogether and sends Windows hostname
and username information to the server via SMB protocol messages. IE,
Safari, and Chrome all permanently save user-approved self-signed SSL
certificates encountered during private browsing as well.
Firefox was not immune to private browsing leaks. The team singled out
protocol handler" function, which a site can use to register a custom
protocol with the browser and trigger a desired action — think
and torrent:// links, for example. Those protocol handlers live in the
document object model (DOM) of the browser and could be detected by a
remote site, which could leak information between public and private sessions.
During the in-depth look at Firefox, the team found five different files in a user's profile that were written to during private browsing, including security certificate settings (the cert8.db file), site-specific permissions such as cookie- and pop-up-blocking (in the permissions.sqlite file), download action preferences (in the mimeTypes.rdf file), automatically-discovered search engine add-ons (in the search.sqlite and search.json files), and plugin registrations (in the pluginreg.dat file).
Beyond the individual problems with each browser, however, the team found add-ons to be a far more serious source of private browsing data leaks. Plugins can use their own cookie setting and reading framework unimpeded by the browser's. Extensions are worse, in most cases outright ignoring whether or not the browser is in private browsing mode.
Not all browsers support extensions, of course, but all had problems. Chrome disables all extensions by default while in its private browsing mode, though this setting can be changed on a per-extension basis. IE also defaults to disabling extensions while browsing privately, although this is a preference setting that can only be turned on or turned off across all extensions at once. Safari does not have a public, supported extension API at all, but unsupported extensions continue to run unaltered while private browsing is enabled.
Firefox's extension behavior is the most problematic, starting with the fact that extensions remain enabled when private browsing is turned on. The paper examined the top 40 most popular Firefox extensions in depth. Eight were binary extensions, which constitute a serious security threat in their own right and run with the same read/write privileges of the current user. Of the remaining 32, 16 wrote no data to disk at all when browsing, but only one — Tab Mix Plus — actually checked the privacy mode of the current session through Firefox's nsIPrivateBrowsingService API.
The study's authors indicate that they have begun a similar in-depth examination of the most popular Chrome extensions. Thus far, they have encountered several that can execute arbitrary binary code, and several more that report user information to remote Google Analytics servers, leading them to expect to uncover more privacy violations.
The paper concludes with a discussion of various approaches that the browsers could take to improve the privacy guarantees of private browsing sessions, with particular emphasis on extensions. The ideas include having each extension voluntarily suspend writing information during a private session, having the browser block all extensions from writing data during a private session, and having the browser revert changes made by extensions.
The authors seem to regard the first option as the easiest to implement, and built their own extension named ExtensionBlocker that implements it. ExtensionBlocker works by querying the manifest file of each active Firefox extension, and, during a private session, disabling those that do not include the suggested <privateModeCompatible/> XML tag. Thus far, ExtensionBlocker does not seem to have been released to the public.
Naturally, for ExtensionBlocker to be useful, other extension authors would have to start including the <privateModeCompatible/> tag. So far, no other extensions have adopted it. But the authors do recommend several other privacy-protecting extensions, such as Torbutton, Doppelganger, and Bugnosis. In addition, they suggest that a system similar to the W3C's platform for privacy preferences (P3P) could be used to classify sites as safe for private-mode browsing.
Shortly after the paper was published online, ZDNet Asia solicited
feedback from various browser vendors. Opera, surprisingly, was the
harshest in its criticism, reportedly accusing the researchers of
"simply incorrect" assumptions about the security goals of
private browsing. Opera was not included in the tests, but it did recently
introduce its own private browsing feature, which it (like the others) advertises as offering protection for gift-shoppers, shared-computer users, and "testing websites" for "cookie and session-related aspects" of browsing.
Google responded by saying that Chrome's private browsing mode helps you "limit" the information saved on disk but that it makes clear that the mode "does not remove all records."
For its part, Mozilla responded both by saying that some of the issues addressed in the paper have already been fixed in the Firefox 4 series, and that others had been filed as new bugs on which there will be new work.
All details aside, perhaps the biggest take-aways for users are that the browser makers do not agree on what "private browsing" mode actually entails, and that none of them make strong guarantees. As always, existing tools like Tor, Privoxy, and NoScript offer the dedicated user with a way to significantly improve the anonymity and security of a browsing session — albeit it at the cost of reduced functionality on certain sites.
Finally, everyone concerned about his or her browsing privacy would do well to remember that private browsing modes offer no protection against certain other types of detection. The Electronic Frontier Foundation's Panopticlick tool, for example, uses combinations of remotely-accessible, non-private data (including the list of installed plugins, fonts, OS version information, and more) to assemble what could be a unique fingerprint for each browser — regardless of what nsIPrivateBrowsingService might report.
to post comments)