|
|
Subscribe / Log in / New account

Google's differential privacy library

Google has announced the release of a new library for applications using differential privacy techniques. "Differentially-private data analysis is a principled approach that enables organizations to learn from the majority of their data while simultaneously ensuring that those results do not allow any individual's data to be distinguished or re-identified. This type of analysis can be implemented in a wide variety of ways and for many different purposes. For example, if you are a health researcher, you may want to compare the average amount of time patients remain admitted across various hospitals in order to determine if there are differences in care. Differential privacy is a high-assurance, analytic means of ensuring that use cases like this are addressed in a privacy-preserving manner."

to post comments

Google's differential privacy library

Posted Sep 5, 2019 20:04 UTC (Thu) by ncultra (✭ supporter ✭, #121511) [Link] (4 responses)

Calling it a "privacy" library is Orwellian. It may be more appropriately called an anonymization library. But I see the vast benefit of doing this meta-analysis using correct methods.

Google's differential privacy library

Posted Sep 6, 2019 8:14 UTC (Fri) by mattdm (subscriber, #18) [Link] (3 responses)

It's a "differential privacy" library, not a differential "privacy library". This is a term of art; see e.g. https://www.microsoft.com/en-us/research/publication/diff...

Google's differential privacy library

Posted Sep 8, 2019 19:12 UTC (Sun) by scientes (guest, #83068) [Link] (2 responses)

Redefining inflation doesn't make it go away. Similarly redefining privacy *does* make it go away, in that you don't have privacy anymore, because the Orwellian tricks are hidden by obscured and deliberately hard to understand "academic" theories (which are thinly veiled political policies).

Google's differential privacy library

Posted Sep 8, 2019 19:15 UTC (Sun) by scientes (guest, #83068) [Link]

Like have you noticed the cost of a hamburger at McDonalds lately?

Google's differential privacy library

Posted Sep 10, 2019 13:44 UTC (Tue) by kleptog (subscriber, #1183) [Link]

It's not redefining privacy, it's actually trying to provide a rigorous definition of what privacy is.

I can publish a list can contains all the PIN codes of all your cards, but that doesn't violate your privacy. If however, I publish a link between a specific PIN code and the street you live in, does that violate your privacy?

If I train a ML model on a whole lot of private data, can I publish the resulting model? What criteria would you use to decide?

An IP address in a log is only a issue if you combine it with a database that maps IP addresses to people. If such a database exists, does it matter who has access?

As long as you consider privacy a binary issue you cannot have any sensible discussions about it, and it's a really important area that needs a lot of discussion and research so we can collectively decide what we actually want and what trade offs we find acceptable. And there are lots of trade offs being made right now without a good discussion about what is actually being traded.


Copyright © 2019, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds