|
|
Subscribe / Log in / New account

Image "Cloaking" for Personal Privacy

Image "Cloaking" for Personal Privacy

[Security] Posted Jul 22, 2020 22:43 UTC (Wed) by jake

SAND Lab at the University of Chicago has announced Fawkes, which is a BSD-licensed privacy-protection tool available on GitHub. "At a high level, Fawkes takes your personal images, and makes tiny, pixel-level changes to them that are invisible to the human eye, in a process we call image cloaking. You can then use these "cloaked" photos as you normally would, sharing them on social media, sending them to friends, printing them or displaying them on digital devices, the same way you would any other photo. The difference, however, is that if and when someone tries to use these photos to build a facial recognition model, "cloaked" images will teach the model an highly distorted version of what makes you look like you. The cloak effect is not easily detectable, and will not cause errors in model training. However, when someone tries to identify you using an unaltered image of you (e.g. a photo taken in public), and tries to identify you, they will fail."

Comments (23 posted)


Copyright © 2020, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds