Security quotes of the week
So that's the real ethical question involved here: do you go to Apple and
get your $50-200,000, knowing that Apple will give you credit for the bug,
let you talk about it at the next conference, and seems to care enough to
try to fix these things quickly...
— "saurik"
(Thanks to Paul Wise.)
...or do you sell your bug to a group that resells it to some government which then uses it to try to spy on people like Ahmed Mansoor, "an internationally recognized human rights defender, based in the United Arab Emirates (UAE), and recipient of the Martin Ennals Award (sometimes referred to as a "Nobel Prize for human rights")".
To address this, Google developed a machine-learning algorithm for
clustering mobile apps with similar capabilities. Our approach uses deep
learning of vector embeddings to identify peer groups of apps with similar
functionality, using app metadata, such as text descriptions, and user
metrics, such as installs. Then peer groups are used to identify anomalous,
potentially harmful signals related to privacy and security, from each
app's requested permissions and its observed behaviors. The correlation
between different peer groups and their security signals helps different
teams at Google decide which apps to promote and determine which apps
deserve a more careful look by our security and privacy experts. We also
use the result to help app developers improve the privacy and security of
their apps.
— Martin Pelikan, Giles Hogben, and Ulfar Erlingsson