Scanning "private" content
Scanning "private" content
Posted Aug 16, 2021 23:04 UTC (Mon) by rodgerd (guest, #58896)In reply to: Scanning "private" content by NYKevin
Parent article: Scanning "private" content
Pretty much. OneDrive, Google Drive, Dropbox will almost certainly all be doing this, and I'd be surprised if Slack, Teams, and so on likewise don't.
WhatsApp have made a big deal about *not* doing this, but they provide an even (IMO) creepier feature for law enforcement, which is using metadata analysis to de-anonymise the social graph of their users.
(Even creepier overreach is that the T&Cs for some smart TVs - which is any TV you can buy now - specify that you agree to the TV screen capping and sending the screen caps back to base)
> The only reason this should be controversial is because of the possibility that it later expands to include more stuff.)
There's a few more things to it than that, in my opinion:
1. Not everyone wants US government orgs setting legal policy for their devices. CSAM is pretty much that.
2. How are false positives handled? We've seen woo like polygraphs misrepresented (i.e. lied about) in courts, along with other types of forensic pseudo-science like ballistics work. The possibility of very serious legal trouble from a misrepresented or misunderstood application of hash collisions is not a comfortable thought.
3. Once it is well-understood that this sort of scanning is available, pressure to expand is inevitable, with the same leverage: you provide this facility, or you're out of our market.
Posted Aug 17, 2021 0:45 UTC (Tue)
by rodgerd (guest, #58896)
[Link]
Scanning "private" content