|
|
Subscribe / Log in / New account

Scanning "private" content

Scanning "private" content

Posted Aug 16, 2021 20:03 UTC (Mon) by NYKevin (subscriber, #129325)
In reply to: Scanning "private" content by dskoll
Parent article: Scanning "private" content

> Huh, interesting. So what about people who already own iPhones? Can they get a refund because Apple has fundamentally changed the terms of the agreement? (I doubt it, sadly.)

Doubtful, but they could disable iCloud photo uploading, at which point (as far as I can tell) they would no longer be subject to this scanning. At least until Apple changes the policy again.

(If you think that Apple *won't* change their policy again, then this whole controversy is a complete nothingburger, because they were very likely already doing some sort of CSAM scanning on the server side anyway. That's basically standard practice in the industry, barring E2EE products that are incapable of it. The only reason this should be controversial is because of the possibility that it later expands to include more stuff.)


to post comments

Scanning "private" content

Posted Aug 16, 2021 23:04 UTC (Mon) by rodgerd (guest, #58896) [Link] (1 responses)

> they were very likely already doing some sort of CSAM scanning on the server side anyway. That's basically standard practice in the industry, barring E2EE products that are incapable of it.

Pretty much. OneDrive, Google Drive, Dropbox will almost certainly all be doing this, and I'd be surprised if Slack, Teams, and so on likewise don't.

WhatsApp have made a big deal about *not* doing this, but they provide an even (IMO) creepier feature for law enforcement, which is using metadata analysis to de-anonymise the social graph of their users.

(Even creepier overreach is that the T&Cs for some smart TVs - which is any TV you can buy now - specify that you agree to the TV screen capping and sending the screen caps back to base)

> The only reason this should be controversial is because of the possibility that it later expands to include more stuff.)

There's a few more things to it than that, in my opinion:

1. Not everyone wants US government orgs setting legal policy for their devices. CSAM is pretty much that.

2. How are false positives handled? We've seen woo like polygraphs misrepresented (i.e. lied about) in courts, along with other types of forensic pseudo-science like ballistics work. The possibility of very serious legal trouble from a misrepresented or misunderstood application of hash collisions is not a comfortable thought.

3. Once it is well-understood that this sort of scanning is available, pressure to expand is inevitable, with the same leverage: you provide this facility, or you're out of our market.

Scanning "private" content

Posted Aug 17, 2021 0:45 UTC (Tue) by rodgerd (guest, #58896) [Link]

To follow up on my own comment, this is what Google Drive looks for and enforces: https://support.google.com/docs/answer/148505#zippy=%2Cmi...


Copyright © 2025, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds