|
|
Subscribe / Log in / New account

Scanning "private" content

By Jake Edge
August 11, 2021

Child pornography and other types of sexual abuse of children are unquestionably heinous crimes; those who participate in them should be caught and severely punished. But some recent efforts to combat these scourges have gone a good ways down the path toward a kind of AI-driven digital panopticon that will invade the privacy of everyone in order to try to catch people who are violating laws prohibiting those activities. It is thus no surprise that privacy advocates are up in arms about an Apple plan to scan iPhone messages and an EU measure to allow companies to scan private messages, both looking for "child sexual abuse material" (CSAM). As with many things of this nature, there are concerns about the collateral damage that these efforts will cause—not to mention the slippery slope that is being created.

iPhone scanning

Apple's move to scan iPhone data has received more press. It would check for image hashes that match known CSAM material; the database of hashes will be provided by the National Center for Missing and Exploited Children (NCMEC). It will also scan photos that are sent or received in its messaging app to try to detect sexually explicit photos to or from children's phones. Both of those scans will be done on the user's phone, which will effectively break the end-to-end encryption that Apple has touted for its messaging app over the years.

Intercepted messages that seem to be of a sexual nature, or photos that include nudity, will result in a variety of interventions, such as blurring the photo or warning about the content of the message. Those warnings will also indicate that the user's parents will be informed; the feature is only targeted at phones that are designated as being for a child—at least for now. The general photo scanning using the NCMEC hashes has a number of safeguards to try to prevent false positives; according to Apple, it "ensures less than a one in one trillion chance per year of incorrectly flagging a given account". Hash matches are reported to Apple, but encrypted as "safety vouchers" that can only be opened after some number of matching messages are found:

Only when the threshold is exceeded does the cryptographic technology allow Apple to interpret the contents of the safety vouchers associated with the matching CSAM images. Apple then manually reviews each report to confirm there is a match, disables the user’s account, and sends a report to NCMEC. If a user feels their account has been mistakenly flagged they can file an appeal to have their account reinstated.

The Siri voice-assistant and the iPhone Search feature are also being updated to check for CSAM-related queries, routing requests for help reporting abuse to the appropriate resources, while blocking CSAM-oriented searches:

Siri and Search are also being updated to intervene when users perform searches for queries related to CSAM. These interventions will explain to users that interest in this topic is harmful and problematic, and provide resources from partners to get help with this issue.

The Electronic Frontier Foundation (EFF) is, unsurprisingly, disappointed with Apple's plan:

We’ve said it before, and we’ll say it again now: it’s impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children. As a consequence, even a well-intentioned effort to build such a system will break key promises of the messenger’s encryption itself and open the door to broader abuses.

All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content, or a tweak of the configuration flags to scan, not just children’s, but anyone’s accounts. That’s not a slippery slope; that’s a fully built system just waiting for external pressure to make the slightest change.

The EFF post goes on to point to recent laws passed in some countries that could use the Apple backdoor to screen for other types of content (e.g. homosexual, satirical, or protest content). Apple could be coerced or forced into extending the CSAM scanning well beyond that fairly limited scope. In fact, this kind of expansion has already happened to a certain extent:

We’ve already seen this mission creep in action. One of the technologies originally built to scan and hash child sexual abuse imagery has been repurposed to create a database of “terrorist” content that companies can contribute to and access for the purpose of banning such content. The database, managed by the Global Internet Forum to Counter Terrorism (GIFCT), is troublingly without external oversight, despite calls from civil society. While it’s therefore impossible to know whether the database has overreached, we do know that platforms regularly flag critical content as “terrorism,” including documentation of violence and repression, counterspeech, art, and satire.

For its part, Apple has released a FAQ that says it will refuse any demands by governments to expand the photo scanning beyond CSAM material. There is, of course, no technical way to ensure that does not happen. Apple has bowed to government pressure in the past, making some leery of the company's assurances. As Nadim Kobeissi put it:

Reminder: Apple sells iPhones without FaceTime in Saudi Arabia, because local regulation prohibits encrypted phone calls. That's just one example of many where Apple's bent to local pressure.

What happens when local regulation mandates that messages be scanned for homosexuality?

It is interesting to note that only a few years ago, Apple itself was making arguments against backdoors with many of the same points that the EFF and many other organizations and individuals have made. As Jonathan Mayer pointed out: "Just 5 years ago, Apple swore in court filings that if it built a capability to access encrypted data, that capability would be used far beyond its original context."

EU scanning

Meanwhile in the EU, where data privacy is supposed to reign supreme, the "ePrivacy derogation" is potentially even more problematic. It allows communication-service providers to "monitor interpersonal communications on a voluntary basis with the purpose of detecting and reporting material depicting sexual abuse of minors or attempts to groom children". It is, of course, not a huge leap from "voluntary" to "mandatory". As might be guessed, the scanning will not be done directly by humans—problematic in its own right—but by computers:

The scanning of private conversations will be conducted through automated content recognition tools, powered by artificial intelligence, but under human oversight. Service providers will also be able to use anti-grooming technologies, following consultation with data protection authorities.

The EU General Data Protection Regulation (GDPR) is a sweeping framework for protecting personal data, but since the start of 2021 it no longer covers messaging services. That kind of communication falls under the ePrivacy directive instead, thus the change allowing scanning is a derogation to it. Patrick Breyer, member of the EU Parliament, has criticized the derogation on a number of grounds. He lists a number of different problems with it, including:

  • All of your chat conversations and emails will be automatically searched for suspicious content. Nothing remains confidential or secret. There is no requirement of a court order or an initial suspicion for searching your messages. It occurs always and automatically.
  • If an algorithms classifies the content of a message as suspicious, your private or intimate photos may be viewed by staff and contractors of international corporations and police authorities. Also your private nude photos may be looked at by people not known to you, in whose hands your photos are not safe.

    [...]

  • You can falsely be reported and investigated for allegedly disseminating child sexual exploitation material. Messaging and chat control algorithms are known to flag completely legal vacation photos of children on a beach, for example. According to Swiss federal police authorities, 86% of all machine-generated reports turn out to be without merit. 40% of all criminal investigation procedures initiated in Germany for “child pornography” target minors.

As Breyer pointed out, there is already proposed legislation to make the scanning mandatory, which would break end-to-end encryption: "Previously secure end-to-end encrypted messenger services such as Whatsapp or Signal would be forced to install a backdoor."

"Safety" vs. privacy

Both of these plans seem well-intentioned, but they are also incredibly dangerous to privacy. The cry of "protect the children" is a potent one—rightly so—but there also need to be checks and balances or the risks to both children and adults are far too high. Various opponents (who were derided as "the screeching voices of the minority" by the NCMEC in a memo to Apple employees) have noted that these kinds of measures can actually harm the victims of these crimes. In addition, they presuppose that everyone is guilty, without the need for warrants or the like, and turn over personal data to companies and other organizations before law enforcement is even in the picture.

As with many problems in the world today, the sexual abuse of children seems an insurmountable one, which makes almost any measure that looks likely to help quite attractive. But throwing out the privacy of our communications is not a sensible—or even particularly effective—approach. These systems are likely to be swamped with reports of completely unrelated activity or of behavior (e.g. minors "sexting" with each other) that is better handled in other ways. In particular, Breyer has suggestions for ways to protect children more effectively:

The right way to address the problem is police work and strengthening law enforcement capacities, including (online) undercover investigations, with enough resources and expertise to infiltrate the networks where child sexual abuse material is distributed. We need better staffed and more specialized police and judicial authorities, strengthening of resources for investigation and prosecution, prevention, support, training and funding support services for victims.

There have long been attacks against encryption in various forms, going back to (at least) the crypto wars in the 1990s. To those of us who lived through those times, all of this looks an awful lot like a step back toward the days of the Clipper chip, with its legally mandated crypto backdoor, and other efforts of that sort. Legislators and well-meaning organizations are seemingly unable to recognize that a backdoor is always an avenue for privacy abuse of various kinds. If it requires screeching to try to make that point—again—so be it.


Index entries for this article
SecurityPrivacy


to post comments

Scanning "private" content

Posted Aug 11, 2021 20:51 UTC (Wed) by Cyberax (✭ supporter ✭, #52523) [Link] (2 responses)

And don't forget, with iPhone you're out of control. You can't modify the OS or even to install applications that are not blessed by Apple.

How long do you think we'll have wait until this scanning feature becomes mandatory for all applications?

Scanning "private" content

Posted Aug 16, 2021 13:08 UTC (Mon) by scientes (guest, #83068) [Link] (1 responses)

Android already has a feature that completely blocks out non-Google Play apps.

Scanning "private" content

Posted Aug 18, 2021 11:32 UTC (Wed) by Jandar (subscriber, #85683) [Link]

> Android already has a feature that completely blocks out non-Google Play apps.

Can you elaborate? On my non rooted Android phone I often install apps from Free-Droid.

Scanning "private" content

Posted Aug 11, 2021 21:25 UTC (Wed) by pebolle (guest, #35204) [Link] (9 responses)

> [X] and other types of [X] are unquestionably heinous crimes; those who participate in them should be caught and severely punished.

Note "unquestionably", "heinous crimes", "should be caught" and "severely punished".

X being one of the four horsemen of the of the information apocalypse. Say: terrorists, pedophiles, drug dealers, and money launderers. Bruce Schneier suggested: terrorists, drug dealers, kidnappers, and child pornographers. Any other suggestions?

Anyhow: why bother continuing after that sentence? For abstractions like privacy, due process and rule of law?

Scanning "private" content

Posted Aug 12, 2021 7:12 UTC (Thu) by smurf (subscriber, #17840) [Link] (8 responses)

That.

But even if this idea should stay limited to CSAM, which it most likely won't be if history is any indication, and even if due process and whatnot is followed when my phone sics the police on the content of my phone, which is unlikely to happen either, there are problems galore here.

For one, define "child". The idea that everybody a day shy of their 18th-or-whichever birthday should have no rights to their own body, or to decide what they and/or somebody else are allowed to do with it, is certifiable, but all too real in some parts of the world. Thus "abuse" can be, and sometimes is, a catch-all for basically anything, given that consent is presumed to be impossible.

Similarly, define "sex", given that some puritan busybodies get off on their righteous indignation when they see a perfectly innocent picture of a nude child at the beach, let alone a teen on their way through puberty.

Also, possession is a crime. Nice. So, somebody sends me a disgusting pic via Whatsapp? it stays in my media folder pretty much forever, with no visible clue that it's not mine. That brainless scanner won't know either.

Next, how is that thing even trained? Will it be confronted with a ton of innocent, or not-so-innocent-but-legal-in-this-context, control images? The set of "bad" images its going to be trained on cannot be made public for obvious reasons, but it's Apple's job of coming up with a way of verifying the accuracy of this.

In fact, did Apple even *think* of the implications of unleashing this on their users? or is it just a placate-the-legal-beagles move?

Scanning "private" content

Posted Aug 12, 2021 8:05 UTC (Thu) by taladar (subscriber, #68407) [Link] (4 responses)

There is also the issue of verification. Say some government entity sends the service provider who does the scanning a set of hashes. How do they know it is hashes of CSAM? It might also be hashes of pictures of political activists protesting that government or pictures of embarrassing programs that government wants to keep secret by finding journalists who get their hands on them before they have a chance to publish.

To me this focus on pictures and videos seems rather telling in any case.

Shouldn't the primary focus be to prevent the children from being hurt? Wouldn't that be best achieved by monitoring parents, priests, youth group leaders, sports coaches and similar people who actually have regular access to children and have historically been the main group of perpetrators? Wouldn't it make more sense to limit their right to privacy before limiting everyone's (not that I think limiting anyone's right to privacy is a good idea)?

On the other hand if your primary goal is surveillance and children are just the excuse then it makes much more sense to focus on the digital realm over the physical environment of the children.

Scanning "private" content

Posted Aug 12, 2021 21:53 UTC (Thu) by gmaxwell (guest, #30048) [Link] (1 responses)

They put a tremendous amount of engineering into it to cryptographically shield themselves and their list providers from accountability, too. I find this extremely concerning.

The obvious construction for such a system would simply deliver to the DRM-locked-down devices a list of hashes to check against and self-report. But that construction would make it possible for people to perform a web crawl and check it against the list, which would have a fighting chance of exposing instances of inappropriately included popular images. It would make it much harder to secretly use the system to target particular religions, ethnicity, or political ideologies...

But, instead, the use a complex and computationally expensive private set intersection to make sure that the users can't learn *anything* about the list they're being checked against except an upper bound on its size.

Scanning "private" content

Posted Aug 16, 2021 11:02 UTC (Mon) by immibis (subscriber, #105511) [Link]

One could more charitably guess that they don't want to send the set to the device because the people who enjoy this kind of content could then simply Google the hashes to find more of it.

Scanning "private" content

Posted Aug 12, 2021 22:55 UTC (Thu) by tialaramex (subscriber, #21167) [Link] (1 responses)

In fact Apple trust "a database of known CSAM image hashes provided by NCMEC and other child safety organizations".

While NCMEC gets a name check, "other child safety organizations" is entirely for Apple to define. Nobody _even at Apple_ has an insight into what the actual images are, and if they allow some unnamed "child safety organization" to submit one or more hashes that organization gets indirect access to Apple's devices and users.

The "verification" step also is concerning. These are perceptual hashes, meaning they recognise images that are similar to a target in a way determined by an algorithm, likely a proprietary algorithm. So, unavoidably there can be false positives and you must actually verify matches. Apple of course doesn't have the original images, and certainly doesn't want to hire people to look at probably illegal images, so instead this is done by sending the matching image off to the same organizations which generated the hashes... The effect is that it may be possible to provide Apple with hashes that match most photographs of a dissident's face, and then you'll be sent any new photographs of that dissident for "verification". You can confidently inform Apple that these were false positives, not CSAM, and needn't be blocked, and of course Apple have no way to to determine whether you kept a copy of the images or acted on what you saw...

It's probably more reasonable to think of Apple and other proponents of these systems as useful idiots rather than part of a conspiracy. They feel they ought to do something, and this is something, so they feel they ought to do this. They believe that because they have good intentions, the arguments against this system are null - they know what they're doing is good. The problem is that they in fact cannot know whether what they're doing is good, the system intentionally blinds them and that ought to be a red flag.

Even the wider campaigns on this topic are most likely from useful idiots rather than deliberate enablers. Their idea is that if we can stop copies of an image from existing the thing depicted in the image is erased too, this is an obviously false belief about the universe but it's actually really commonly wished for. In many countries some such CSAM databases include images of events that never took place, or of children who never existed, they're of no value for prosecuting a hypothetical perpetrator of sexual assault because the offence never actually occurred - it'd be like showing video of a popular "death match" first person multi-player video game as evidence for a murder charge - but whereas you won't anywhere with that, prosecuting possession of a copy of these image really means somebody goes to jail to "protect" the imaginary children from imaginary abuse. Like prosecuting vagrancy this seems like an obviously bad idea but is really popular.

Scanning "private" content

Posted Aug 13, 2021 1:20 UTC (Fri) by NYKevin (subscriber, #129325) [Link]

> Nobody _even at Apple_ has an insight into what the actual images are,

To be fair, under US law, it is illegal for Apple to possess copies of the actual images. I share your concerns about transparency here, but there are legal limits to what Apple can do.

> They feel they ought to do something, and this is something, so they feel they ought to do this.

I think you're short one level of abstraction. It's not necessarily the case that Apple themselves believe this. They may know perfectly well that this is a Bad Idea, but nevertheless promote it to take the wind out of the sails of some of the anti-encryption arguments. They may believe (rightly or wrongly) that the perfect ("nobody ever looks at the user's data") is the enemy of the good ("the user's data is only looked at under very narrow circumstances, and is otherwise E2E encrypted and beyond reach"). But of course, we won't know if that is actually true unless and until Apple introduces E2E encryption for iCloud (which as I understand it, is currently not a thing). Nevertheless, the system does seem to be very conveniently designed in such a way that it won't break if E2E encryption is introduced later (because it happens on the client, and not on Apple's servers).

Bear in mind, if Apple doesn't do *something*, then sooner or later one of the big governments is going to simply outlaw E2E encrypted user products altogether. They may see this as the least-worst alternative to that. Of course, that doesn't mean they're right, just that they probably do have a fairly comprehensive understanding of what they are actually doing and how it plays in the broader political context.

Scanning "private" content

Posted Aug 15, 2021 11:04 UTC (Sun) by k8to (guest, #15413) [Link] (2 responses)

These problems aren't theoretical either. "Protecting minors" had been used systematically to attack, harass, and incarcerate gay men by law enforcement for totally innocuous shit all over the USA.

I know people who were threatened with jail for you know, horrible things like crossing state lines to see a minor. You know, a 18 year old driving 30 minutes to hang out with a17 year old who considered themselves boyfriends.

There will always be an out group who can be attacked via stuff like this.

Scanning "private" content

Posted Aug 16, 2021 13:11 UTC (Mon) by scientes (guest, #83068) [Link] (1 responses)

In these cases the problem is not bad law but bad people (and I am not talking about the criminals).

Scanning "private" content

Posted Aug 16, 2021 15:49 UTC (Mon) by k8to (guest, #15413) [Link]

Technology that opens the door to this type of abuse is a problem. The laws that criminalize totally reasonable behavior are a problem. And yes, the law enforcement full of predators is multiple problems.

Scanning "private" content

Posted Aug 11, 2021 21:28 UTC (Wed) by philipstorry (subscriber, #45926) [Link] (10 responses)

I get the feeling that we're going to discover a whole heap of edge cases from this.

For example, most sexual predators are part of the community or even extended family. What if they're adding photos from social media to their photo sets? Suddenly someone sends an old picture of their kid at the beach to someone, and BANG! Hash match, and you've got a problem. In some justice systems this will be handled well, in others it will be handled terribly.

Can you imagine people being told "We know you're abusing that child, Apple tells us so. Take this guilty plea and it'll be easier for you." Because I really wouldn't bet against that happening.

This has the potential to ruin lives in new and awful ways.

Scanning "private" content

Posted Aug 12, 2021 9:08 UTC (Thu) by nim-nim (subscriber, #34454) [Link] (9 responses)

The less democratic a state, the less it cares about side effects and treating everyone fairly.

An authoritarian state will much prefer the ability to repress hard a minority of opponents, over the ability to detect and police what it considers petty crimes. And, in fact, being able to repress at will relies on the existence of a large amount of unprosecuted offenses (real or false positives). This way you have ready-to-be-opened cases against a large part of the population.

So, depending on your objectives, lack of fairness and huge amount of false positives are not a bug but a feature.

Finally the nice thing about algorithms is that they do not have a conscience and won’t protest overreaching. You only need to care about the subset of people involved into prosecutions. With traditional manual data collection any of the army or data collecting bureaucrats can turn whistleblower.

Scanning "private" content

Posted Aug 12, 2021 13:30 UTC (Thu) by ldearquer (guest, #137451) [Link] (8 responses)

>> The less democratic a state, the less it cares about side effects and treating everyone fairly.

Similar phrasing could be utilized for a careless state, which applies populistic measures, being welcome by a majority, regardless of the harm caused to smaller groups, or groups without direct democratic representation (as children are).

No one asked children if they were ok with all the technology changes of the last decades, considering they could arguably make them more vulnerable.

Privacy is a good thing, but it is not the absolute most important asset of humanity. Maybe it is in the top 10, but not in the top 5.

I would give away my online privacy any day *if* that would reduce children abuse.

Note this is a generic take on the problem, not a defense of this specific way of doing things.

Scanning "private" content

Posted Aug 12, 2021 14:35 UTC (Thu) by nim-nim (subscriber, #34454) [Link] (7 responses)

But can a careless state be considered democratic ? The sole advantage of democraties over other regimes is that they are supposed to care about everyone equally (one head = one vote).

It seems to me that assigning unequal political weights to citizens (caring less about some than others, breaking the one head = one vote rule) is a fast path to something else.

Scanning "private" content

Posted Aug 12, 2021 17:18 UTC (Thu) by ldearquer (guest, #137451) [Link] (6 responses)

I hate to disagree with such a naive vision of democracy.

A state is democratic as long as the government is elected by the people or their representatives. That's it. Greeks already realized long ago how things could still go wrong.

Also note how the one head=one vote still excludes children. Not that it is easy to get around (direct children votes could be a disaster), but IMO this has an effect on politics.

Democracy

Posted Aug 12, 2021 19:33 UTC (Thu) by tialaramex (subscriber, #21167) [Link] (5 responses)

There's a long history of expanding the franchise with people incorrectly predicting drastic consequences if this is attempted and then nothing interesting happening.

Just 200 years ago, England and Wales had a completely baroque electoral system, where in one place the Member might be elected by more or less anybody who had an opinion (conventionally not by women but it wasn't technically illegal, records do not indicate any women _trying_ to vote under this system) and in others only by members of a group actually _paid_ by the current Member (which is scarcely any sort of democracy at all), each district could have its own rules and most of them did.

Great Reform shook that up, introducing a baseline that would allow a typical middle aged skilled worker or professional of the era to meet requirements to register as a voter -- but it didn't result in a great upheaval, and over the rest of that century if anything the aristocracy actually expanded their control, because it turns out rich merchants can buy 100 votes in a district where only 150 people can vote anyway, but greatly expanding the franchise makes this game unaffordable for them - whereas if people vote for you, Sir Whatever just because they always voted for you, or your father (also Sir Whatever) before you, that's not difficult to maintain when more of them qualify to vote.

100 years ago, a lot of women could vote, though far more men (all men over 21), the introduction of widespread voting by women was seen as potentially very disruptive. In fact, the women proved no more spirited than the men, and more or less the same people (overwhelmingly men) were elected. Even when women were given the exact same rights (all women over 21 too) it made no major difference.

In the middle of the 20th century the UK got around to abolishing plural voting (yes, right up until 1948 "one man, one vote" literally wasn't the rule although plural voting had been somewhat neutered after Great Reform) and only in 1969 did they lower the age to 18. The newly enfranchised teenagers did not in fact tear down grown-up government, and things still continued more or less as before.

Among obvious groups that still ought to be enfranchised: Convicted criminals from prison -- at least all those convicted of crimes unrelated to the functioning of democracy, and frankly probably those too unless you're bad at prisons and can't keep them from tampering with the vote from inside a prison; Children -- certainly all teenagers and there's no particular reason not to enfranchise any child that seems to actually have a preference, their preferences certainly can't be less _informed_ than those of adult voters so why not?

Overall, given that the only function of democracy is to avoid the alternative (violent transitions of power) why shouldn't we let toddlers help if they want to?

Democracy

Posted Aug 13, 2021 22:25 UTC (Fri) by jkingweb (subscriber, #113039) [Link] (2 responses)

> Among obvious groups that still ought to be enfranchised: Convicted criminals from prison -- at least all those convicted of crimes unrelated to the functioning of democracy, and frankly probably those too unless you're bad at prisons and can't keep them from tampering with the vote from inside a prison

Here in Canada this is happily already the case after the Supreme Court deemed the restrictions of the time unjustified (though not unjustifiable: the government simply failed to make their case).

> Children -- certainly all teenagers and there's no particular reason not to enfranchise any child that seems to actually have a preference, their preferences certainly can't be less _informed_ than those of adult voters so why not?

Ensuring that a toddler can vote secretly, in safety, and free from coercion by their guardians all at the same time would probably be a challenge, but I agree 18 years is definitely not a magically appropriate age.

Democracy

Posted Aug 14, 2021 15:14 UTC (Sat) by BirAdam (guest, #132170) [Link] (1 responses)

I could easily be convinced that 18 is too young for a person to be trusted to vote. I could equally easily be convinced that 14 is old enough. This depends entirely upon the person in question and his/her relative life experience, intelligence, and sense of caution. When rules are made for a large society, any type of individual consideration is removed due to expediency.

Democracy is not good in itself. Many dictators and horrible people have been elected to office. In the current USA, citizens are no longer guaranteed trial by jury, or even their right to life. Selling lose cigarettes? Death penalty. Your father was a US citizen but also Muslim? Death penalty (Abdul Rahman and Nawar al-Awlaki).

Part of my dislike for Apple’s move here is based upon these exact considerations. Even in supposedly free countries, freedoms are frequently done away with in the interest of stopping something horrible. Today, this is child exploitation which is far from well defined. Tomorrow, it will be the even more weakly defined “terrorism,” where terrorist is truly just whomever the state deems bad.

Democracy

Posted Aug 14, 2021 16:58 UTC (Sat) by mpr22 (subscriber, #60784) [Link]

> In the current USA, citizens are no longer guaranteed trial by jury, or even their right to life.

This phrasing suggests the existence of some prior state of the union in which they were.

And, well.

Neither the Federal government nor any state in the union has ever seriously considered forbidding the use of lethal weapons by law enforcement officers.

Democracy

Posted Aug 16, 2021 11:09 UTC (Mon) by immibis (subscriber, #105511) [Link] (1 responses)

> There's a long history of expanding the franchise with people incorrectly predicting drastic consequences if this is attempted and then nothing interesting happening.

Indeed, there is a long history of *people doing anything at all* with people incorrectly predicting drastic consequences if this is attempted and then nothing interesting happening.

It is entirely *possible* that Apple stops at child porn, and only ever detects child porn, and the hashes are good enough that false positives are rare and random.

Democracy

Posted Aug 16, 2021 12:33 UTC (Mon) by smurf (subscriber, #17840) [Link]

Possible, yes. Highly unlikely, also yes.

Scanning "private" content

Posted Aug 11, 2021 21:39 UTC (Wed) by dskoll (subscriber, #1630) [Link] (12 responses)

It seems to me this violates the prohibition against unreasonable search. Even if it's Apple doing the searching, it's effectively been deputized by the government to do that.

Anyway, welcome to dystopia. As a technologist, I'm really starting to think that (despite his being a mass murderer) the Unabomber might have had some good points in his manifesto. Technology is looking more and more negative.

Scanning "private" content

Posted Aug 12, 2021 21:40 UTC (Thu) by gmaxwell (guest, #30048) [Link] (11 responses)

Because the scanning is done by Apple for their own free will and commercial benefit and not at the behest of the government, they've not been deputized. They gain the ability to do so via the users consensual interaction with them. The courts have been unequiviable-- if the government mandated or even incentivized this scanning it would require a warrant.

The reason that the fact that they're using a (effectively) government provided list doesn't cause a problem is because they have a human manually review the content (which is why Apple includes that step-- without it the 'search' wouldn't happen until the NCMEC opened the files, which would require a warrant).

Scanning "private" content

Posted Aug 13, 2021 12:55 UTC (Fri) by dskoll (subscriber, #1630) [Link] (9 responses)

OK, but if someone is charged based on what Apple finds, wouldn't Apple's evidence be inadmissable since it was obtained without a warrant? The United States has the Fruit of the Poisonous Tree doctrine, so I can certainly envision a situation in which Apple makes it easier for someone to dodge possession of child pornography charges simply because of the tainted evidence.

I'm not a lawyer, but would be very interested to hear what lawyers have to say about this.

Scanning "private" content

Posted Aug 13, 2021 16:59 UTC (Fri) by NYKevin (subscriber, #129325) [Link] (8 responses)

Nope, they're going to rely on the old standby, the third-party doctrine.

Historically, the rule was that you didn't have any legally-defensible privacy interest whatsoever in any information which you voluntarily disclosed to someone else. In Carpenter v. United States, the Supreme Court said hey, wait a minute, that's going to allow all sorts of mass surveillance of people's location data, so maybe we should make an exception for pervasive monitoring that can't be reasonably avoided. But iCloud is definitely not going to fall into that exception, because you have the option of not using it (perhaps surprisingly, the Supreme Court was prepared to recognize that most people don't realistically have the option of *not* owning a cell phone, but they are not going to extend that to iCloud specifically).

Scanning "private" content

Posted Aug 13, 2021 21:15 UTC (Fri) by dskoll (subscriber, #1630) [Link] (5 responses)

Huh, interesting. So what about people who already own iPhones? Can they get a refund because Apple has fundamentally changed the terms of the agreement? (I doubt it, sadly.)

What if Google also did this? Realistically, if you're going to own a cell phone, it's either going to be IOS or Android. Remaining devices have a minuscule market share.

Scanning "private" content

Posted Aug 15, 2021 13:13 UTC (Sun) by k8to (guest, #15413) [Link] (1 responses)

Google does this. At least the camera app pushes images to the cloud by default (backup features or some shit) and they definitely scan that data for CP.

I uninstall all the Google image related apps for this reason, but who knows when the mechanisms will change, so for they most part I don't keep any images on my phone. Who knows when the overreach will change or when something will be falsely flagged.

Scanning "private" content

Posted Aug 15, 2021 13:16 UTC (Sun) by k8to (guest, #15413) [Link]

Err I meant the photos app,not camera.there are too many similar sounding apps.

Scanning "private" content

Posted Aug 16, 2021 20:03 UTC (Mon) by NYKevin (subscriber, #129325) [Link] (2 responses)

> Huh, interesting. So what about people who already own iPhones? Can they get a refund because Apple has fundamentally changed the terms of the agreement? (I doubt it, sadly.)

Doubtful, but they could disable iCloud photo uploading, at which point (as far as I can tell) they would no longer be subject to this scanning. At least until Apple changes the policy again.

(If you think that Apple *won't* change their policy again, then this whole controversy is a complete nothingburger, because they were very likely already doing some sort of CSAM scanning on the server side anyway. That's basically standard practice in the industry, barring E2EE products that are incapable of it. The only reason this should be controversial is because of the possibility that it later expands to include more stuff.)

Scanning "private" content

Posted Aug 16, 2021 23:04 UTC (Mon) by rodgerd (guest, #58896) [Link] (1 responses)

> they were very likely already doing some sort of CSAM scanning on the server side anyway. That's basically standard practice in the industry, barring E2EE products that are incapable of it.

Pretty much. OneDrive, Google Drive, Dropbox will almost certainly all be doing this, and I'd be surprised if Slack, Teams, and so on likewise don't.

WhatsApp have made a big deal about *not* doing this, but they provide an even (IMO) creepier feature for law enforcement, which is using metadata analysis to de-anonymise the social graph of their users.

(Even creepier overreach is that the T&Cs for some smart TVs - which is any TV you can buy now - specify that you agree to the TV screen capping and sending the screen caps back to base)

> The only reason this should be controversial is because of the possibility that it later expands to include more stuff.)

There's a few more things to it than that, in my opinion:

1. Not everyone wants US government orgs setting legal policy for their devices. CSAM is pretty much that.

2. How are false positives handled? We've seen woo like polygraphs misrepresented (i.e. lied about) in courts, along with other types of forensic pseudo-science like ballistics work. The possibility of very serious legal trouble from a misrepresented or misunderstood application of hash collisions is not a comfortable thought.

3. Once it is well-understood that this sort of scanning is available, pressure to expand is inevitable, with the same leverage: you provide this facility, or you're out of our market.

Scanning "private" content

Posted Aug 17, 2021 0:45 UTC (Tue) by rodgerd (guest, #58896) [Link]

To follow up on my own comment, this is what Google Drive looks for and enforces: https://support.google.com/docs/answer/148505#zippy=%2Cmi...

Scanning "private" content

Posted Aug 15, 2021 16:46 UTC (Sun) by giraffedata (guest, #1954) [Link] (1 responses)

In addition, the Fruit of the Poisonous Tree rule in the US applies only to searches by the government.

If I suspect my neighbor, so I break into his house and find a bloody shirt that ties the neighbor to a murder and I take that shirt to the prosecutors, they can use that in court.

Here's why: My neighbor already has protection against me doing this -- I can go to jail for it. He really doesn't have any protection against the government doing it; letting people in his situation get away with a crime seems to be the only effective incentive to the government not to do it.

If prosecutors asked me to break in (or even strongly suggested or encouraged it), that would be different.

Scanning "private" content

Posted Aug 16, 2021 11:07 UTC (Mon) by immibis (subscriber, #105511) [Link]

> If prosecutors asked me to break in (or even strongly suggested or encouraged it), that would be different.

Many commenters are speculating that Apple made this decision because the alternative was for the government to demand an end to encryption altogether.

Scanning "private" content

Posted Aug 13, 2021 14:01 UTC (Fri) by mathstuf (subscriber, #69389) [Link]

> The reason that the fact that they're using a (effectively) government provided list doesn't cause a problem is because they have a human manually review the content (which is why Apple includes that step-- without it the 'search' wouldn't happen until the NCMEC opened the files, which would require a warrant).

According to this post[1], 18 U.S.C. § 2258A is clear that the *only* legal way to transmit suspected CSAM is to NCMEC. Not the FBI, not the local police, and certainly not Apple. Apple sending themselves content which has a "1 in a trillion" (not that I believe they have the numbers to back up such a claim, but let's go with their PR here) chance of *not* being CSAM is blatantly illegal here.

[1] https://www.hackerfactor.com/blog/index.php?/archives/929...

Scanning "private" content

Posted Aug 12, 2021 0:52 UTC (Thu) by jkingweb (subscriber, #113039) [Link] (3 responses)

Software installed on my property which acts as an agent for someone else and could land me in jail? Sounds like malware to me...

We should not tolerate software that acts contrary to the interest of the owner.

Scanning "private" content

Posted Aug 12, 2021 20:36 UTC (Thu) by LtWorf (subscriber, #124958) [Link] (1 responses)

> We should not tolerate software that acts contrary to the interest of the owner.

That's why we need free software.

Scanning "private" content

Posted Aug 13, 2021 22:26 UTC (Fri) by jkingweb (subscriber, #113039) [Link]

Moreover, it's why we need open hardware, which is becoming less common with the passage of time, not more.

Scanning "private" content

Posted Aug 12, 2021 21:44 UTC (Thu) by pizza (subscriber, #46) [Link]

> We should not tolerate software that acts contrary to the interest of the owner.

I'd argue that the software of which you speak is very much acting in the interest of the owner.

The user is not the owner.

Meanwhile, most folks don't actually care at all about "software" in of itself -- because the software is useless without a service running on someone else's computers.

Scanning "private" content

Posted Aug 12, 2021 0:53 UTC (Thu) by azure (guest, #112903) [Link] (2 responses)

Ever since this came out, I've been suspecting this is Apple's pushback against anti-monopoly interventions. If users can sideload whatever apps they like, won't they be able to bypass whatever scanning Apple puts in place?

Scanning "private" content

Posted Aug 12, 2021 1:18 UTC (Thu) by wahern (subscriber, #37304) [Link] (1 responses)

Similarly but maybe slightly more plausible is that Apple was facing down an imminent demand by government(s) to weaken iPhone security. Apple unilaterally proffers this as a compromise. (Not singularly, though; Apple has had to acquiesce in other areas, like encrypted backups.)

It pushes people who exchange child porn onto other, less secure platforms, making it easier for the government to prosecute marginally more people. (Ditto for some other criminal behaviors if they trust Apple less.) Far more importantly it disarms the government of the cudgel of child porn in its political campaign to weaken iPhone security. And for the same reasons child porn is such a great cudgel--i.e. "I've got nothing to hide" is more persuasive when the topic is child porn--Apple likely considers the cost/benefit to its credibility acceptable, especially presuming the alternative Apple faced was more direct and substantial weakening of iPhone security (e.g. key escrow) that lacked inherent subject matter limitations.

Scanning "private" content

Posted Aug 12, 2021 20:44 UTC (Thu) by LtWorf (subscriber, #124958) [Link]

On a false positive (say of my kid at the beach) an apple employee will view the content. I don't think it's anywhere near acceptable for this to happen.

And for all I know, a person seeking a job to look at pics of kids might have an interest in choosing the profession…

In my opinion they are starting with this, since most people see it as an acceptable measure to protect children, but will soon after move to look for copyrighted content.

Scanning "private" content

Posted Aug 12, 2021 6:56 UTC (Thu) by fenncruz (subscriber, #81417) [Link]

Have Apple just created a way to DOS apple users? A malicious user sends an apple user enough abuse images to trigger the threshold. The account gets locked until a human gets through their backlog of cases (is reviewing child abuse images really a job many people want or would stay in for a long time?). Account is locked for maybe days or weeks untill the appeal is heard and decided on.

Cloud = big brother

Posted Aug 12, 2021 8:12 UTC (Thu) by nim-nim (subscriber, #34454) [Link] (11 responses)

Just a few more steps and global Big Brother implementation will be complete.

It will surprise no one except Americans with their blind faith in private companies that the end result of giving data keys to an oligopoly of cloud giants, results first into processing of this data for those giants own needs, and second to the extension of this processing for all kinds of public and private third party demands.

The data trove is just too tempting and there are too few people needing to be leaned on to breach all safeguards.

It has been a long way into the making, by refusing to break out companies that were overreaching, helping them marginalize independent data clients (Firefox), own the data pipes (https hardening), take over the software market (open source vs free software ie letting the corporations with the most money take over dev projects and open core them).

Unfortunately at this stage nothing short of a huge scandal or the USA losing a major war can reverse the process. Delegating surveillance to a few giants is just too convenient for the US government and it will shield them from any serious challenge.

Cloud = big brother

Posted Aug 12, 2021 11:05 UTC (Thu) by alfille (subscriber, #1631) [Link] (5 responses)

I'm not sure I understand nim-nim's point entirely. Trust in big companies is not exclusive to the USA, nor are other vectors of intrusive monitoring (governments, cults, schools) any better. In fact the priority of personal freedom is being eroded globally.

Cloud = big brother

Posted Aug 12, 2021 13:07 UTC (Thu) by nim-nim (subscriber, #34454) [Link] (4 responses)

The USA are weird insamuch they are ultra-sensitive to anything done by the state and ultra-tolerant of anything done by private companies.

In other parts of the world the same rules apply to both private and public sectors.

Cloud = big brother

Posted Aug 13, 2021 2:52 UTC (Fri) by alfille (subscriber, #1631) [Link] (3 responses)

There is certainly some truth to the thesis that government is more distrusted in US culture than in Europe. The counter to that is that being subjected to a corporate choice is usually not compulsory, you can vote with your wallet and move to another platfom. Changing government is more difficult.

And the motives of a corporation are clearer -- profit. What is the motive for a government? Staying in power, control, and a complex mix of competing lobbying groups.

At least that is the view of distrustful Americans.

Cloud = big brother

Posted Aug 13, 2021 11:39 UTC (Fri) by kleptog (subscriber, #1183) [Link] (2 responses)

> Changing government is more difficult.

Is it though? I have three countries within two hours travel which I can move to if I wanted to. Getting away from local or provincial government is even easier. Getting away from Google, Amazon, Apple, etc OTOH...

For both government and business trust needs to be earned. And frankly I trust my government way more than Google, though I can understand this doesn't apply to everyone. And the US political system is particularly... opaque.

Cloud = big brother

Posted Aug 15, 2021 13:22 UTC (Sun) by k8to (guest, #15413) [Link]

It's a false narrative for sure, though I think accurately described. Government can be influenced by your votes, organizing, and lobbying. Corporations there's no guarantee you have any influence at all.

Cloud = big brother

Posted Aug 17, 2021 16:45 UTC (Tue) by NYKevin (subscriber, #129325) [Link]

> I have three countries within two hours travel which I can move to if I wanted to.

Ha. Americans have that at the state level, in some parts of the country, but if you dislike what the federal government is doing, your nearest options are Mexico or Canada. Because the US is simply huge, at least one of those options is guaranteed to be unreasonably far away.

Also, we don't have a Schengen-like-arrangement with either of them, so you'd have to go through the whole immigration process, which can take months or years.

Cloud = big brother

Posted Aug 12, 2021 16:27 UTC (Thu) by marcH (subscriber, #57642) [Link] (4 responses)

> It will surprise no one except Americans with their blind faith in private companies ...

I agree many Americans distrust "government" more than private monopolies, which seems indeed naive / ideological. Democratic governments tend to have at least the Hanlon's razor on their side whereas private monopolies are much more efficient at screwing us. However:

- It does not follow that Americans have a blind faith in private companies! Think: "lesser of two evils".
- In this particular case it's really the combination of governments _and_ monopolies at work.

Cloud = big brother

Posted Aug 14, 2021 15:18 UTC (Sat) by BirAdam (guest, #132170) [Link] (3 responses)

A lesser evil is still evil.

Cloud = big brother

Posted Aug 16, 2021 9:16 UTC (Mon) by marcH (subscriber, #57642) [Link] (2 responses)

What many people miss is the possibility of these two evils cancelling each other out, more specifically a very carefully balanced regulation trimming the worst excesses of the private sector. This includes of course anti-trust regulations to ensure some competition but not just. Also known for a very long time as "checks and balances".

"Carefully balanced", numbers, science and complexity are unfortunately all dead; simple people watching cable news and social media demand yes/no answers.

Is government evil? Yes / No.
Is Big Tech/Pharma/Oil/... evil? Yes / No.

Afraid even the words "less" and "more" are gone.

Etc.

Cloud = big brother

Posted Aug 16, 2021 9:46 UTC (Mon) by Wol (subscriber, #4433) [Link] (1 responses)

> "Carefully balanced", numbers, science and complexity are unfortunately all dead; simple people watching cable news and social media demand yes/no answers.

I think you mean simple JOURNALISTS. Many people probably would like more detail, unfortunately the gutter press has learnt that a lot of people enjoy watching journalists and politicians fighting, and as always, the bad drives out the good - decent investigative journalism has died ...

Cheers,
Wol

Cloud = big brother

Posted Aug 17, 2021 16:21 UTC (Tue) by marcH (subscriber, #57642) [Link]

> I think you mean simple JOURNALISTS.

I don't and I respectfully disagree. I think journalism is one of the markets where there is pretty decent competition and where crap wins because that's what most people "consume" preferably, not because of a lack of hard working (but underpaid) journalists trying to do the Right Thing. Because crap is emotional, "infotaining" or free or all of the above. Even when we understand that "if it's free, we're the product" it's still hard to resist and easy to fall for it.

e2e encryption

Posted Aug 12, 2021 11:15 UTC (Thu) by zdzichu (subscriber, #17118) [Link] (1 responses)

> both of those scans will be done on the user's phone, which will effectively break the end-to-end encryption

If the scan is done on the user's phone, it *does not* break end-to-end encryption, as the phone is one of the "ends".

On the other hand, Apple collecting CP, even for "review purposes", has dubious legal status.

e2e encryption

Posted Aug 12, 2021 21:54 UTC (Thu) by gmaxwell (guest, #30048) [Link]

That's a little like arguing that e2e encryption wouldn't be broken by requiring everyone only use the encryption key "password". :) It's technically true, but it undermines the purpose of the encryption.

Scanning "private" content

Posted Aug 12, 2021 11:37 UTC (Thu) by sam.thursfield (subscriber, #94496) [Link] (1 responses)

Seems like a good time to run an encrypted email service based in somewhere not subject to EU law, like Switzerland for example.

Scanning "private" content

Posted Aug 12, 2021 12:14 UTC (Thu) by smurf (subscriber, #17840) [Link]

How does that help when the scanner is embedded in your phone's OS?

Scanning "private" content

Posted Aug 12, 2021 11:37 UTC (Thu) by kleptog (subscriber, #1183) [Link]

> According to Swiss federal police authorities, 86% of all machine-generated reports turn out to be without merit

All issues aside, that's actually way better than I expected. Any fraud analyst would be quite happy with that kind of accuracy given the volume of data involved.

Scanning "private" content

Posted Aug 12, 2021 18:23 UTC (Thu) by tchernobog (guest, #73595) [Link] (2 responses)

So, everything I do as a law-abiding citizen in the EU can be viewed by strangers and I can be flagged from an AI to disseminate child pornography because my 4 years old goes to the beach naked. And all my communication software will need to be adapted to permit this, including my personal mail server (since I don't use GMail).

Whereas, a criminal *really* disseminating child pornography will use secure end to end encryption since they don't care about "the law".

Doesn't anybody else in the EU parliament see the problem with this?

Scanning "private" content

Posted Aug 13, 2021 9:12 UTC (Fri) by james (subscriber, #1325) [Link]

  1. Most criminals, like the rest of us, are not particularly technically proficient. Catching the incompetent ones is worthwhile -- ideally, because it frees up resources to concentrate on the competent ones.
  2. EU governments are quite capable of coming up with "secure" end-to-end communications for criminals to use where the police can evade the encryption: see Encrochat. It's actually quite difficult to protect against that.
  3. One of the problems with secret communications between criminals is that it is only as strong as its weakest link, which is likely to be the least technical member: the ones who will copy "interesting" images onto their phones.

Scanning "private" content

Posted Aug 17, 2021 16:52 UTC (Tue) by NYKevin (subscriber, #129325) [Link]

> because my 4 years old goes to the beach naked.

There has been some confusion on this point, so to be clear: The phrase "visually similar," in this context, refers to things like cropping, adding/removing a watermark, greyscale/color, hue/saturation, etc., not the https://xkcd.com/1425/ problem. If you take a brand-new photo, regardless of the subject matter, this technology is not intended* to find that photo.

* False positives exist.

Scanning "private" content

Posted Aug 12, 2021 19:19 UTC (Thu) by albertgasset (subscriber, #74366) [Link]

> The EU General Data Protection Regulation (GDPR) is a sweeping framework for protecting personal data, but since the start of 2021 it no longer covers messaging services. That kind of communication falls under the ePrivacy directive instead, thus the change allowing scanning is a derogation to it.

I think this is not accurate. The linked "ePrivacy Regulation" directive is a proposal that was meant to be approved at the same time as the GDPR but it is still being discussed. The directive that has been (partially) derogated is the "Privacy and Electronic Communications Directive" (2002/58/EC) which is also known as the "ePrivacy Directive". But the GDPR (2016/679) still applies to messaging services:

> (12) This Regulation provides for a temporary derogation from Articles 5(1) and 6(1) of Directive 2002/58/EC, which protect the confidentiality of communications and traffic data. The voluntary use by providers of technologies for the processing of personal and other data to the extent necessary to detect online child sexual abuse on their services and report it and to remove online child sexual abuse material from their services falls within the scope of the derogation provided for by this Regulation provided that such use complies with the conditions set out in this Regulation and is therefore subject to the safeguards and conditions set out in Regulation (EU) 2016/679.

See: https://www.europarl.europa.eu/doceo/document/TA-9-2021-0...

Scanning "private" content

Posted Aug 13, 2021 13:40 UTC (Fri) by flussence (guest, #85566) [Link]

While I agree with the premise that horrible people use iPhones (the ultra-wealthy have plenty of hush money), I question why they went to these lengths to work with a government that practically gloats about how it abducts children by the thousands in broad daylight and disappears them into battery farm conditions. Maybe Apple's long game is hoping to catch a few of the officials responsible for that? That's my optimistic interpretation anyway.

that seem to be of a sexual nature,

Posted Aug 16, 2021 13:05 UTC (Mon) by scientes (guest, #83068) [Link]

> that seem to be of a sexual nature,

The "I know it when I see it" horseshit.

Scanning "private" content

Posted Aug 16, 2021 21:00 UTC (Mon) by scientes (guest, #83068) [Link]

The EU is nothing more than a giant cabal like the defence pacts that lead to WW1. A bunch of incompetent beurocrats telling people what to do, and Coronavirus demonstrated that the EU doesn't exist.


Copyright © 2021, Eklektix, Inc.
This article may be redistributed under the terms of the Creative Commons CC BY-SA 4.0 license
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds