Hacker News new | past | comments | ask | show | jobs | submit login
After criticism, Apple to only seek abuse images flagged in multiple nations (reuters.com)
263 points by ldayley on Aug 13, 2021 | hide | past | favorite | 240 comments



And... Apple misses the point of the criticism completely.

This problem is the capability, not what it's used for. Any such capability will be abused by new use cases be it terrorism, drug trafficking, human trafficking or whatever. Plus there will inevitably be unauthorized access.

The only way to prevent all this is for the system not to exist.

I don't buy into theories that Apple is being pressured or coerced on any of this. I believe it's far more likely this is just tone-deaf but well-intentioned incompetence. It's classic "why won't anyone think of the children?" and we've seen it time and time again with encryption backdoors and similar.

The big question for me is how and why Tim Cook and Apple's board signed off on a plan to do huge damage to Apple's reputation and user trust. If they didn't know, it's a problem. If they knew and didn't realize the inevitable backlash, well that's a different problem.


> This problem is the capability, not what it's used for. Any such capability will be abused by new use cases be it terrorism, drug trafficking, human trafficking or whatever. Plus there will inevitably be unauthorized access.

Apple even says it themselves[1]:

> This program is ambitious, and protecting children is an important responsibility. These efforts will evolve and expand over time.

[1] https://www.apple.com/child-safety/


I can't wait until /child-safety 404s or redirects to /safety and there's a wall of marketing blurb (possibly only in Chinese at first) that explains how 'national security' concerns are reported to the CCP.

This has totally pushed me over the edge, though I'll admit I was oblivious to begin with. My plan is to replace the MacBooks with a Thinkpad P15 gen 2 running Ubuntu and replace the iPhone with something running Ubuntu Touch (Volla Phone, Fairphone, OnePlus One). Screw not having control.


That’s a smart plan. Apple didn’t misunderstand the criticism, they just have ulterior motives. It’s the only rational explanation.


“It’s the only rational explanation.”

Or, perhaps they feel really strongly about child exploitation?


Then they would not allow pay-to-play in-app purchases


As another commentator stated, if they actually cared about this issue, they would have done what all the other cloud providers have been doing for many years, and what everyone opposed to this wants them to do - just scan the cloud. That's all they're effectively doing right now, only with the scanning happening on a person's private device.


If they did, they would have done this much sooner.


Same here. Just that I'll get rid of MS and Google. That means CalyxOS on my Pixel 2 and some version of Linux on my private ThinkPad. Once I find time that is, which to be honest can take a while. I think the last time I used my private laptop was over a month ago. Phone is different, but again time constraints. It will happen so.


I'm in the same situation in terms of tech debt against my personal infrastructure. I have plans to migrate my personal (currently Ubuntu) server to use Proxmox on top of selling the current machines & buying new non-Apple ones for this. The evening and weekend time is tight.


There is another option - don’t put anything sensitive on your phone and basically treat it as an adversary already has root access. This pretty much what I do for a long time. All the sensitive information (e.g. banking) is on Linux desktop and there are no logins or apps on iPhone or work laptop (hn is not sensitive ;) ).


> hn is not sensitive

In literally 30s looking at your comments it shows that you're an HR and an amateur pilot.

Seeing the number of comments, with slightly more time, your place of residence and political leaning would probably become apparent.


Missed 100% ;)


> don’t put anything sensitive on your phone and basically treat it as an adversary already has root access

Why am I paying 100s of $$$ to own a device that is adversarial?


For fun. Why do people pay to run with the bulls or to own a tiger? We pay because we love the thrill that comes from a pocket adversary.


I started a thread on what specific actions we can take to avoid Apple and other big tech in future. While it did not gain much traction, I am hoping there are at least 1,000 people in the world who think like me and over time we will be able to influence more people.

https://news.ycombinator.com/item?id=28157281


I might be reading between the lines here, but you know that Thinkpads are made in China, by a Chinese company, Lenovo.


Dell XPS 13's are better than MBP anyway.


I also note that /child-safety page is only(?) accessible from outside search/link. There is no corresponding press article. This thing is just floating in the air somehow…


check out the Framework Laptop - https://www.youtube.com/watch?v=0rkTgPt3M4k


from ptrotecting children to global ip dmca takedowns.

we all know this is gonna be dmca 2.0


Yep.

And you know who that gun is aimed at first?

Apple employees.

Even if you trust Apple and NCMEC to not add out of scope hashes that some government or law enforcement or intelligence agency “asks” them to, does anybody who’s ever worked for Apple have any doubt at all they they’d use this to check employee’s personal devices for Apple IP? Especially if a big spectacular leak hits the media? Apple’s IP enforcement goons are legendary. And not in a good way in most people’s opinion. Particularly Gizmodos…


Apple employees can be subject to all kinds of employer installed management tools and builds, that’s nowhere near the same page of concern.


I'm pretty sure all cloud storage providers have good idea about who has what pirated content already.


Moreover, things could become retroactively bad. If tomorrow multiple countries decide to change the minimum age of adult performers from 18 to 19 a bunch of pics legal today could become illegal tomorrow leading to wide dragnets enabling law enforcement to go through everything you have.


I just have to wonder who green-lit this idea to begin with. Every single other company is riding a post-covid tidal wave of sales. as WFH becomes a reality for so many workers, apple products are positioned to really take center stage for a massive segment of consumers...

and then this. Apple intentionally injects uncertainty and controversy? what were they thinking?? sure its just phones right now but im sure mac users are wondering about workstations and laptops? the damage control being spun right now is absolutely overwhelming.

im also surprised to see no other players like MS or Google rushing to take advantage of the outrage. even players like Purism seem to be ignoring the event.


> even players like Purism seem to be ignoring the event.

Purism addressed it yesterday[1]. Here's the HN thread[2].

[1] https://puri.sm/posts/internet-of-snitches/

[2] https://news.ycombinator.com/item?id=28158939


>>I just have to wonder who green-lit this idea to begin with.

I suspect, with no inside knowledge, that there's someone high up in Apple who feels very, very passionately about child abuse/these sorts of images due to some personal connection, and they are championing the idea internally. And tbh not the worst bugbear to have, but obviously this isn't the way to address it.

As for why the big players aren't jumping on this - it's a tricky line to walk, you don't ever want to be seen as pro child abuse.


Having read the coverage and the documents that Apple has been putting out to manage the … what, crisis? Backlash? … I’ve been thinking about the design of the system and how I would want CSAM to be detected, if different than this.

Honestly, I’m pretty impressed by what this approach accomplishes. Your comment says this obviously isn’t the way to do it— what’s better than this?

In a legal context where detecting CSAM is a strong requirement, what’s preferable to this approach?


>>>In a legal context where detecting CSAM is a strong requirement, what’s preferable to this approach?

It's not a strong requirement though? Only if the company sees it (unencrypted) do they need to report it, detecting it on-device is wholly different. It would honestly be better if they adjusted their TOS and started scanning the files on upload (on server side). They have the encryption keys already.

Apple is planning to install on all (recent) iPhones software that will let them scan for any image similar (using perceptual hashing) to some set of images. Right now, that's CSAM, and only on upload to iCloud. But what do you want to bet China is salivating at the idea of including images of Tank Man and other "seditious" content? And maybe checking images anytime they're added to the phone, or sent to someone else, not just up to iCloud?

It completely ruins the idea that Apple has your privacy/security in mind.


Even if scanning for the content is not explicitly required by law, what happens when a pedophile ring hoarding thousands of images of CSAM on iCloud is busted, and the news gets out? The article at [1] makes it sound like Apple choosing not to scan any of its users' videos in iCloud was seen as evidence of Apple lagging behind the status quo of companies like Facebook that were proactively reporting CSAM.

According to that article, in the last year Apple only submitted 265 reports to the NCEMC while Facebook submitted 20 million. Would law enforcement believe they'd be missing out on catching abusers after seeing this disparity?

If a company is found to allow criminals to store CSAM on their servers for extended periods of time, the law is going to want to know why they let it pass, irrespective of the extent the company chose to scan for it. Apple probably doesn't want to deal with that fallout, so maybe they figured that being proactive about scanning for CSAM in a way that could enable the use of E2EE wouldn't hurt, and that pushing the privacy narrative would satisfy enough people - which it didn't.

[1] https://www.nytimes.com/2021/08/05/technology/apple-iphones-...


>>>If a company is found to allow criminals to store CSAM on their servers

I'd bet dollars to doughnuts that AWS, GCP, Azure, et al. have terabytes of CSAM stored within their data centers - because users have uploaded encrypted files and the companies (rightly) don't have the keys. I'll also bet there are many WD hard drives full of CSAM, many Linux servers hosting it, many nginx or apache installs serving it up, etc. Should we mandate that all hard drives scan files as they're written to see if it's CSAM? Or maybe nginx should alert law enforcement anytime a CSAM image is served.

Apple should have actually let their users have E2EE or given up on that and just scanned stuff server-side.


How much of an impact do you think this will have on the non tech savvy user though? Most people already don't care about ad tracking, location tracking, etc. I think your standard user will just shrug this off and go "eh, I have nothing to hide and it's good for the kids"


I don’t like that I have to resort to this, but I was going to mention Proles and point to Wikipedia, except that Wikipedia deleted article about that topic. Yet I can see article’s blurb in search engine’s cache! Everything is a meta nowadays…

[1] https://en.m.wikipedia.org/wiki/Proles_(Nineteen_Eighty-Four...


"Apple is scanning every image you upload" will certainly have a measurable effect on non-tech users, even if the details are more nuanced.


Which is stupid because it’s also true for all other major cloud providers.


I won’t be surprised if there’s a class action launched soon.


It will come to the Mac eventually though, right? Macs can upload photos to iCloud.


Sooner or later the copyright enforcement companies will find a judge whose willing to rule that since Apple has the capability they have to use it even though this system is aimed at still images only, no sound or video. Then Apple will end up fighting it for years if they're lucky and it doesn't get a quick trip up to Supreme Court that's drastically Hollywood-friendly. The legal difference between mandating that a company create a brand new scanning capability and "just apply" one they already have is significant.


Apple doesn't have that capabilty though. There are countless ways to get copyrighted material on your phone that are perfectly legal. Apple would have to be able to decide if you're legally owning your files. With how the IP system works currently that's impossible. For CSAM this isn't a problem because we assume that there is no legitimate way to have CSAM files on your phone.


> we assume that there is no legitimate way to have CSAM files on your phone

WhatsApp has "Save pictures to camera roll" on by default. Someone can send you a bunch of CSAM via WhatsApp and now you are in trouble.


Or take a photo of CSAM on display on another device or printout using your iPhone camera, which is often accessible from the lock screen without a PIN.


Yep there sure is something fishy about this whole thing. Talk about shooting yourself in the foot...


normalize security for children and then get paid to dmca 2.0 ip owners content.


> I don't buy into theories that Apple is being pressured or coerced on any of this.

Of course they were.

It’s well known that Apple chose not to introduce e2e encrypted iCloud backups back i 2017/18 or so, due to FBI complaints.[1]

This is clearly Apple’s play to be able to introduce that again and tell law enforcement “Look, we ‘thought of the children’, if you want further access to our customers data, you’re going to need to come up with a better justification than that.”

If Apple pull that off, adopting client side image scanning with this quite impressive privacy preserving system behind it, and then e2e encrypt everything they upload to iCloud, that’d arguably be a very big win for Apple customers privacy.

Whether that’s an acceptable trade off for having a device I purchased run code I didn’t ask for and don’t want to monitor where or not I’m a paedophile, possibly snitching on me for false positives or bad-faith additions of non CSAM hashes into the database, or not is a good question still.

1: https://www.reuters.com/article/us-apple-fbi-icloud-exclusiv...


> This problem is the capability

> The only way to prevent all this is for the system not to exist.

I've been having trouble following this argument over the last week. Isn't it clear that the capability already exists? Whether or not Apple goes through with its CSAM plan, the capability is evidently there.

In other words, since Apple is a closed system, the capability was there before they announced the CSAM plans. Their announcement has changed nothing about capability other than reminding people that Apple has privileged access to the data on iPhones.

I guess the question is, if Apple does the CSAM program does that make them more likely to cave to government pressure to search for other things? And to do so without telling users?


Once a certain piece of technology exists, courts can order that it be used a certain way other than the way it was intended.

Neither courts nor legislative bodies are capable of explicitly requiring the invention of technology that doesn't yet exist, although in some cases they could make it an implicit requirement of a law.

At least in the US most lawmakers are varying shades of technologically illiterate and don't have a grasp of what is possible or not possible. If technology doesn't exist yet, you can say to them "this is impossible", and with enough of a bribe, they will believe it.

Apple by building this technology, has put it on the map and announced that this technology both possible and is ready.


It's a very ill-considered argument, because Apple obviously already has the capability of pushing literally any conceivable software update to iPhones. Apple (and thus iPhones) already have the capability to do anything you can conceive of, regardless of how terrible. For instance, iPhones have the capability of activating the cameras at all times, detecting when they capture compromising or embarrassing video, and uploading those videos to a public website along with your name and contact information. The only thing preventing Apple from doing that is their own policies.

If you think it's bad that Apple can push software updates to iPhones, then by all means make that argument. But "the only way to prevent bad changes to this system down the road is for this first version of the system to never exist" is a very poor argument, given that the only "system" that needs to exist is the ability to push software updates, and that system existed long ago.


That isn't the argument at all. What Apple can't have happen without major reputational damage is get caught doing it without telling its customers.

Thus this is seen as a marketing backdoor to the actual backdoor. This is seen as a ploy that later gives Apple the plausible deniability argument that the government is making them do it (other forms of content) even though they created the means (on device scanning) for the government to exploit.

Why limit it to content that will be uploaded to iCloud? Why limit iMessage scanning to kids? Why not also let the government know? Why limit it to child porn since the system works for any image? Why limit it to 30 hits?


Imagine the reputational damage to Apple if say in 2 years it was discovered that they had been scanning all photos on your device and that the matches had been going to the government and that they didn't tell you. Their privacy marketing angle would be completely toast. Imagine if they did this to phones outside the US.

Any sane marketing person would try to find a way to sugarcoat it first.


> But "the only way to prevent bad changes to this system down the road is for this first version of the system to never exist" is a very poor argument, given that the only "system" that needs to exist is the ability to push software updates, and that system existed long ago.

Apple relied on that argument 5 years ago successfully.[1]

[1] https://en.wikipedia.org/wiki/FBI–Apple_encryption_dispute


There is a qualitative difference between tweaking an existing code base (expanding the scope of this detector) and coercing a company to dedicate teams of tens to hundreds of engineers over years to create a brand new code base from scratch (writing this detector in the first place.)


There shouldn’t be a single line of code, or single binary blob, on my device that can compute these photo DNA hashes.

Apple could always go ahead and add that functionality in an update, but then there would be a big backlash and the opportunity to not update or switch providers.


> Apple could always go ahead and add that functionality in an update, but then there would be a big backlash and the opportunity to not update or switch providers.

Isn't that exactly what is happening now?


It is exactly. And I have my opportunity to start migrating to Android before the big update.


Wouldn't users have the ability to not update or switch providers if Apple expands the scope of its search beyond CSAM?

I guess I don't see the huge difference between the surveillance code existing on the phone but not used for objectionable purposes versus the line of code sitting in a different branch in git and not deployed on actual phones (yet).

I'm completely against this move by the way -- I'm not trying to defend it. But I want to be able to argue effectively against it.


This is like renting a house and the owner saying "we installed a camera in your house, but we'll only look at the video, if someone screams 'help'"

Then someone says "what if they change it so it sends the video if someone says 'do you want some pot?'"

And your argument would be "wouldn't you just move if they did that?" - this argument is pointless, because it's not the "help" part that bothersome, it's the always recording camera in a black box, for which you never know what's recording and what's it sending to apple. Let's say hypothetically they do this, and modify the database to trigger on the "tank man" photo, and give that photo such weight, that it triggers manual review by the CCP observer (remember, apple data for chinese users is hosted in china) - you'll never know the image got flagged, you'll never even know that it was sent anywhere, but your social score will go down and you might "vanish" during the night. Imagine some new wikileaks happening... 15 journalists around the world get the data, NSA/CIA/FBI has hacked one of them and found some photos of USA doing something bad.... add those photos to the database, have apple scan all the phones, you find the other 14 journalists, and possible even the leaker who took the original photo. If this is expended to macbooks (which it probably will), even worse, because people keep more sensitive stuff on ther computers than their phones.

And all this for what? Child molesters and rapists (the worst of the pedos) create new photos and videos, that are in none of the databases, so you never catch them. Their direct sponsors (paying customers) get the photos first, and again, before they're in the database. So all you've caught is someone browsing 4chan.


> And all this for what? Child molesters and rapists (the worst of the pedos) create new photos and videos, that are in none of the databases, so you never catch them. Their direct sponsors (paying customers) get the photos first, and again, before they're in the database.

I hadn’t considered that. This would seem to create a market incentive to abuse more children and create a greater variety of new abusive content. Has that been considered at all?


No, because they’re not scanning for CSAM, they’re scanning for hashes. That hash database needs to be updateable for it to be useful at all. A significant part of the policy protections are implemented off device, so those can’t be trusted.


The only explanation that makes sense to me is that they are doing huge damage to their reputation and trust for bigger reasons. Perhaps there are international or national security issues they are told to comply with that are not quite in the public view yet.


Occam's razor says they aren't playing 11th dimensional chess and that they just fucked up. It happens.


"Build this. Don't explain why" is not chess. It would be a perfectly logical order from the power structures Snowden exposed.

eta: It makes way more sense to me than Apple actually thinking this was a good idea.


> It would be a perfectly logical order from the power structures Snowden exposed.

Snowden exposed that Apple was actively part[1] of the PRISM data collection and surveillance program targeting Americans.

Apple also voluntarily gives up customer's data when requested by the US government for about 120,000 individuals a year[2]. They also hand over the data for over 31,000 users/accounts in response to FISA and NSL requests[2] in a six month period.

> eta: It makes way more sense to me than Apple actually thinking this was a good idea.

Based on the company's own messaging, they simply seem proud of the project and find it ambitious[3]. They're also excited about the project's expansion and evolution[3].

[1] https://en.wikipedia.org/wiki/PRISM_(surveillance_program)#/...

[2] https://www.apple.com/legal/transparency/us.html

[3] https://www.apple.com/child-safety/


They do seem proud, but one of these things makes less sense than the others, given Apple's prior reasoning.

We don't know what goes on in the secret courts these days but we have been assured watchdogs are in place and certain programs are no longer in use [1]. This is all speculation based on past behavior, but I assume new reasoning has been constructed that passes watchdog's interpretation of the law but requires new hoops for three-letter agencies to jump through to get the mountains of data they so yearn for.

[1] https://reason.com/2015/11/29/the-nsa-will-stop-collecting-y...

Now it kind of sounds like I'm playing 5D chess, but since we don't know what takes place under secret Executive Orders...


maybe they want to stop giving up customer data and doing this on the phone is the only way to do e2ee backups - can't hand over what you can't decrypt.


Seems like a reach to me. The government doesn't just care about CSAM. They care about terrorism, human and drug trafficking, gangs, organized crime, fraud, etc.

That might make sense if CSAM detection is just the start, and they plan on detecting all those other things, as well.


"Don't tell anyone" seems like it would be step 3 of the instructions there.


That would arouse suspicion as soon as someone (perhaps someone inside the company) figured out what was going on. Here, a key person introduces the idea, says it's "for the children," and people are baffled but the masses will forget about it, even come to accept the logic, soon enough.


They fucked up? No they didn't fuck up.

They got what they wanted. It did sting a bit, but it was calculated.

They won't suffer any massive losses over that, they have the moral high ground among non tech people, who mistakenly believe that they're doing something to protect children.


Why? "A huge fuck up while trying to be seen as champion of protecting children" perfectly explains the situation


> The only way to prevent all this is for the system not to exist.

I'll go one step further. Apple should have implemented a system that makes these kind of backdoors impossible. I don't know how or if such a system is possible, but given Apple's track record, they are in a position to attempt it.


>Apple should have implemented a system that makes these kind of backdoors impossible.

Signed updates (which they already have) that are not designed to spy on their customers and bog-standard E2E would achieve that.

But then they couldn't even scan the stuff on icloud.


You are completely correct and I find this turn of events hilarious. They are onto the bargaining stage of grief.


Unfortunately, I don't think this has been anything more than a small blip to Apple's reputation. If I were not on Hackernews (like 99+% of Apple users), I would have either not heard or given second thought to this move.


Bad press on tech sites is the worst thing for a tech company's reputation. They're the people other people go to when deciding what to buy.

When Uncle Bob asks the family computer engineer about the pros and cons of a given platform and hears that one of them scans your device and a false positive could get you arrested for child pornography, Uncle Bob may develop an aversion to that platform.


It’s on Reuters and many other news outlets. The press are still doing their job for the moment at least.


Even the Apple fanatics on MacRumors as pissed. These are the people who’d steadfastly defend apple against anything.


I think it dominating Reddit for last week might be bringing a lot of attention too.

Kind of surprising how tone deaf Apple's response has been despite this.


> The big question for me is how and why Tim Cook and Apple's board signed off on a plan to do huge damage to Apple's reputation and user trust. If they didn't know, it's a problem. If they knew and didn't realize the inevitable backlash, well that's a different problem.

Comments like this make me wonder how in touch HN is with the average person because I genuinely don't believe most people care or even know that this is happening. I may be wrong but Apple could've pretended this backlash doesn't exist and did nothing about it and they would lose nothing for it.


I agree regarding the average person but I’ve forwarded the EFF article to friends who are surgeons, electricians, firefighters, etc. Who wouldn’t know what HN is and the reaction has been a universal “wtf Apple is scanning all my photos?”. I guess the question is how big the angry minority is.

I think a lot of people blindly trust Apple as a reputable corporate entity and this definitely shakes that trust.


It was on mainstream news here in Australia.


Their promise is completely childishly unrealistic, the person who said it should feel ashamed. Does anybody believe they will refuse to use this in <country> if the government demands when it’s a few button clicks of work to do it? When market access is at risk, and with apple’s track record? Or that two countries can’t collaborate to save face while doing this? This promise is so unrealistic that it’s just a lie


I agree.

"Make it so we are the only ones on that list, or we ban iphones in China".


> The big question for me is how and why Tim Cook and Apple's board signed off on a plan

"Do us a favor or the AG launches an investigation into your business practices."


You mean those investigations into app store practices. ;)


> …Apple's reputation and user trust

This is the time to make a donation to a foss project related to the Linux desktop. Off the top of my head, there is Debian, Ubuntu, mint, gnome, plasma, the fsf, wine, or other projects that could use contributions of code or cash. It adds up and could help one of them hire an extra full timer if enough people set up recurring donations


I wonder if "we won't use it for anything else" is really any less credible than other vendors who simply aren't talking about any scanning they may or may not be doing.

We know TV makers scan what you watch, even from other inputs, for ad targeting. This sort of capability is already widespread and basically a commodity - who else is using it that hasn't told us? And can we do anything about it short of only using hardware and services where trusted parties have verified the source?


Because most people dont care. All of HN can stop buying Apple, itll make .000001% difference to their profit if even.


Perhaps the result of internal stakeholders trying to hold on to any vestige of a massive project they've put a lot of effort into. I doubt they want to see it scrapped, even after admitting to themselves it was a huge error.


They didn't miss the point. They get the point.

Their point is that they don't care about that other point.


Sooner, rather than later, I think adding this to all devices will be mandated, by law. Apple can intend to do only good things with this, but governments can bend and twist things very easily so that a beneficial idea becomes dystopian. Regardless of intention, Apple still has to answer to government, not the other way around.


People need to stop buying Apple devices, immediately. They do not give a shit about us.


This article misses the point completely. It was never about what they were looking for, it was that they were looking at all. The quiet part is, once they get acceptance of the snooping using the bogeyman of the day, it will eventually encompass other behavior up to and including political dissent etc.


There is also the issue that closed door arbitration of heinous crimes are the perfect tool to put away dissidents. A very convenient excuse to avoid proper checks and balances. "No, you cannot verify, because you aren't allowed to look at the illegal material that proves their guilt"


Is this actually true? If I was arrested for an image on my phone, would no one in the judge/jury team be allowed to see the image that was flagged? I understand it’s allegedly contraband, but if I were a judge I don’t think I’d take an algorithm’s word for it.


> it’s allegedly contraband, but if I were a judge I don’t think I’d take an algorithm’s word for it.

The prosecution can hire more expert witnesses with excessive credentials than you can, and they will explain how there's a one in one trillion possibility that the system is wrong, and that the defendant is assuredly a monster.

Juries eat that up when it comes to bogus DNA, bite mark, fingerprint, or other forensic evidence claims. Most people think computers can't be wrong or biased, and people's perceptions of what can be deemed reasonable doubts or not seem to shift when computers are involved, or when smart, credentialed people tell them their reasonable doubts aren't reasonable at all because of that one in a trillion chance of the computers being wrong.


I agree. Further, is there any doubt that these “CSAM”² hash databases aren’t already shared between countries? Because if there was doubt, there shouldn’t be, because agencies do share across borders.

² - reason for the scare quotes is because I have first hand knowledge of non-CSAM content being in NCMEC’s database (most likely via data entry errors, but I can’t be for sure).


Images of bonsai trees?


> Images of bonsai trees?

Ha! Come to think of it, I think that was one example. The main examples that came to memory (it’s been almost 8 years since I was involved) when discussing this last week were essentially extremely common photos (like the stock Windows XP background, among others).


And, just introducing this idea can make the concept real, even if Apple cancels this plan tomorrow. Country X could introduce a law saying “all companies that sell phones here must scan and report for these hashes” and based on recent trends, the companies will just roll over and say Uncle.


This is by far the biggest misstep of Tim Cook’s tenure at Apple thus far.

If they don’t kill this program soon, it’s going to overshadow the entire upcoming iPhone event, and will follow Apple around like a dark cloud for years.

I can see the headlines now: “New iPhone launches amid massive new privacy concerns.”

Anytime someone praises Apple for privacy, anywhere on the internet, there will be a tidal wave of people bringing up this program in rebuttal. From people who would have previously defended Apple to the grave!

I cannot fathom how on earth anybody thought this was a good idea. It’s like taking decades and billions of dollars worth of hard won reputation for privacy and throwing it in the garbage at the worst possible moment.


The folks who choose iPhone in part because of a perception of relative privacy and respect for the user (like myself) will think twice.

My two concerns are that Android as an ecosystem is almost certainly still worse, and that the vast majority of users will not care.

I’m tempted to jailbreak my devices going forward, although I guess the folks at Apple would say that makes me a pedophile.

Edit: seeking recommendations for Android phones with strong performance and reasonable privacy protections. Ideally one that can be used without a Google account.


I'm switching to a Pixel 3 or 4 running CalyxOS, GrapheneOS, or LineageOS. I'm eager to have my choice of OS on my phone, build it myself, etc.

https://calyxos.org/get/

https://grapheneos.org/faq#device-support

https://wiki.lineageos.org/devices/


Thanks, that seems to be the right answer.

I just recently migrated everything to Apple + iCloud, but now it appears to be time to switch to synology+pixel/graphene+Firefox

Apparently obsidian.md can be end-to-end encrypted through their service, for anyone else planning to ditch iCloud.


I'm very nostalgic for Blackberry again. I feel like on a platform where the whole point was to prevent anyone who wasn't the corporate IT department from being able to access it, this local code execution would have been a massive non starter.


I put LineageOS on an S5 a while back (with gApps, but there are non-Google alternatives) and it went pretty smoothly. Works with several more modern phones.


Android devices are much more open than iOS devices. There are quite a few alternative OSs with much more privacy than Google and Apple

Unlike Apple, Android devices are not vendor locked into just one OS.


I would never ever touch anything made by Apple from now on. I was never a fan, but was considering getting an ipad for drawing etc. After this, Never.


You can use something like CalyxOS with a pixel 5


This I think is a bit of HN myopia.

I've talked around and nobody I know is even aware of this, when I brought it up, they didn't care about the issue.

It's not news in the commons, it's not on CNN, Fox or MSNBC right now as a headline.

It's one of those tertiary concerns that frustrates some groups, but most people are not aware and would only marginally maligned, and a good 2/3 of people really don't care. I suspect most people would have a difficult time with the 'slippery slope' issue and would accept the 'it's for the children' terms at face value.

It is what it is, but it's worth understanding how regular people think about these issues.


That's true. However, Apple is part of the same tech bubble as HN users. There are Apple employees reading this thread right now. Most regular people don't care about privacy either and just follow soundbites in the mainstream media e.g. "WhatsApp is spying on you!!!". But as long as HN/Apple think that they do then all is good.


This is a potentially sensitive and conflicting topic. I don't think people can take strong view on this in real life for multiple reasons, one of them being pedophilia is a disorder that is not super uncommon.

The fact that WSJ did a story on this with and a lengthy interview with Apple SVP means this is already a mainstream story.


Just wait until the September event. Apple’s media attention quadruples during that time.

“Apple creates authoritarian government wet dream” will be the main narrative coming out of the iPhone event. Not the new phone. And investors will not be happy.


I doubt that. I mentioned it to a non-tech person today. They hadn’t seen the news, and their first instinct was to empathize with the Apple executives having to confront that their products are enabling pedophiles.


hearin the number of CSAM reports apple made (265) vs facebook (10 million) changed my perspective. i don't like it but they are going to have to start scanning icloud photos sometime


I don’t think anybody has a problem with iCloud scanning (don’t they do it already?). Apple owns their servers, so I don’t care if they scan them.

It’s the on-device scanning that is the massive overreach. It’s like being forced to allow a government employee to live in your home and watch your behavior 24/7 for anything they don’t like.

“Oh but John is only looking for specific bad behavior like drug use, you don’t have to worry. He would never report you for anything else, we promise! If you have nothing to hide, you shouldn’t be upset that John is living with you now!”


> I don’t think anybody has a problem with iCloud scanning

The large number of people here who want iCloud to be E2E have a problem with it.

> (don’t they do it already?).

Apparently not for files and photos. They may well do it for email.


Yeah - I can hear Tim Cook quoting those numbers as he explains this on stage.


Unfortunately, it seems the majority of the population are assuredly already rather authoritarian-leaning.


I wouldn't necessarily say authoritarian-leaning. More like "just this once"-leaning, without realizing there's no such thing.


If they kill the program, headlines will read: “Apple refuses to protect sexually abused children” or “Apple sides with protecting child pornographers.” I anecdotally see no backlash outside of HN and the tech/libertarian-focused parts of Reddit. To the casual iPhone buyer making up their billions in revenue, turning back on the program wouldn’t be good optics. Obviously my statement is purely conjecture, I’ll admit that.


I don't think they have to compromise on the CP protection.

They can just go along with what other cloud storage providers have been doing all along... that is, "we won't snoop into your phone, because that's yours. but if you upload photos to iCloud, onto OUR servers, then we reserve the right to scan for images in CP database.."

And I think most people will be perfectly fine with that idea.

Just like you sign away some rights and accept risk when you decide to store items in physical storage facility, same thing will happen when you use iCloud.


There's surprisingly high backlash on /r/apple


You’re not seeing the backlash yet because normies don’t pay attention to Apple outside of the September iPhone event (when coverage breaks into the mainstream for a few days).

Look at Google trends data for the search term “Apple.” It spikes every September by like 70% for a week.

I can guarantee you, every average joe learning about the new iPhone from mainstream media, is going to hear a sound byte about this fiasco as well.

I’ll be surprised if this program isn’t dead within 2 months.


> It’s like taking decades and billions of dollars worth of hard won reputation for privacy and throwing it in the garbage at the worst possible moment.

How gullible Apple users are to think that their closed-source ecosystem was ever about privacy, but I still don't think they'll learn. It is a cult and will continue being a cult.


So they're still retaining the ability to look on users devices? I have nothing against them looking at the abuse material, I have a problem with them having power over my device, which could be piggy-backed on by state actors (or others) for their own purposes later.

Nobody is against them trying to prevent child sexual abuse, pretty sure we all agree that fighting that is important, but doing so by creating what is essentially a back door of sorts into my devices isn't the right approach to doing so.

It sounds to me that this is still allowing them that access, so this changes absolutely nothing.


Yeah, all they need to do is follow suit with what everybody else is doing and scan once images are uploaded onto iCloud.

Scan every single file. I don't care. Because once in iCloud, files are sitting on Apple's server and hard drives. I don't have much expectation that those files are 100% private.

They're completely missing the point.


Don’t use iCloud photos


This is as unhelpful as saying "don't have CSAM on your phone." As has been reiterated many times, the problem is not the capability it is claimed to start off with, it is how the capability will be evolved, abused, and exploited as time goes on.


I mean, don’t do that either, but the venn diagram circles of those two statements are not identical.


Don’t use Apple products. The scanning is on device.


You need to read what they are doing. It's on-device scanning. You have no privacy or choice. May as well have a Facebook Phone.


You need to read what they are doing. Yes it is on device But it is only done to photos uploaded to the cloud.


If they only intended to scan iCloud photos, then they would’ve done it on the server (since they have encryption keys) instead of installing spyware on everybody’s iPhone. If they had done that then there would’ve been little to no outrage since people wouldn’t need to worry as they could just turn off iCloud.

What do you think will happen when e.g. the Chinese government demand that they scan all photos for hashes provided by the government, upload them regardless of whether iCloud is enabled, and let a Chinese company handle the reviews? If they couldn’t stop the CCP from gaining full control of iCloud in China then they most definitely can’t prevent this either.. and other authoritarian governments will copy the CCP and make similar demands.


Or they are trying to get rid of having encryption keys…


Makes zero sense. Installing spyware and compromising everybody’s iPhone does a million times more harm than whatever governments can achieve with complete access to people’s iCloud account. At least before we could protect our privacy by turning off all iCloud sync, but now your only option is to switch away from iOS.


They put out a new paper [0] describing the security thread model they were working with, and this paragraph on page 9 stood out to me:

The perceptual CSAM hash database is included, in an encrypted form, as part of the signed operating system. It is never downloaded or updated separately over the Internet or through any other mechanism. This claim is subject to code inspection by security researchers like all other iOS device-side security claims.

Could someone tell me how that inspection works? Are there researchers who are given the source code?

[0]: https://www.apple.com/child-safety/pdf/Security_Threat_Model...


That paper is being discussed at https://news.ycombinator.com/item?id=28173134.


Let's talk about a nanny state called Finland for a moment

> the Finnish National Bureau of Investigation (NBI), had compiled a secret blacklist of websites that it deemed to contain child pornography and sent it to Finnish Internet service providers.

> Analyzing the address list, Nikki also noted that the first three Google search results for "gay porn" are censored. Electronic Frontier Finland (EFFI) have noticed that the blacklist includes non-pornographic websites also, including a Windows advice forum, a computer repair service and the Internet Initiative Japan server nn.iij4u.or.jp that, among others, hosts websites for a violin factory, a doll store and a hearing aid manufacturer.

from https://en.wikipedia.org/wiki/Lapsiporno.info


This does not address the issue of perceptual hash collisions and false positives at all, nor does it address the issue of on device scanning nor their claim that, according to Apple[1]:

> This program is ambitious, and protecting children is an important responsibility. These efforts will evolve and expand over time.

People don't want their property spying on them and reporting them to the police. They don't want people looking at their photos or thumbnails. It's patronizing, invasive and embarrassing to have your privacy violated like that.

[1] https://www.apple.com/child-safety/


You may be interested in this:

Security Threat Model Review of the Apple Child Safety Features [pdf] (apple.com) https://news.ycombinator.com/item?id=28173134


It was not that long ago that Apple wrote this letter https://www.apple.com/customer-letter/.

> Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.


> For many years, we have used encryption to protect our customers’ personal data because we believe it’s the only way to keep their information safe. We have even put that data out of our own reach, because we believe the contents of your iPhone are none of our business.

How ironic of you, Apple


Apple these days is the type of company to make a mistake, think they are never wrong, blame anyone but themselves and then change a little bit, only for it to go over like a dead balloon. It reminds me of the Apple Silicon dev kit. No refunds, ok refunds but you have to spend them in a certain time, ok fine you can use them for a longer period and for anything.

They are probably completely flabbergasted that people are upset about this so they make this change after a week, completely missing the point.


They've been like that since the early days of the company, it's core to their culture, which was built by Steve Jobs. Apple has always had an elitist snobbery at their center, such that if you don't get a thing then you're wrong. That's what Think Different is all about, that's a hyper arrogant proclamation that could only be spewed out of an elitist never-wrong orifice, it's borderline assholic in nature. Part of the cult of Apple has always been that attitude, the Apple fans eat it up, they love the idea that buying Apple products made them feel different (or it used to, before Apple became mainstream), made them feel separate from the masses. Steve Jobs points that out in the D5 interview in 2007 with Gates, how Apple needs the inferior taste of the masses to prime their superiority off of (which is another way of saying: they're wrong, we're right).

Go back to their response over the iPhone 4 reception problems ("you're holding it wrong"), same asshole culture then as now, and that culture has been in place for decades:

https://www.engadget.com/2010-06-24-apple-responds-over-ipho...


Eh, I would go even deeper: the original Macintosh keyboard didn't have arrow keys because you shouldn't need it according to z Jobs. People complained and Apple held their feet for a long time until giving up and adding the goddamn arrow keys. That's always been very emblematic of Apple for me.

(Hard to find a good citation link on a phone, bit the story's out there.)


I mean I don’t consider myself an elitist but as a software engineer who just moved from doing dev work on a Mac to a Windows laptop, I’m so frustrated by how bad the experience is compared to the Mac. I love the new job but even these $2000 Dell laptops feel junky in comparison to a similarly priced MacBook Pro. I’m constantly having problems with my audio devices (which is a necessity during the pandemic) the built in microphone is terrible, the webcam on the laptop is on the BOTTOM of the screen which just makes me look super fat because reasons I guess. It’s like how a fish doesn’t realize it’s in water until it flops up on the shore.


What happened to it being USA only?

This sounds like one of those errors where they double down on it and call it a "solution".

Sounds like they're misunderstanding people's concerns pretty badly. Sorry if we miscommunicated. Kill this "feature" yesterday. Thanks Tim!


This is just business. Governments around the world are thinking of breaking up the app store, and Apple is saying they'll help governments surveil on their citizens inside their walled garden. Would be a shame if users can escape from the surveillance by choosing alternates outside of the walled garden, wouldn't it?

Expect an App Store rule soon - all photos apps must scan for CSAM.

In a year, expect Signal to be banned from all app stores, just like how it's already banned from some countries.


My impression was that it's currently US only because that's the only country so far where they've arranged a process to report their users to the cops. No reason they wouldn't expand it as they make arrangements with other countries.

They've really bought into a ton of complexity at all levels by doing this instead of just scanning stuff on their own servers, which would at least have been a clear dividing line. Privacy is really all about control, and on-device scanning dangerously blurs that line.


USA-only referred to which accounts would be scanned. Whereas it's theoretically a good thing to require the CSAM hashes to be verified by organisations in multiple countries (rather than a single US government-tied organisation that dismissed privacy concerns as "the screeching voices of the minority").

I'm hoping for something along the lines of "iOS will reject hashes that haven't been signed by independent organisations in US, Russia, China, and India" to make it very difficult to push through anything except actual CSAM. Won't be much of a guarantee if it's just Apple saying "we promise we're only using hashes that have been checked by Australia and the US".


This entire project has been a public relations catastrophe not only for Apple, but also for the NCMEC.

Almost no one held the National Center for Missing & Exploited Children in low regard prior to last week. Why would we? It's one of the best, most noble causes that have ever existed.

Then, Marita Rodriguez, an executive director at NCMEC, reassured employees in an internal memo that was unfortunately leaked:[1]

"I know it’s been a long day and that many of you probably haven’t slept in 24 hours. We know that the days to come will be filled with the screeching voices of the minority.

Our voices will be louder."

The "screeching voices of the minority" phrase got picked up and (perhaps unfairly) has become a kind of rallying cry, exposing the arrogance of the NCMEC.

I say "unfairly" because I do believe that their hearts were in the right place with this initiative, although they are desperately naive when it comes to the principles of privacy. "Let's save the children - how can we stop this toxicity of child porn - let's set something up with the cloud providers!"

But the damage has been done. Now the only tenable outcome is for Apple to simply cancel the project.

I, for one, was looking forward to upgrading from an Android to an iPhone 12 Pro Max later this year. That plan is now on hold! I doubt it will affect their sales too badly, at least at first. Maybe it will after the Chinese government starts using the back door to round up dissidents, or some Middle Eastern governments use it to crack down on homosexuality, or... the mind boggles.

[1] https://www.howtogeek.com/746588/apple-discusses-screeching-...


This affair has me wondering about NCMEC and if it has publicly accountable oversight.


I'm not sure if this addresses the main point. Apple promised privacy, as long as you're not violating laws. The most recent move changed this: now you're a suspect by default, your photos will be scanned even on your local device, and you won't know what the criteria are to flag you, since they're hidden from your knowledge and you can't defend yourself against that hidden criteria.

So what's gotten better by this? In the kindest interpretation, you'll be safe from Fascististan's hashes. But what if Fascististan has been bribed by China or the US? And so on.

A line which should never have been crossed was crossed.


I expect any powerful nation could “share” with a desperate ally or otherwise leverage them into adopting the same database, creating a false overlap.


Yeah. "Dear debtor countries, the attached file contain hashes of new files deemed illegal for the harmonious well-being of peoples for this month, please enter them into your databases."


How does that stop a FISA subpoena with a gag order? The system exists that any secret government order can exploit.


A FISA subpoena or an NSL can only compel Apple to give certain types of information that Apple already has. They can’t compel Apple to gather information they don’t already have.



Turns out that evil bastards also have no sense of beauty or proper data formatting.


That can’t be true because the old pen/trap orders required installation of a device to record the calls.


A wiretap order is a different kind of order and goes through the district court, not FISA. The legal standard for it is also much higher than for other subpoenas by statute.


FISA is from the 1970s and covered wiretapping.

https://en.m.wikipedia.org/wiki/Foreign_Intelligence_Surveil...


You have far more faith in a system of secret courts and gag orders than I do.


Yep. FBI Lawyer, Kevin Clinesmith gets to lie to the FISA court, fabricate emails to get 2-hop warrants and then gets zero days in prison and as of this month is allowed to practice law again. It's an absolute joke.


Hahaha an NSL can do more than that.


These stories have been top of HN for more than a week, indicating fascination. Yet non-technical mainstream articles make the bulk of it, and Apple's just released detailed (and convincing) threat report PDF is getting buried. An unfortunate recipe that'll solidify misunderstandings and keep the conversation much more emotional than it should be.

Security Threat Model Review of the Apple Child Safety Features [pdf] (apple.com) https://news.ycombinator.com/item?id=28173134


> solidify misunderstandings

The key issue here is the very concept of "Apple turns your iPhone into a snitch", and I haven't seen any misunderstanding around that.


They can publish the dev design doc, the test plan, even the fricking source code, that doesn't change the basic fact which is local device scanning.

There is no misunderstanding of that fact by anyone with sufficient technical know-how. There is also no misunderstanding of what comes next once this Pandora box is opened.


Since hearing about this whole affair, one potential scenario popped up in my mind...

Currently, bad actors are randomly sharing illegal content in an unsolicited way to people who did not ask for it (1).

Let's imagine this happens to you. Suddenly, you have illegal content on your device you didn't ask for.

Maybe you are busy when the message with the content is received and you don't even know about it, or you think it's random spam from someone you don't know and just ignore it.

Then, the content scanner picks it up and notifys authorities.

What will happen in this scenario?

(1) https://www.news4jax.com/news/2018/02/05/beware-child-porn-m...


Well, to be factually accurate, in this case with Apple, nothing will happen. Apple only reports when it matches 30 instances of this content. They have said they plan to reduce it from 30 as they improve their algorithm.


> Well, to be factually accurate, in this case with Apple, nothing will happen.

To be factually accurate, the commenter didn’t mention a number. Sending 30+ images of anything to someone’s number is trivial.


True


I hate Apple, they're ruining computers.

1. Devaluing general-purpose computing.

2. Disposable devices with soldered-in batteries and no upgradeability.

3. Removing headphone sockets so they can nickel-and-dime with irritating dongles and ridiculously expensive wireless headphones.

4. Disallowing personal backups using your own key to incentivize the use of expensive icloud storage.

5. App-store corruption.

6. Dodging tax in my country.

7. A million examples of shitty hardware design and then blaming the customer somehow, eg "you're holding it wrong", dust-attractant garbage keyboards, overheating gpus, etc etc

What a shitty, penny pinching piece of shit company. The most expensive devices and they still want to squeeze their customers more.

Literally everything they do trades privacy/usability for money, and the cherry on the shit sundae is always insulting their customers intelligence by somehow claiming it's for their own good.

And now we get probably the very worst thing they've done:

7. Legitimizing on-device scanning and adversarial devices.

I dearly wish they would go out of business because they're ruining technology, something I've loved my whole life.

I never even bought one of their bullshit locked-down garbage disposable devices and they're still ruining things for me with their influence on the market.

Fuck you, Apple. Fuck you in every way. You're the worst hardware manufacturer and the worst software manufacturer.

Go to hell.


With this much bad PR (and Apple hates bad PR), why is Apple pretty much holding its position?


Because this is likely not about making money, it's about expanding surveillance. Once this becomes accepted they can ratchet it up bit-by-bit. Each the the outrage will be less and less.

For example imagine creating a new tax for 1% of purchases. People would be outraged. Now imagine raising sales tax instead by 1% (ex 7% to 8%.) Sure some people would be upset, but it would be a smaller number than the "new" tax. The reason is because people are already used to paying a sales tax. What is "normal" is more easily accepted.

If Apple can get past the initial outrage (like when sales tax was implemented in many countries in history) they can increase the surveillance once scanning local files becomes the "new normal".


The reason Apple actually cares about the feature though is probably to maintain sales in China (which will require this kind of surveillance soon).


I love how people will jump through hoops to land on China even when discussing something that was conceptualized and first (only, as of now) implemented in the US.


You can't have a massive new fake war - the War on Domestic Terorrism - without ideally having new ways to track and surveil anyone (everyone) that is going against the machine politically. They need new capabilities and they're absolutely going to put them into place, asap. Did you see their reaction to the populist Trump circus? The globalists viewed it as a mortal threat. Everything they built since WW2 was in jeopardy as far as they were concerned. Their ability to launch more forever wars, endless foreign adventurism, obsessive (psychotic) superpower global meddling, all of it under threat from rising populism (represented by both Trump and Sanders) and a growing rejection of the entirely failed decades-long approach by the globalists that has led to the US being buried under nearly $30 trillion in debt and numerous catastrophic wars. Just look at what these monsters are saying publicly on MSNBC, CNN, et al. They're telling everybody what they're planning to do, they're putting it right out there. Look at their framing: we're going to target the domestic population; they're de facto saying that. They couldn't be more open about the authoritarian nightmare they have in mind. They perceive their system as being under risk from populist revolt against them and their agenda, and they're launching a counter offensive (the empire is striking back), it's under way right now. Bet on it.

It's quite simple: the culture of bureaucrats that have dominated Washington DC and its policies in the post WW2 era, they acquired a position of power the likes of which has rarely been seen in all of human history. Now, ask yourself: do you really think they're ready to give that power up? They're ready to stop treating the rest of the planet like it's their toy, to do with as they please. Then ask yourself: what do you think they're prepared to do to stave off risks to that power? The risk that they might lose their precious. And there you go, you have your answer. They're willing to do many of the same things other authoritarians have been willing to do. The US isn't Soviet Russia, that's not the suggestion; it doesn't need to be to start putting political opponents (fringe rabble rousers; whether libertarian or socialist - see: Clinton vs Sanders; that's globalist establishment vs domestic socialist) into prison and surveiling everyone in a desperate attempt to retain their power. That kind of power is an intense drug, they're addicts of the worst kind, you can tell by how they behave in regards to the rest of the world, how they treat the rest of the world.

To ensure they keep their power, to keep their globalist forever war machine (and everything that goes with it), they need to put the boot on rising domestic risks to that power.


They put themselves into a position where backing out is also extremely costly.

a) it means they have to publicly admit that what they did was a bad idea, which will fuel another news cycle

b) backing out of it will get them criticized for "protecting pedophiles", maybe even the NCMEC (who privately praised them in the "screeching voices of the minority" memo) will now publicly criticize and shame them to get what it wants

c) now that they've put this idea on the table, there will be even more government pressure to mandate/implement it. Even if they back out of the implementation, just by bringing this proposal up, Apple may have just destroyed not just the privacy of their users, but of everyone.


The thing that strikes me as most suspicious is their insistence on keeping the client-side scanning feature while wrapping it in all sorts of supposed restrictions so it's essentially server-side scanning, just implemented on the client device for some reason.

If the problem is "We need to scan stuff on iCloud for CSAM" then "Let's build a client-side scanning agent and distribute it to hundreds of millions of devices and figure out a way to protect the hashes from the end user and then figure out a way to lock it down so it can't be used to scan other things even if we wanted to or somebody ordered us to" is a singularly (and suspiciously) inelegant solution to the problem.

Just scan the files on your own servers, like everybody else does. Anything else is rightly going to make people suspicious that you've got some ulterior motive.


I assume their goal is to turn on E2E for what gets stored on their servers.

Right now, they have to hand over anyone’s photo library if they are ordered to do so.

If they turn on E2E that will no longer be possible, and this will be strictly better and harder to abuse.

That doesn’t mean this mechanism isn’t offensive. It just is better from a privacy point of view for them to do this and enable E2E.


Notice how they haven't done that though. It would be trivial to do. Just change all of the private keys to ones supplied by the clients. But they haven't done that. Why would you expect them to do that when they are taking steps away from individual privacy? Leaving a hypothetical option on the table that costs them nothing to do means they actively do not want to do it. So why would they do it?


They haven’t shipped this yet.


It’s quite simple, they would like to store photos without being able to see them, only knowing they’re not CP. If they scan on server, they need to be able to see all photos, and eventually give them to any requesting authority.

So the reasonable approach is to make clients that only upload photos that are not CP. The problem is that the implementation of that feels creepier than they anticipated.


Potentially something like a national security letter and they can’t reveal the real reason behind it. I agree this feature is off-brand for Apple and came out of the blue.


What would happen if Apple reveal the national security letter? What’s the government going to do, arrest Tim Cook?


Is that consistent with the amount of time this feature took to implement?


Maybe this is the warrant canary

Lets push that idea so allies in the government can go looking and void it


Maybe they're telling you to run.


Their handlers have them on a tight leash.


They're convinced that it's worth the benefit to abused children to let the tech community vent fears about the world at them for a while. I don't really blame them. My views don't align with the mob here, so it would be a waste of my time to discuss them while y'all are upset and angry about this; best just to hold my position, remain silent or reserved, and wait for the pitchfork mob to find another target or exhaust itself. I'm honestly surprised Apple even commented at all.


> I'm honestly surprised Apple even commented at all.

I think they commented because they expected a brief storm of outrage that quickly dies down, but instead it has become a growing wave that is spilling into mainstream media, with a mostly negative reaction, and threatening to completely destroy their "Privacy. That's iPhone" mantra that has been the core of their marketing campaign this year.


I wonder what the other countries will be and whether they will be disclosed?

My bet is on Canada, UK, Australia, and New Zealand.

https://en.wikipedia.org/wiki/Five_Eyes


Translating from corporatese: "Apple hasn't given up the idea and is looking for a better excuse to push their surveillance solution."


I think the most reasonable solution to this problem is to not purchase Apple products for a few years given how they have responded to criticism and promoted the characterization of privacy advocates as "screeching voices". I was thinking of upgrading my iPhone given it's getting pretty old now but I can't really see myself doing that now.


It should be obvious by now that Apple has been coerced into building this scanning system, and the CSAM scanning is just the front for a much more broad "national security" backdoor. Apple's PR team has surely been shitting bricks for the entire week--this is the worst PR debacle they've had since the iPhone 4 Antennagate in 2010.


Completely agree. A friend of mine asked what was happening, and I just said "it's a big deal, biggest thing in a long time". Antennagate is a good comparison -- though this seems bigger. Impacts of Antennagate long term seemed minimal.


This is way bigger than any RF engineering mishap.

I’m going to actively evangelize alternatives to Apple devices and that’s coming from someone who has been doing Apple evangelizing since OSX Panther days. Tens of thousands of dollars of bought devices and services not including all the people I convinced to make the switch over the years.

I’m sure I’m not the only one in this regard.


Sadly this will likely be little to no impact on Apple long term either. I too agree this is coming from the government. They have found their loophole of using "private industry" to do all the dirty stuff they cannot directly do.


Given that privacy has been what Apple was using as they key selling point in their ad campaign, and now they've completely subverted that and mainstream media is picking it up and people are noticing, I suspect this will impact them far more than they expected.

While Apple deserves all the fallout they're getting and more, I'm disappointed that the NCMEC that pushed for this isn't also receiving more scrutiny and criticism. Multiple people have now pointed out that their database contains false positives, which is absolutely terrifying. Completely legal, harmless, no-nudity-no-humans pictures can get your life ruined. The truth coming out later doesn't matter when your home gets raided and it slips out that you were caught sharing multiple images that matched hashes from the NCMEC child porn database.


To everyone looking for alternatives, there are quite a few but strong alternatives. They are on a spectrum of complexity/privacy/convenience.

Android probably is not much different. But and this a big one, android devices are not tied to the host google OS and are waaaay way more open than Apple. This is because of Apple's ir tight grip on their hardware ecosystem and lobbying against right to repair. Also, the abundance of android phone makers means you get features at a variety of price points.

The OS options are LineageOS (my daily driver), CalyxOS worth more google support, microG lineage.

More different OS are Ubuntu touch (very active), postmarket OS with multiple frontends just like on Linux desktop, Manjaro Mobile, etc.

And hardware wise, there is Pine phone and Purism. But the more people paying for these, the more there is money for the competition.

Apple id already sitting on a staggering $200billion dollars. $200 billion.


That's just smoke in the eyes while hiding the real problem which is allowing monitoring, not where, when, by whom, and for what reason.

And I would also like to get some real numbers about children abuse cases compared to drugs/domestic/gangs/organized/etc. crimes. In other words, how much they impact the society. Wanna bet that in the world there are for example a lot more normal citizens abused by the police every day than old men wanking to a kid picture? When will come the scanning of cops iPhones to catch the "bad apples"?

I'm very serious. Child abuse is a cancer and must be stopped, but if they're employing all that technology for this and not for that, well, dear Apple, I want to see some numbers that would justify the choice.

And don't reply "even saving only one child is worth ... yadda yadda yadda" You know what I mean.


Conveniently, the other participating nations are all members of Five Eyes. Probably.


I have a 4" iPhone SE. Love it, but Apple has continued to push me away these last four years. I no longer use any other Apple products except for this 4-year-old phone. What alternatives are there for small phone fans? Sony used to make an Xperia Compact that was nice, but I think they discontinued it.


I think there's an out of tree kernel for the SE. You could give PostmarketOS a shot if you have a lot of patience.


It's a policy decision, while the technical one is still there, unchanged.


Still not good enough. That’s like an invading army saying they have a “kinder, gentler machine gun hand.”


> The implications of the government’s demands are chilling. If the government can use the All Writs Act to make it easier to unlock your iPhone, it would have the power to reach into anyone’s device to capture their data. The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge.[1]

[1]: https://www.apple.com/customer-letter/

Chilling indeed, Tim, chilling indeed.


lol, so first it's "at least 30 images", then it's "multiple nations" (how many? which pairs are allowed?)

Soon your pictures will be subject to a 3-week review period to decide whether they 're fit to be scanned


Wrong answer, Apple.


[deleted]


That paper is being discussed at https://news.ycombinator.com/item?id=28173134.


They're willing to kill the platform over this? What's going over there on the west coast?


They know they can essentially wait out the public outcry, plus they can even buffer things a little bit by calling their critics "confused" and introduce measures that don't address anything but sound reasonable to lay persons in an effort to distort the conversation.

The platform is not in danger. But if it were, then yes, I think they'd still risk it. It's either that or getting shut out of markets gradually by law enforcement and governments.


> It's either that or getting shut out of markets gradually by law enforcement and governments.

How come that's not happening to desktop Linux then?


Desktop Linux in no way compares to iOS market share, and even if it did, it'd require a different type of attack. Apple is vulnerable precisely because it is the one single vendor in complete control of anything related to the iPhone.

Personally I do think it's likely we'll see a successful anti-Linux smear campaign and/or laws in the next few years, but I have to admit that's a pretty pessimistic take.


I wonder if anyone would've really cared (or at least cared this much) if they just nixed the possible plans to do end to end on the entire iCloud and just did the scanning server-side. I probably wouldn't have started looking at other options, personally.

It's somewhat funny to me that switching to a Mac and eventually to multiple Macs is what had me thinking that I could probably be fine just using Linux for my day-to-day computing that doesn't involve gaming (I've a Windows desktop for that) and now this has me thinking "yeah, I can pretty much make the switch and realistically not feel like I'm missing out on too much, anyway.".


Trust lost last week. This doesn’t matter.


Of course it matters- if Apple does a 180º reversal I'll applaud and buy more Apple products. I want companies to be able to acknowledge mistakes unequivocally and fix them.

If Apple says "we heard the backlash, we're sorry, we'll never do it again", I'll be more of a supporter than I had been before.

Until they do, trust lost.


I wouldn't say "this doesn't matter", I'd say "this makes it worse" because they're reaffirming that they want to push forward with this despite the criticism.


I’m saying it doesn’t matter to me because I’ve moved out of their ecosystem. I’m done.


We don't want it pushed back, we want it dead in the water.


My problem with all of this is not some abstract slippery slope of "what happens if China starts insisting that Tank Man be added to the hash list" or whatever.

It's the practical scenarios where people in the United States have previously been indicted and their lives ruined over "child pornography" or "statutory rape" that stretches the definition to the breaking point. Young men just barely 18 years old have gone to jail because they have taken pictures of their technically under-aged girlfriends. Often their real crime is that they've simply upset someone rich and powerful, or that they were guilty of being black and dating a blonde white girl. I wish I was being facetious, but this happens all the time.

More importantly, in this digital age, many stupid teenagers upload nude pictures of their under-aged girlfriends to the Internet. Sometimes as revenge for the girl breaking up with them, sometimes because they "hacked" a girl's account and downloaded their selfies, or they legitimately have the photos and just wanted those sweet internet karma points.

Imageboards like 4chan are full of what is technically CP, even though nobody was directly harmed in its creation. Other stupid teenagers download these photos, as do older paedophiles that trawl these websites looking for CP. Some of them will share it with other paedophiles, internationally even.

So what happens if one of these paedophiles gets arrested for a serious child sex crime? Their PCs and their phones will be searched, and the pictures will end up on the CP hash database, of course. Even with this "multiple nations" requirement, that just requires two arrests for the hash to be added. Nothing really changed, there's just a delay.

Now what will happen to the stupid teenagers that downloaded the same photo? Teenagers that aren't child molesters! Teenagers that aren't paedophiles!

Apple will probably say: No worries, we filter out the phones belonging to under-aged children from the results.

... until they turn 18.

Get it? This is a legal landmine in the pocket of every stupid kid. YOUR stupid kid, that looks at porn on the internet, just like every other stupid kid.

This is just one scenario where this could lead to a false positive that totally and irrevocably destroys someone's life. I can think of several more, and dozens once you start adding the political pressure from various other nations.

There is just no way to make this kind of dragnet surveillance safe. Even a tiny false positive rate is unacceptable when there are hundreds of millions of people playing this distopian lottery.


This is not a risk unique to Apple’s program to scan for known CSAM. Every cloud provider is susceptible.

Furthermore, I am fairly sure NCMEC’s due diligence process involves confirming the depicted individual was abused.


NCMEC and similar groups call all child pornography CSAM.[1] Including 17 year olds sexting each other. And several people said NCMEC's database includes legal images.[2][3] What made you fairly sure NCMEC's process was more rigorous?

[1] https://www.missingkids.org/theissues/csam

[2] https://www.hackerfactor.com/blog/index.php?/archives/929-On...

[3] https://news.ycombinator.com/item?id=28069844


This is a highly informative comment. Thanks!


This perfectly illustrates the problem: the content being scanned for can change arbitrarily and on a whim. Gun, meet foot.


China will easily get multiple countries to flag content as problematic. Values are not reflected in UN votes, either.


In before China starts diplomatically leaning on "multiple nations" to flag dissident imagery for review.


We could probably increase the vaccination rate if we snooped on peoples' phones to find out if they are participating in vaccine-hesitant behavior, and then rate-limited traffic to websites and apps that peddle this dangerous misinformation.

How many people have to lose their lives in the service of some pedantic idea of "privacy"? It's a computer looking at it, it's not even a human person.

I think tech companies need a hippocratic oath similar to "first do no harm". Apple should not be engaging in misinformation trafficking, and should at the very least be working to minimize harm by preventing people form falling victim to dangerous, unsubstantiated, and un-fact-checked information. This is especially important when our elected officials use the considerable power that has been gifted to them by the people to put peoples' lives in danger by spreading dangerous misinformation.

What role did apple's inaction on this have in the pandemic? In the January 6th insurrectionist's attempt to overturn a validated, secure, and duly certified democratic election? What role did Apple's inaction play in the attempted kidnapping of the governor of Michigan?

By refusing to help, they are partially responsible for these things. It's time for us to demand that they do their fair share of helping. Inaction is itself an action.

--

This is sarcasm, of course. For how long?


If they cared about privacy so much, why wouldn't they just alert you and block you from uploading in the first place?

By this becoming such a hot topic, I think this will push the activity to more underground.

I bet most people didn't even know that other cloud companies did search for CP.


Suppose we trust what they're saying, what proportion of "children abuse" photos is supposed to be flagged in "multiple nations"?

How many children is this going to protect, to weigh against the loss of privacy across the whole userbase [in US] [for now]?


Am I missing something or wont that just let more pedos get away without giving ANY more privacy?


Multiple nations being Five Eyes nations? Is that supposed to make us trust them more??


Multiple nations? USA+Canada? China+HongKong? Russia+Belarus? In practice, this statement means nothing good.

It does mean that Apple is able and willing to manage country-specific blacklists.


NSA agent 1: So we need Apple to scan this image on all phones but they won’t do it because only we need it.

GHCQ agent 2: No worries we got you covered we’ll add it to our list as well.


Just out of curiosity, would it be possible to fight this back by creating a huge number of false positives without involving actual CSAM content?


What is a reasonably “Good enough” alternative for an iPhone that does not tie into Google services?


A backdoor is a backdoor.


Start with x and settled on x/2




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: