And... Apple misses the point of the criticism completely.
This problem is the capability, not what it's used for. Any such capability will be abused by new use cases be it terrorism, drug trafficking, human trafficking or whatever. Plus there will inevitably be unauthorized access.
The only way to prevent all this is for the system not to exist.
I don't buy into theories that Apple is being pressured or coerced on any of this. I believe it's far more likely this is just tone-deaf but well-intentioned incompetence. It's classic "why won't anyone think of the children?" and we've seen it time and time again with encryption backdoors and similar.
The big question for me is how and why Tim Cook and Apple's board signed off on a plan to do huge damage to Apple's reputation and user trust. If they didn't know, it's a problem. If they knew and didn't realize the inevitable backlash, well that's a different problem.
> This problem is the capability, not what it's used for. Any such capability will be abused by new use cases be it terrorism, drug trafficking, human trafficking or whatever. Plus there will inevitably be unauthorized access.
Apple even says it themselves[1]:
> This program is ambitious, and protecting children is an important responsibility. These efforts will evolve and expand over time.
I can't wait until /child-safety 404s or redirects to /safety and there's a wall of marketing blurb (possibly only in Chinese at first) that explains how 'national security' concerns are reported to the CCP.
This has totally pushed me over the edge, though I'll admit I was oblivious to begin with. My plan is to replace the MacBooks with a Thinkpad P15 gen 2 running Ubuntu and replace the iPhone with something running Ubuntu Touch (Volla Phone, Fairphone, OnePlus One). Screw not having control.
As another commentator stated, if they actually cared about this issue, they would have done what all the other cloud providers have been doing for many years, and what everyone opposed to this wants them to do - just scan the cloud. That's all they're effectively doing right now, only with the scanning happening on a person's private device.
Same here. Just that I'll get rid of MS and Google. That means CalyxOS on my Pixel 2 and some version of Linux on my private ThinkPad. Once I find time that is, which to be honest can take a while. I think the last time I used my private laptop was over a month ago. Phone is different, but again time constraints. It will happen so.
I'm in the same situation in terms of tech debt against my personal infrastructure. I have plans to migrate my personal (currently Ubuntu) server to use Proxmox on top of selling the current machines & buying new non-Apple ones for this. The evening and weekend time is tight.
There is another option - don’t put anything sensitive on your phone and basically treat it as an adversary already has root access. This pretty much what I do for a long time. All the sensitive information (e.g. banking) is on Linux desktop and there are no logins or apps on iPhone or work laptop (hn is not sensitive ;) ).
I started a thread on what specific actions we can take to avoid Apple and other big tech in future. While it did not gain much traction, I am hoping there are at least 1,000 people in the world who think like me and over time we will be able to influence more people.
I also note that /child-safety page is only(?) accessible from outside search/link. There is no corresponding press article. This thing is just floating in the air somehow…
Even if you trust Apple and NCMEC to not add out of scope hashes that some government or law enforcement or intelligence agency “asks” them to, does anybody who’s ever worked for Apple have any doubt at all they they’d use this to check employee’s personal devices for Apple IP? Especially if a big spectacular leak hits the media? Apple’s IP enforcement goons are legendary. And not in a good way in most people’s opinion. Particularly Gizmodos…
Moreover, things could become retroactively bad. If tomorrow multiple countries decide to change the minimum age of adult performers from 18 to 19 a bunch of pics legal today could become illegal tomorrow leading to wide dragnets enabling law enforcement to go through everything you have.
I just have to wonder who green-lit this idea to begin with. Every single other company is riding a post-covid tidal wave of sales. as WFH becomes a reality for so many workers, apple products are positioned to really take center stage for a massive segment of consumers...
and then this. Apple intentionally injects uncertainty and controversy? what were they thinking?? sure its just phones right now but im sure mac users are wondering about workstations and laptops? the damage control being spun right now is absolutely overwhelming.
im also surprised to see no other players like MS or Google rushing to take advantage of the outrage. even players like Purism seem to be ignoring the event.
>>I just have to wonder who green-lit this idea to begin with.
I suspect, with no inside knowledge, that there's someone high up in Apple who feels very, very passionately about child abuse/these sorts of images due to some personal connection, and they are championing the idea internally. And tbh not the worst bugbear to have, but obviously this isn't the way to address it.
As for why the big players aren't jumping on this - it's a tricky line to walk, you don't ever want to be seen as pro child abuse.
Having read the coverage and the documents that Apple has been putting out to manage the … what, crisis? Backlash? … I’ve been thinking about the design of the system and how I would want CSAM to be detected, if different than this.
Honestly, I’m pretty impressed by what this approach accomplishes. Your comment says this obviously isn’t the way to do it— what’s better than this?
In a legal context where detecting CSAM is a strong requirement, what’s preferable to this approach?
>>>In a legal context where detecting CSAM is a strong requirement, what’s preferable to this approach?
It's not a strong requirement though? Only if the company sees it (unencrypted) do they need to report it, detecting it on-device is wholly different. It would honestly be better if they adjusted their TOS and started scanning the files on upload (on server side). They have the encryption keys already.
Apple is planning to install on all (recent) iPhones software that will let them scan for any image similar (using perceptual hashing) to some set of images. Right now, that's CSAM, and only on upload to iCloud. But what do you want to bet China is salivating at the idea of including images of Tank Man and other "seditious" content? And maybe checking images anytime they're added to the phone, or sent to someone else, not just up to iCloud?
It completely ruins the idea that Apple has your privacy/security in mind.
Even if scanning for the content is not explicitly required by law, what happens when a pedophile ring hoarding thousands of images of CSAM on iCloud is busted, and the news gets out? The article at [1] makes it sound like Apple choosing not to scan any of its users' videos in iCloud was seen as evidence of Apple lagging behind the status quo of companies like Facebook that were proactively reporting CSAM.
According to that article, in the last year Apple only submitted 265 reports to the NCEMC while Facebook submitted 20 million. Would law enforcement believe they'd be missing out on catching abusers after seeing this disparity?
If a company is found to allow criminals to store CSAM on their servers for extended periods of time, the law is going to want to know why they let it pass, irrespective of the extent the company chose to scan for it. Apple probably doesn't want to deal with that fallout, so maybe they figured that being proactive about scanning for CSAM in a way that could enable the use of E2EE wouldn't hurt, and that pushing the privacy narrative would satisfy enough people - which it didn't.
>>>If a company is found to allow criminals to store CSAM on their servers
I'd bet dollars to doughnuts that AWS, GCP, Azure, et al. have terabytes of CSAM stored within their data centers - because users have uploaded encrypted files and the companies (rightly) don't have the keys. I'll also bet there are many WD hard drives full of CSAM, many Linux servers hosting it, many nginx or apache installs serving it up, etc. Should we mandate that all hard drives scan files as they're written to see if it's CSAM? Or maybe nginx should alert law enforcement anytime a CSAM image is served.
Apple should have actually let their users have E2EE or given up on that and just scanned stuff server-side.
How much of an impact do you think this will have on the non tech savvy user though? Most people already don't care about ad tracking, location tracking, etc. I think your standard user will just shrug this off and go "eh, I have nothing to hide and it's good for the kids"
I don’t like that I have to resort to this, but I was going to mention Proles and point to Wikipedia, except that Wikipedia deleted article about that topic. Yet I can see article’s blurb in search engine’s cache! Everything is a meta nowadays…
Sooner or later the copyright enforcement companies will find a judge whose willing to rule that since Apple has the capability they have to use it even though this system is aimed at still images only, no sound or video. Then Apple will end up fighting it for years if they're lucky and it doesn't get a quick trip up to Supreme Court that's drastically Hollywood-friendly. The legal difference between mandating that a company create a brand new scanning capability and "just apply" one they already have is significant.
Apple doesn't have that capabilty though. There are countless ways to get copyrighted material on your phone that are perfectly legal. Apple would have to be able to decide if you're legally owning your files. With how the IP system works currently that's impossible. For CSAM this isn't a problem because we assume that there is no legitimate way to have CSAM files on your phone.
Or take a photo of CSAM on display on another device or printout using your iPhone camera, which is often accessible from the lock screen without a PIN.
> I don't buy into theories that Apple is being pressured or coerced on any of this.
Of course they were.
It’s well known that Apple chose not to introduce e2e encrypted iCloud backups back i 2017/18 or so, due to FBI complaints.[1]
This is clearly Apple’s play to be able to introduce that again and tell law enforcement “Look, we ‘thought of the children’, if you want further access to our customers data, you’re going to need to come up with a better justification than that.”
If Apple pull that off, adopting client side image scanning with this quite impressive privacy preserving system behind it, and then e2e encrypt everything they upload to iCloud, that’d arguably be a very big win for Apple customers privacy.
Whether that’s an acceptable trade off for having a device I purchased run code I didn’t ask for and don’t want to monitor where or not I’m a paedophile, possibly snitching on me for false positives or bad-faith additions of non CSAM hashes into the database, or not is a good question still.
> The only way to prevent all this is for the system not to exist.
I've been having trouble following this argument over the last week. Isn't it clear that the capability already exists? Whether or not Apple goes through with its CSAM plan, the capability is evidently there.
In other words, since Apple is a closed system, the capability was there before they announced the CSAM plans. Their announcement has changed nothing about capability other than reminding people that Apple has privileged access to the data on iPhones.
I guess the question is, if Apple does the CSAM program does that make them more likely to cave to government pressure to search for other things? And to do so without telling users?
Once a certain piece of technology exists, courts can order that it be used a certain way other than the way it was intended.
Neither courts nor legislative bodies are capable of explicitly requiring the invention of technology that doesn't yet exist, although in some cases they could make it an implicit requirement of a law.
At least in the US most lawmakers are varying shades of technologically illiterate and don't have a grasp of what is possible or not possible. If technology doesn't exist yet, you can say to them "this is impossible", and with enough of a bribe, they will believe it.
Apple by building this technology, has put it on the map and announced that this technology both possible and is ready.
It's a very ill-considered argument, because Apple obviously already has the capability of pushing literally any conceivable software update to iPhones. Apple (and thus iPhones) already have the capability to do anything you can conceive of, regardless of how terrible. For instance, iPhones have the capability of activating the cameras at all times, detecting when they capture compromising or embarrassing video, and uploading those videos to a public website along with your name and contact information. The only thing preventing Apple from doing that is their own policies.
If you think it's bad that Apple can push software updates to iPhones, then by all means make that argument. But "the only way to prevent bad changes to this system down the road is for this first version of the system to never exist" is a very poor argument, given that the only "system" that needs to exist is the ability to push software updates, and that system existed long ago.
That isn't the argument at all. What Apple can't have happen without major reputational damage is get caught doing it without telling its customers.
Thus this is seen as a marketing backdoor to the actual backdoor. This is seen as a ploy that later gives Apple the plausible deniability argument that the government is making them do it (other forms of content) even though they created the means (on device scanning) for the government to exploit.
Why limit it to content that will be uploaded to iCloud? Why limit iMessage scanning to kids? Why not also let the government know? Why limit it to child porn since the system works for any image? Why limit it to 30 hits?
Imagine the reputational damage to Apple if say in 2 years it was discovered that they had been scanning all photos on your device and that the matches had been going to the government and that they didn't tell you. Their privacy marketing angle would be completely toast. Imagine if they did this to phones outside the US.
Any sane marketing person would try to find a way to sugarcoat it first.
> But "the only way to prevent bad changes to this system down the road is for this first version of the system to never exist" is a very poor argument, given that the only "system" that needs to exist is the ability to push software updates, and that system existed long ago.
Apple relied on that argument 5 years ago successfully.[1]
There is a qualitative difference between tweaking an existing code base (expanding the scope of this detector) and coercing a company to dedicate teams of tens to hundreds of engineers over years to create a brand new code base from scratch (writing this detector in the first place.)
There shouldn’t be a single line of code, or single binary blob, on my device that can compute these photo DNA hashes.
Apple could always go ahead and add that functionality in an update, but then there would be a big backlash and the opportunity to not update or switch providers.
> Apple could always go ahead and add that functionality in an update, but then there would be a big backlash and the opportunity to not update or switch providers.
Wouldn't users have the ability to not update or switch providers if Apple expands the scope of its search beyond CSAM?
I guess I don't see the huge difference between the surveillance code existing on the phone but not used for objectionable purposes versus the line of code sitting in a different branch in git and not deployed on actual phones (yet).
I'm completely against this move by the way -- I'm not trying to defend it. But I want to be able to argue effectively against it.
This is like renting a house and the owner saying "we installed a camera in your house, but we'll only look at the video, if someone screams 'help'"
Then someone says "what if they change it so it sends the video if someone says 'do you want some pot?'"
And your argument would be "wouldn't you just move if they did that?" - this argument is pointless, because it's not the "help" part that bothersome, it's the always recording camera in a black box, for which you never know what's recording and what's it sending to apple. Let's say hypothetically they do this, and modify the database to trigger on the "tank man" photo, and give that photo such weight, that it triggers manual review by the CCP observer (remember, apple data for chinese users is hosted in china) - you'll never know the image got flagged, you'll never even know that it was sent anywhere, but your social score will go down and you might "vanish" during the night. Imagine some new wikileaks happening... 15 journalists around the world get the data, NSA/CIA/FBI has hacked one of them and found some photos of USA doing something bad.... add those photos to the database, have apple scan all the phones, you find the other 14 journalists, and possible even the leaker who took the original photo. If this is expended to macbooks (which it probably will), even worse, because people keep more sensitive stuff on ther computers than their phones.
And all this for what? Child molesters and rapists (the worst of the pedos) create new photos and videos, that are in none of the databases, so you never catch them. Their direct sponsors (paying customers) get the photos first, and again, before they're in the database. So all you've caught is someone browsing 4chan.
> And all this for what? Child molesters and rapists (the worst of the pedos) create new photos and videos, that are in none of the databases, so you never catch them. Their direct sponsors (paying customers) get the photos first, and again, before they're in the database.
I hadn’t considered that. This would seem to create a market incentive to abuse more children and create a greater variety of new abusive content. Has that been considered at all?
No, because they’re not scanning for CSAM, they’re scanning for hashes. That hash database needs to be updateable for it to be useful at all. A significant part of the policy protections are implemented off device, so those can’t be trusted.
The only explanation that makes sense to me is that they are doing huge damage to their reputation and trust for bigger reasons. Perhaps there are international or national security issues they are told to comply with that are not quite in the public view yet.
> It would be a perfectly logical order from the power structures Snowden exposed.
Snowden exposed that Apple was actively part[1] of the PRISM data collection and surveillance program targeting Americans.
Apple also voluntarily gives up customer's data when requested by the US government for about 120,000 individuals a year[2]. They also hand over the data for over 31,000 users/accounts in response to FISA and NSL requests[2] in a six month period.
> eta: It makes way more sense to me than Apple actually thinking this was a good idea.
Based on the company's own messaging, they simply seem proud of the project and find it ambitious[3]. They're also excited about the project's expansion and evolution[3].
They do seem proud, but one of these things makes less sense than the others, given Apple's prior reasoning.
We don't know what goes on in the secret courts these days but we have been assured watchdogs are in place and certain programs are no longer in use [1]. This is all speculation based on past behavior, but I assume new reasoning has been constructed that passes watchdog's interpretation of the law but requires new hoops for three-letter agencies to jump through to get the mountains of data they so yearn for.
maybe they want to stop giving up customer data and doing this on the phone is the only way to do e2ee backups - can't hand over what you can't decrypt.
Seems like a reach to me. The government doesn't just care about CSAM. They care about terrorism, human and drug trafficking, gangs, organized crime, fraud, etc.
That might make sense if CSAM detection is just the start, and they plan on detecting all those other things, as well.
That would arouse suspicion as soon as someone (perhaps someone inside the company) figured out what was going on. Here, a key person introduces the idea, says it's "for the children," and people are baffled but the masses will forget about it, even come to accept the logic, soon enough.
They got what they wanted. It did sting a bit, but it was calculated.
They won't suffer any massive losses over that, they have the moral high ground among non tech people, who mistakenly believe that they're doing something to protect children.
> The only way to prevent all this is for the system not to exist.
I'll go one step further. Apple should have implemented a system that makes these kind of backdoors impossible. I don't know how or if such a system is possible, but given Apple's track record, they are in a position to attempt it.
Unfortunately, I don't think this has been anything more than a small blip to Apple's reputation. If I were not on Hackernews (like 99+% of Apple users), I would have either not heard or given second thought to this move.
Bad press on tech sites is the worst thing for a tech company's reputation. They're the people other people go to when deciding what to buy.
When Uncle Bob asks the family computer engineer about the pros and cons of a given platform and hears that one of them scans your device and a false positive could get you arrested for child pornography, Uncle Bob may develop an aversion to that platform.
> The big question for me is how and why Tim Cook and Apple's board signed off on a plan to do huge damage to Apple's reputation and user trust. If they didn't know, it's a problem. If they knew and didn't realize the inevitable backlash, well that's a different problem.
Comments like this make me wonder how in touch HN is with the average person because I genuinely don't believe most people care or even know that this is happening. I may be wrong but Apple could've pretended this backlash doesn't exist and did nothing about it and they would lose nothing for it.
I agree regarding the average person but I’ve forwarded the EFF article to friends who are surgeons, electricians, firefighters, etc. Who wouldn’t know what HN is and the reaction has been a universal “wtf Apple is scanning all my photos?”. I guess the question is how big the angry minority is.
I think a lot of people blindly trust Apple as a reputable corporate entity and this definitely shakes that trust.
Their promise is completely childishly unrealistic, the person who said it should feel ashamed. Does anybody believe they will refuse to use this in <country> if the government demands when it’s a few button clicks of work to do it? When market access is at risk, and with apple’s track record? Or that two countries can’t collaborate to save face while doing this? This promise is so unrealistic that it’s just a lie
This is the time to make a donation to a foss project related to the Linux desktop. Off the top of my head, there is Debian, Ubuntu, mint, gnome, plasma, the fsf, wine, or other projects that could use contributions of code or cash. It adds up and could help one of them hire an extra full timer if enough people set up recurring donations
I wonder if "we won't use it for anything else" is really any less credible than other vendors who simply aren't talking about any scanning they may or may not be doing.
We know TV makers scan what you watch, even from other inputs, for ad targeting. This sort of capability is already widespread and basically a commodity - who else is using it that hasn't told us? And can we do anything about it short of only using hardware and services where trusted parties have verified the source?
Perhaps the result of internal stakeholders trying to hold on to any vestige of a massive project they've put a lot of effort into. I doubt they want to see it scrapped, even after admitting to themselves it was a huge error.
Sooner, rather than later, I think adding this to all devices will be mandated, by law. Apple can intend to do only good things with this, but governments can bend and twist things very easily so that a beneficial idea becomes dystopian. Regardless of intention, Apple still has to answer to government, not the other way around.
This article misses the point completely. It was never about what they were looking for, it was that they were looking at all. The quiet part is, once they get acceptance of the snooping using the bogeyman of the day, it will eventually encompass other behavior up to and including political dissent etc.
There is also the issue that closed door arbitration of heinous crimes are the perfect tool to put away dissidents. A very convenient excuse to avoid proper checks and balances. "No, you cannot verify, because you aren't allowed to look at the illegal material that proves their guilt"
Is this actually true? If I was arrested for an image on my phone, would no one in the judge/jury team be allowed to see the image that was flagged? I understand it’s allegedly contraband, but if I were a judge I don’t think I’d take an algorithm’s word for it.
> it’s allegedly contraband, but if I were a judge I don’t think I’d take an algorithm’s word for it.
The prosecution can hire more expert witnesses with excessive credentials than you can, and they will explain how there's a one in one trillion possibility that the system is wrong, and that the defendant is assuredly a monster.
Juries eat that up when it comes to bogus DNA, bite mark, fingerprint, or other forensic evidence claims. Most people think computers can't be wrong or biased, and people's perceptions of what can be deemed reasonable doubts or not seem to shift when computers are involved, or when smart, credentialed people tell them their reasonable doubts aren't reasonable at all because of that one in a trillion chance of the computers being wrong.
I agree. Further, is there any doubt that these “CSAM”² hash databases aren’t already shared between countries? Because if there was doubt, there shouldn’t be, because agencies do share across borders.
² - reason for the scare quotes is because I have first hand knowledge of non-CSAM content being in NCMEC’s database (most likely via data entry errors, but I can’t be for sure).
Ha! Come to think of it, I think that was one example. The main examples that came to memory (it’s been almost 8 years since I was involved) when discussing this last week were essentially extremely common photos (like the stock Windows XP background, among others).
And, just introducing this idea can make the concept real, even if Apple cancels this plan tomorrow. Country X could introduce a law saying “all companies that sell phones here must scan and report for these hashes” and based on recent trends, the companies will just roll over and say Uncle.
This is by far the biggest misstep of Tim Cook’s tenure at Apple thus far.
If they don’t kill this program soon, it’s going to overshadow the entire upcoming iPhone event, and will follow Apple around like a dark cloud for years.
I can see the headlines now: “New iPhone launches amid massive new privacy concerns.”
Anytime someone praises Apple for privacy, anywhere on the internet, there will be a tidal wave of people bringing up this program in rebuttal. From people who would have previously defended Apple to the grave!
I cannot fathom how on earth anybody thought this was a good idea. It’s like taking decades and billions of dollars worth of hard won reputation for privacy and throwing it in the garbage at the worst possible moment.
The folks who choose iPhone in part because of a perception of relative privacy and respect for the user (like myself) will think twice.
My two concerns are that Android as an ecosystem is almost certainly still worse, and that the vast majority of users will not care.
I’m tempted to jailbreak my devices going forward, although I guess the folks at Apple would say that makes me a pedophile.
Edit: seeking recommendations for Android phones with strong performance and reasonable privacy protections. Ideally one that can be used without a Google account.
I'm very nostalgic for Blackberry again. I feel like on a platform where the whole point was to prevent anyone who wasn't the corporate IT department from being able to access it, this local code execution would have been a massive non starter.
I put LineageOS on an S5 a while back (with gApps, but there are non-Google alternatives) and it went pretty smoothly. Works with several more modern phones.
I would never ever touch anything made by Apple from now on. I was never a fan, but was considering getting an ipad for drawing etc. After this, Never.
I've talked around and nobody I know is even aware of this, when I brought it up, they didn't care about the issue.
It's not news in the commons, it's not on CNN, Fox or MSNBC right now as a headline.
It's one of those tertiary concerns that frustrates some groups, but most people are not aware and would only marginally maligned, and a good 2/3 of people really don't care. I suspect most people would have a difficult time with the 'slippery slope' issue and would accept the 'it's for the children' terms at face value.
It is what it is, but it's worth understanding how regular people think about these issues.
That's true. However, Apple is part of the same tech bubble as HN users. There are Apple employees reading this thread right now. Most regular people don't care about privacy either and just follow soundbites in the mainstream media e.g. "WhatsApp is spying on you!!!". But as long as HN/Apple think that they do then all is good.
This is a potentially sensitive and conflicting topic. I don't think people can take strong view on this in real life for multiple reasons, one of them being pedophilia is a disorder that is not super uncommon.
The fact that WSJ did a story on this with and a lengthy interview with Apple SVP means this is already a mainstream story.
Just wait until the September event. Apple’s media attention quadruples during that time.
“Apple creates authoritarian government wet dream” will be the main narrative coming out of the iPhone event. Not the new phone. And investors will not be happy.
I doubt that. I mentioned it to a non-tech person today. They hadn’t seen the news, and their first instinct was to empathize with the Apple executives having to confront that their products are enabling pedophiles.
hearin the number of CSAM reports apple made (265) vs facebook (10 million) changed my perspective. i don't like it but they are going to have to start scanning icloud photos sometime
I don’t think anybody has a problem with iCloud scanning (don’t they do it already?). Apple owns their servers, so I don’t care if they scan them.
It’s the on-device scanning that is the massive overreach. It’s like being forced to allow a government employee to live in your home and watch your behavior 24/7 for anything they don’t like.
“Oh but John is only looking for specific bad behavior like drug use, you don’t have to worry. He would never report you for anything else, we promise! If you have nothing to hide, you shouldn’t be upset that John is living with you now!”
If they kill the program, headlines will read: “Apple refuses to protect sexually abused children” or “Apple sides with protecting child pornographers.”
I anecdotally see no backlash outside of HN and the tech/libertarian-focused parts of Reddit. To the casual iPhone buyer making up their billions in revenue, turning back on the program wouldn’t be good optics. Obviously my statement is purely conjecture, I’ll admit that.
I don't think they have to compromise on the CP protection.
They can just go along with what other cloud storage providers have been doing all along... that is, "we won't snoop into your phone, because that's yours. but if you upload photos to iCloud, onto OUR servers, then we reserve the right to scan for images in CP database.."
And I think most people will be perfectly fine with that idea.
Just like you sign away some rights and accept risk when you decide to store items in physical storage facility, same thing will happen when you use iCloud.
You’re not seeing the backlash yet because normies don’t pay attention to Apple outside of the September iPhone event (when coverage breaks into the mainstream for a few days).
Look at Google trends data for the search term “Apple.” It spikes every September by like 70% for a week.
I can guarantee you, every average joe learning about the new iPhone from mainstream media, is going to hear a sound byte about this fiasco as well.
I’ll be surprised if this program isn’t dead within 2 months.
> It’s like taking decades and billions of dollars worth of hard won reputation for privacy and throwing it in the garbage at the worst possible moment.
How gullible Apple users are to think that their closed-source ecosystem was ever about privacy, but I still don't think they'll learn. It is a cult and will continue being a cult.
So they're still retaining the ability to look on users devices? I have nothing against them looking at the abuse material, I have a problem with them having power over my device, which could be piggy-backed on by state actors (or others) for their own purposes later.
Nobody is against them trying to prevent child sexual abuse, pretty sure we all agree that fighting that is important, but doing so by creating what is essentially a back door of sorts into my devices isn't the right approach to doing so.
It sounds to me that this is still allowing them that access, so this changes absolutely nothing.
Yeah, all they need to do is follow suit with what everybody else is doing and scan once images are uploaded onto iCloud.
Scan every single file. I don't care. Because once in iCloud, files are sitting on Apple's server and hard drives. I don't have much expectation that those files are 100% private.
This is as unhelpful as saying "don't have CSAM on your phone." As has been reiterated many times, the problem is not the capability it is claimed to start off with, it is how the capability will be evolved, abused, and exploited as time goes on.
If they only intended to scan iCloud photos, then they would’ve done it on the server (since they have encryption keys) instead of installing spyware on everybody’s iPhone. If they had done that then there would’ve been little to no outrage since people wouldn’t need to worry as they could just turn off iCloud.
What do you think will happen when e.g. the Chinese government demand that they scan all photos for hashes provided by the government, upload them regardless of whether iCloud is enabled, and let a Chinese company handle the reviews? If they couldn’t stop the CCP from gaining full control of iCloud in China then they most definitely can’t prevent this either.. and other authoritarian governments will copy the CCP and make similar demands.
Makes zero sense. Installing spyware and compromising everybody’s iPhone does a million times more harm than whatever governments can achieve with complete access to people’s iCloud account. At least before we could protect our privacy by turning off all iCloud sync, but now your only option is to switch away from iOS.
They put out a new paper [0] describing the security thread model they were working with, and this paragraph on page 9 stood out to me:
The perceptual CSAM hash database is included, in an encrypted form, as part of the
signed operating system. It is never downloaded or updated separately over the Internet
or through any other mechanism. This claim is subject to code inspection by security
researchers like all other iOS device-side security claims.
Could someone tell me how that inspection works? Are there researchers who are given the source code?
Let's talk about a nanny state called Finland for a moment
> the Finnish National Bureau of Investigation (NBI), had compiled a secret blacklist of websites that it deemed to contain child pornography and sent it to Finnish Internet service providers.
> Analyzing the address list, Nikki also noted that the first three Google search results for "gay porn" are censored. Electronic Frontier Finland (EFFI) have noticed that the blacklist includes non-pornographic websites also, including a Windows advice forum, a computer repair service and the Internet Initiative Japan server nn.iij4u.or.jp that, among others, hosts websites for a violin factory, a doll store and a hearing aid manufacturer.
This does not address the issue of perceptual hash collisions and false positives at all, nor does it address the issue of on device scanning nor their claim that, according to Apple[1]:
> This program is ambitious, and protecting children is an important responsibility. These efforts will evolve and expand over time.
People don't want their property spying on them and reporting them to the police. They don't want people looking at their photos or thumbnails. It's patronizing, invasive and embarrassing to have your privacy violated like that.
> Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.
> For many years, we have used encryption to protect our customers’ personal data because we believe it’s the only way to keep their information safe. We have even put that data out of our own reach, because we believe the contents of your iPhone are none of our business.
Apple these days is the type of company to make a mistake, think they are never wrong, blame anyone but themselves and then change a little bit, only for it to go over like a dead balloon. It reminds me of the Apple Silicon dev kit. No refunds, ok refunds but you have to spend them in a certain time, ok fine you can use them for a longer period and for anything.
They are probably completely flabbergasted that people are upset about this so they make this change after a week, completely missing the point.
They've been like that since the early days of the company, it's core to their culture, which was built by Steve Jobs. Apple has always had an elitist snobbery at their center, such that if you don't get a thing then you're wrong. That's what Think Different is all about, that's a hyper arrogant proclamation that could only be spewed out of an elitist never-wrong orifice, it's borderline assholic in nature. Part of the cult of Apple has always been that attitude, the Apple fans eat it up, they love the idea that buying Apple products made them feel different (or it used to, before Apple became mainstream), made them feel separate from the masses. Steve Jobs points that out in the D5 interview in 2007 with Gates, how Apple needs the inferior taste of the masses to prime their superiority off of (which is another way of saying: they're wrong, we're right).
Go back to their response over the iPhone 4 reception problems ("you're holding it wrong"), same asshole culture then as now, and that culture has been in place for decades:
Eh, I would go even deeper: the original Macintosh keyboard didn't have arrow keys because you shouldn't need it according to z Jobs. People complained and Apple held their feet for a long time until giving up and adding the goddamn arrow keys. That's always been very emblematic of Apple for me.
(Hard to find a good citation link on a phone, bit the story's out there.)
I mean I don’t consider myself an elitist but as a software engineer who just moved from doing dev work on a Mac to a Windows laptop, I’m so frustrated by how bad the experience is compared to the Mac. I love the new job but even these $2000 Dell laptops feel junky in comparison to a similarly priced MacBook Pro. I’m constantly having problems with my audio devices (which is a necessity during the pandemic) the built in microphone is terrible, the webcam on the laptop is on the BOTTOM of the screen which just makes me look super fat because reasons I guess. It’s like how a fish doesn’t realize it’s in water until it flops up on the shore.
This is just business. Governments around the world are thinking of breaking up the app store, and Apple is saying they'll help governments surveil on their citizens inside their walled garden. Would be a shame if users can escape from the surveillance by choosing alternates outside of the walled garden, wouldn't it?
Expect an App Store rule soon - all photos apps must scan for CSAM.
In a year, expect Signal to be banned from all app stores, just like how it's already banned from some countries.
My impression was that it's currently US only because that's the only country so far where they've arranged a process to report their users to the cops. No reason they wouldn't expand it as they make arrangements with other countries.
They've really bought into a ton of complexity at all levels by doing this instead of just scanning stuff on their own servers, which would at least have been a clear dividing line. Privacy is really all about control, and on-device scanning dangerously blurs that line.
USA-only referred to which accounts would be scanned. Whereas it's theoretically a good thing to require the CSAM hashes to be verified by organisations in multiple countries (rather than a single US government-tied organisation that dismissed privacy concerns as "the screeching voices of the minority").
I'm hoping for something along the lines of "iOS will reject hashes that haven't been signed by independent organisations in US, Russia, China, and India" to make it very difficult to push through anything except actual CSAM. Won't be much of a guarantee if it's just Apple saying "we promise we're only using hashes that have been checked by Australia and the US".
This entire project has been a public relations catastrophe not only for Apple, but also for the NCMEC.
Almost no one held the National Center for Missing & Exploited Children in low regard prior to last week. Why would we? It's one of the best, most noble causes that have ever existed.
Then, Marita Rodriguez, an executive director at NCMEC, reassured employees in an internal memo that was unfortunately leaked:[1]
"I know it’s been a long day and that many of you probably haven’t slept in 24 hours. We know that the days to come will be filled with the screeching voices of the minority.
Our voices will be louder."
The "screeching voices of the minority" phrase got picked up and (perhaps unfairly) has become a kind of rallying cry, exposing the arrogance of the NCMEC.
I say "unfairly" because I do believe that their hearts were in the right place with this initiative, although they are desperately naive when it comes to the principles of privacy. "Let's save the children - how can we stop this toxicity of child porn - let's set something up with the cloud providers!"
But the damage has been done. Now the only tenable outcome is for Apple to simply cancel the project.
I, for one, was looking forward to upgrading from an Android to an iPhone 12 Pro Max later this year. That plan is now on hold! I doubt it will affect their sales too badly, at least at first. Maybe it will after the Chinese government starts using the back door to round up dissidents, or some Middle Eastern governments use it to crack down on homosexuality, or... the mind boggles.
I'm not sure if this addresses the main point. Apple promised privacy, as long as you're not violating laws. The most recent move changed this: now you're a suspect by default, your photos will be scanned even on your local device, and you won't know what the criteria are to flag you, since they're hidden from your knowledge and you can't defend yourself against that hidden criteria.
So what's gotten better by this? In the kindest interpretation, you'll be safe from Fascististan's hashes. But what if Fascististan has been bribed by China or the US? And so on.
A line which should never have been crossed was crossed.
Yeah. "Dear debtor countries, the attached file contain hashes of new files deemed illegal for the harmonious well-being of peoples for this month, please enter them into your databases."
A FISA subpoena or an NSL can only compel Apple to give certain types of information that Apple already has. They can’t compel Apple to gather information they don’t already have.
A wiretap order is a different kind of order and goes through the district court, not FISA. The legal standard for it is also much higher than for other subpoenas by statute.
Yep. FBI Lawyer, Kevin Clinesmith gets to lie to the FISA court, fabricate emails to get 2-hop warrants and then gets zero days in prison and as of this month is allowed to practice law again. It's an absolute joke.
These stories have been top of HN for more than a week, indicating fascination. Yet non-technical mainstream articles make the bulk of it, and Apple's just released detailed (and convincing) threat report PDF is getting buried. An unfortunate recipe that'll solidify misunderstandings and keep the conversation much more emotional than it should be.
They can publish the dev design doc, the test plan, even the fricking source code, that doesn't change the basic fact which is local device scanning.
There is no misunderstanding of that fact by anyone with sufficient technical know-how. There is also no misunderstanding of what comes next once this Pandora box is opened.
Since hearing about this whole affair, one potential scenario popped up in my mind...
Currently, bad actors are randomly sharing illegal content in an unsolicited way to people who did not ask for it (1).
Let's imagine this happens to you. Suddenly, you have illegal content on your device you didn't ask for.
Maybe you are busy when the message with the content is received and you don't even know about it, or you think it's random spam from someone you don't know and just ignore it.
Then, the content scanner picks it up and notifys authorities.
Well, to be factually accurate, in this case with Apple, nothing will happen. Apple only reports when it matches 30 instances of this content. They have said they plan to reduce it from 30 as they improve their algorithm.
2. Disposable devices with soldered-in batteries and no upgradeability.
3. Removing headphone sockets so they can nickel-and-dime with irritating dongles and ridiculously expensive wireless headphones.
4. Disallowing personal backups using your own key to incentivize the use of expensive icloud storage.
5. App-store corruption.
6. Dodging tax in my country.
7. A million examples of shitty hardware design and then blaming the customer somehow, eg "you're holding it wrong", dust-attractant garbage keyboards, overheating gpus, etc etc
What a shitty, penny pinching piece of shit company. The most expensive devices and they still want to squeeze their customers more.
Literally everything they do trades privacy/usability for money, and the cherry on the shit sundae is always insulting their customers intelligence by somehow claiming it's for their own good.
And now we get probably the very worst thing they've done:
7. Legitimizing on-device scanning and adversarial devices.
I dearly wish they would go out of business because they're ruining technology, something I've loved my whole life.
I never even bought one of their bullshit locked-down garbage disposable devices and they're still ruining things for me with their influence on the market.
Fuck you, Apple. Fuck you in every way. You're the worst hardware manufacturer and the worst software manufacturer.
Because this is likely not about making money, it's about expanding surveillance. Once this becomes accepted they can ratchet it up bit-by-bit. Each the the outrage will be less and less.
For example imagine creating a new tax for 1% of purchases. People would be outraged. Now imagine raising sales tax instead by 1% (ex 7% to 8%.) Sure some people would be upset, but it would be a smaller number than the "new" tax. The reason is because people are already used to paying a sales tax. What is "normal" is more easily accepted.
If Apple can get past the initial outrage (like when sales tax was implemented in many countries in history) they can increase the surveillance once scanning local files becomes the "new normal".
I love how people will jump through hoops to land on China even when discussing something that was conceptualized and first (only, as of now) implemented in the US.
You can't have a massive new fake war - the War on Domestic Terorrism - without ideally having new ways to track and surveil anyone (everyone) that is going against the machine politically. They need new capabilities and they're absolutely going to put them into place, asap. Did you see their reaction to the populist Trump circus? The globalists viewed it as a mortal threat. Everything they built since WW2 was in jeopardy as far as they were concerned. Their ability to launch more forever wars, endless foreign adventurism, obsessive (psychotic) superpower global meddling, all of it under threat from rising populism (represented by both Trump and Sanders) and a growing rejection of the entirely failed decades-long approach by the globalists that has led to the US being buried under nearly $30 trillion in debt and numerous catastrophic wars. Just look at what these monsters are saying publicly on MSNBC, CNN, et al. They're telling everybody what they're planning to do, they're putting it right out there. Look at their framing: we're going to target the domestic population; they're de facto saying that. They couldn't be more open about the authoritarian nightmare they have in mind. They perceive their system as being under risk from populist revolt against them and their agenda, and they're launching a counter offensive (the empire is striking back), it's under way right now. Bet on it.
It's quite simple: the culture of bureaucrats that have dominated Washington DC and its policies in the post WW2 era, they acquired a position of power the likes of which has rarely been seen in all of human history. Now, ask yourself: do you really think they're ready to give that power up? They're ready to stop treating the rest of the planet like it's their toy, to do with as they please. Then ask yourself: what do you think they're prepared to do to stave off risks to that power? The risk that they might lose their precious. And there you go, you have your answer. They're willing to do many of the same things other authoritarians have been willing to do. The US isn't Soviet Russia, that's not the suggestion; it doesn't need to be to start putting political opponents (fringe rabble rousers; whether libertarian or socialist - see: Clinton vs Sanders; that's globalist establishment vs domestic socialist) into prison and surveiling everyone in a desperate attempt to retain their power. That kind of power is an intense drug, they're addicts of the worst kind, you can tell by how they behave in regards to the rest of the world, how they treat the rest of the world.
To ensure they keep their power, to keep their globalist forever war machine (and everything that goes with it), they need to put the boot on rising domestic risks to that power.
They put themselves into a position where backing out is also extremely costly.
a) it means they have to publicly admit that what they did was a bad idea, which will fuel another news cycle
b) backing out of it will get them criticized for "protecting pedophiles", maybe even the NCMEC (who privately praised them in the "screeching voices of the minority" memo) will now publicly criticize and shame them to get what it wants
c) now that they've put this idea on the table, there will be even more government pressure to mandate/implement it. Even if they back out of the implementation, just by bringing this proposal up, Apple may have just destroyed not just the privacy of their users, but of everyone.
The thing that strikes me as most suspicious is their insistence on keeping the client-side scanning feature while wrapping it in all sorts of supposed restrictions so it's essentially server-side scanning, just implemented on the client device for some reason.
If the problem is "We need to scan stuff on iCloud for CSAM" then "Let's build a client-side scanning agent and distribute it to hundreds of millions of devices and figure out a way to protect the hashes from the end user and then figure out a way to lock it down so it can't be used to scan other things even if we wanted to or somebody ordered us to" is a singularly (and suspiciously) inelegant solution to the problem.
Just scan the files on your own servers, like everybody else does. Anything else is rightly going to make people suspicious that you've got some ulterior motive.
Notice how they haven't done that though. It would be trivial to do. Just change all of the private keys to ones supplied by the clients. But they haven't done that. Why would you expect them to do that when they are taking steps away from individual privacy? Leaving a hypothetical option on the table that costs them nothing to do means they actively do not want to do it. So why would they do it?
It’s quite simple, they would like to store photos without being able to see them, only knowing they’re not CP. If they scan on server, they need to be able to see all photos, and eventually give them to any requesting authority.
So the reasonable approach is to make clients that only upload photos that are not CP. The problem is that the implementation of that feels creepier than they anticipated.
Potentially something like a national security letter and they can’t reveal the real reason behind it. I agree this feature is off-brand for Apple and came out of the blue.
They're convinced that it's worth the benefit to abused children to let the tech community vent fears about the world at them for a while. I don't really blame them. My views don't align with the mob here, so it would be a waste of my time to discuss them while y'all are upset and angry about this; best just to hold my position, remain silent or reserved, and wait for the pitchfork mob to find another target or exhaust itself. I'm honestly surprised Apple even commented at all.
> I'm honestly surprised Apple even commented at all.
I think they commented because they expected a brief storm of outrage that quickly dies down, but instead it has become a growing wave that is spilling into mainstream media, with a mostly negative reaction, and threatening to completely destroy their "Privacy. That's iPhone" mantra that has been the core of their marketing campaign this year.
I think the most reasonable solution to this problem is to not purchase Apple products for a few years given how they have responded to criticism and promoted the characterization of privacy advocates as "screeching voices". I was thinking of upgrading my iPhone given it's getting pretty old now but I can't really see myself doing that now.
It should be obvious by now that Apple has been coerced into building this scanning system, and the CSAM scanning is just the front for a much more broad "national security" backdoor. Apple's PR team has surely been shitting bricks for the entire week--this is the worst PR debacle they've had since the iPhone 4 Antennagate in 2010.
Completely agree. A friend of mine asked what was happening, and I just said "it's a big deal, biggest thing in a long time". Antennagate is a good comparison -- though this seems bigger. Impacts of Antennagate long term seemed minimal.
This is way bigger than any RF engineering mishap.
I’m going to actively evangelize alternatives to Apple devices and that’s coming from someone who has been doing Apple evangelizing since OSX Panther days. Tens of thousands of dollars of bought devices and services not including all the people I convinced to make the switch over the years.
Sadly this will likely be little to no impact on Apple long term either. I too agree this is coming from the government. They have found their loophole of using "private industry" to do all the dirty stuff they cannot directly do.
Given that privacy has been what Apple was using as they key selling point in their ad campaign, and now they've completely subverted that and mainstream media is picking it up and people are noticing, I suspect this will impact them far more than they expected.
While Apple deserves all the fallout they're getting and more, I'm disappointed that the NCMEC that pushed for this isn't also receiving more scrutiny and criticism. Multiple people have now pointed out that their database contains false positives, which is absolutely terrifying. Completely legal, harmless, no-nudity-no-humans pictures can get your life ruined. The truth coming out later doesn't matter when your home gets raided and it slips out that you were caught sharing multiple images that matched hashes from the NCMEC child porn database.
To everyone looking for alternatives, there are quite a few but strong alternatives. They are on a spectrum of complexity/privacy/convenience.
Android probably is not much different. But and this a big one, android devices are not tied to the host google OS and are waaaay way more open than Apple. This is because of Apple's ir tight grip on their hardware ecosystem and lobbying against right to repair. Also, the abundance of android phone makers means you get features at a variety of price points.
The OS options are LineageOS (my daily driver), CalyxOS worth more google support, microG lineage.
More different OS are Ubuntu touch (very active), postmarket OS with multiple frontends just like on Linux desktop, Manjaro Mobile, etc.
And hardware wise, there is Pine phone and Purism. But the more people paying for these, the more there is money for the competition.
Apple id already sitting on a staggering $200billion dollars. $200 billion.
That's just smoke in the eyes while hiding the real problem which is allowing monitoring, not where, when, by whom, and for what reason.
And I would also like to get some real numbers about children abuse cases compared to drugs/domestic/gangs/organized/etc. crimes. In other words, how much they impact the society.
Wanna bet that in the world there are for example a lot more normal citizens abused by the police every day than old men wanking to a kid picture? When will come the scanning of cops iPhones to catch the "bad apples"?
I'm very serious. Child abuse is a cancer and must be stopped, but if they're employing all that technology for this and not for that, well, dear Apple, I want to see some numbers that would justify the choice.
And don't reply "even saving only one child is worth ... yadda yadda yadda" You know what I mean.
I have a 4" iPhone SE. Love it, but Apple has continued to push me away these last four years. I no longer use any other Apple products except for this 4-year-old phone. What alternatives are there for small phone fans? Sony used to make an Xperia Compact that was nice, but I think they discontinued it.
> The implications of the government’s demands are chilling. If the government can use the All Writs Act to make it easier to unlock your iPhone, it would have the power to reach into anyone’s device to capture their data. The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge.[1]
They know they can essentially wait out the public outcry, plus they can even buffer things a little bit by calling their critics "confused" and introduce measures that don't address anything but sound reasonable to lay persons in an effort to distort the conversation.
The platform is not in danger. But if it were, then yes, I think they'd still risk it. It's either that or getting shut out of markets gradually by law enforcement and governments.
Desktop Linux in no way compares to iOS market share, and even if it did, it'd require a different type of attack. Apple is vulnerable precisely because it is the one single vendor in complete control of anything related to the iPhone.
Personally I do think it's likely we'll see a successful anti-Linux smear campaign and/or laws in the next few years, but I have to admit that's a pretty pessimistic take.
I wonder if anyone would've really cared (or at least cared this much) if they just nixed the possible plans to do end to end on the entire iCloud and just did the scanning server-side. I probably wouldn't have started looking at other options, personally.
It's somewhat funny to me that switching to a Mac and eventually to multiple Macs is what had me thinking that I could probably be fine just using Linux for my day-to-day computing that doesn't involve gaming (I've a Windows desktop for that) and now this has me thinking "yeah, I can pretty much make the switch and realistically not feel like I'm missing out on too much, anyway.".
Of course it matters- if Apple does a 180º reversal I'll applaud and buy more Apple products. I want companies to be able to acknowledge mistakes unequivocally and fix them.
If Apple says "we heard the backlash, we're sorry, we'll never do it again", I'll be more of a supporter than I had been before.
I wouldn't say "this doesn't matter", I'd say "this makes it worse" because they're reaffirming that they want to push forward with this despite the criticism.
My problem with all of this is not some abstract slippery slope of "what happens if China starts insisting that Tank Man be added to the hash list" or whatever.
It's the practical scenarios where people in the United States have previously been indicted and their lives ruined over "child pornography" or "statutory rape" that stretches the definition to the breaking point. Young men just barely 18 years old have gone to jail because they have taken pictures of their technically under-aged girlfriends. Often their real crime is that they've simply upset someone rich and powerful, or that they were guilty of being black and dating a blonde white girl. I wish I was being facetious, but this happens all the time.
More importantly, in this digital age, many stupid teenagers upload nude pictures of their under-aged girlfriends to the Internet. Sometimes as revenge for the girl breaking up with them, sometimes because they "hacked" a girl's account and downloaded their selfies, or they legitimately have the photos and just wanted those sweet internet karma points.
Imageboards like 4chan are full of what is technically CP, even though nobody was directly harmed in its creation. Other stupid teenagers download these photos, as do older paedophiles that trawl these websites looking for CP. Some of them will share it with other paedophiles, internationally even.
So what happens if one of these paedophiles gets arrested for a serious child sex crime? Their PCs and their phones will be searched, and the pictures will end up on the CP hash database, of course. Even with this "multiple nations" requirement, that just requires two arrests for the hash to be added. Nothing really changed, there's just a delay.
Now what will happen to the stupid teenagers that downloaded the same photo? Teenagers that aren't child molesters! Teenagers that aren't paedophiles!
Apple will probably say: No worries, we filter out the phones belonging to under-aged children from the results.
... until they turn 18.
Get it? This is a legal landmine in the pocket of every stupid kid. YOUR stupid kid, that looks at porn on the internet, just like every other stupid kid.
This is just one scenario where this could lead to a false positive that totally and irrevocably destroys someone's life. I can think of several more, and dozens once you start adding the political pressure from various other nations.
There is just no way to make this kind of dragnet surveillance safe. Even a tiny false positive rate is unacceptable when there are hundreds of millions of people playing this distopian lottery.
NCMEC and similar groups call all child pornography CSAM.[1] Including 17 year olds sexting each other. And several people said NCMEC's database includes legal images.[2][3] What made you fairly sure NCMEC's process was more rigorous?
We could probably increase the vaccination rate if we snooped on peoples' phones to find out if they are participating in vaccine-hesitant behavior, and then rate-limited traffic to websites and apps that peddle this dangerous misinformation.
How many people have to lose their lives in the service of some pedantic idea of "privacy"? It's a computer looking at it, it's not even a human person.
I think tech companies need a hippocratic oath similar to "first do no harm". Apple should not be engaging in misinformation trafficking, and should at the very least be working to minimize harm by preventing people form falling victim to dangerous, unsubstantiated, and un-fact-checked information. This is especially important when our elected officials use the considerable power that has been gifted to them by the people to put peoples' lives in danger by spreading dangerous misinformation.
What role did apple's inaction on this have in the pandemic? In the January 6th insurrectionist's attempt to overturn a validated, secure, and duly certified democratic election? What role did Apple's inaction play in the attempted kidnapping of the governor of Michigan?
By refusing to help, they are partially responsible for these things. It's time for us to demand that they do their fair share of helping. Inaction is itself an action.
This problem is the capability, not what it's used for. Any such capability will be abused by new use cases be it terrorism, drug trafficking, human trafficking or whatever. Plus there will inevitably be unauthorized access.
The only way to prevent all this is for the system not to exist.
I don't buy into theories that Apple is being pressured or coerced on any of this. I believe it's far more likely this is just tone-deaf but well-intentioned incompetence. It's classic "why won't anyone think of the children?" and we've seen it time and time again with encryption backdoors and similar.
The big question for me is how and why Tim Cook and Apple's board signed off on a plan to do huge damage to Apple's reputation and user trust. If they didn't know, it's a problem. If they knew and didn't realize the inevitable backlash, well that's a different problem.