Hacker News new | past | comments | ask | show | jobs | submit login
An open letter against Apple's new privacy-invasive client-side content scanning (github.com/nadimkobeissi)
1082 points by yewenjie on Aug 7, 2021 | hide | past | favorite | 430 comments



Honestly I'm glad to see a non-insignificant amount of people in tech take this seriously, especially when the goal Apple announces appears to be for the greater good. It can be hard to stand on the side that doesn't immediately appear to be correct. We have already lost so many freedoms for 'national security' and other such blanket terminology.

Just be warned, there will be those that unfairly try to cast this as helping the distribution of CP. Expect a headline tomorrow: "Edward Snowden et al Supports Child Porn" - or some other hot take.

A few other things:

1. Vote with your feet - Put your money in the pockets of the people aligned to your values.

2. Actively support projects you align with - If you use some open source hardware/software, try to help out (financially or time).

3. Make others aware - Reach out to those around you and inform them, only then they can make an informed choice themselves.


> a non-insignificant amount of people in tech take this seriously

We're all here to make ourselves feel good saying we Took A Stand.

In reality, four weeks from now, do you think anybody will still be talking about it?

I made this same mistake. I was pretty convinced that people were taking Copilot seriously, and that there was possibly going to be ramifications for Microsoft. I wasn't particularly looking forward to it, but I rated it as plausible.

One stonewall and a few weeks later, here we are. Hardly a word anymore about it.

I think if there's going to be any real, long-term change, we have to find a way to combat the real threat: life goes on.

(Made me smile to phrase it like that, actually. That wasn't the combat I had in mind...)

It's true, though. Ask anyone in a month whether they take this seriously. They'll say yes. And what's it worth?

It's all we can do, of course. But that's just another way of saying we're powerless.

Like it or not, we aren't in control. And unless we can figure a way to get some measure of control, then what are we doing here, really?

Your three bullet points are wonderful, and I agree entirely with them. And it won't change a damn thing.

I'm not sure what to do about it.


> Like it or not, we aren't in control.

The typical Apple customer is forking over thousands of dollars. They have substantial control. This isn't like a government where you need a plurality of voters to agree before anything starts to happen.

Apple is running software on their phone that will either (a) do nothing or (b) call the police on them. The situation is questionable even for a normal consumer, let alone a nervous one. Maybe most people will weigh up the situation and keep buying Apple. But they are making a choice, they aren't powerless. And some will start asking how important smartphones really are.


My question is, who has funded this feature all the way from PoC to product? How did Apple calculate the ROI on this feature?

It’s not like end users are tripping over themselves to have all their photos scanned. Does Apple just have gobs of money to burn on things like this that will not increase its bottom line by even one penny?

There is probably funded from somewhere, and I’d like to know who is paying for it.


Probably, as other commenters have said, a mix of behind-the-scenes political pressure to give feds access to phone data for people they’re interested in and an effort to get them to back down on anti-trust and other things that governments are realizing they can milk tech companies for more money on.


> And some will start asking how important smartphones really are.

I'm having to buy a smartphone to be able to wash my clothes.


>I'm not sure what to do about it

For starters, please give up the defeatist attitude. They deserve the frontlash; even if it serves no purpose, as you describe it.

Apple have placed 'Privacy' at the core of their messaging, in order to sell their high-end products. In comparison to the already excessive pearl clutching based on moral panic within the ecosystem e.g. sanitising and censoring language and apps. This is a scope-creep via thought terminating clichés like 'think of the children' and the boogeymen, which will not be enough of an explanation, when Apple ID's start getting locked out for capturing entirely innocent moments of their children's lives, or the intimate holiday snaps etc. Conversely, someone can weaponise it by sending you content that will besmirch your good name, or send you away for a stint in prison.

Nonetheless, they will be a couple of moves away from either a lawsuit/collective action or a pushback from consumers, especially when the algorithm (wrongly) labels you as paedo, porn baron or worse -- then placing you in a Kafkaesque nightmare to explain yourself.

https://www.apple.com/privacy/docs/A_Day_in_the_Life_of_Your...


They’ll scan content if you have it set to goto iCloud so as to avoid being an accomplice.

Turn off sync to iCloud, make local backups only.

Who is to say politicians who started threatening tech companies publicly haven’t made threats behind closed doors about Apple maybe being on the hook as an accomplice for distribution?

My company only exists because our CEO had input on an executive order years ago. We hardly “made it” in the free market.

The headlines never tell the full story. Media colludes with politics to generate “the right” sentiment. The spec and implementation details aren’t being discussed, just classic speculation on privacy and overreach.

So much for this site having a higher level of discourse and it’s efforts to dissuade repetitive and knee jerk commentary though.

We’d have nicer things if we discussed how things work and instead of what corporate media wants us to discuss.


What we should really be hammering on I think is the vendor lock-in, walled garden, and monopolistic aspects.

To me this CP scanning nonsense is Apple waving a carrot in front of the Establishment to get the Anti-Trust heat off, while selling everyone else on further degrading privacy through a slam dunk deliverable to satisfy the short-game players, but that's just me.


If the public isn’t going to push back against establishment politics looking like they did 100 years ago, good luck.

Like waves and particles; it starts with individuals changing their behavior.

But we’d all rather sit around debating ephemeral abstraction (a lot like religion), reconfirming math and the physical universe still work as discovered years ago, as we’re all still “rich enough” the bottom falling out hasn’t gotten to us yet, political problems are for the poor.


>The headlines never tell the full story.

The below links encapsulate some parts of why Apple cannot be continually trusted with customer data, or viewed as a paragon for privacy/security. There have been numerous examples, past and present e.g. The Fappening, T2 Chip flaws, atrocious leaks of customer data, Pegasus etc.

Some people with technical prowess, although not deluded, buy these devices with limited desire to constantly tweak or circumvent security features, especially when these devices claim to have them baked in and promise not to intrude.

>So much for this site having a higher level of discourse and it’s efforts to dissuade repetitive and knee jerk commentary though.

I am invested in this ecosystem, so the topic has a certain resonance. I resent your implication that my response to the original author was knee-jerk commentary. Furthermore, I responded to their nonchalance with a view to furnish a counterpoint.

https://en.wikipedia.org/wiki/ICloud_leaks_of_celebrity_phot...

https://www.wired.com/story/apple-t2-chip-unfixable-flaw-jai...

https://www.seattletimes.com/business/technology/apple-pays-...

https://www.theguardian.com/technology/2021/jul/21/why-apple...


Can’t speak for everybody of course, but I am currently planning the migration away from the Apple ecosystem and I’m heavily invested (watch, phone, MacBook, Mac Pro, Apple TV).

I’ve also been recommending to friends and family to move to droids with microG.

The largest concern I have for them is that they’re stuck between the devil and the deep blue sea.

Most people aren’t going to run a droid without google, or Linux on the desktop.


> I am currently planning the migration away from the Apple ecosystem

Me too. I told my wife today that I'll be looking at a feature phone as I'm not sure I can be bothered with jumping through all the hoops required to de-Google an Android phone. I remember a time before mobile phones, I was just fine without one - smartphones aren't that good, just convenient.


It's good that you're principled enough to do that but you guys will be rounding errors in Apple's revenue. Most people don't even know about this and those that do, most wont care. Even of those who do know and do care, most aren't motivated enough to change their habits beyond writing a few angry words on a forum Apple are never going to read anyway. Those that do change their buying habits, yourselves, are by far in the minority. And while you might have some influence over what tech your friends, family and significant other might buy, let's be completely honest, it's not going to be enough to sway them into using an unbranded phone over a feature that they themselves likely don't care much about but happily assume the role of outrage when around you because we are generally social animals.

This is companies consistently get away with pissing off their customers. It's the same reason politicians get away with pissing off their electorate. Even in a crowded space people seldom get outraged enough to change their buying habits. But in an industry where the options are already limited (Android or iOS) and where people only buy products, at most, once every two years...? Sorry but literally nobody outside your social circle will notice if you stopped buying Apple products.

I get this is a depressing response and I'm not suggesting that you shouldn't follow through with your threat. But I hear this threat posted unironically a lot and it seems to me that people think they have the power to invoke change simply by moaning on a few message boards or buying a competitors products for a couple of years (then that competitor does something to earn our outrage and we're back to the first company again). But real change takes effort. A lot of effort. More effort than most are willing to commit. Hence why the GP rightfully commented that in a months time people will have moved on to their next outrage.


The best you can do is raise tech awareness, take apple interviews and cite this reason for not moving forward. Facebook faced constraint not just after a avalanche of bad press but also as the tech workers (employees and applicants) pushed back and worked to improve things. The biggest asset in information economy is the people; the employees can enact change. See many examples in how Google direction has been influenced in military contracts or what not.


Facebook hasn't changed -- if anything, they're bigger now and sucking up more personal data than ever. Your Google example is an interesting one though it's worth noting it was the employees who objected. However it's not inconceivable that a public outcry could cause Apple employees to rethink and then challenge their management's decision.


I completely agree, I have managed to influence parts of my family and a few friends to shift to better products (like Signal) but I simply don't like having all of my communications scanned by anyone, whether it be by disinterested (in me particularly) humans or computers. The often unspoken (ha) part of free speech is the bit about it being my choice to whom and when I share my thoughts. That means a lot to me. Apple, Google, Twitter and Facebook will go on without me but they will only find out about me second hand, and that's fine by me.


What you are saying isn't that depressing, I think it makes a lot of sense. But perceived abuses do take time for people to eventually take action that can actually effect companies/states bottom lines.

I can imagine at some point a 0day targeting iOS devices that when they connect to a public wifi network with a compromised router, it distributes CP malware/botnet on to their device and then everyone else that connects to that device in the future (muddying the waters).

Similar actions could also be taken against other companies and state entities systems.

And these actions don't need to rely on most of the population to take them.

The more these "kill chain" like systems become automated and cheap to deploy, the more incentive individuals will have to pursue these types of attacks.


This is a wonderful dose of realism, mixed with slightly too much cynicism.


I think the problem with your response is that you aren't proposing anything better. It just sounds like you're saying "give in, it doesn't make a difference anyway".

That capitulating, "crabs in a barrel" attitude is why things are the way they are. The majority just sits around waiting for something to happen and for some miracle to happen; for a leader to show up, galvanize the masses to change their ways and save the day. At the same time you don't believe that will ever happen, so you stay complacent, non-active and immobile (physically and mentally).

Stop preaching futility and do something positive. Try and be the positive change you want to see. Just because you've given up doesn't mean you have to try and convince somebody else to give up. You're part of the problem.


> I think the problem with your response is that you aren't proposing anything better.

That's because I honestly can't think of anything better. However that doesn't make my response invalid.

> The majority just sits around waiting for something to happen and for some miracle to happen; for a leader to show up, galvanize the masses to change their ways and save the day. At the same time you don't believe that will ever happen, so you stay complacent, non-active and immobile (physically and mentally).

I think the issue is more that most people are too busy with their own lives to give a crap about a theoretical problem that doesn't visibly and directly affect them. We've head leaders before and it didn't change people's attitudes. If Snowden couldn't influence people's behaviours then what chance does a few random posts on a niche message board have?

> Stop preaching futility and do something positive. Try and be the positive change you want to see. Just because you've given up doesn't mean you have to try and convince somebody else to give up. You're part of the problem.

That's a bit harsh. I'm just as free to post my views are you are to post yours. And if you can't be arsed to get off your seat and campaign for real change then you're in no position to delegate that responsibility onto me.

Also I think you misunderstand the point of my post. I'm not trying to talk people out of action. In fact I've explicitly stated otherwise when I said "It's good that you're principled enough" and "I'm not suggesting that you shouldn't follow through with your threat". I'm just expressing my expectations about the futility of it all. A futile action can still give ourselves comfort that we've done the right thing even if we know it will make no wider change: like how I always vote in each election despite my constituency being so overwhelmingly in favour of the opposition that my vote is literally meaningless.

If yourself switching manufacturer brings yourself comfort that you've done the right thing then that's reason enough to change. But the pragmatic truth of the matter is that it'll take more than that to influence a company as large and successful as Apple.


I think your proclamation of it all being futile is a bit premature and can be disheartening for those that do want to act. It may even play a part in negative change, leading to some people dropping their original intention because you successfully convinced them that nothing can possibly change.

Yet, I'm not convinced that you can reasonably know this. So if we can all agree that what's happening here with Apple is a bad thing, perhaps it's for the best to refrain from posting pessimistic takes?

All actions have consequences, even posting to a message board. I think it is wise to formulate an intended consequence in mind before you act.


The irony here is while arguing that we should fight for our freedoms you are suggesting I shouldn't exercise one of my freedoms.

I do get your point, honestly I do, but you can't have it both ways.


I do want to emphasize that this wasn't a "you can't do that" type of thing, though. You absolutely have the freedom to do it.

My comment was meant more in a "Do you really want to, given that this might have an effect that we both deem negative?" kind of way.


Couldn't say it any better.


Same. I’m strongly considering donating to CalyxOS as they are a nonprofit.

The biggest reason I dumped Android years ago (the last Android phone I had was a galaxy nexus or nexus one) was because of the lengthy process to reflash a phone with something like aosp or (at the time) cyanogen. Each model was different. Ultimately I decided I didn’t have time for that and battery life was atrocious. Like I had batteries stashed all over the place.

I figure something like CalyxOS seems to have a good feedback and reputation here. And a donation helps them and allows me to bypass hours of tinkering on xd forums.

Ultimately I think I’m locked in iOS for now as my phone is dual sim and the esim is a company provided line. So I will either have to revert to carrying two devices again or just maintain the calyx/pixel as my bailout device in case my job suddenly my vanishes (like the current device I have in safe keeping)


Pixel phone running GrapheneOS, you'd have to jump through hoops to get google back on the thing.


I’ll give it a look, thanks.


I made the switch a couple months ago and its harder than you think. There is a lot of convenience you take for granted in the smartphone age.


Do you remember a time when you had to make a plan with a friend to meet at a specific place at a specific time and if one of you wasn't there or had a last minute problem there was probably no way to contact the other person? That was annoying, true.

I also refused to get a mobile phone for years, much to the chagrin of my girlfriend(s). It meant I used to come home to an answering machine full of messages and feel much more loved than I ever have with the immediate drip feed of messages I get now.

I wonder, what will I really miss that a book and a dumbphone won't be able to replace in my life? Perhaps the maps.


What did you switch to?


A nokia dumbphone


With at least a billion baseband exploits. Smartphones are bad, but switching back to feature phones is arguably worse in these regards.


The thing is, people who are young enough to not live in the time without mobile phones will have no recourse.


Quite possibly true. You reminded me of this, which did give me a chuckle https://www.youtube.com/watch?v=updE5LVe6tg

Watching it again I wondered, would they even know what a dial tone is? (or was:)


Are you in North America? Which feature phone would you suggest? The Nokia 3310 was likely the only phone I was considering, but it’s ancient and 3G and doesn’t seem very future proof.


The Nokia 3310 does not support 3G. It’s a 2G (GSM or TDMA) device.


The 2017 edition exists in 2G, 3G and even 4G versions.


Oh, I’d forgotten that they made a new phone with the same model number in 2017.

Even Nokia’s website seems confused about it because it claims the 2017 3310 is a 2G (“GSM: 900/1800 MHz“) device. That can’t be true of even a low-end 2017 device. It wouldn’t even work on many modern networks without at least having 3G!


Nokia 8110 4G


I suspect it might be more effective to go to an Apple store and attempt to return your iphone and iPad (though probably they are outside of return windows.)

I’m seriously considering doing that. Make it painful for the local staff by demanding a refund. Don’t take no for an answer for quite some time. Explain that they just broke the product with a remote update that you had no choice in, so thats why you’re only returning it now. When they ask how it broke explain that they started scanning all your content, thereby claiming it as their own. If they want to own the phone, then they need to buy it back. Etc.


> Explain that they just broke the product with a remote update that you had no choice in...

Except when you turned on the phone, you agreed to automatic updates. Contract law isn't on your side.


No, I didn't agree to automatic updates by turning it on. In fact, I don't have automatic updates turned on.

What happened is that I updated based on what Apple claimed the update did, but they were lying. Contract law would seem to be on my side in such a case.


You can run an Android device without ever signing into anything Google. F-droid and Aurora store, APK direct downloads. It is possible. I tried GrapheneOS, but am concerned about long-term development, hardware compromises. Their Element/Matrix threads can get very toxic.


My girlfriend manages with Pop_OS! At this point it's more user friendly than Windows.


A typical defeatist neo-luddism stance. You are part of the problem.

Instead of spreading your nihilism, you could be a force for good. We are not powerless. We can make a difference.

Contact your elected officials. Start a movement. Create a direct competitor to Apple. I know it’s tough, but we all just need to come together and stop the negativity.

Excuses or results. You choose.

Sent from my iPhone


Contacting your local representative is the right move. Bear in mind this change is specifically to comply with a law mandating the scanning of photos stored in the cloud, that’s why apart from the opt in for children’s iMessages, it only applies to photos you upload to iCloud storage.

The way to stop companies complying with the law is to get the law changed.


> I was pretty convinced that people were taking Copilot seriously

I think people used Copilot and realised it was a bit of a nothing burger.

> find a way to combat the real threat: life goes on.

That's the curse of the news cycle. But actions that do live on are monetary ones. When Apple miss sales targets, perhaps they will take note.

> Your three bullet points are wonderful, and I agree entirely with them. And it won't change a damn thing.

I think the last one is the most important, if each of us here can multiply the awareness of the problem, we can start to make real difference. Tech people have a massive influence over what devices ordinary people buy, after all, who will they go to for tech advice?

> I'm not sure what to do about it.

It's a tough problem, but what I do know is that doing nothing will 100% change nothing.


I am cancelling my apple one subscription as we speak.


I completed my move to the Pinephone and formatted my iPhone over this. I'm probably not alone.


I would love to read some first-hand experiences of the Pinephone or Librem phones on here.


I've used one for a year now, it's a UMPC running Linux that can make phone calls. What do you want to know?


Indeed. The PinePhone is my current plan for ditching my iPhone. I had planned to upgrade to the new 13 this fall, so thank goodness this happened now. Apple just saved me a lot of money!


> Like it or not, we aren't in control.

Yes you are. Leave the default and don’t enable this for your children’s iPhones, and the iMessage change won’t affect you. Switch off iCloud photo storage and none of your photos will be scanned. You have complete control over this.


I can't have a phone or a computer that snoops on me to this degree. So for me I definitely won't change or forget in a week. Unless they solve it, there's no more Mac or iPhone for me.

(This may actually be a Good Thing)


Well, you can't save your friends, but you can save yourself: https://nixos.org/


You can do two things: - fight : you will either join the establishment or have your life ruined but you won't change things - adapt : accept the things are the way they are and make the best of the situation In either case you can still get caught by the machine and get ground up, it has always been like that and as long as we are human it will always be like that. Any change that will happen is generational, 20 or 40 years from now the issues that are important to you now will be addressed by people who floated up to the top and are in position to make changes.


I still hope there is a chance that Apple may change course after they realise they may not have thought this through very well.

Especially the argument that this can be weaponised really easily worries me, and it looks like they overlooked it.

For example: some joker sends you a Whatsapp with a lewd picture, WhatsApp by default saves all pictures to your photo roll => you are now on the naughty list.

I really hope they come up with a good answer to that one (or just abandon this unholy plan).


> In reality, four weeks from now, do you think anybody will still be talking about it?

Nope, and it won’t have the slightest impact on Apple’s revenue or earnings either.


I actually view this as not a good sign. That many are still in the very very early stage of shock, what Apple is doing does not align with their perceived values. Deducting points from Apple in their own mental model. They write these letter because they think Apple is still mostly good and hope they could have a change of heart.

What would happen as some others have pointed out, people will forget about it. Apple will bump out decent Mac products line, along with very good iPhone hardware in a few months time. Which will add points back to their mental model.

"May be Apple is really doing it in good faith"

"They aren't so bad, let's wait and see."

Apple's marketing and PR have changed in the past 5 - 6 years. My guess is their KPI had changed. And they will ( successfully ) white wash any doubt you have. And 2022 Q1 results ( That is the iPhone 13 launch quarter ) will be record breaking again.

And that is not even including competition. I mean for pete sake is Microsoft or Google even trying to compete? ( I give credit to their Surface Laptop and Pixel team, but still far from enough )


It might be hard to understand, but iOS is a blackbox. Based on what they add and say, we still don't know what exactly is happening. All we have is trust. We can speculate that with "what ifs", but same "if" is applying already.

Once we take other big platforms on account, Apple is actually trying to note privacy. Other platforms just scan everything you put in cloud, but Apple tries to limit the information what they acquire, by scanning it on device. And based on their specification, they scan locally only what is going into the cloud. It took me a while to realize, but this is improvement for what has been happening in the industry for years already.


IIRC, Apple was already scanning iCloud server-side for CSAM [0] so I'm not sure if that excuse holds.

[0] https://www.macobserver.com/analysis/apple-scans-uploaded-co...


They did not scan everything. Only suspected ones, this expands scope to everyone by default. But this is still improvement. We might see followup as added E2EE services which was not possible before.


So are they only scanning things you specifically upload to iCloud, aka if you disable iCloud and photo syncing you are fine.

Or does that include ancillary data like images in iMessage chats (even if you have iCloud data basically shut off)

I read the scans were done on device but prior to upload. Just unclear what that really means to me.


This is an interesting point. I think you might be right.


> I mean for pete sake is Microsoft or Google even trying to compete?

Forget the evil monopolies. Consider GNU/Linux phones, Librem 5 and Pinehone. They may not be ready for an average user, but the HN audience can take advantage of them and help their develoment.


If open phones actually do get market traction (and I'm hoping they do), then I'm pretty sure that Apple/Google/etc will attempt to get laws passed banning them.

There are all sort of avenues that could be taken for such laws unfortunately. :/


Which is why supporting Electronic Frontier Foundation is important.


I would love a phone based on the ideas from Plan9.


Would you? How often do you come into contact with the OS when developing for a phone, let alone use a phone?

Phones are products for people who use them. What sells phones is not what OS is underneath layers and layers of code, but user experience. If we are to even hope for alternatives, this is where the focus needs to be.

If the OS underneath mattered the Nokia Communicator would have been a runaway success. It wasn’t. It was a useless brick of a computer that almost couldn’t do anything compared to even the first generations of smart phones from Apple and Google.


>How often do you come into contact with the OS when developing for a phone, let alone use a phone

That's exactly the point,

Maybe you don't know what Plan9 is, but see it like that:

The Phone is just a Terminal. If i want todo business i connect my terminal to my Workplace-servers, every application, datas and settings are there, the calculation heavy stuff and backup is made on the server. If finished i connect to my private plan9-cluster, the Phone is just a bit more than a intelligent Terminal.

The difference is with a plan9 phone you would ~never have anything todo with software ON the phone, let alone having to worry to "sync" to the cloud to make backups, update Apps or need to encrypt the datas in case of loss.

It's a bit like Cloud-gaming or Virtual Desktops with thin-clients, but much much more integrated.


Slightly problematic when your phone loses service, but I get where you are coming from. It would be nice for that to be viable, we’re probably decades away from having good enough network connectivity.


Truetrue, it need's some caching and cpu capability, like the map when your offline...stuff like that..but with 5G...well we will see ;)


Ah yes, 5G. The bag of promises telco execs have dangled in front of large customers for nearly a decade that will deliver on all their needs.

If they could just figure out how to deliver on it ;-)


>If they could just figure out how to deliver on it

Install an antenna? You know that before 5G there was a 4,3,2,1,0.5 G?


Well, no.

Most of the attraction of 5G isn't really anything you as a consumer will ever see. Like more advanced spectrum management and more advanced core network components that allow what you can think of as "virtualization with quality of service guarantees" for multiple tenants. On top of that 5G also makes use of higher frequency spectrum which has implications both for use and the design of mobile networks.

To really make life interesting, 5G also aims to enable private 5G RANs. For instance industry in some countries are heavily involved in building their own 5G networks for their own use in geographically limited areas (made possible by allocating higher frequency spectrum for these uses).

The customers that 5G targets isn't so much the consumer as government and industry. For instance by removing the need for building dedicated infrastructure for emergency, defense, law-enforcement, manufacturing etc. which would represent cost savings.

Part of the challenge of 5G is that regulatory authorities don't understand how 5G differs from what they are used to, so many places in the world (with eager lobbying from the telco industry) one will try to keep new players out of the market. (Of course, they might understand, but there may be "motivating factors" to cling to the status quo where spectrum is mostly allocated in large chunks at significant cost and with bureaucracy that ensures smaller players are kept out)


>Most of the attraction of 5G isn't really anything you as a consumer will ever see.

I see the speed difference definitely, that's was the point here remember?


How long have you used 5G?


I both know plan 9 and what falling in love with an idea to the degree where one stops thinking about the implications looks like.


Having Unix and try to implement Plan9 on top of it?


Depends Apple is on my "call first list now" on my CV along with Chem and Bio weapons and payday loan type companies.

And I wont be buying the ipad I was thinking about.


Fair point and you are probably right. However, this is getting them a ton of bad press from pretty major players in the privacy world (Edward Snowden for example). This has high chances of a lot of people abandoning Apple products (I'll no longer purchase new apple products and already switching over to de-googled Android).

Privacy is obviously just a marketing play by Apple but this time, they can't hide it and it might actually hurt their bottom line.


I actually expect most to support Apple’s approach here. But it is early.


> especially when the goal Apple announces appears to be for the greater good.

Not surprising at all.

In Russia, _every_ measure to limit internet freedom was introduced under the pretence of fighting child pornography and extremism.

Then, of course, it was used to block the websites of political opponents.


No need to look at russia only, what about Germany, one of those "bastions of free speech".

Also, wasn't the rational for the crypto ban the same? Either terrorism or Child Porn? If you support E2E you're effectively supporting child porn?

https://gigaom.com/2009/06/16/germany-to-vote-on-block-list-...


Does anyone really think of Germany as "bastion of free speech"?

It is one of few european countries that still have (actively used) anti-blasphemy laws[1] and laws against insulting foreign leaders[2].

[1] https://friendlyatheist.patheos.com/2016/02/27/in-rare-move-...

[2] https://www.theatlantic.com/international/archive/2016/04/ge...


The second one is outdated - not only did the prosecutors drop the case against him for insufficient evidence, his situation led Germany to subsequently repeal the law against insulting foreign leaders. He received plenty of support, both from the public and from the public broadcaster on which he recited the controversial poem.


Fun fact, it's also the only place that I know of where you can get fined for calling a German a Nazi.


In many countries (including the US) there are circumstances where defamation law can create civil or criminal liability for such a claim, but yeah, usually not merely for stating it without certain other things also being true about the situation.

Conversely, there are cases in Germany where calling someone a Nazi would not lead to a fine. One very clear example:

Person A: [Unambiguously asserts a sincere adherence to Nazi ideology]

Person B: Did you hear that, everyone? Person A is a Nazi.


I'm not talking about defamation. I can understand why you would infer that. The fine I'm talking about is about "feeling insulted". There doesn't even have to be a third person present. Say someone treats you in a racist way, you call him Nazi, judge can order you to pay 400 Euros(or more depending how damaging he thinks you may have been to his feelings) for hurting his feelings. No need to have a witness prove it either. It's a purely subjective judgement.

You calling someone Nazi to shut down someone in public actually seems to work well to shut down public discourse judging from what I'm observing in Germany right now. I don't think those defamation lawsuits go anywhere(even though they should).


Do you have any proof of that actually happening? Genuine question, because fining someone over pure hearsay seems insane.


I was assaulted once and said this can only happen in a fascist country. The assaulter filed a complained saying I called him Nazi and I was fined for doing so, despite the fact that I never used those words. The law literally talks about this as something "hurting your honour". This law is ridiculous in its own right. It's not really related to the word Nazi at all. Which is why I called it a fun fact. It doesn't matter that I didn't say it. What matters is that the judge thought that I could have said it.

Attached is a case where someone appealed a judgement at the constitutional court of Germany because he was fined for calling a social worker in a correctional facility a "Trulla"[0]. Think of it as getting fined by the court for contempt by calling a random person a "Karen" in the US. He did say it, but it hardly constitutes an insult to someones honour. The courtcase argued that the way this law is currently used directly contradicts the rights of freedom of speech. Obviously they lost because it would set a precedent rendering this whole charade meaningless.

The way this law works is that it's undefined enough that what constitutes an insult to your honour and can therefore be used whenever someone threatens authority or if authority doesn't like you, you can be fined.

[0] https://www.kostenlose-urteile.de/BVerfG_1-BvR-224919_Erfolg...


It was promotion of suicide before it was extremism and cp


Not only in Russia. It's sad to see a country (in central europe) which puts a guy 8 years in jail for raping children more than 8 years, justifying such invazive software as defence against CP.


> 1. Vote with your feet - Put your money in the pockets of the people aligned to your values.

The sad reality right now is that we're actively suggesting that peoples stop buying smart phones, and I think that would mean that any argument will fall on deaf ears. Yes, projects like the PinePhone exists, but they aren't ready for the general public.

As much as I agree with your comment, the sad reality is that I have to pick between Apple and Google. Of those two companies I still trust Apple significantly more.


That's not the only choice at all.

Pixel phones + GrapheneOS/CalyxOS. Various phones + LineageOS. Various devices + SailfishOS. Librem 5. PinePhone. Fairphone. /e/. F(x)Tec Pro1.

You're greatly underestimating the ability of people to adapt when suggiciently motivated, especially when so many devices can be bought fully set up and most people use only a few apps anyway.


There are lots of DIY electric car kits out there too, but suggesting one as a viable alternative daily driver to mass-market consumers seems a bit unrealistic.

Apple was nice in theory because it provided out-of-the-box privacy and security that laypeople could just go buy.


I think people need to wise up extreamly quickly to what we are facing here. I cannot overstate the importance of being extreamly sceptical of the motives of tech giants at this point in time.


I think the biggest challenge is a lack of consumer choice. Part of this is the complete failure of the FTC and the SEC in the 21st century. Maybe Pinephone will be compelling some day in the future but it isn't now. I don't want to be on tech support for my entire family trying to get some toy phone to work. We've built phones that are generally simple enough for a random grandma to do the couple things they need to do but the problem is to them FB, etc. IS the internet. That is where they see the pictures of their grandkids, they don't care about anything else.

I think there is a #4 that could be on the list. People within Apple could try to push back and protest and walk-out or any other means to try to make this fight go viral. However, the media and population write large will push back "... for the children." I've always thought that these problems are more solvable with Whistleblower Awards plus Witness Protection packages for the major enablers. There are lots of people in these rings or with someone who is but they are dependent on those same bad people for their necessities. Also, some of these rings are crime group adjacent and witnesses would need protection.

From what I'm reading, it seems like turning off iCloud (maybe just for photos?) will turn off this scanning. It is unclear to me what server side scanning Apple was/wasn't doing on photos uploaded to iCloud previously/currently. The one thing that occurred to me is that this is almost seems like this is a cya, Section 230 protection in disguise. There has been more discussions about Big Tech and 230, and this is one way to say "Look, we are compliant on our platform. Don't remove our protections or break us up, we are your friend!"


> 1. Vote with your feet - Put your money in the pockets of the people aligned to your values.

This is impossible. Just like modern democratic voting, you don’t get to vote on an individual policy. You vote on a bundle and that bundle almost certainly includes policies (or in this case “features”) that you don’t agree with.


So? If a corporation (or political party) does a shockingly egregious thing that doesn't align with your values then you absolutely should switch to an alternative that supports your values.


He’s saying that every option is not an ideal option. So there’s nothing to “switch” to.


What if all of them do at least one egregious thing? Especially when "all" means just apple and Google?


The thing that gets me about #1 is that if I prefer iOS to Android, there is nothing that can replace iCloud. Google Photos and other apps cannot upload seamlessly in the background as iCloud can. I cannot “restore my iPhone” from a Google Drive backup. Apple uses private APIs for their own software to enable this, so now I must either use iCloud and accept these terms, or make iTunes backups regularly + deal with the inconvenient workarounds required to let things like Google Photos sync everything.

Given that there are two realistic choices in mobile OS these days, both bound to hardware (and update schedules to a combination of mfg + carrier whim), it’s not a great position to be in as a consumer. And yet, what is the alternative? PinePhone?


> 1. Vote with your feet - Put your money in the pockets of the people aligned to your values.

As much as I dislike this invasion of privacy, I still trust Apple's products, ecosystem & stewardship over the alternatives (for use by whole family).

Although this is the first time in recent memory where not having a viable alternative to consider is irksome as I don't see where the negative push back will come from to prevent further user hostile invasions like this - can only hope it ends here and it's not the start of a slippery slope towards govt surveillance.


What scare me is that now that this capability will be out there, a court could compel Apple to go a lot farther


The UK porn filter is already seeing works to extending to fighting against extremism on the Internet.


The problem with 1. is that a tiny amount of people understand or care about the issue. It won't make a difference to Apple bottom line. Then when other companies start doing it, you eventually become tech excluded.

This needs to be dealt with at legislative level.


"vote with your fit" is as ineffective as taking less showers. It is a non-solution (what helps is growing less food in a desert, aligning government with the needs of most people)


with regards to #1, that is what I did last year when I chose to buy apple since safetynet would mean i would be completely hosed in buying a phone that works with privacy centered custom roms while also using important apps. now that apple is completely breaking user trust where you you possibly think people can 'vote with their feet'? this is a libertarian fantasy other than simply not using smartphones altogether?


Apple has a right to protect themselves from being an accomplice to distribution of child porn.

Maybe you freedom fighters should have heeded the warnings over the decades of creeping corporate control and stopped buying their products?

Oooh but that dopamine high from new stuff is so addictive!

Now you’re finding yourself in an overturned boat, adrift at sea, no life jacket, and you’re ready to take on monopoly?

The anti-politics bloc incensed their political hands off approach enabled others to wield politics against them.

Good luck.


non-insignificant -> significant


I used a double negative on purpose there :)


> 1. Vote with your feet - Put your money in the pockets of the people aligned to your values.

Too bad many people have been trapped within the Apple bubble and are unable to jump ship because they have no idea how the rest works, how they get their data over there and don't have the time to do all this.

This is why Apple IS a monopolist and this is how it ends up being a problem.


That would be fine except an informed choice depends on accurate information, and this letter starts off saying things that are patently false.

“ Apple's proposed technology works by continuously monitoring photos saved or shared on the user's iPhone, iPad, or Mac. ”

No it doesn’t, photos just saved in the phone are never scanned by either of the systems Apple is implementing, and shared photos are only scanned in two very specific situations.

Also they avoid pointing out the iMessage aspect of this is opt in only, so it only applies to children and only if parents choose to enable it, and it's off by default.

You can completely avoid all scanning entirely by not opting in children for the iMesage scan, which is irrelevant to you if you don’t have children anyway, and not using iCloud Photo Library. Photos you keep on your phone or share using other services will not be scanned. Anyone just reading this letter wouldn't know any of that. How is this promoting informed choice?

Bear in mind the iCloud Photo Library scan is being implemented in order to meet a legal obligation. Apple is required by law to scan photos uploaded to their cloud service.

So really there are two issues here. One is the question of parental authority over optionally enabling the scan for their children’s use of iMessage. That's a legitimate concern over the privacy of children, no question.

The other is whether this is an appropriate way for Apple to comply with US law requiring scanning of uploaded images, and whether that law is appropriate. Again, entirely legitimate concerns that this letter obfuscates completely.


Probably the main issue is that every time a technology or regulation that starts like that (“you don’t need to worry if you’re not doing anything wrong”) is then inevitably expanded into repressive activities by governments around the world.


Oh absolutely, the laws that started all of this are terribly thought through, all a child pornographer has to do to avoid these controls is not use iCloud Photo Library. That's it, just do that and they can have as much child porn on their phones as they like, and share it however they like using iMessage or anything else.


> all a child pornographer has to do to avoid these controls is not use iCloud Photo Library

Yes, so as it is today, this system isn't even useful for its stated purpose. Apple are not so stupid that they don't know that. Either it will not always be as it is today, or the stated original purpose isn't the ultimate purpose.

After all, the problem you state has a simple and obvious solution: just scan everything on the device.

The only saving grace is that a full database of copyright material won't fit on a current device, so as long as you avoid offending people in power in your jurisdiction you're probably fine for a few years.


Or maybe be they are just doing what they said they are doing for the reasons they gave, to comply with the law, and that’s that. But apparently that not even a possibility you can conceive of.


Related but slightly off-topic: am I the only one that thinks more technology is not the answer to catching crooks? Can’t the police do good old fashioned police work to catch people doing these things? Why does EVERYONE need to be surveilled for the 0.01% (or less?) who don’t behave properly. To further this point: why do we need cameras on every street, facial recognition systems and 3-letter orgs storing huge data silos and maintaining lists etc etc… One thought: is it because over 10, 20, 30+ years the police have been de-funded everywhere and become useless for difficult cases and/or militarised to deal with peaceful protesters instead? Genuine questions.. I just don’t think this surveillance nightmare is the answer, and police could still catch crooks before the internet so what’s the problem.


>One thought: is it because over 10, 20, 30+ years the police have been de-funded everywhere

Police funding has shot up in America. It's everything else thats been cut.

>what’s the problem.

There's no problem. It's just a drawn out power grab with a weak pretext.


>There's no problem. It's just a drawn out power grab with a weak pretext.

In 2018 tech companies reported over 45 million images of CSAM, which was double the amount that was reported the year before.[0] The next year the number of reports went up to 60 million.[1] I wouldn't expect everyone to agree on the proper response to the rapid growth of child-abuse imagery online, but I don't think the problem itself should be dismissed.

[0] https://www.nytimes.com/interactive/2019/09/28/us/child-sex-... or https://web.archive.org/web/https://www.nytimes.com/interact...

[1] https://www.nytimes.com/2020/02/19/podcasts/the-daily/child-...


There is literally no logic to what they are doing. the CP criminals will just work around it and the other billion or so of us are stuck with Apple acting like they're the police, and waiting for the government to send them the next "CSAM" database of political dissent hashes or pretty soon just scanning for anything that governments don't like on the device that YOU OWN. This is a joke, a very dangerous one. This will do nothing but put a slight dent in CP arrests and another huge body blow to democracy.


I couldn't find anywhere in the NY article that described that these efforts these companies are doing is leading to less child porn being created. Is that true? Everything I read and searched for wanted to make it sound like child porn and sexual abuse of children was skyrocketing - nothing concrete about the number of victims per capita per year in the USA or what not. And - of course - it made it sound like this effort is deeply underfunded and needs tons more money behind it because it's exploding as an issue and your next door neighbor is clearly raping children... It felt a bit sensationalist.


Yeah, that article definitely reads like a thinly-veiled scare piece.


> Police funding has shot up in America. It's everything else thats been cut.

What else in America has been cut? I'd be interested in that very long list (since you're saying that everything else is seeing cuts; the understood meaning being that a very large number of major items are seeing their budgets slashed). The spending data I see year in year out, or across decades, is showing the opposite.

Healthcare spending has skyrocketed over the last 20-30 years, including healthcare spending by the government (eg Medicaid, Medicare). They're not slashing Medicaid or Medicare, those programs have far outrun inflation and are drowning the US budget.

Social Security hasn't been slashed. Even the leading Republicans no longer dare talk seriously about cutting Social Security (30-40 years ago they commonly did). Trump could hardly run away faster from that conversation, it's the third rail of US politics, absolutely nobody dares.

Education spending has not been slashed. US teachers are among the best paid in the world and Americans spend more per capita on education at all levels than just about any other nation (while getting mediocre results for their epic investment, as with healthcare spending).

Defense spending of course continues to rise.

The US welfare state has continued to perpetually expand. US social welfare spending is larger as a share of GDP than it is in Canada; and it's closing in on Britain (the US will catch Britain on social welfare spending as a % of GDP this decade). The US social safety net has gotten larger, not smaller, over the decades. Programs like housing first didn't even exist 30 years ago; food security programs like SNAP continue to get bigger over the decades, they're not vanishing.

US Government spending has continued to soar year by year. Typically 5-8% annual spending increases are normal (just look at the radical spending increases during the Bush and Obama years, or any of the recent budgets). Total government spending (Federal + State + Local) has continued to climb, it has not been slashed or reduced. Total US government spending is taking more out of the US economy than it ever has outside of WW2 - you have to go back to WW2 level spending to find something comparable.

Total US government spending has increased by roughly 225% over the past two decades (more than triple the rate of inflation over that time). The soaring spending shows no sign of letting up.

The major US Government agencies - such as NASA, NSA, FBI, DHS, VA, DoJ, etc - have not had their budgets slashed over the last few decades, they keep climbing year after year.

The only big one I can think of is infrastructure spending, which has not kept up with inflation because both sides have refused to raise gasoline taxes.

What kind of results do we have to show for the massive increase in government spending? Are our streets now lined with gold? Things are better than they have ever been, is that right? Is our quality of life equivalent to Switzerland, Norway, Denmark, Sweden? Because we now have their per capita government spending levels. The government systems of the US are spending the equivalent of 45% of the economy each year.


>Healthcare spending has skyrocketed over the last 20-30 years

Aging populations with for profit healthcare will do that.

>US teachers are among the best paid in the world

It's still a comparatively low paying, thankless profession and consistently fails to attract talent because of low pay. Most Engineers, for example, have starting pay similar to a teacher's pay with 15 years of experience. Both jobs require the same level of training and degree (Bachelor's).

>total US government spending is taking more out of the US economy than it ever has,

Government spending is still economic activity. It doesn't 'take money out' of the economy, it contributes to it. It pays workers like any other employer that can then spend that money in the local economies. If your argument is 'government spending is bad' then that's a terrible argument.


We hire, and train, terrible cops in America.

They are basically Reveune Collectors.

I would like to see all cops under federal jurisdiction. Let the FBI train them.

I know in my county of Marin we have way to many just looking for any slight moving violation.

I have felt for awhile that we also need complete bans in certain kinds of tech.

With the exception of always on cop cameras.


Exactly, there's a reason the FBI takes over real criminal cases when they occur - the police are just there to settle petty, inconsequential local disputes.


Inconsequential to you. Consequential to those people involved. Jesus Christ dude not everything revolves only around you or your concerns.


American police has never been de-funded, and if you really think "good old fashioned" police work ever actually worked for most people, you are misinformed.

Without even discussing policing across racial and poverty lines, one only needs to look at police failing to catch serial killers, rapists, and even just house burglars.

The situation was much worst 10, 20, 30+ years ago. Nowadays police have some of their work done for them by technology and they are at least able to catch a few more criminals. Technology has also helped prevent false arrests in some cases. Apple's content scanning is problematic, for sure, but technology in general has been great for community safety.


I don't know why this is downvoted, you are absolutely right.

Prosecutions based, essentially, on community suspicion are what lead to countless black men being wrongfully convicted of the rape of white women.

Police embracing cutting edge science like DNA sequencing is what allows unreliable antiquated evidence like (gasp) eyewitness testimony to be given it's proper weight.

Perhaps people consider DNA evidence to be "good old-fashioned policing" nowadays but it was within people's lifetimes that it was as new was quantum computing is today.

The sooner the "good old fashioned policing" meme dies the better.


I don't think us have thought criminal scanning on our devices is going to fix any of that. Cops don't need to be in my phone unless they have reason to suspect me and have a warrant to search it. Otherwise this is just a corporate proxy for the cops in warrant-less searches of what we're doing day to day on our own devices.


Thanks for sharing your knowledge, yes I am fairly uninformed about police effectiveness which is why I wanted to find out more.


> "Can’t the police do good old fashioned police work to catch people doing these things?"

I'm a detective that works exclusively on online child sexual offences.

The short answer to this is "no", although the question doesn't make much sense to me. Policing has always been near the forefront of technology.

Perhaps you could expand more on what "good old fashioned police work" means, in this context?


Not the OP, but by good old fashioned police work, I assume non-dragnet methods, where everybody's device isn't scanned in an automated way. So instead of sifting through a massive collection of automatically collected data, taken from a vast majority of innocent people, you'd deal with explicit reports of CSAE. You'd then be able to get a warrant to obtain ISP (and other) records, cross-reference and proceed from there. If there's reasonable suspicion, you'd get the suspect's address and go talk to them in person.

Before we started trying to push government-sanctioned and unwanted spyware engines on private devices, I imagine the process looked something like that. Is this incorrect?


The system is basically what you describe except the explicit report is precisely what Apple send to NCMEC.

By the time it gets to the police, there will be an identified crime.

This has been the case for many years. I don't have the numbers to hand but I believe NCMEC receives around the order of 100 million referrals a year.

EDIT: it's 20 million according to https://www.missingkids.org/ourwork/ncmecdata

https://www.missingkids.org/footer/media/keyfacts - around 99% are from tech company referrals, 1% from members of the public


But we've already established there is no public oversight over the contents of the NCMEC database and that there cannot ever be, by design. Furthermore, it's known to contain hashes of non-CSAE images simply because they were found in a CSAE-related context.

So how can this system guarantee civil freedom? How can it be guaranteed that it won't be exploited by the small number of people in power to actually inspect it and manipulate it?


Have we established that? Certainly not a reality I recognize. The processes involved in CSAM databases like NCMEC/ICSE are many years old. If it leads to widespread civil rights abuses, where are they?

Google Drive has 1bn users. Google scans content for CSAM already. Shouldn't we be seeing these negative side effects already?

Proponents of these systems can point to thousands upon thousands of actual "wins" (such as identifying teachers, police officers, sports coaches, judges, child minders etc who are pedophiles) and detractors cannot provide actual evidence of their theoretical disadvantages.

No system is perfect, no system "guarantees civil freedom", this is not a fair test. The actual evidence suggests automated scanning for CSAM is a net win for society.


“ detractors cannot provide actual evidence of their theoretical disadvantages.”

This is the concern. The system is hidden and you have no way to challenge. You are simply “trusting” that this system is working well

Who watches the watchers?


That’s the major concern I have: take as a given that NCMEC is on the side of the angels here, what happens when some government demands that Apple help identify anyone who has an image shared from a protest, leak, an underground political movement, etc.? The database is private by necessity, so there’s no way to audit for updates.

Now, currently this is only applied to iCloud photos which can already be scanned server side, making this seem like a likely step towards end to end encryption of iCloud photos but not a major change from a privacy perspective. What seems like more of a change would be if it extended to non-cloud photos or the iMessage nude detection scanner since those aren’t currently monitored, and in the latter case false positives become a major consideration if it tries to handle new content of this class as well.


> If it leads to widespread civil rights abuses, where are they?

This is disingenuous, since up to this point these databases weren't used in systems residing on end user devices, scanning their local files.

The disadvantages aren't merely "theoretical"; they just haven't materialized yet since the future does not yet exist. To ignore these obvious faults by hand-waving them as theoretical is to blatantly ignore valid concerns raised by thousands of informed citizens.

No system is perfect, but that doesn't mean there aren't some systems that simply go too far.


The distinction between scanning locally before upload, and on server after upload, is an implementation detail. The only reason Apple have done this is to allow them to implement E2EE for iCloud without hosting CSAM.

All arguments relating to anything other than this are arguing against something that doesn't exist.

I don't dispute that fictional proposals in the imaginations of HNers might pose a grave (fictional) threat to civil freedoms.


I don't really understand what you mean when you say it is an "implementation detail". It is definitely a detail of the implementation, but it is an extremely important detail that makes all of the difference.

Given that it is just an implementation detail, I guess they can simply change it and then the rest of us will be at peace?

I think it would do you well to see this quote:

> ”Any proposal must be viewed as follows. Do not pay overly much attention to the benefits that might be delivered were the law in question to be properly enforced, rather one needs to consider the harm done by the improper enforcement of this particular piece of legislation, whatever it might be.”

> -Lyndon B. Johnson


> you'd deal with explicit reports

Where would these reports come from though? Without these dragnet methods, it would seem like a very simple matter to get away with owning this kind of material.


This is a reality you have to accept. It is a simple matter to get away with owning this kind of material, dragnet method or not.

There is no point in trying to eradicate all such material and bring the number to absolute zero. It's impossible. The only way to semi-ensure this would be absolute slavery of the citizenry, which I hope is a non-goal.


Nobody is proposing that though...

It's a fact that NCMEC referrals have identified teachers who were secretly pedophiles, who were subsequently banned from teaching. Most people see this as a good thing. If you want to do away with this, you have to bring a compelling argument.

I support everybody's right to argue for the type of society they want to see, but there's an arrogance/conspiratorial flavor to a lot of the comments here that suggest they don't really understand what they are up against. There are actual benefits.


> If you want to do away with this, you have to bring a compelling argument.

I'm a law-and-order guy myself, have applied for (technical) position in police and would even accept a paycut for it.

But it is with this like with other security related things, many people only considers two of the three cornerstones:

- confidentiality

- integrity

- availability

Same here:

Justice is not only about punishing every very bad guy.

It is also very much about not dragging innocent people through the worst accusations an innocent person can be dragged through.

Keeping people safe is not only about rescuing people from hostage situations - it is also about not ramming in the door at someone relaxing on their couch because some random guy or gal called over ip phone.

Too many people already go through huge problems because of false accusations. You say you work in the field so I guess you must have heard about cases were perfectly innocent people commit suicide because their lives got ruined because of what later turned out to be baseless accusations.

Justice is also about not creating a system that creates even more of this suffering, right?

(BTW: I really enjoyed the an0m trick and a few others. I root for you guys, but lets keep it real and not set up the worst footgun for the future since the well meant Jew lists in Europe early last century.)


I completely agree with your characterization of the various concerns.

I don't agree with the characterization that this system will cause baseless accusations.

These systems have been in place for many years and have proven themselves useful and reliable.

Apple have tried to implement it in a way that allows them to turn on E2EE on iCloud, and HN has turned that into a conspiracy theory.


I don't deny the system has some benefits, they just aren't benefits I will gladly give up my own freedom for nor the freedom of my fellow citizens.

I don't want to do away with anything; I just want "you" out of my devices, where "you" don't belong anyway and where you haven't been up to this point. (Well, you still aren't there because I don't use an iPhone, but hopefully the point is clear.)

Also note that I'm not against the use of such technology in certain situations, such as in unencrypted cloud storage. Though concerns with lack of oversight exist there as well! But installing this on end user devices goes a (huge) step too far.


Sorry, to clarify, you do in fact support Apple running the exact same scans on their server side CPUs? A situation that has the exact same effects on society...?

Your only concern then, is with a future authoritarian policy of your own imagination?


>A situation that has the exact same effects on society...?

Of all your (mis)leading questions, this one sticks out the most as being of particularly bad faith.


Can you explain why?

Commenters here are making claims about false positives, arrests of innocents, lived ruined etc. These effects are identical whether uploads are scanned before or after transmission.

What are the different effects of society from scanning the upload before transmission?


Yes, but

> A situation that has the exact same effects on society...?

This is patently false.

You do come off as arguing in bad faith, resorting to debasing any concerns of policy misuse as simply a product of the imagination of the person raising the concern. As if authoritarian policies and erosion of freedom/privacy are something completely baseless, impossible and unheard of, instead of happening all the time.


It's still not clear to me whether you are raising an issue with scanning client side, or your concern is actually with some other policy you expect to be implemented in future.

What is the most obviously different effect on society? I don't know how anyone would even know what CPU calculated their file hash.


Thanks for your reply - I meant that police caught criminals before the internet (I do not know the effectiveness and am unknowledgeable on this subject generally), however they did that, getting out there speaking to suspects and victims, and investigating with evidence I would guess


Well, police still investigate with evidence, but the potential scope of "evidence" is pretty much the whole physical universe. File hashes and TCP packet captures are evidence, DNA fragments are evidence, weather patterns are evidence, in the same way that people's memories are evidence.

Through the decades, the respect shown to eyewitness testimony has generally declined, and crimes with no eyewitness evidence are still expected to be solved.

For offences with a huge online aspect there is no prospect of "getting out there" until you work out where "there" is, because it could be anywhere in the world.


First very brave of you to post here and thanks for that. Second thanks for all your hard work in keeping this world sane. However, I would point out that this is a complete invasion of privacy and essentially working around the 4th amendment in both spirit and the law via using Apple as a proxy to spy on us. I realize police just want to do their job and have to push on the limits, but with all due respect I think this is going too far. That's why we push back when something as ridiculous as this happens. I don't want to be policed on my own devices, and the only way police should have a way into that is with a warrant, then you can drop a world of security on me. The pedos will just work around this and this is the first step into allowing police into all our personal devices while the criminals work around it. Anyway, again I mean this in a respectful tone, I just think it goes way too far. Cloud drives are already being scanned and that's with corporate permission on their own devices, so that's tolerable if undesirable but what Apple is doing here is going too far.


Do you have any qualms?

For you does the CP protection end justify any means?

Where would you personally draw the line on mass surveillance by LE for the sake of your specific LE goals?

CP aside, are there other crimes that you feel should be folded-in to a dragnet like this?


You probably don't realise it, because you're coming from a perspective that has been heavily influenced in a particular way, but some of these questions are kind of insulting and don't really assume good faith (or even basic decency) on my part.

> "does the CP protection end justify any means?"

Like, is this legitimately a question you think I might answer "yes" to?

This is the equivalent of "do you support the rape of children?".

I'll gladly comment on more specific points if you are genuinely struggling to understand how Apple could honestly implement this system in good faith.


I apologise if you you genuinely felt my questions were assuming bad faith. It was not my intention.

> "does the CP protection end justify any means?"

It's a style of argumentation. Not personal. When trying to find where to draw a line in the sand, one way is to draw a line that almost certainly encompasses us both. We are obliged to consider: if not this line (obviously) then what line?

My intent was to find your limit. Do you have any qualms with what you may do under the law? For you personally, how much erosion of innocents' liberties would be acceptable?

Based on your earlier comment, I was not asking Apple, I was asking a LEO who acts with some but limited justification. Legal and moral. And I would like to ask in good faith about how you see those limits.

>> This is the equivalent of "do you support the rape of children?

No need to make it black and white. Almost nobody supports this. I am sure you don't. Please assume good faith on my part too. There are always trade-offs. How far would you go?


How are you expecting me to describe this limit?

I think it's legitimate for companies to implement automated systems, such as CSAM and spam filtering, to limit the amount of unwanted material on their networks. I don't have any problem with Apple, Google, and Microsoft, checking the hashes of files I upload (or attempt to upload, in Apple's case) to their servers against ICSE. I would have an issue if employees of those companies had unfettered, unaudited access to users files.

Outside of giving my opinion of a specific proposal I don't know what you expect me to say.

Perhaps you could describe your own "limit" to how much avoidable suffering is acceptable to you before you would support automated scanning of uploads. I don't personally believe it's possible to precisely explain an overarching "limit" in situations that balance competing moral and philosophical concerns.

---

I'm being rate limited now due to downvotes, might not be able to respond further


Sorry you got rate-limited. On HN, new accounts are rate-limited by default because of past abuses by trolls. I've turned that off now for your account (I'm a mod here).


It's probably timing that is limiting your responses. AFAIK downvotes won't do that. And I didn't down-vote you.

I accept your answer, and acknowledge that different perspectives are valuable.

My own opinion is far less relevant as I probably will not be implementing or executing systems that target information from large swathes of a population for inspection by my organisation.

Nevertheless, let me give it a shot. I asked you about the social cost of false-positives. You reciprocated by asking me about false-negatives. It's tough - on both sides of the equation.

I try to apply weighting. If the dragnet would be targeted to a limited number (say 100) then it could easily be justified, since the relative horror of one over the other is surely in that ball park. Maybe even 1,000.

The problem for me is that mass surveillance such as the subject of this discussion is not numerically constrained. It's trawling the entire ocean floor for the one or two species that may be legally caught.


I don't think police has been defunded more than the funds have been put into unnecessary things like surplus military gear, settlements to defend the vast minority of bad cops, etc. That takes the oxygen out of everything including what most people consider fair police work.


Well what they’re searching for is done privately and the victims often don’t go the police. And good old fashion policing was to ignore these crimes.


Where are you hearing that cops are getting defunded? The curious should look up the budget of the NYPD and LAPD looking mighty funded to me!


For what it's worth, police funding in US rivals and exceeds military funding of other countries.


Asking in earnest: are you speaking in absolute terms or relative terms? That is, does the percentage of funding (as compared to the rest of the budget) exceed most other countries’ military funding, or just the absolute amount?


Absolute.

Today, the U.S. collectively spends $100 billion a year on policing and a further $80 billion on incarceration.

Just the spending on policing is larger than what other countries spend on military. Actually, the only country that spends more on military is China [1], after taking in incarceration costs, it is 2.5 times the military expenditure of India.

[1] https://www.sipri.org/publications/2021/sipri-fact-sheets/tr...


Thanks for the link, but have you read it?

"the only country that spends more on military is China"

1 1 United States 778 4.4 -10 3.7 4.8 39

2 2 China [252] 1.9 76 [1.7] [1.7] [13]

3 3 India 72.9 2.1 34 2.9 2.7 3.7

:facepalm:


I compared US' __police__ spending of $100 billion, to the __military__ spending of other countries.

I assume you missed that. The comparison is not between militaries, but US' police vs the military.

:facepalm:


Ay, I did miss that, didn't go up enough to see the parent :facepalm: China do spend more on defense than policing, at lease nominally, per page 5 of the 2021 budget proposal hosted on Xinhua:

"National defense spending was 1.267992 trillion yuan, 100% of the budgeted figure. Public security expenses totaled 183.59 billion yuan, 100.2% of the budgeted figure. "

http://www.xinhuanet.com/english/download/20210313budget.pdf


> Can’t the police do good old fashioned police work

There is no "old fashioned" work. The police is a relatively new concept. And these issues (Child Porn) are really 21 century issues.

> Why does EVERYONE need to be surveilled for the 0.01% (or less?) who don’t behave properly.

Because you think they are surveilling people to catch the bad people. You have too much faith in the system.


So, if I understand correctly, the NCMEC provides a bunch of hashes of CSAM for Apple to match. This way Apple doesn’t get exposed to the content of images themselves? Then Apple will provide a user’s details plus the IDs of matching content. This identifies the direction of travel for any CSAM content?

So, now NCMEC and any local NCMEC can provide new hashes and identify – possibly even historically – the epicentre of the distribution of a group of images.

Except, if Apple only gets the hashes, what’s to stop a bad actor in a NCMEC from providing non CSAM images to Apple for other purposes?

Seems like this technology should be illegal without a warrant. It’s a wire tap.


NCMEC's database already contains non-CSAM. What you suggest is not theoretical, it's reality today.

I really think people are missing this point. NCMEC's database is not an infallible, audited and trustworthy source of despicable imagery. It's a mess contributed to by thousands of companies, individuals and police.

It's also so intertwined with the FBI that I don't think it's truly correct to call NCMEC independent, given FBI employees work at NCMEC, including on the database.

The FBI's involvement is particularly notable here, given their past usage of mass surveillance.

This is a truly terrifying precedent.


Wow.

Also, many people trust Apple to do ‘the right thing’. But if Apple are trusting NCMEC and their local equivalents to provide valid CSAM hashes, Apple will never know if their technology is being abused, so they can conveniently point the finger elsewhere if it is.

I can’t imagine this point has escaped their awareness.


> Also, many people trust Apple to do ‘the right thing’.

Well that's silly of them, thats the motto Google switched to after they gave up on the whole not being evil thing.

( https://www.engadget.com/2015-10-02-alphabet-do-the-right-th... )

:P


> NCMEC's database already contains non-CSAM.

How do you know?


> What you suggest is not theoretical, it's reality today.

As others have stated, do you have proof? Inside knowledge of this supposed reality? Even some major news publication with no direct evidence would be credible enough to support this statement.


The database contents are a secret. There are no news articles about it. However, I do know for a fact the database has serious problems (I'm not speculating).


Can you provide more context for this? I could definitely believe this possibility but it’s a big claim to make.


I know you mean well but people who know how bad the database is also can't tell you how they know, because the database is cloaked in secrecy for obvious reasons.

All I can say is that I'm not speculating, I know for a fact the database is as broken as I say it is.


https://www.apple.com/child-safety/pdf/CSAM_Detection_Techni...

Apple uses sophisticated cryptography to make absolutely certain that you cannot hold them accountable for abuses of this system against you, NONE of which are prevented by its complex construction.

The private set intersection is an alternative to sending you a list of bad-image hashes which uses significantly more bandwidth than simply sending you the list. This alternative is superior for Apple because if they distributed the hashes it would be possible for someone to prove that they had begun matching against innocent images (such as ones connected to targeted races or religions, or sought to out particular pseudonyms by targeted images connected to them). It is inferior for the user for precisely the same reasons.

Some might be fooled into thinking the "threshold" behavior, somehow is in their interest: But no, Apple (or parties that have compromised them) can simply register the same images multiple times and bypass it and the privacy (for apple, but not for you) makes it impossible to detect that they've done that.


The existence of Spyware Engine is a problem, not technical details of how current version suppose to work. Suppose to work because we have not idea what it will do. Did you see the source code?

We are not that naive to believe for a second that this Spyware Engine will do what is claimed.

This is simply viewed as calculated attempt to legalize Spyware Engine on a personal device covered by some bs story intended to get emotional response from the audience to fog the real issue. It is always done this way so no surprises here. All limitations of free speech on the web in Russia were done under umbrella of protecting children and immediately used against political opponents to this very day.

This is discussion about values [1] not technical details! Simply because Spyware may be already installed. It should not become any legal though or morally accepted anyhow.

[1]https://news.ycombinator.com/item?id=28084578


I agree with you, but if you'd take the time to read my post you would see that I am arguing that it's not good even by their own claims!

They talk a lot about complex crypto to protect privacy but the primary thing it's doing is hiding what apple is matching against, which shields them against accountability.

I fully agree that even if the behavior were currently threading the needle it would still be an extremely bad move.

See also this prior post of mine: https://news.ycombinator.com/item?id=28083838


I've read this https://news.ycombinator.com/item?id=28083838 and agree. Added to my favorites. Very good points.

>.. it would still be an extremely bad move.

My view that this should be considered not only as "bad move" but as "criminal activity" Reasons I've described here: https://news.ycombinator.com/item?id=28096948


> They talk a lot about complex crypto to protect privacy but the primary thing it's doing is hiding what apple is matching against, which shields them against accountability.

NCMEC partners are not allowed to share the raw hashes, and I imagine Apple's contract with NCMEC to create a photo-comparison tool that will have auditable code (well, compiled code, but still) includes such a provision to slow or stop CSAM sharing enterprises from completely reverse engineering and cheating the system.


What they are making available is sufficient to 'cheat' the system in the sense that if you have an image you are concerned might match in some database you can modify it until the 'perceptual hash', which you can compute on your own, changes. The novel changed image is then unlikely to be a match in the database.

You don't have to have a copy of the database to be fairly confident that your modifications have made a target image non-matching. You would have to have the database in order to gather evidence that the matching was beginning to be used for unlawful, rights violating purposes, such as collecting targets for genocide.

I think it's a safe assumption that this sort of system is only effective against idiots-- which isn't an argument against it: lots of effective anti-crime measures mostly work against idiots. Adding functionality which destroys accountability which at most improves the system against non-targets, however, doesn't seem like an acceptable trade-off.


No, the existence and non-existence of a spyware engine doesn't constitute a problem. These companies spend billions of dollars, paying thousands of engineers a year, to develop solutions to problems that often don't even exist. It's safe to say that the sheer scale of every one of these tech companies means that this isn't something that incrementally gets worse, within a 6 month sprint they could push an update overnight going from step 0 to step 5 of total spyware.


> The private set intersection is an alternative to sending you a list of bad-image hashes which uses significantly more bandwidth than simply sending you the list.

How can the image hashes take up more space than the images themselves? Are you sure about this?


Using private set intersection takes more bandwidth than the server simply sending you a list of hashes to match against. (Significantly more, once you're taking about testing multiple images.)


That’s not what they wrote, double check your comprehension.


Obviously I don't understand it then, care to explain?


Notice, for example, that Thunderbird is sending file names and SHA-256 hashes when you open most (e.g. .pdf) attachments, in the clear, to Google. This seems worse to me (in the Apple case, the information is revealed only if enough files from one device match against a predefined hash list) and nobody really cares...

I have just tested with a fresh profile with a freshly downloaded thunderbird-78.12.0.tar.bz2 (x64 Linux, en-US) using a Burp proxy. Here is the function that does it: https://searchfox.org/mozilla-central/source/toolkit/compone...

Here is the about:config tweak to turn it off: browser.safebrowsing.downloads.remote.url

Here is an example request: POST https://sb-ssl.google.com/safebrowsing/clientreport/download...

Content is Protobuf-encoded payload containing:

- full path to the mailbox including the username and "secret" random profile name (e.g. /home/jenda/.thunderbird/ioi8uk0k.tbtest/Mail/hrach.eu/Inbox)

- SHA-256 of the file

- attachment file name (e.g. 10-test-blahblah.pdf)


I suspect that if they were strictly bitstream hashes and not perceptual hashes, people would be less concerned.


No, i'm sure people would still make a fuss. Perceptual hashes are required to prevent criminals from slightly changing pixels within CSAM photos to avoid detection.


I think everyone knows why perceptual hashes are being used. I'm not sure how you're certain that people would make a fuss. Almost all of the discussion I have seen surrounding this has spent a tremendous amount of time dwelling on the risk of false positives.


Do you produce most email attachment in your inbox yourself? Do you produce most photos on your iCloud yourself? The point is anti-virus (purposed) hash upload is different from your private iCloud content hash upload.


At least mozilla lets you disable that, does apple?


after seeing recent events around mozilla, I suspect they would not allow disabling that if they could get away with it

(don't get me wrong, "everybody sucks")


Yes, however, there is little practical advantage in this, if nobody knows about it and so nobody turns it off.


I had no idea. What is the reason for this default behaviour?


E-mail is an old technology which had privacy bolted on top of it as an afterthought.

Everybody knows that e-mail is not the best medium if privacy is of utmost importance.


Apple is forgetting the network effect of professionals that made them billionaires. I personally evangelized my clients in the past for years to switch to Apple ecosystem.

In my view this is weaponization of personal devices on a mass scale with clear intent of normalization of surveillance state on a global level.

The fact that this comes after NSO spyware investigation speaks volumes. They don't care about privacy. They care about control over user data.

I hope this will give power to Linux and FOSS movement and be the beginning of serious Apple detox.


The empowerment that you feel you once lent to Apple may no longer be part of their growth plan. Whether or not it actually mattered at the time.


The empowerment at the time was a honest and professional assessment of using MacOS X vs Windows. Apple embraced PC vs Mac marketing as I can remember.

The empowerment today is the same professional stance. Explaining the technical facts to my customers. Nobody will like the idea that their phone or personal computer will actively police on the behalf of big brother.

Serious business people don't have to "follow" tech closely. That's why they pay us.


Your professionalism is admirable. In this context, perhaps that's all it counts for.


Thanks. Smart one. :)


As a fellow professional, I think this is kind of nonsense.

Apple (and all these companies) also operate in a world with a populace that WANTS surveillance for these areas, does not agree that it is a slippery slope, and has voted in governments where that eventually will force Apple to install an even worse back door.

The Linux and FOSS movement have never been up to the end user device challenge, and they too operate in a world with a populace that WANTS surveillance for this stuff.

The solution to true privacy is never going to be “switch to FOSS” for most users, as that’s sort of like saying “get a degree in math”. It will work for some, but it will be an unstable experience.

The solution has to be to a) ensuring democracy survives and b) convincing most people that the right to privacy is more important than finding child pornographers, and c) to vote for representatives that will uphold that right.


Sorry. I am biased on "political" solutions. In the country where I live politicians are just mercenaries for hire.

Most of them are direct successors of communist apparatus or political project of inner circles with economical power.

Looking at the EU legislators, they will like Apple idea a lot, so my options are limited to technical evangelism for people that care.


Fair, in those situations I think FOSS / darknet solutions increasingly will be the only way to have true privacy.


I'm not usually the most idealistic when it comes to FOSS, but it's stories like this and Apple's stance of "trust us, we promise we know what we're doing and that nothing is going to go wrong" that makes me think all software should be legally required to be open source.


It would be if y’all slapped the AGPLv3 on your code instead of Apache/MIT/BSD.

If we were to perform a cyber analog of what the founding fathers did for the USA 250 years ago, it would be something along the lines of declaring all software free from the tyranny of corporate control and state oppression. Free in perpetuity so that our digital projections onto hardware shall reside comfortably each in their own pursuit of happiness.


> If we were to perform a cyber analog of what the founding fathers did for the USA 250 years ago, it would be something along the lines of declaring all software free from the tyranny of corporate control and state oppression.

… and then proceed to conduct asymmetric warfare against state and corporate cyber systems alike.


If we use servers in China and Russia it’s essentially legal, right?


How would OSS help in this case? Someone can run a hash, find an “offensive” photo and jail you. Shit, if someone is out to get you, they will dig up any photo on your phone, put it in the “bad hashes” database and just wait for Apple to catch you. And then you’re effed, because no court will want to publicly display a photo that may contain child pornography, so you’ll be jailed without a fair trial.

OSS won’t help here. This system must be shut down, and the gov should deal with pedos in a different way.


They said FOSS, not OSS. F means free as in freedom. As in you’re free to modify the software running on your phone at will and to your liking and Apple or some government can’t get in your way.. not even for the kids.


Processor microcode is closed source (Intel/AMD), including most of the BIOS code. You are having hard time to build your devices. Maybe we can see change in the future.


FOSS is expanding down the stack.

> coreboot is a Free Software project aimed at replacing the proprietary BIOS (firmware) found in most computers. coreboot performs a little bit of hardware initialization and then executes additional boot logic, called a payload.

https://github.com/coreboot/coreboot


* enter RISC-V *


RISC-V is "nice to have". I do not know of anybody who sells RISC-V computers or boards at a reasonable price.


Maybe it does not seem a reasonable price to you, but for people who don't know, the HiFive Unmatched [1] sells for 679$ [2].

This is the most powerful RISC-V platform you can buy today. It comes as a Mini-ITX board including 16Gb of DDR4 ram, 1Gbps ethernet, usb 3.2, PCI Express and NVMe.

Of course it is less powerful than an x64 machine at the same price point, but it should work reasonably well when paired with an SSD and a graphics card.

[1] https://www.sifive.com/boards/hifive-unmatched

[2] https://www.crowdsupply.com/sifive/hifive-unmatched


I find it weird how none of these links state the CPU frequency prominently (or at least I didn't notice it). Not even the product sheet has it.


On another hand, I don't remember the last time I was interested in a CPU's frequency. In isolation it gives no insights about the performance of a CPU, not even single-threaded performance. Even with the exact specs I wouldn't know how to interpret them, and I doubt many people would.

Nowadays I just check the benchmarks of a CPU to have a rough idea of it's performance.


1.5GHz for the four high speed cores, by the way.


So we force Apple to migrate away from ARM? Or should the next iPhone have a swappable CPU?


I’m not sure I understand how microcode is a problem here. Microcode isn’t an OS. I can compile for arm 8.3 and execute those instructions on apple silicon just fine. The kernel is the only thing that gets in the way, not the microcode…


I guess they mean the coprocessor stuff, ime and whatever the AMD one is called.

I mean this does forget the point that absolutely nothing is stopping anyone making a riscv processor which also has such features; we just have to take the manufacturer’s word that they don’t? Riscv isn’t the solution to any of these problems, but hopefully it’ll enable a few more ethical manufacturers to pop up who do make this stuff their core mission.

Probably not, though.


Force? Apple can continue to mistreat their customers, and they'll beg for more.


So all 12 people in the world who know how to read, modify and install and OS can run their paranoid versions of iOS?


Personally I'm hugely in favour of phrasing it in terms of "Right to Repair". Most people don't know how to repair their car or John Deere tractor either: and that's where the experts come in who do it for you.

In this case, you or me might be the expert, who can provide solutions for other people such as an "installer" to do whatever.

I personally don't really have a problem with Apple's plan here by the way; I think the way they're doing it is actually quite balanced and reasonable. But other people should be free to make a different personal decision on this. What I would like is to modify some other aspects of my iPhone though, like some things on the keyboard that still bug me after several years.


If it were FOSS one person could build a paranoid version and everybody else could use it.


Just buy a device with it already installed, Calyx and Fairphone offer these for instance.


The common, instinctive reaction in tech circles & HN seems to be some version of "consumer action," vote-with-your-dollar stuff. This isn't going to work. It almost never does. At most, it'll be affect a minor tweak or delay.

This is a political issue, IMO. A thousand people protesting outside Apple HQ and/or stores is worth more than 100 thousand consumer striking. IMO, the main non-political avenue here is alternatives, especially FOSS alternatives. It's hard to see a way to widely adopted FOSS phones from here, but if we had a viable alternative things would look different. That's producer action, rather than consumer action.

Make it an issue in elections. Stand outside Apple stores with a sign. Push for a Digital Freedom law. Etc. An Apple labour union, maybe. The conscious consumer game is a dead end. It usually is.


> The common, instinctive reaction in tech circles & HN seems to be some version of "consumer action," vote-with-your-dollar stuff. This isn't going to work. It almost never does. At most, it'll be affect a minor tweak or delay.

If "vote with your wallet" doesn't work and only causes, say, 1 million people (out of the billion+ customers) to switch off of iOS, doesn't that mean that the rest of the billion+ people really don't care enough? Maybe a few million more accept the tradeoff of getting iOS for 'scanning my photos for csam and having the ability to move off of iOS later if they expand usage or this system otherwise has issues'.

Maybe, just maybe, even if a million people complain about this and apple loses an extreme amount of iPhone sales upwards of 20 million, they can still sell 220 million[0] iPhones to people who don't care. We are certainly the vocal minority here - whether or not that warrants government intervention or otherwise harms our society doesn't really matter when Apple still makes tens of billions in profit a quarter.

0: https://www.macrumors.com/2021/04/07/2021-iphone-sales-forec...


>>If "vote with your wallet" doesn't work and only causes, say, 1 million people (out of the billion+ customers) to switch off of iOS, doesn't that mean that the rest of the billion+ people really don't care enough?

IMO, no. This is silly logic. I don't particularly care that Apple scan my photos.

First, most people don't care about or understand any particular issue well. You might care about the freedom issue here. Another person cares about oceanic biodiversity. Another person cares about factory working conditions. All of them make purchasing decisions that affect all of these. The civilians will always outnumber activists 100-1 on any particular issue.

Second, in this particular case, choices are few and poor. Hence exasperated declarations of going back to dumb phones or DIYing something.

Third people's individual purchasing decisions don't have any effect on the thing they're trying to affect.

In any case, my point was empirical, not theoretical. There are very few significant examples of consumer action amounting to anything. Where it has (eg dolphin safe tuna), consumer action was one piece of a much bigger puzzle, and more about raising awareness than direct. IE people buying dolphin safe tuna also become insistent on regulatory action. By the same logic, you could conclude that people don't care enough about anything. In which case, we come to the same conclusion.


From the letter:

> Apple Inc. issue a statement reaffirming their commitment to end-to-end encryption and to user privacy.

Apple has no commitment to end to end encryption or user privacy; the premise of this letter is incorrect.

https://www.reuters.com/article/us-apple-fbi-icloud-exclusiv...

The data in iCloud is for the most part effectively unencrypted. iCloud Backup serves as a backdoor to e2e messaging, and iCloud Photos aren't e2e at all. If you are using iCloud Photos or iCloud Backup today (on by default and enabled on most iPhones), you are already uploading all of your data to Apple presently in a way that both Apple and the USG can read without a warrant.

This is by design.


Somewhat offtopic, but the subject caused me to make a disappointing realization:

The most sure way for someone to obtain child porn images that won't trigger a hit in these known child-abuse-image databases would be for someone to take new photos.

Is there a reason we can be confident that these databases aren't creating a market for novel child abuse at a greater rate than they're taking abusers out of the community? Esp since presumably abusers are only a (small?) subset of people who view these images?


The target of NCMEC's database is stopping the distribution, and hopefully encouragement of CSAM. Besides how the distribution can negatively impact the mental state of victims, distribution and more eyes on it might cause more people already on barely-legal parts of the internet (ie. qanon message boards) to learn about ways to obtain CSAM and become part of the market of buyers for such material. If you stop that, then the population of people looking for CSAM shrinks and makes it less economical for criminal enterprises to target children specifically for the production of the illegal material.


> The target of NCMEC's database is stopping the distribution, and hopefully encouragement of CSAM.

But potentially with the unintended consequence of encouraging the creation of more child abuse material, in order to evade the detection of recognized images.


Or if it's just a simple hash, change a single, impercetible pixel in the photo.


See this other HN thread - https://news.ycombinator.com/item?id=28091750 - they make it a 'perceptual hash' instead of a bytestream hash since they know a lot can and will be done to try to evade PhotoDNA and NeuralHash.


I signed this, but I'm very doubtful this will ever achieve something.

But this made me wonder: is there an history of people complaining about this sort of things and actually achieving something?

I think this might have happened with MSFT's hailstorm/passport, where in the end industry opposition meant the project was abandoned, but I can't recall other instances.


Last year Apple reacted after the outcry about MacOS tracking[1] so there is some hope. That was not a deliberate product and probably more of an unintended behavior though.

I'm really hoping this was just a team at Apple that got blinded by all the cool cryptography they came up with to solve a problem and that the higher ups will react if we screech loudly enough.

[1]: https://sneak.berlin/20201112/your-computer-isnt-yours/


> Last year Apple reacted

How did they react? Isn't it still in effect? Still sending unencrypted messages when apps are opened?

As far as I'm aware, they plan on having OCSP encrypted, and plan on giving users a choice.

I honestly do not get the outrage over the apple's client side scanning. It's outrageous, sure, but... much of the same people seem to have been ok with unencrypted telemetry being transmitted for every executable run. Not to mention that every executable will also be logged at some point, so apple knows every program you've ever run.


Not sure about the OCSP encryption but they did remove their VPN bypass in 11.2 (which was the bigger issue for me at least)


is there an history of people complaining about this sort of things and actually achieving something?

One of the few tiny (and now seemingly completely void) victories I remember was the PIII processor serial number (1999): https://news.ycombinator.com/item?id=10106870


Google cancelled its DoD drone program because of employee protests. https://www.fedscoop.com/google-project-maven-canary-coal-mi...


I’m sure they just got Lockheed/Boeing/General Dynamics/Northrop/Raytheon/IBM to take it on for 10x the price.


Not in a sense of "signed letter", but more like a "huge feedback". Six years ago MSFT planned to reduce free storage for existing OneDrive users to 5GB (after bumping it to 15GB + 15GB of Camera Roll bonus) [1]. Like other users, I was unhappy as I have been using OneDrive with the promise of bigger storage than Google.

Sure enough, over 72k votes at OneDrive's UserVoice [2] cause MSFT to back down. They still reduce the storage to 5GB, but existing users (myself included) can opt out of it.

[1] https://www.theverge.com/2015/12/11/9890966/microsoft-onedri...

[2] https://onedrive.uservoice.com/forums/262982-onedrive/sugges...


> is there an history of people complaining about this sort of things and actually achieving something?

The WASP did it to bring the browser vendors to comply to web standards, Netscape ans MS back then. In the EU you could make the case that user group lobbying influenced regulation.


This letter doesn't really read correctly. If you're going to write an open letter, this might be better worded.

>Apple's proposed technology works by continuously monitoring photos saved or shared on the user's iPhone, iPad, or Mac.

It does a check if it's being uploaded to iCloud Photos. It is not (currently, at least) continuously monitoring photos saved or shared; shared is Messages specific for child accounts.

>Because both checks are performed on the user's device, they have the potential to bypass any end-to-end encryption that would otherwise safeguard the user's privacy.

There is currently no E2E encryption for iCloud, so short of HTTPS... what is this supposed to mean? There is literally no privacy at all if you upload to iCloud currently.

If anything, it feels like Apple's client side system enables something closer to E2EE while maintaining the ability to detect child porn on their platform.


>If you're going to write an open letter, this might be better worded.

>It does a check if it's being uploaded to iCloud Photos.

The letter has a perfect wording because it doesn't relay on things unknown or said by the interested side with agenda to push Spyware Engine using some BS cover story. Wording reflects the real danger of such step.

Your continuous attempts go into tiny little details of what Spyware Engine would do at the first stage and support this BS cover story about children is irrelevant to the problem. Besides, How do you know? Did you see the source code?

This cover story has nothing to do with the issue and invented to get emotional response from people and make them stop thinking about what is actually happening.

And what actually happening is an attempt to legalize Spyware Engine that will do completely other things that are claimed and who knows what it will do...

We are not that naive to believe for a second that this Spyware Engine is going to do what it is claimed to do. This attempt of it's legalization is done just to avoid future scandal once it detected.

Again, it doesn't matter what Spyware Engine would do at the moment of installation. What matter is it's existence on a personal device which should be prohibited completely. No one in healthy mind would believe that even if Apple would say that Spyware Engine would not be installed it would indeed not be installed. It should be checked very carefully. It very may be already installed. The fight now is more about legalization of it and values. Not about Apple. Not about what their specific version of Spyware Engine does.

>There is literally no privacy at all if you upload to iCloud currently.

This is exactly the reason why some people do not use iCloud and not going too. The whole idea of putting personal thoughts/ideas/photos into some cloud in open/de-codable format is completely idiotic. So I agree that this malicious practice is done for some time now by Apple . It is unacceptable. It was encouraging people to loose own privacy which is the opposite of what should be in a healthy free democratic society respecting human rights. It was bad long enough but now there is an attempt to make it even worse, because before at least people had a choice to avoid using iCloud and not participate in malicious Apple activity but now it is suggested to leave people without such choice because Spyware Engine is pushed to be installed in their personal devices.

The installation of Spyware Engine will have longing effect on free democratic society respecting human rights. I described such effect here: https://news.ycombinator.com/item?id=28084578


Yes, there seems to be a lot of conflation between the two separate systems (in the media, and in comments on here).

As you say, it’s important to be precise, otherwise people won’t take your arguments seriously. To summarise:

One system involves hash checking. This is performed when photos are being uploaded to iCloud. The hashes are calculated on-device. Private set intersection is used to determine whether the photos being uploaded match anything in the CSAM hash list. Photos that match will be reviewed manually - if I recall correctly, this only occurs after a certain threshold has been reached. This system will only match known CSAM (or whatever else is in the hash list).

The second, independent, system, is a machine learning model which runs on-device to detect sexually explicit photos being received on Messages.app. This system is designed to be run on children’s phones. The system will hide these explicit images by default, and can be configured to alert parents - that feature is designed for under-13s. This system doesn’t use the CSAM hash list.


Based on my reading, the first system is interesting in that the threshold is also cryptographically determined. Each possible hit forms part of a key, and until a complete key is made none of the images can be reviewed.

Should also be noted the second system sends no images to Apple or is reviewed by Apple employees in any way. It's just using the same on device ML that finds dog pics for example.

I think Apple PR screwed up here announcing all these features at once on one page, and not having the foresight they would be conflated. It also didn't help that a security researcher leaked it the night before without many of the details (iCloud only for example).

Context should also be included. CSAM has been happening for years on most (all?) photo hosts, and is probably one of the blockers to full E2E encryption for iCloud photos (I so hope that's coming now!).

Finally, nothing has really changed. Either iOS users trust Apple will use the system only as they have said or they won't. Apple controls the OS, and all the 'what if' scenarios existed before and after this new feature. As described, it is the most privacy preserving implementation of CSAM to date.


> Should also be noted the second system sends no images to Apple or is reviewed by Apple employees in any way.

You are directly contradicting Apple's public statement:

> Apple manually reviews all reports

( https://www.apple.com/child-safety/pdf/CSAM_Detection_Techni... )

Moreover, it was found that in US v. Ackerman that the NCMEC's inspection of a user's private email was an unlawful violation of their fourth amendment rights specifically because AOL passed along the email based on a hit against the NCMEC database without inspecting it. Had AOL inspected it than the repetition of the inspection by NCMEC would simply be a repetition of the search performed by AOL which was permitted under their contract with the customer.

Not only does Apple state that they will "review" the images, they must do so in order to deprive the user of their forth amendment protection against unlawful search by the government.

> and is probably one of the blockers to full E2E encryption for iCloud photos

Apple is free to provide encryption and they are choosing to not do so.


I was referring the second system mentioned by the parent. NOT the CSAM system. Conflating these systems has been huge issue with the random articles coming out. They are distinct.

From the GP > The second, independent, system, is a machine learning model which runs on-device to detect sexually explicit photos being received on Messages.app. This system is designed to be run on children’s phones. The system will hide these explicit images by default, and can be configured to alert parents - that feature is designed for under-13s. This system doesn’t use the CSAM hash list.

> Apple is free to provide encryption and they are choosing to not do so.

Sort of. They cited the FBI a few years ago when they put the brakes on adding E2EE to everything. It would also create a huge target for legislation, 'Apple is helping CP people - think of the children!' The new method of CSAM will let Apple add E2EE, and fight off calls for legislation.


I fully agree that some people are conflating the imessage nudity detection to the detriment of the public dialog. Though I think there are also problems with imessage, including the fact that minors are not completely devoid of civil rights relative their parents, that the same mechanisms could be easily retargeted against adults (with some dictatorships taking on the role of the parent accounts), and that doing so normalizes our children to a pervasive electronic surveillance... I think it is a substantially different matter and one of less concern.

I don't think I agree on the "sort of"--

It is unambitious the effective law in the US, established in court, that if the tech company scanning is a product of government coercion than the scanning cannot be performed without a warrant.

In Ackerman Google went out of their way to testify that they were not being coerced by the government, but were scanning out of their own commercial interest (to avoid a public loss of reputation for hosting child porn on their systems).

If Apple wants to defend their failure to act in their user's best interest, they need to do so directly by calling out the governments coercion. Or otherwise, they ought to admit (as google did) that no meaningful government coercion exists and that their actions at the users expense are driven by their commercial interests.

There are two central possibilities: That the searching is coerced, in which case it is an unlawful violation of the user's fourth amendment rights, or that it is not coerced, in which case it is an activity that Apple is freely engaging with for the benefit of their commercial interests.

In either case-- facilitating an unlawful invasion of user's constitutional rights on behalf of the government, or voluntarily invading the privacy of their users for commercial benefit Apple is ethically in the wrong.

I don't doubt that this dichotomy, if widely recognized, would put apple in a difficult position-- but that's a cost of doing business.


>I think Apple PR screwed up here announcing all these features at once on one page, and not having the foresight they would be conflated. It also didn't help that a security researcher leaked it the night before without many of the details (iCloud only for example).

This sums up my entire feelings on the matter at this point.

It's become a bit of hysteria due to this.


> Photos that match will be reviewed manually

This isn't a protection of your privacy. It's because US v. Ackerman (2016) the appeals court ruled that when AOL forwarded an email with an attachment whos hash matched the NCMEC database to law enforcement without anyone looking at it, and law enforcement looked at the email without obtaining a warrant was an unlawful search and had AOL looked at it first (which they can do by virtue of your agreement with them) and gone "yep, thats child porn" and reported it, it wouldn't have been an unlawful search.

Instead, if Apple were to report the existence of a match to law enforcement without divulging the image, which (if the matching is precise enough) should be enough to create probable cause to allow law enforcement to obtain a warrant. The only once a warrant was issued would anyone look at the image. That would be more protective.

What they're actually doing just inserts an unaccoutable additional step that reduced your privacy for the specific purpose of undermining your due process by avoiding the need of law enforcement to evade your privacy.


Don't support Apple! *You must sign up for a Microsoft GitHub account to sign our petition.

While I like the sentiment of protest, I don't understand this logic of putting it on a closed platform of one of Apple's biggest competitors who also doesn't respect your privacy.


You made me ponder about "Microsoft Github account" vs "Github account", or large corporation vs smaller corporation. So many smaller corporations end up getting acquired by larger corporations anyways. It's all kind of a broken system huh?


I wish I would practically escape it. On personal projects, I’ve moved everything to Sourcehut and GitLab, but so many projects I want to contribute to are still on GitHub for legacy reasons and don’t feel it’s worth the effort to migrate (though everything heavy on the libre side have moved). Some projects, like Elm (programming language), use GitHub for both as the source of its package management and identity so it’s impossible to escape the lock-in.

But this is why Microsoft bought it: to get closer to monopoly (GitHub, Copilot, NPM, VS Code, Azure, etc.) making people buy into a platform they feel they cannot escape. I think it is important to tag the parent company when talking about GitHub so users understand who they are really dealing with by hosting their projects on this Git forge and what restrictions this puts on their contributors.


Apple's new policy will have only one effect - pedophiles will stop using ios. For the rest of us, our privacy will remain compromised.


Pedophiles in the know would presumably just have to disable iCloud sync, and the hash scan would never happen.


This. It’s not like iCloud and GDrive are the only two cloud storage vendors, or like there’s no way to encrypt things before uploading them.


I was actually looking to buy the new iPhone should they add back TouchID, but this has been quite the turnoff.

Are Pixel phones the only android phones that allow you to do secure boot with non stock images and basically long term cripple phone functionality once its unlocked?


Google does the same with Google Drive. Apple is just late for the party.


Apple does the same with iCloud as Google does with Google Drive.

Apple just now moved it to scan on the device.


Correct. https://9to5mac.com/2020/02/11/child-abuse-images/

And if you have your iCloud sync toggle on, since your devices are intended to be uploaded, the client is doing the hashing.


This is different now. They scanned only suspected ones. Now they are expanding it to every user. To avoid same privacy issues as Google is doing (scan everything on cloud), they scan everything on device, and only leaking suspected information to upstream and preventing the upload to stop sharing.

IF we can trust that they really scan locally only those files which would end up into the cloud, then this is improvement. But trust is all we have, because the system is already full blackbox.

https://www.apple.com/child-safety/pdf/Expanded_Protections_...


Disingenuous false equivalence for anyone who knows the difference between server side and client side


Isn't it bit ironical or naive to trust their current software as it is (which is almost full blackbox), and then speculate what they could do without saying, when they add something?

As far as I understand, you can disable this feature, because it is tied to iCloud sync. Based on their spec [1], this feature avoids to do the same as Google and others doing (scan everything on cloud), instead they scan on device, which limits exposed data what Apple sees. So this is improvement compared to other available solutions.

[1] https://www.apple.com/child-safety/pdf/Expanded_Protections_...


Exactly. Many people who are upset about having their images scanned before going to iCloud don't seem to realize all the big providers (Apple, Google, FB, Twitter, MS, etc...) have been scanning images with CSAM for years already.

The client side/server side also does not matter because iOS users have had to trust Apple implicitly since day 1. All the 'what ifs' existed whether or not Apple added this feature.

I speculate that Apple is going to announce an expansion of E2E to more services at the iPhone event this year, and this feature is getting in front of political complaints that could lead to real privacy destroying legislation/LEO complaints.


This makes the most sense for “why” and “why now” IMO.


and Microsoft since 2009:

https://www.microsoft.com/en-us/photodna

Which, in turn, is used by FaceBook, Twitter, etc.


Fairphone allows re-locking the bootloader with the /e/ rom flashed.


The annoying thing about unlocking or relocking the bootloader is that it wipes your phone. Unless you're paranoid that someone can give you a rootkit while you're not looking, as long as you use full device encryption then leaving the bootloader unlocked isn't too bad. Plus, it lets you upgrade your bootloader and OS later. I stupidly relocked the bootloader after installing LineageOS and now I can't upgrade the OS major version without wiping everything.


I'm in the same boat with my Fairphone and /e/. ;)

For me, re-flashing wouldn't even be that much of a hassle since I have contacts and all in my Nextcloud instance and would only have to make a backup of Signal (I barely use my phone - about 2-3 hours total per month), but I just can't get myself to do it.

Should have sticked with my trusted ol' Nokia 3110c I used for a good 13 years, but alas I sent that in to Fairphone for recycling after I bought my Fairphone 3 last year (which I only bought in the first place because finding CNG gas stations is such a hassle here in Germany).


Most of the people see the adversary in Apple (or governments), I think there is something else: What about Adversarial Attacks. Let’s assume someone is going to spread regular memes modified as Adversarial examples to generate the same neural hash as the true bad images. Thinking back at the political campaigns, these could spread very easily among some voters for some party. Suddenly you have a pretty serious attack on people of some political spectrum (or whatever group you can target with this)… I seriously think this wasn’t fully thought through.


…Maybe we should? All it would take is someone publishing the hash database and then another group working to generate hamming-similar images. Shouldn’t be that hard if it’s as broken as people claim. What’s that saying, “can’t outlaw something that is commonly acceptable behavior”? Point being attacking the system would either prove or compel the false positive rate to be “acceptable”, or result in it being dismantled. Still ideologically opposed but this may be a practical “solution”.


It requires a 'collection' of images to trigger an alert to apple who then has someone manually reviewing the photos before taking further action

there's still our justice system/due-process as well


How to complain: don't buy an iphone next time you need a new phone


Boycotting products to change a company's mind won't work because:

- not enough people even know about this issue, understand the consequences or remember to act on it the next time a new phone is due - not enough alternatives on the market, and we're probably already boycotting them for other reasons - even with lower sales a company might not see the cause-and-effect

There is a general trend in the industry to take all kinds of liberties with user data. There is pressure on companies to do something about bad things happening in their clouds. Machine learning folks want to apply their tools to big data and the corpus of everyone's photos is probably nice to work with.

So we would have to tackle these issues on a larger scale.


That it is. I'm bit fancied of the ignorance of the people, until it is Apple who is doing it. And Apple is minority compared to other platforms.

I haven't heard much of complains about Windows sending all your file hashes and most of the files into their servers, what they have been doing a very long time with Windows Defender. It is literally the same what Apple is doing now, but without restrictions.

Google and other cloud service providers have scanned your photos for very long time, by using PhotoDNA. On top of images, Google scans also your emails. People forget and move on.


I don’t use Windows Defender nor Google Photos nor other cloud service providers to scan my photos. I barely use gmail. I knew that Google was terrible so I have already minimized my use of Google services. Same with Facebook. But Apple!? I guess the dominos have finally gotten to me.


It is not so straightforward to not use Windows Defender if you are using Windows 10. You must edit group policies in quite deep to disable automatic scanning and startup.

Google Drive includes backups of your phone, so it is not limited to Google Photos. Literally everything that is stored unencrypted on Google servers.


Disabling all of Defender is complex, but disabling automatic sample submission is easy. It's an option in the Security settings app, and you're even allowed to disable it during first time set up (or were, last I installed Windows 10).

It nags you once, but you can ask it to stop.

Besides, uploading unseen executable code and scanning photos are far from the same tech. They're very different things.


Every binary is executabale by default. Images are only subcategory. As malware, any file is potential threat. Also, I think disabling automatic sample submission does not disable hash upload, only full files.


Hmm. I wonder if a activism-based shopping platform could be something. Basically, a place where you shop, and as part of your purchase you state why you shopped that thing. Can be anything. "I liked the color". "Needed a bigger screen". "Don't want to use Apple any more".


> Boycotting products to change a company's mind won't work because:

In this case not buying the product is to avoid a viperous feature and protect house of from being spied on. Any boycott effects are tertiary.


I am going to give them a chance to reverse this but I literally just bought an upgrade to my iCloud. I thought it would make my life easier not having to worry about storage so much. Now I am just waiting for this reversal to happen or think I will have to cancel my iCloud subscription to start.


What cloud service are you putting your images on? I would be shocked if google photos hasn’t already been doing this.


You don't need the cloud: syncthing can sync your phone and your computer together without any issues. No nasty cloud provider involved: https://syncthing.net/


And because no one except Apple have access to the private iOS APIs required to run background processes forever, you would have to remember to keep the app in the foreground while it completes syncing on a regular basis.

Also, you can restore an iPhone from what SyncThing can back up. Only an iTunes or equivalent iCloud device backup can. They also can’t back up things like MFA keys / seeds / IVs.

I’m glad ST exists and it’s useful for syncing photos for home use, but it is not a viable iCloud alternative, nor could one be made due to the private iOS API issue.


You can even sync to $nasty_cloud_provider and don't provide the syncthing password. Problem solved!


What? Syncthing is not a backup program, it, as the name suggests, syncs folders and files. There is no password involved.


Syncthing now supports untrusted device sync[0] (i.e. storing an encrypted copy on an untrusted device) in a recent version, though this is still on beta.

[0]: https://docs.syncthing.net/users/untrusted.html



Looks like Syncthing doesn’t have support/client for iOS though?


There may be 3rd party apps like https://www.mobiussync.com


For some reason it’s not available in France. Is it limited to the US?


Cloud services doing this isn't the problem and they are already doing it. The problem is your device doing this.

Imagine if your Tesla reported back to Tesla who was in your car and sent that to the authorities if they thought someone in your car was "illegal".


You honestly can't know if it does that already or not.


Actually Tesla's do client side machine learning. When tesla needs training data of a bike for example, they can remotely "query" all the cars on the road and have the cars do client side detection of the stored footage and driving data and send over only the data that contains bikes.


Coming in the not-so-far-future.


Cloud is other people's computers. So, get your own server, or use someone's you trust. or use e2e encryption.


That would be great if I could run an iCloud back-end on my home servers. Unfortunately, I can’t, and no software other than iCloud can do what iCloud does on iOS due to Apple’s usage of private iOS APIs to accomplish things like persistent background sync or syncing parts of the OS which would be required to do a full device restore that aren’t accessible to user space apps.


Why do you need to put your images on a cloud service at all? Just use a spare disk.


I’m saying it because apple is scanning iCloud photos here. Unless I am mistaken, local only photos are not scanned.


It's just a matter of time till they start scanning all image files on the SSD. Then documents (country-specific probably)


There is hardly any mention of a boycott. Cynicism prevails; Apple is laughing all the way to the bank.



Just a reminder that proprietary software that auto-updates does not serve you and should never have been treated as such. Sometimes it's most convenient / best to use proprietary, and that's fine, but please be aware that all vendors with root level access (google + android, microsoft + windows, etc) are just 1 update away from this situation.


I'm disturbed by this ordeal and Apple don't seem to be backing down. I was thinking of getting one of the new MacBooks later this year but I think I've changed my mind - I won't buy products from a company that treats its customers as pedophiles. My phone is getting pretty old too, don't think the next one will be an iPhone.


I’m not sure what to think of the backlash here. I’m sure that people aren’t trying to minimize the evil of child pornography and exploitation. But anything that has the potential to stop or slow it should be fairly considered. People complaining that Apple is scanning your photos, they are already doing that. Where was this backlash when they released the memories feature? Why is scanning your photos for good pictures of your dog different that scanning your pictures for exploitive pictures of children? Is the problem that they could then share that information with the authorities?

But of course this feature could be abused. But Apple already has all the power it needs to be abusive. They can push whatever software they want to your devices. They can remotely lock and wipe all your data. They already have the power to do all of these things. This statement is simply an announcement that they will be using some of this power to try to stop one of the worst evils in our world. Until they prove that they will abuse this power, I suggest that we let them try.


I think the point is that yes, they have always had the power to do these things - but they haven't thus far. We rely on entities with power (companies, governments, people) to exercise restraint in how they exercise it. Those that DO exercise restraint gain trust since exercising restraint demonstrates an understanding of the consequences of not doing so.

What the Apple move is doing is showing that they are willing to relax their restraint. It gets tricky because everyone agrees that the specific goal here is honorable, but the manner by which they are using their power to achieve it is generalizable to areas that are less honorable. Once they are willing to use their power to accomplish one highly honorable goal, it's not a big ask for them to use it for a slightly less honorable goal in the future. Iterate that a few times and you can find yourself in a very bad place. It's the classic slippery slope argument - when you know there is a slope that leads to dangerous places, you need to not ever start down it no matter how righteous the motive is in starting down that path in the first place. There's a reason we have the old saying "the path to hell is paved with good intentions".

The existence of power isn't what matters: it's the intention and willingness to exercise it. Apple is now demonstrating that they have changed their stance in how they choose to exercise their extreme power. That's worthy of scrutiny.

For a concrete example of where I expect this to naturally lead to: instead of a database of child pornography being the source of the hashes to search for, the Chinese government provides a set of hashes of all known digital photos of the Tianenmen square protests of 1989. Does it really seem implausible for a government like China's to NOT use this kind of technology for that purpose? It's not hard to cook up similar examples all over the place.


Yeah, but they are already doing this for pointless things. They already use facial recognition on all the photos on your phone. That’s what I don’t understand. The only new thing is what they are looking for and their willingness to alert the authorities. The slippery slope argument that this will eventually be used by China to arrest journalists is scare tactics. We have zero evidence that Apple would allow such a thing to happen. And the only thing stopping them is Apple’s word. The fact that they are announcing this should actually give confidence that they aren’t doing it in the shadows for China. They didn’t have to say anything about this. The fact that they should give you confidence that they are respecting your rights, not evidence that they aren’t.


Apple has caved to china many many times, a quick googling of it will give you a list.

The difference between the "old" content scanning and the new is indeed that they are now willing to "use" the results of that. Facial recognition was client-side only (or so they said), the results of which never left your phone.

Now they're doing content scanning and sending it to themselves as well as others.

In parallel Apple is starting up a growing advertising business, hiring aggressively and expecting that to be a big part of their future revenue. If they're now "allowed" (by its users) to do content scanning _and_ sharing the results, why wouldn't they use those results for themselves to target you with ads?


i think the remote execution with a remote hash database is the key part thay you need to focus. Checking for faces is something that doesn't need much information outside your phone itself.

What Apple is proposing is basically adding a feature to scan any user's phone for a collection of hashes. Even if they say they will only use this for CSAM this sends a strong message to all government agencies around the world that the capability is over there. Maybe for US citizens this doesn't sound dangerous but if I was a minority or a critic of the government on a more authoritarian country I would jump ship from Apple products right away.


It's important not to conflate the new features. CSAM uses hashes of known photos and is only run on photos going to iCloud (turning off iCloud turns off CSAM). Photos sent to iCloud have been checked against CSAM for years on the server. The change here is moving it from server to client (which I hope is to make iCloud photos E2E encrypted).

Completely agree with your second point. All the 'what ifs' have existed forever. Either iOS users trust Apple will only do what stated or they don't. Nothing has changed.


  But anything that has
  the potential to stop
  or slow it should be
  fairly considered
Strip searching everyone hourly will stop most contraband, including child pornography. And, according to you, it should be fairly considered.

Shall we start with you?

(And as people do have naked photos of themselves on their phones often[1], make no mistake, the strip search analogy is NOT an exaggeration)

[1] https://en.wikipedia.org/wiki/ICloud_leaks_of_celebrity_phot...


If UN or another international body like Interpol suddenly grew the courage to do it planetwide, I'd be the first to volunteer.


You can volunteer installing CCTV in all your rooms, including toilet, and regularly send the footage to the police station. However, let us, normal people, to be excluded from this dystopian madness.


Considered, sure. But easily rejected. Let’s be grownups here.


Yes, indeed, let's! Grownups recognize that strip searches are unwarranted in this and most other cases, be they physical OR digital...

And certainly you've heard of software bugs. You do not think that a small one like "accidentally" "forgetting" to check that a photo had been synced cannot happen, and an "accidental" scan of ALL photos couldn't possibly happen in any situation no matter what bug/corruption/etc? Surely you'll share with the class where YOU hire infallible programmers.

An accidental strip search is a strip search nonetheless.

Luckily, this can all be solved by NOT writing such code. Ever.


You must surely realise that there’s a fairly large difference between being strip-searched and having a hash computed on one of your photos that you’re uploading to iCloud.


Don't see any difference. In both cases your private property forcibly accessed without any reason to suspect you, but rather out of preventive reasons.


You must surely realize that slippery slopes exist. And as soon as adversarial code exists in your device, it will only expand.

Reference: history of literally every dictatorship includes many "reasonable" expansions of power in the name of security/safety/etc...


Yes, but I think you’d agree that each slippery slope has a certain degree of probability.

The slippery slope from Apple checking image hashes to hourly strip-searches seems rather unlikely, which makes your analogy unhelpful.


The concern is less about outing people with CSAM and more about Apple building a powerful tool for government surveillance that will most certainly be abused by countries like China and Saudi Arabia when they request that politically dangerous images be added to the database if Apple wants to continue business in their country.

It is incredibly myopic of Apple to implement this feature. You must consider, as an engineer, how your constructions will be abused and used for bad as well as good.


This isn't a powerful tool, it's more akin to a rudimentary hash check. It's not going to match photos that aren't already known to authorities. And any time a user is flagged, the material in question is sighted by an Apple employee before a decision is made whether to forward it onto the relevant authorities.


Ok, so what stops the CCP from submitting hashes of photos of the "tank man" to this hash set? And then jailing all those reported to possess it?


What's stopping them already...?

It operates on iCloud photos. Those are already scanned. If a nation state wanted to flex this before they could have done so.


That's a poor argument, and possibly some kind of fallacy.

'No one has done it before, so they probably won't do it in the future now that it we made it easier.'


It’s not easier though. They moved the scanning onto the device instead of doing it server side. That all.


Same reason you do not sue someone for patent infringement until it is too late for them to turn back?


Because when the Apple employee who reviews the flagged accounts sees pictures of Tank Man and not CSAM, they won't forward them on.


Unless their business is threatened


What was stopping Apple from being threatened with the exact same threats three years ago?


I think that the used technology described in CSAM Detection Technical Summary (the so called NeuralHash) is a really bad idea, basically it uses the output of a neural network (likely a CNN) trained using the now classical triplet loss (see page 5). The problem with those methods that they are not always reliable, for instance the same type of networks and same training procedure is used in neural face recognition that proved to be vulnerable (A recent example is the 'Master Faces' That Can Bypass Over 40% Of Facial ID, etc).


Let's not forget Microsoft is doing this same fukery for years

https://www.microsoft.com/en-us/photodna


Using child abuse as the example crime for this crime prevention technology makes it feel like they want to silence any criticisms of their phone monitor technology


To introduce such invasive technology just a few weeks after the Pegasus scandal shows me that Apple has lost its ability to read the room.


Because it’s based on machine learning, why can’t some of the network and weights used for the hash be shared? So we can be sure it is not possible to match anything else than children.

The sub narrow network used for detecting only CP is of course secret, but then we know it can’t be used for revealing pictures of police, activists etc


Because the technology is not the point. I just don’t want Apple to search anything in my photos. Because those are my photos. Not theirs. My phone doesn’t need a program that search for abused child photography since there is not such material on it.

I don’t care if their thing is open or closed source : My iPhone is mine and I don’t want them to do anything on it that can get me arrested because of a false positive.

We must not accept those intrusions.

This thing is as serious as if IKEA regularly sent robots in your house to check all the photo albums on your IKEA shelf to check them against some database. Why does it sounds dumb when it’s IKEA but more acceptable when it’s your phone constructor ?


It doesn't seem to work that way. It's a bunch of arbitrary hashes that might not show up as any content in particular except "this hash matches". In particular, there was someone claiming that images of empty rooms where abuse happened are part of the database, along with other images that appear in the same sets as CSAM.


Like stated below, I’m seriously worried about Adversarial examples that fool the network into A) hiding bad pictures being not recognized B) triggering false positives on harmless pictures to discredit people.

Opening up the network would mate it even easier to create the adversarials… Im not sure there is a winning position in the approach they used


«even easier to create the adversarials», agree. But if they find one of this pictures they can retrain in a way that find all of them on all phones. So it will never be safe to have CP on the phone/iCloud.

And easier to keep Apple accountable on not matching anything else than the purpose.


Do you use iCloud photos now? CSAM checking has already been used for years (on almost all photos services btw).


Anybody else feel the chilling effect, wanting to sign this but unsure of future implications of doing so?


If it’s just a list or hashes, any possessor of CSAM could simply modify a few pixels to make it no-matching, no? How "flexible" is the matching?

And if it is flexible, what about false positives? What if I have a pictured of my naked son on my device and I get flagged? Will the picture of my son get uploaded to the cloud "for further analysis" even though I don’t have iCloud enabled and never signed up for this?

Edit: A sibling post has this link, which answers some of these questions: https://www.apple.com/child-safety/pdf/CSAM_Detection_Techni...


I think it makes sense to take a step back and ask yourself if Apple, or any sensible Western company, would set up a system that could falsely flag millions of people, or even a hundred. Even without going into implementation details, it seems clear that they would not have a system that would flag standard family photos.


I wouldn't over-/underestimate the support capabilities of a big corp like Apple.

I remember the Twitter account takeover hack a year ago, which was possible only because customer support had some 2FA-bypass: https://en.wikipedia.org/wiki/2020_Twitter_account_hijacking

To prevent false-positives, who knows if they have a "review" team that takes a tiny little peek at my naked son. Do you know for sure that no such system is in place or ever will be in place? What if they do get hundreds of thousands of false positives in the future? How would they improve their system if not by reviewing the existing system with real data?


They are not looking at your photos at all, they are comparing hashes of them to hashes of KNOWN predatory material. Your family photos will not be in that database.


It's really hard to not want to throw some Schadenfreude on those who pay mini-bar prices for technology. Sadly I doubt this will be a last straw for most, given their investment in time standing in lines.


Apple in the eyes of many had one redeeming feature over the competition: they appeared to value privacy. The problem is that the 'many' are still a small minority, and that the much larger contingent that Apple serves sees them more accurately for what they are: a consumer electronics manufacturer, where privacy is not an absolute concept but something malleable as long as it serves the bottom line.

The tech world seems to have a problem reconciling these two views.


What is the current flagship phone that could run e.g. Ubuntu Touch? I am looking for a new phone that will not be worse than my current driver from 2018.


If you had the weights of the hashing neural network (NeuralHash) could it be reversible so that a hash could derive a similar photo to the original?


Neural networks are generally assumed to be non-invertible functions. That said, something like an auto-encoder can learn to decode the encoded data back to its original form, with loss.


I wonder what image transformations would yield a different hash in the NeuralHash algorithm. Anyone know?

From Apple: “The neural network that generates the descriptor is trained through a self-supervised training scheme. Images are perturbed with transformations that keep them perceptually identical to the original, creating an original/perturbed pair.”


Not a pair. The number of possible perturbations of each example image in deep learning are enormous. This blithe quote implies that a hash code somehow can provide a flexible match against the billions of perturbations even a single image can undergo. It's a specious claim, and intended to cow anyone from disbelieving in the effectiveness of such a brute force match model.


Google Photos and Gmail openly and heavily scan server-side, and I'm sure a lot of us use them, how do we reconcile that?


On the server side. That is the difference. I know that if I put something on Google Drive it is in clear in the Google servers.

What I don't want is Google performing a scan of the internal memory of my Android device!

Server side they can do whatever they want, and I choose what data they can access, but on my device not.


You know it’s still only photos being uploaded to iCloud that are scanned right? Or are you just totally unfamiliar with the actual issue? If you are familiar, can you explain the practical difference between these approaches that makes one worse?


Reading these comments, I think most people have only read Twitter outrage or the flat out wrong news articles conflating the new features. This is also on Apple as they had to know the CSAM on the client was a big deal and really needed it's own PR page.


If you don't like it, you can use Protonmail instead of Gmail, and Mega instead of Google Drive, and still being able use Android device. But I totally understand frustration of users, whose smartphone suddenly turned into spyware.


I don’t use Android, so genuine question here: does an Android phone automatically upload everything to iCloud like an iPhone does?

With the iOS ecosystem, in my experience having everything backed up to iCloud is the default position, and you have to turn it off manually.


The Big Tech is already here, it’s hungry for control and data. Not just for its own good, but to also serve the governments around the world. It’s going to happen eventually, and most people won’t care.


Google is heavily incentivized not to share the data with anyone else.


Boycotting Apple products will not make a difference. Apple doesn't care, what Apple does care about is bad PR.

Generating and sharing bad PR is the only way to get Apple to change something.

Think about the Bad PR around Butterfly Keyboards, MacPro, App Store Policies etc and Apple reversing course on decisions.


I kind of stopped after:

>fundamental privacy protections for all users of Apple products.

Users of Apple (should and largely do) accept they are not in control. Apple has always been clear this is a feature not a bug.

This is the logical next step to justify that their lock-in/monopoly has unique features beside making them more money.


Let‘s play a simple game.

Go to https://www.apple.com/child-safety/pdf/CSAM_Detection_Techni...

and then replace

- "National Center for Missing and Exploited Children“ with „the new Trump administration“

- „child pornography“ with „LGBT content“

Doesn't look fun anymore, does it?

(use „the new Clinton administration“ and „images of guns“ if you are conservative)


I'm not sure of the point you're getting at. You can do this with effectively anything. Replace "gay marriage" with "child sexual exploitation" in "I support gay marriage". Does that mean we shouldn't support anything?


> Does that mean we shouldn't support anything?

Op is pointing out what a universal evil Apple's scanning of personal property at scale is.


I understand you wanted to refute the former comment, but, ironically, that is indeed the case in point: when we cannot trust that the the thing we support won't be exchanged with something else in the future we don't support, without our knowledge, then we indeed should not support it.


Wasn't this covered, yesterday? It seems to be a GitHub Pages site that links to the same resource that was posted here yesterday.

But I think it def deserves a lot of scrutiny, and I support the effort, so I'm not really complaining.


I am relatively sure that Apple has made concessions to the Chinese government to allow it's existence in the Chinese market. I would guess that other Governments might feel slighted if not offered the same access.


I think the important question is, can we trust a large company like Google, Apple or Intel again in the future, and how do we know that they're not spying on us?


I think at this stage big companies affecting societies in a big way may be forced to publish source code that people of those societies run on their personal devices. At least the client side.


I'm not sure what the solution is.. Child abuse is an extremely severe problem. In the future encrypted-messaging may be used by terrorists developing some weapon of mass destruction like a virus. I'm concerned about privacy, but is it tenable to have true E2EE regardless of the harm? For better or worse it seems that human societies will prioritise physical safety over most things.

Even if we have true E2EE, a bad authoritarian regime in the future can simply make it illegal. It's more imperative than ever to prevent such a regime arising in the first place..


The solution is for law enforcement to show evidence of a crime to get a subpoena.

If you don’t think it works, check out your local list of Megan’s Law participants.


Unfortunately law enforcement don't have super-human powers and very often don't know who exactly is involved..


At this stage it doesn't matter if it is carefully calculated effort to gradually strip off people from the very sense of any privacy with longing effects on a society (like I described here [1]) or it's some company trying to legalize Spyware Engine for it's own convenience covering it with some story to justify it. Usually they use children for such cover story because people have direct emotional response to that topic and thus stop thinking for just enough time to miss the important issue.

The important issue here is an attempt of Spyware Engine installation/legalization on personal device.

I think this is much more serious than many people might see it at the moment. I think it is attack on a very fabric of free democratic society where human rights have real meanings. It is done by people who do not share those values or not familiar with them or simply do not care about them or even worse - doing it on purpose.

Many can think of their personal device as their home. Some can even think of it as extension of their mind. It is very personal space and attempts to invade it can/should be considered not less serious then invasion into your private home.

Even critics can admit that your personal device indeed can be in your private home space and connectivity of such personal device should not automatically mean that someone or something like AI under control of someone should access it and spy on you without warrant.

Companies are doing shitty things for some time now and they are getting away with them. This can be a reason why they move forward with unacceptable practices of surveillance. Perhaps those companies do not understand that surveillance is completely incompatible with free democratic societies respecting human rights and there is no way they can succeed in combining two incompatible things unless they wish to repeat China way of doing thing. And I believe the later would not be met gladly I even think not so much possible without using firearms to which people would resist with the same effort like they did many times in history to protect own freedom.

At this stage I think it is much more serious than just writing a letter.

There should be a law protection of your personal computer from any Spyware Engine/Agent for any reason other then warrant to avoid bigger and possibly deadly confrontations. Just like your home as your personal space should be protected from Search without warrant in Fourth Amendment [2]. Your personal device is much more personal then your home in a way and deserve to be guarded accordingly. I am not a lawyer and perhaps Fourth Amendment is already enough but then it should be used properly to prevent Spyware Engines in personal devices.

* Amendment IV

The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no warrants shall issue, but upon probable cause, supported by oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized. * [2]

The Fourth Amendment originally enforced the notion that “each man’s home is his castle”, secure from unreasonable searches and seizures of property by the government. It protects against arbitrary arrests, and is the basis of the law regarding search warrants, stop-and-frisk, safety inspections, wiretaps, and other forms of surveillance, as well as being central to many other criminal law topics and to privacy law. [2]

[1] https://news.ycombinator.com/item?id=28084578

[2] https://www.law.cornell.edu/constitution/fourth_amendment


Is it ok nowadays to use github for pamphlets, "open letters" (that aren't letters), and other social media and polemic content to protest against privacy invasion of all things?


Why should it not?


Expect people to be swatted by placing CP on their laptop.


Except this doesn’t scan the contents on your laptop. It scans what you upload to iCloud.


I was about to buy M1 Macbook. No thanks.


It seems like it is coming together. They'll be able to figure out what kind of person you are by looking at your phone's content. Then at your next vaccination, you'll get one that causes a blood clot or you'll get sterilized. It's not like government hasn't done some of those things. Now it is only going to be automated.


[flagged]


Apple is fundamentally about making Nice Phones, and are among the least morally complicated organizations in the US. If we followed your way, we'd be banning over half the country from occupation.

Just look at the moral complexity of the US armed forces, which is entangled in stories of collateral damage from drone strikes, burning of opium fields in Afghanistan, or hundreds of thousands of civilian deaths in Iraq.


I think, at it's core and what drives most people to work there, is Apple is fundamentally about improving humanity's experience through computing.

I don't mean that as a buzzword soup either. They're the most valuable company in the world for a reason. It also means they're the spotlight for all the moral dilemmas we face as a global society more deeply integrating computing into our daily lives. So I have to disagree.

Apple is trying to take a stand saying, "We will be proactive about policing the one activity that nearly everyone considers morally reprehensible," and tacitly they're saying in that, "This is more important than our whole mission to bring privacy back to computing users."

Because what they're building completely invalidates that. They're effectively saying that -- as of August 5, 2021 -- Apple is privacy-hostile.

Then to connect that back to their true mission, or what anyone working for the most powerful computing company in the world ought to consider their true mission: they're about improving the human condition through computing. Working for Apple now signals that you believe humans are better off without privacy.

Buying from Apple now signals that you believe humans are better off without privacy. Apple is privacy-hostile.

It doesn't have to stay that way. There can still be some pride in making short the number of days from August 5, 2021 to the day that Apple makes a stand for the liberty of all humans to feel secure in their ability to use computers in all forms without an ever-watchful million unknown eyes over their shoulder.


The armed forces is a bad comparison. They have a fundamentally different mission than any given corporation. There are different moral tradeoffs to be made there.

Not to say I agree with the US armed forces' actions -- but comparing the US armed forces to Apple is a bit disingenuous.


I think this is rather hyperbolic. Vast majority of people at Apple have nothing to do with this. And even people involved in this, I’m not sure how much I can blame them. Soldiers vs generals kind of argument. Also, they likely see the pedo-fighting side of things and think are in the right.


[flagged]


> And even for the ones that would starve, I'm sorry, but I put the well-being of the entirety of society and our collective future over their lives.

God, just listen to yourself.

Personally I think their plans are fairly balanced and reasonable. Like it or not, there are legitimate interests here. It's a difficult trade-off and conversation. Ostracizing people and saying you would literally rather have people DIE is not adding to the conversation. The way you talk about it makes it sound like this is about the Nuremberg laws or something.


What makes you think I haven't listened to myself? And don't bring God into this, he's left the conversation a long long time ago; This shit's on us. I really couldn't care less if you think their plans are balanced or reasonable -- I don't agree with that on a fundamental level because Apple has long passed that threshold. This is just their latest digging a deeper hole. And why should I feel obligated to add to a conversation that has long gone nowhere? What can I add to it that wouldn't just get ignored anyways. I might as well inject my honest opinion, however much it'll be ignored.


This is not really much of a conversation if you don't care what other people have to say, is it?

I don't really understand this almost visceral hate towards Apple. They do some things I like, and some things I don't like. This kind of stuff is not going to change anyone's mind, much less help them understand your position.


The visceral hate here is towards Apple because they are the current subject. Substitute any FAANG, or really, any company behaving similarly, and my reaction would be similar.

Was it ever a conversation in the first place? I'm pretty sure that unless there's enough blow-back financially, Apple is going to do whatever it wants. This was never a conversation, it was a dictum from Apple that we're all reacting to. They're going to do whatever they thing will bring them the most profit. I am just following suit. I don't need others to understand my position, I don't need them to change their minds. I need them to acquiesce and do as necessary. This is what Apple dictates; so shall I.


> Was it ever a conversation in the first place? [..] I don't need others to understand my position

What do you think the point of Hacker News is?


It certainly isn't to constrain one's expression of opinion to the well-trodded cliches and kowtows.


It also isn’t to bully people into agreeing with you, regardless of what valid or invalid arguments they may or may not have.

I’d suggest reading the HN guidelines one more time: https://news.ycombinator.com/newsguidelines.html


"If you're a hiring manager, throw the resumes of anyone that worked at Apple in the trash."

Well,... NO. Those are the very people trying to depart Apple, so surely the very ones we should try to support?

I'm not saying they should attract a more favourable review in the application/hiring process, just that the ones who should attract approbation are those staying behind to continue supporting Apple. (Assuming you go along with the entire proposition that pressuring developers will alter Apple's corporate misbehaviour in the first place.)


Except that I can't do anything as a hiring manager to those that continue to stay at Apple. But I can help perpetuate the notion that if you work at Apple for any reason, you'll be un-hirable anywhere else.

The point of reasonable doubt passed a long time ago. If you didn't get out years ago, you're already part of the problem.


People have asked of this for Facebook employees for years, with little effect.


Maybe we should ask for more leaktivism from morally inclined employees instead of trying to push water uphill.

Sunlight disinfects power.


Has that ever changed anything? Like, really?

Sure, we get up in arms for a little bit, but come the next news cycle its back to business as usual.


all the time. it triggers resignations, policy changes, impedes further power grabs.

if you're a small player in an evil empire it's the one thing you can really do that makes a difference.

just quitting is what theyd want you to do. That never makes a difference.


One person quitting doesn't make a difference. Most people quitting sure does.


I dont think I've ever heard of a medium sized company being taken down a peg this way never mind a behemoth people dream of working at that treats its employees as well as Apple does.

But, good luck trying to spur an exodus i guess?


That’s such an idealistic stance. 99% of people don’t have the means to engage in militant activism.


It is a very idealistic stance. And precisely because most people can't engage in militant activism, those that can should be.


Or you know, people can just have fun and chill out. Not everyone's life goal is to change the world.


Then go have fun where it doesn't hurt the rest of us.


As much as I dislike what Apple is doing here, Apple is a long way down the list of companies that should be a black mark on your resume.

Oracle is a MUCH better target, for example.


Oracle's on the blacklist too. One does not preclude the other.


In the end, which companies remain off your list?


Horrible. Apple has every right not to facilitate child exploitation, your rights be damned. Don’t like it? Don’t buy an iPhone.

end of story.


So we should sit still and wait while CCTV in every room would be a normal preventive measure against child abuse? No thanks, we will tear down this initiative, and will keep doing that in future. However, you can keep sending your private footage to the police station, if you like doing so.


Four of the top ten posts as of writing is about this issue. I'm glad people are finding something they are interested in to talk about, but at this point this issue is hardly intellectually interesting.

It's like people are being forced to buy an iPhone or something.


I guess a lot of owners of Apple devices are also making use of a lot of their services/features making it hard, or at least inconvenient, costly and time-consuming to switch out i.e. vendor lock-in with open alternatives. This is why it's good to try to avoid this situation occurring in the first place. But that's at odds with how you're 'supposed' to use their devices and ecosystem and you may as well have not chosen Apple in the first place. Most people have already gone 'all-in'. Too bad when they implement something (like this) that could be used to target innocent people at some point in the future. Apple obviously don't intend it to be used in this way. But a bit like the Pegasus debacle with it only targeting criminals, it's obviously a vulnerability waiting to be exploited.


Unpopular opinion: I think the outrage over this is quite overblown.

This kind of hash checking is done by damn near all cloud providers - Google Photos, Dropbox, Gmail, Discord, Reddit, OneDrive, Facebook, Twitter, you name it. Apple have actually been very reluctant to implement this.

If you don’t like it, you don’t need to enable iCloud Photos. In the exact same way as if you don’t want your images scanned by Dropbox, you don’t upload the photos to Dropbox. It seems reasonable for Apple to implement something like to prevent iCloud becoming a repository for CSAM.

Edit: I’m unable to respond to the comment below this about hashes being performed locally, as I’m rate limited, but here’s my response anyway: The hashes are calculated on the local device, but only for photos which are about to be uploaded to iCloud Photos. The hashes are sent with the photos to iCloud. If a certain number of hashes in someone’s iCloud match (calculated using private set intersection), a manual review will be performed.


It's being done on the local device, not just the cloud.


Why does it matter where the scanning is done, if the scans are done to the same content under the same circumstances?


iphone user != icloud user

When you use a cloud service, you must (should) be aware of the consequence of your files being stored on another computer on the internet.

Until now, your local files on your device were not scanned in an inscrutable manner.

This is a precedent.


But it’s still only cloud photos being scanned. The only photos on your device being scanned are ones that are being uploaded to the cloud. So again, what’s the practical difference?


There is none. I partly blame apple for putting these 3 distinct features into the same PR doc, news articles that are flat out wrong, and commenters for not reading the original source. Because it’s apple and privacy, people immediately lose all semblance of understanding it seems.

There are some valid slippery slope arguments to be made. But, the feature as described and implemented today actually increases privacy for CSAM scanning.


If the scan remained in the cloud, it could not be misused to spy on anything else but the uploads. But if it runs on the device and sends findings, it fits the definition of spyware.


That argument just makes no sense. Apple already controls your device and software.


Not sure why you're downvoted (too many facts in one post I guess lol), but you're right. I wanted to add that Apple has already been doing CSAM scanning on photos uploaded to iCloud for years. They moved it to the client instead (I think in preparation in make iCloud photos e2e encrypted).

Turn off iCloud photos and it turns off CSAM.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: