Hacker News new | past | comments | ask | show | jobs | submit login

Then ask yourself why they shipped this scanning on the client-side. This is the first step towards normalizing client-side scanning of encrypted content across the entire device.



Why would Apple want to do this? It doesn’t benefit them at all. Their competitive advantage is having people trust their devices, if not their values.

People are making an extreme claim that Apple went out of their way to implement a fancy system to ruin their own value proposition, and the evidence they have to offer is mere speculation.


You seem to be operating under the assumption that device owners are the only customers of the Apple ecosystem.


Ambiguous retorts like this make you sound intelligent but offer little to the discussion. If the adversary is the USA or China, I have bad news for you: every major democracy has planned encryption regulation which is unimaginably worse than what was announced here.


Telling me what I already know is not really contributing much, either.


Actually, the discussion benefits from everyone stating their arguments clearly, even if you don’t benefit. Most of the discussion of this change has been FUD, making it difficult to tease apart actual privacy regression from imagined ones.


Hopefully so they can remove their current ability to decrypt user photos for whatever reason they want. The current state is they can decrypt any user photos on iCloud. Doing client side scanning and this CSAM detection implementation could allow them to remove their ability to decrypt EXCEPT in very specific situations.

It's not true end-to-end encryption since in some cases the content can be decrypted without the user key but it's significantly closer than what they have today.

That being said I don't know if that is their plan or not, but it is a plausible reason to make this change.


If they can decrypt in a “specialized” situation, then they can decrypt in any situation. All that has to be done is to broaden the classifier step by step. Or someone else gets access to the back door. That’s why there can be zero allowed back doors.


The database ships with iOS. Apple can do anything they want in iOS updates. In fact, this was exactly the “back door” the FBI requested Apple use half a decade ago. Per this standard, all Apple end to end products are already backdoored, and nothing new was announced.


Yes but previously Apple’s stance was, “no we won’t do that”. And so they earned many’s trust. Now they are planning to do exactly that. And so they have broken the trust they earned.


Where specifically did Apple say they will never try to detect for the presence of CSAM in your iCloud Photo Library? In fact, people mistakenly assume that they do already.


They can decrypt iCloud content currently, so it wouldn't be a step back.

On the topic of backdoors, automatic update systems could be used as backdoors.


Probably why every system update requires passcode input.


I disagree. I think it's the first step towards enabling e2ee on iCloud photos. This system will replace the server side CSAM they have done for years.


I think this is highly unlikely, everything points to Apple ditching the idea altogether : https://www.bbc.com/news/technology-51207744

Many foreign countries have also clearly stated that they do not want this (E2EE) to happen and would legislate against it (the UK comes to mind first).

I do believe that you are correct with the idea that this technology was initially developed as a compromise to E2EE. But while E2EE on iCloud was indefinitely shelved, somehow this was not.

And someone at Apple thought this could be repurposed as a privacy win anyway ?

The other way I can think of it is if the ultimate goal is to add those checks to iMessage. One could argue the tech would make a lot more sense there (it's mostly E2EE with caveats), and it would certainly catch many more positive hashes.

I think someone at Apple massively misjudged the global implications of this and opened the company to a (literal) world of upcoming legislative hurt.


I read that article and see this new method as work around for the FBI complaints, and once again allowing E2EE to move forward.

Technology doesn't live in a vacuum. Given the calls from the government for backdoors to encryption, I think it's safe to assume this is Apple getting out in front of what could likely be heavy handed legislation to add actual backdoors like master keys.

But, we'll have to wait and see if Apple starts adding more services to E2EE again. It also may all be moot if legislation gets passed that forces companies to be able to break the encryption for warrants.


> Technology doesn't live in a vacuum. Given the calls from the government for backdoors to encryption, I think it's safe to assume this is Apple getting out in front of what could likely be heavy handed legislation to add actual backdoors like master keys.

I broadly agree but I cannot foresee a scenario where limiting at this particular issue (CSAM) would be seen as a sufficient compromise by legislators to allow E2EE to be expanded.

And other countries will have very different interpretations, much less palatable to Apple's values, on what should be checked for and they will have no qualm legislating to require it.

Quoting the NY Times (via Daring Fireball) :

> Mr. Neuenschwander dismissed those concerns, saying that safeguards are in place to prevent abuse of the system and that Apple would reject any such demands from a government.

> “We will inform them that we did not build the thing they’re thinking of,” he said.

They can tell themselves that but it doesn't matter : they precisely did.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: