Hacker News new | past | comments | ask | show | jobs | submit login
[flagged] Apple OSes are insecure by design to aid surveillance (sneak.berlin)
43 points by vitplister 11 months ago | hide | past | favorite | 32 comments



> First, iCloud E2EE is opt-in. The setting is buried, and there are no prompts to enable it, so approximately 0% of iCloud users have turned it on. It might as well not exist.

This is a feature, not a bug. If my mom or Aunt Millie forgets her password then I still want her to be able to recover data.

The main issue for most regular people in the CIA triad most of the time is not confidentiality but rather availability: avoidance of losing photos and videos of (grand) children is the highest priority.

Human rights activists and journalists can enable more stringent controls separately from the general public because their threat model is different. Apple has added a suite of extra features for these individuals in recent releases:

* https://techcrunch.com/2022/08/12/apple-lockdown-mode-ios-16...

* https://www.apple.com/newsroom/2022/07/apple-expands-commitm...

* https://www.apple.com/newsroom/2022/12/apple-advances-user-s...

For all of Apple/iOS's imperfections, are there other vendors with out-of-box experiences that are at a higher level in this (security/privacy) regard?


> Human rights activists and journalists can enable

This is completely missing the point of why one needs privacy. Lack of it harms journalism and activism, making the government too powerful and not accountable. If only activists and journalists will try to have the privacy, it will be much easier to target them. Everyone should have privacy to protect them. It’s sort of like freedom of speech is necessary not just for journalists, but for everyone, even if you have nothing to say.

> are there other vendors with out-of-box experiences that are at a higher level in this (security/privacy) regard?

Yes: Purism with their Librem devices, including Librem 5 GNU/Linux phone.


In a perfect world, yes. In reality, a friend lost his phone and forgot his Apple password. They were able to help him recover his data after he proved his ID to them.

For most people, data loss is much more of a risk than privacy loss.

I'm a member of the EFF. I backup all my own data. I have Advanced Data Protection turned on. I keep my handful Dogecoin in a hardware wallet stored in a safe deposit box. I am perfectly competent and willing to take responsibility for irrecoverably losing my data. Most people are not you and me, and defaulting to perfect privacy would cause much more harm than choosing best-effort.

> Yes: Purism with their Librem devices, including Librem 5 GNU/Linux phone.

I think they meant other vendors making devices targeted at the general public. There's always been a demand for devices that trade convenience for security for the relatively tiny market of people like you and me who value that choice. No way I'd set my older relatives up with it.


> In a perfect world, yes. In reality, a friend lost his phone and forgot his Apple password.

I fail to see how this is relevant concerning unencrypted communication with Apple servers. There is no reason not to have it except malicious as far as I can see.

> targeted at the general public

This is HN, not a general forum. People here who care about privacy should be able to use such devices. So I recommend them.


> This is HN, not a general forum. People here who care about privacy should be able to use such devices. So I recommend them.

I think the spirit of respectfully challenging what you're stating is the nuance of privacy versus the entire world.

Even within the HN audience, I don't think it's a stretch to say some individuals would prefer a secure(data enc in transit)-and-available (note: not a complete E2E private against 3 letter agencies) versus perfect privacy.

FWIW, I'm one of those folks that have iOS devices that turned on E2EE for iCloud. I'm attempting a "close as I can get to amazing privacy but also balancing the nuance of life" strategies.


I don't think your lifestyle choice conflicts with educating people about backdoored systems. It seems to me that Hackers and civilians can work together to identify obviously backdoored systems (eg. Dual_EC_DRBG) and encourage review.


This isn't wrong but it's also not just Apple. Virtually all mainstream OSes scream into the cloud constantly and a disturbing amount of that traffic is either not encrypted at all or has unencrypted SNI fields and other easily fingerprint-able content.

Apps do it too. I was amazed when I looked into it how many apps contain metrics and other telemetry features and how often this isn't encrypted or has unencrypted SNI data that can identify the app at least.

Then there's DNS, of course, which is still usually plaintext and can leak all kinds of information about what you are doing and running.

All this stuff taken together can pretty easily be used to fingerprint you.

The only way to fix this would be to adopt protocols like QUIC or later versions of TLS with encrypted SNI for everything all the time and block outgoing plain text http.

What I really think is that allowing apps carte blanche access to the Internet is just not tenable in 2023. It's a bit analogous to the old MS-DOS days when apps had open unprotected access to all RAM. Outgoing connectivity should be whitelisted.


> Outgoing connectivity should be whitelisted.

The problem is, whitelists suck. Just browsing with NoScript is annoying. Unfortunately you cannot do anything to fix this at an individual level; you end up being your own 1-person full-time IT department.

We need some equivalent of class action; this is where companies like Apple could use their leverage, there's some precedent with things like blocking third-party cookies by default - this could've been the default in 1997, but we had to wait 20+ years and suffer all of the consequences in the process. Meanwhile the dominant platform (Chrome) won't change it, because conflict of interest.

Allowing everything to talk to everything was a mistake, but you can't fix it by cutting your own wire.

> [...] block outgoing plain text http.

I have a TiBook here (an insanely powerful machine, by most retrocomputing standards), that struggles to keep up with modern crypto - both TLS and SSH. I really appreciate it when websites offer plaintext http as an alternative, without forcing an https redirect. Setting the Upgrade-Insecure-Requests header, followed by HSTS, does the right thing for modern browsers.

You can bootstrap a basic Linux userspace, starting at a C compiler and zero lines of code, into having a working shell and an HTTP client, in about one weekend; but the road from there to having modern TLS would be monumental. How much of the present world's knowledge is going to become permanently inaccessible, should we ever have to start bootstrapping?


> What I really think is that allowing apps carte blanche access to the Internet is just not tenable in 2023. It's a bit analogous to the old MS-DOS days when apps had open unprotected access to all RAM. Outgoing connectivity should be whitelisted.

This is a good idea. Users may not actually enforce it, though. I think most people would rather give all apps the permissions they ask for than quit using whatever program they're used to in favor of another one that doesn't send telemetry. Most people don't seem to care that much.


> The only way to fix this would be to adopt protocols like QUIC or later versions of TLS with encrypted SNI for everything all the time and block outgoing plain text http.

Or choose a phone that doesn't act against you in the first place.


This article contains some truths but unfortunately also some untruths. For example:

> Several important connections (TSS, OCSP) are made from Apple devices in plaintext (that is, completely unencrypted). This began for historical reasons, but has been repeatedly reported to Apple. They have not fixed it.

This is inaccurate. Apple did in fact switch from the unencrypted ocsp.apple.com to the encrypted ocsp2.apple.com.

> Apple committed in writing a few major versions (i.e. ~3 years ago) to providing a preference setting for disabling online OCSP checks in macOS when I made a stink about it, within one year.

The author is mistaken about his role in this. The reason was not his "stink" but rather the fact that Mac apps around the world suddenly refused to launch, which everyone noticed:

https://www.theverge.com/2020/11/12/21563092/apple-mac-apps-...

> Apple does not allow plaintext server communications in apps released by developers in the App Store.

This is false, as I can attest as an App Store developer. I have several apps with NSAllowsArbitraryLoads.

I wish that "sneak" would be more careful in his writing. He has a tendency to undermine his own valid points by burying them in carelessness and overblown rhetoric, which causes people stop taking him seriously.


But he is correct in that Apple never implemented the ability to disable the OCSP checks.


Yes, as a number of people such as Howard Oakley and myself have already blogged about before sneak, but OCSP checks are now encrypted, which undermines the starting thesis of the article: "Apple is preserving unencrypted server connections in their operating systems in an effort to enable global location tracking of their userbase by passive monitoring of major internet backbones."

As I said, the article has some valid points, but the problem is that the inaccuracies and overblown rhetoric threaten to overshadow them.


I think there's a lot of this sort of thing that goes on.

Remember when Edward Snowden said you need to remove your phone's battery to be sure you weren't being overheard? The fix to that apparently is to make it so that phones no longer allow batteries to be removed - ie there is a general corporate collusion to give the customer what they do not want.


Why is this post flagged? The author showed that your personal data is regularly sent unencrypted by all Apple devices. Isn't this big news?


This swings between so many wildly different things. Yes, Apple should be encrypting the contents of connections, by using a TLS connection and/or app-layer encryption. No, I don't think OSCP is inherently evil, although it should also use encrypted traffic if it doesn't today. And finally, I worked tech support long enough to understand why giant customer-facing corps don't enable E2EE by default, because it's a support nightmare.

But if you know what you're doing, turn on Advanced Data Protection (https://support.apple.com/en-us/HT202303) and take more of your data protection into your own hands.


> First, iCloud E2EE is opt-in. The setting is buried

Go to Apple ID > iCloud in Settings. Is that really "buried"?


So what should we do?


Stop trusting Apple?


Assuming we have to use their devices, what measures can we take?


This is a weird assumption.


Only a weird assumption if you're going to be rigid with solutions. Think along the lines of adversarial maneuvering. Software and hardware solutions to combat and neutralize privacy leaking effects.


You do not control your Apple device. You can't see or change the code it runs. Apple's fundamental approach is "you trust us ultimately".


So there's absolutely no avenues of reverse engineering here, is what you're saying? If so that speaks more to Apple's competence in being able to lock things down.


There are certainly things you can reverse-engineer with relation to MacOS and Mac hardware. Not everything is reversible though, and even third-party projects like Asahi still have to run the Apple-engineered Boot ROM and device firmware. In the fictional, expressly non-real world where Apple deliberately compromises their hardware for intelligence agencies, you are a fish in a barrel. No amount of "adversarial maneuvering" will meaningfully save you.

Companies like Apple, Intel and AMD are proving to the world that the only secure hardware stack is a fully open one. God only knows what the second Snowden leak will reveal once current geopolitical tensions boil over...


Is it not enough to have a full view of all network requests? How can surveillance happen if we have full knowledge of what enters and leaves the device? Is there any evidence of hidden network requests?


It's enough to detect the network requests and read some of them (since they are unencrypted as explained in TFA), but it's not enough to fix that.


Like real Apples, they go rotten.


This is a wonderfully articulate write-up confirming what I've suspected about Apple for a long time. I'm glad I finally have a good reference to send people when this comes up, because the amount of "Apple is good for privacy" I've heard in my little circle of tech-adjacent but not tech-literate people is too damn high.


Even after 2013, after we knew that Apple too participated in the PRISM program ?


The entire scandal went in one ear and then out the other.


Why is this downvoted? It's true that Apple doesn't care about your privacy, sending your personal information unencrypted.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: