Hacker News new | past | comments | ask | show | jobs | submit login
Apple CEO Tim Cook: 'Privacy Is a Fundamental Human Right' (npr.org)
227 points by davidbarker on Oct 1, 2015 | hide | past | favorite | 178 comments



In that case, why are they supporting CISA? http://fortune.com/2015/09/24/tech-vendor-cybersecurity-bill...


This [1] is the letter linked to in the article that is evidence they support CISA, correct?

Maybe it's because I can't read legalese very well, but this looks like tech companies asking congress to codify our rights for the digital age, and to provide a well defined legal framework for those rights to exist within. I'm not really seeing support for sweeping powers of spying; it seems like they're asking for the opposite in fact. Again, I may be misreading that.

1: http://www.bsa.org/~/media/Files/Policy/data/09142015CongLea...


It's not as much legalese but trying to hide a wolf in sheep clothing.

They put other legislations around (I did not inspect them, so can't comment) and among them (in the center so person skimming it might miss it) they put: "Cyber Threat Information Sharing Legislation". They don't mention which bills they are referring to and they made sure that acronym doesn't match CISPA. If you search it you will find that there's no legislation with that name, and the only results you get are CISPA.

Why those companies want CISPA? Because it removes all liability from them when they share the data without a warrant. Basically we no longer will be able to sue them for violating our privacy.


> Because it removes all liability from them when they share the data without a warrant.

Aka, a Letter of Marque[1]. Some companies really want to be the privateers of the surveillance-industrial complex.

[1] https://en.wikipedia.org/wiki/Letter_of_marque


> Why those companies want CISPA? Because it removes all liability from them when they share the data without a warrant. Basically we no longer will be able to sue them for violating our privacy.

Nothing in CISPA allows them to violate their EULA.

If they had EULAs that violated your privacy in the first place, then that's a problem between you and the company, independent of CISPA.


This is irrelevant. EULA is between company and the user it is also imposed by the company. What they write might not necessarily be enforceable.


Great. Then don't agree to those EULAs, and don't use their products/services.

Problem solved.

And we didn't even involve CISPA!


Once that would become a law they no longer even have to mention it in the EULA, because it takes precedence. In addition to that all companies would be doing it so it's not like you can "vote with your wallet".

So I'm still not following why EULA has to do with anything in this context.


Or, for EU people, it’s a problem between the company and your government, and you can force the company to change the EULA, which in turn means they have to change their policies or lose in court, like Facebook.


EULA is something that is imposed by the company, it is totally optional but companies use it to restrict you what you can do with their product or service. They could mention that they would send data about you, or they could just say they have to follow US laws.


The letter calls for "Cyber Threat Information Sharing Legislation" without specifying _which_ bill. But given that CISA is the only such bill in front of Congress, it looks like Apple has endorsed the bill (without technically endorsing it). Given the specifity of the other bills the letter mentioned, and how publicly unpopular CISA is, this seems like a massively spineless move.

If Apple is serious about privacy they should clarify and affirmatively oppose CISA.


It is a shame this comment will be downvoted only because it does not support the confirmation bias of many HN users.


Does HN now show downvoted comments at the top of the page? Interesting design choice...


I have seen this phenomena before, it seems that they take into account more things that just how upvoted/downvoted it was, like the posting user karma, how much the answers were upvoted or the answering users karma.

It makes sense, sometimes a really stupid comment (not implying this was the case) can sprawl a really interesting discussion. I often look at the bottom of the comments for hidden goodies.


> If you buy something from the App Store, we do know what you bought from the App Store, obviously. We think customers are fine with that.

Users should also be able to install apps on their iPhones without Apple knowing what they're doing. Personal computing devices shouldn't have artificial walled gardens.

But I agree with him on everything else and think he's a hero for what he's doing.


From: http://9to5mac.com/2015/06/10/xcode-7-allows-anyone-to-downl...

"Apple has changed its policy regarding permissions required to build and run apps on devices. Until now, Apple required users to pay $99/year to become a member of Apple’s Developer Program in order to run code on physical iPhone and iPads. As part of the new Developer Program, this is no longer required. Apps can be tested on devices, no purchase necessary."


Does it let you install binary apps or do you have to compile from source? Do they still have to be signed? Who verifies the signature?


You have to compile an app using XCode and then it will automatically sign it with a certificate tied to your Apple ID.


Well, if you have a developer cert, you can use iModSign app to change the certificate on a binary and install it on your iOS device. I've done this before with a beta version of WhatsApp.


You have to compile it.


I think that what Apple has done here is an acceptable tradeoff. The app store trades a small amount of privacy for a lot more security. Looking at Android's perpetual issues with this makes me think that Apple made the right choice.


Security? What security?

Every single release with iOS has been totally broken security-wise, as can seen from the existence of jailbreaks.

Apple doesn't even require seeing the source code, and so there is no way they can stop malicious applications written to evade detection from getting to the App Store.

And thanks to their policy you must browse with Safari WebKit, which is a nice juicy ~40% browser share target.

Of course, both Android and Windows Phone are broken too, since they also expose oversized monolithic kernels written in C to random applications, but at least Android doesn't require you to give up freedom to get non-security.


> Every single release with iOS has been totally broken security-wise, as can seen from the existence of jailbreaks.

The jailbreaks for recent versions of iOS only allow jailbreaking unlocked devices, at which point security is already compromised.

> And thanks to their policy you must browse with Safari WebKit, which is a nice juicy ~40% browser share target.

In iOS 8 they allowed other apps to use the same JIT engine Safari does[1]. This makes me inclined to take them on their word when they say it was previously disallowed for security reasons (some early jailbreaks could be done just by visiting a web page, using security holes in Safari's javascript JIT).

[1]: http://9to5mac.com/2014/06/03/ios-8-webkit-changes-finally-a...


The jailbreaks for recent versions of iOS only allow jailbreaking unlocked devices, at which point security is already compromised.

What? You must mean something else because unlocked devices aren't (necessarily) compromised.


> What? You must mean something else because unlocked devices aren't (necessarily) compromised.

I mean compromised in the sense that the malicious party now (for example) has access to the user's email, and would be able to reset a whole host of passwords for online services (assuming they don't use 2FA or something similar, which most users don't). If they wanted to install a keylogger, or get saved passwords, then yes they still have to jailbreak my device. This xkcd is relevant: https://xkcd.com/1200/


I think we must be using "unlocked" in different ways. I'm intending it in what I think is the conventional way for this context: when the device's cellular subsystem is not electronically locked to a particular cellular service provider.


You're absolutely right, I should've picked a different word. I mean unlocked in the lock-screen/password sense. Of course, messing with carrier settings is not easily done even if the phone is not carrier-locked, and pulling of an exploit that way is even more difficult.


>Security? What security? Every single release with iOS has been totally broken

Here's a recent report from computerworld - Malware infections delivered via mobile networks - Windows 80%, Android 20%, "iOS and other operating systems were at nearly negligible percentages"

Nothing's perfect but iOS isn't that bad.

(Window's numbers seem to be mostly pcs with tethering. "Data generated from scans by Alcatel-Lucent's Motive Security Guardian technology, which is deployed worldwide by both mobile and fixed-line networks, and monitors traffic from more than 100 million devices")

http://www.computerworld.com/article/2984444/mobile-security...


I love Apple products, but please stop. Please. Do not be blind to the fact that Apple has to remove apps from the app store often due to security issues. And most of what you hear about on the Android side is either blown up in the media or because people side load apps from ... questionable sources.


You're kidding me right? Android applications are allowed to run daemons that will continue to run if you like it or not. [1] Due to Google's lax review process basically you can write a virus and post it to the store.


demon (long running task in the background) != virus

Virus' primary function is to replicate itself and infect other hosts. You can not simply install apps (or viruses) on other mobile devices without user intervention. Also viruses need access to system and/or other apps to be able to propagate itself, which is not the case on mobile platforms.

None of that things is true for demons.

Btw, iOS also has long running processes if you register proper functionality (VoIP, background download..) when submitting the app to App Store. Also iOS and Android apps can be awaken from the server and perform tasks without user noticing it.


Why would you think allowing daemon processes are bad? You want notifications without having your app front and center, right? Need a daemon for that.


I'm not sure knowing what users purchase makes the App Store any more secure. I'm pretty sure limiting what users can install on their devices is a different sort of tradeoff -- one of freedom for security.


Apple might just be better at covering it up. I would rather see it implemented like GateKeeper on OS X is. Safe store with the user having the power to decide if they leave the store or not.


While I applaud the spirit, I don't think actually implementing it like GateKeeper is a good idea:

http://arstechnica.com/security/2015/09/drop-dead-simple-exp...


"Those who would give up essential Liberty, to purchase a little temporary Safety, deserve neither Liberty nor Safety." -B.Franklin, baby


So we have forgotten XcodeGhost already or the iCloud hacks?


Neither of which would be aided by opening the walled garden?

Which isn't to say I don't think there are advantages to Android's model and that of more open package repositories in general, especially for power users. However most of Apple's base is people who don't care about openness or extra repositories, they want their iPhone to work with a minimum of fuss and not have to worry about malware like you do on Android.

The fact that we can only point to a single (XcodeGhost) cromulent exploitation of the Apple App Store stack is a testament to how good Apple is at maintaining the relative security of the iOS ecosystem.


The point is that security is not an inherent or necessarily attainable feature of a walled garden. Some people seem to think that just because Apple approves apps there isn't going to be any security issues. The approval process does prevent some security problems, and has some other features, but mobile security overwhelmingly comes from things like sandboxing.


Perhaps that's the case, but if it is, then it seems clear that Apple's sandbox is far better than Android's as well. Also, approving apps does cull out a range of malware, because it is human- and computer-analyzed before entry. As I said previously, this has its downsides, but it does make it harder to get malware into the iOS App Store. Not impossible, but harder.


Wasn't XcodeGhost a direct result of bypassing Apple's security model?


As far as I know the users didn't. Which is to say that the App Store doesn't provide much more security than e.g. a black list. Androids problems are more than anything having a fragmented ecosystem.


XcodeGhost wouldn't have happened if developers weren't able to disable Gatekeeper and the iCloud hack was done with phishing for security question answers.

http://gawker.com/feds-seized-chicago-mans-computers-in-cele...


The iCloud hack was enabled by the automatic backup feature that despite storing very sensitive information didn't provide adequate security for doing so. If there were actual precedent for Apple caring about their users privacy, there would be outrage over this rather than being trivially dismissed.


That's not relevant at all to the iCloud "hack". It's a cloud service, the whole point is to store data online.

Those accounts were breached because celebs were phished and not because of bruteforcing, as some people were speculating at the time, or some other vulnerability.


Everyone gets phished. Fraud happens. Secret Questions are especially susceptible to phishing attacks. This has been known for about a decade. Apple had not updated the security protocols for iTunes, and in turn App Store and iCloud, for that time. The hack was definitely Apple's fault.

'"In total, the [500] unique iCloud accounts were accessed 3,263 times," the document states.' - http://www.businessinsider.com/fbi-investigation-into-the-fa...

This was a massive failure in Apple's part.


Yes, how can one trust a company's claims about privacy when the seemingly lose no trust over even such a high-profile incident? Celebrities phones has been targeted since at least the T-Mobile Sidekick.


The security benefits come mostly from curated package management in general, not limiting the device to one particular repository (the App Store).


I don't think the public is sophisticated enough to understand the decoupling of these two ideas. But I also think that one day, probably soon, it will be.


Technical users are members of the public, too.

If you want to treat users like infants, then you should hereby completely forfeit any and all rights to complain about "tech shortages" and technical illiteracy of end users.


So, this is a meta-comment because it's really just about your comment, but I feel compelled to point out the unnecessary negativity, and the unforced errors, so to speak, in representing my words, what I want, and what I've done. I'm not even sure what to call it when you create things I cannot do any longer to maintain consistency with my position - but without any evidence that I've done those things, or want to do those things. It's something like a straw-man argument.

The right thing to do is to ignore it, but on the off-chance that you might be somewhat self-aware, I thought I'd give you the opportunity to catch yourself.


Apologies, the vitriol wasn't directed towards you, specifically. You simply gave me an opportune moment to express dissatisfaction with the paradoxical desire of simultaneously wanting more locked down devices in the name of convenience and "security", while also making programming and/or CS a compulsory school subject.


> "security"

Why the scare quotes? Just because it doesn't provide 100% security doesn't mean that it provides no security.


Which means they're making the wrong choice allowing Macs to run unapproved software, right?


With all due respect, he is not a hero. He is just the CEO of Apple. And Apple, even though it produces great products, profits from exploiting factory workers in countries where workers have less rights and where labour is cheap. If he was actively fighting this system of exploitation, that would probably make him a hero. Instead, he profits from it. That makes him not a hero but just another businessman. Furthermore, he is defending privacy only because it strengthens his business and weakens that of Google et al.


> profits from exploiting factory workers in countries where workers have less rights and where labour is cheap

At least in China, the "exploited" workers you speak of have chosen to work at factories rather than do the kind of farm labor their parents did. Do you think they would be better off if they were not able to make such a choice? If they were paid the same wages a factory worker in Norway might receive, do you think the manufacturer would choose to open a factory in China?

You describe Cook as just another evil businessman, whereas I suspect he has done more, practically speaking, to lift people out of poverty than most.


Apple could definitely afford to be a lot more idealistic, but it's just not (no longer) in their culture and frankly that is what a lot of people (stockholders) like about them. Also that's a lot of fallacies for one comment.


Apple could pay a lot more to Chinese workers and still pay less than Norway wages. Tim Cook collects $hundreds of millions by exploiting a power imbalance in the labor market.


Are you criticizing a person/organization who provides opportunity to hundreds of thousands of workers because he/it did not tweak the knobs exactly as you would have, if it had been you?

Many people have decided that working at Foxconn, etc., is better than the alternatives they perceive as being available to them. Do you not feel somewhat strange judging their arrangement as exploitative? At what hourly rate would it stop being exploitative? How much is it permissible for Tim Cook to earn in a year? Do you really feel able to decide these things?

Are you being exploited? Why not? Aren't your employment opportunities limited by the circumstances of your life, your education, your abilities, etc., just like everyone else?


I don't know what the issue with is with criticizing.

Personally, hearing stories about their ID's being taken from them, working 2 weeks+ non stop, being forced to fill out forms a certain way, workers committing suicide, etc..are all cringe worthy to me.

Not to say there are no benefits...but that does not mean it can't be better. I'd like to read interviews of people who worked there to see if the job was what they were expecting, worse, or better.


> I don't know what the issue with is with criticizing.

No issue, abstractly. But it seems to me that in this case, Apple is providing opportunities that many people are choosing them for themselves, opportunities that they would otherwise not have. And for the crime of not providing even better opportunities, Apple is being characterized as exploitative. That seems unreasonable and unfair.

> I'd like to read interviews of people who worked there to see if the job was what they were expecting, worse, or better.

I would too. From what I know, it's complicated, and contrary to what a lot of people would expect. For example, there really has been massive discontent at Foxconn, etc. -- because they weren't given as much overtime as promised.

Do I wish that people had better opportunities? So, so dearly. This is probably the deepest emotion/feeling I have in my life, the sense of how fortunate I have been compared to how difficult the lives of others have been, when many of them are just as smart, just as hardworking and resourceful as I am, but were simply born in circumstances that did not grant them as much opportunity as I have enjoyed. This is a big part of why I am in China and why I study Mandarin.

But saying Tim Cook earns "too much", or that factory workers are "exploited" because they make choices that we wouldn't, were we to be magically swapped into their bodies, but with our abilities, or some other imaginary situation that has never existed, is not very understanding of the situation, or fair or productive.

The key, I think, is to realize that if you were the "exploited" workers, you would make the same choices. Really. If you were them, you would do the same thing. Then I think it's more clear that Apple is providing avenues toward better lives, which I see as a very good thing.


It's not like we have a free worldwide labor market anyway. No one is enslaving those workers, they only choose to take that job.


Choice is a lot less relevant for this discussion than you think.


What he's saying is good, but it doesn't make him a hero. It costs him nothing to take this stance, and is actually helpful to him as one of the main competitors to the company he heads is deep in this specific issue, and it's a perceived weakness of theirs.

You aren't a hero for doing or saying something convenient that benefits yourself, regardless of whether it's good and/or true.


>Personal computing devices shouldn't have artificial walled gardens.

I couldn't agree more with this sentiment. Requiring that all software on your device be signed by a single entity (i.e. apple) is horrifying when you think about where that might lead to in the future.


And where is that?

Like he said, any back door for the good guys is a back door for the bad guys as well.


Accepting the risk that other people might misbehave is price of freedom.

Do you want a General Purpose Computer? Or do you want an appliance that is ultimately controlled by someone else, where you have to get permission[1] if you want to use it in any way that wasn't pre-aproved? What are the odds that the things you want to do will continue to be approved when you ask for permission next year? Or ten years from now?

If someone can restrict how you use your device, you then they are de facto the actual owner of the device. Do you want to own the products[2] you buy, or do you want a future where property rights are rare and you have to lease everything?

The War On General Purpose Computing continues, and the people that wish the Turing machine could be stuffed back in its bottle have been wining small but important battles over the last few years. Apple deserves a lot of blame here, as the walled garden was previously limited to stuff like game consoles, but the the big problem has been the many developers that chose shiny tech, promises of easy development, end-user convenience, and short-term profit over the freedom to tinker with products you actually own.

Of course there will be costs and risks. Fighting this trend toward walled gardens, like any war, will probably require certain sacrifices. I suggest paying those costs now, as price of freedom will only increase.

[1] https://www.fourmilab.ch/documents/digital-imprimatur/ (note: this is originally about publication, but the ideas apply similarly to the War On General Purpose Computing. Also, consider when the essay was written; some technology has changed, but the basic idea is still important)

[2] Many products, not just your "phone" (portable computer). Just look at the rush to throw a CPU and 802.11 PHY into absolutely everything. There are many example, but the current attempt by John Deere ( http://ifixit.org/blog/7192/john-deere-mess/ ) to circumvent the "first-sale doctrine" is a perfect example of what is going to happen to most products if we give up ownership.


What general purpose computers (even phones and tablets) do you see going this direction, other than Apple's iOS? My Mac will run anything I want, my Surface will do the same, and my Android tablet has both side-loading and an unlocked bootloader. Hell, even my router is unlocked, and it's from ASUS, not some small open-source oriented manufacturer.

You claim that if we even allow companies to put out products with locked firmware, it will lead to a dystopian future. So when is this slippery slope going to start? The iPhone was released in 2007, and pretty much nothing has become more restrictive since then (especially considering that pretty much all phones were already locked down). The threat of any computers being TPM locked by the manufacturer, let alone all of them died in its cradle. If anything, we're moving in the opposite direction, since Apple now allows sideloading on iOS devices[1].

And what about users who want a phone that is locked down for security purposes? Why shouldn't we be able to choose the phones we want, while those who want sideloading/flashing choose the many options that support that? If either laws or market collusion actually removes this as an option, I'll join the fight. But for right now, it looks like this war on general purpose computing isn't even brewing.

[1]: http://9to5mac.com/2015/06/10/xcode-7-allows-anyone-to-downl...


> What general purpose computers (even phones and tablets) do you see going this direction

Windows. Microsoft, as usual, is late to the party, but Win10 is clearly a step towards the walled garden model. It's not there yet, but stuff like their built in app store and removing choice form the user betray the direction MS intends to take Windows.

Intel CPUs. Why do you think there has been a push for SecureBoot and the new SGX instructions? Hardware support is needed if you want to change WinTel boxen form a General Purpose Computer into a locked down appliance. That hardware support now exists in Skylake and later Intel CPUs. Intel even says on their website that the SGX instructions are about creating "trusted" enclaves that software vendors can use that cannot be accessed by someone with physical hardware access.

> So when is this slippery slope going to start?

It started many years ago. Some of us have been warning about these problems for almost twenty years. When we warned that these technologies were coming, we were laughed at because the threat didn't exist yet. When implementations started to show up, we were ignored because nobody was using those tools yet to lock down systems. Now they are slowly starting to turn on, and you've been given yet another warning. Do you intend to wait until the OS is fully locked down? Or do you want to start to fight for your right to run a General Purpose Computer while you s till have the ability to do so?

Today there was even a thread on HN about homebrew having to work around OSX "System Integrity Protection". Sure, you can disable SIP by jumping through a few technical hoops most people won't understand. Are you going to fight back against this trend, or are you going to wait until you cannot disable SIP "for security reasons"?

Just because you've been ignoring these steps doesn't mean they are not happening.

> TPM

The TPM is only key storage and hashing to check the bootstrap chain-of-trust. The TPM never had any "locking" features. Why are you ignoring all the other hardware changes that have happened after the TPM? Active Management Technology (AMT), Software Guard Extensions (SGX), and UEFI SecureBoot have all happened after the TPM.

> And what about users who want a phone that is locked down for security purposes?

You know what would work a lot better? A hardware switch that had to be flipped to install (sudo) software, and had to be flipped back to boot as normal.

> Why shouldn't we be able to choose the phones we want

Of course you have that choice. That doesn't mean it's a smart choice. You're pushing the (incorrect) assumption that security is in conflict with the end user being able to control their own property. Locking your car door does not require giving up your ability to modify the car's engine. There are other ways to provide security. More importantly, the concept of freedom means that some people will do stupid things with that freedom, but we respect their right to make those mistakes. The answer to malware apps isn't removing everybody's right to use the products they buy as they like, but to educate users and write better UIs that help guide novices.

edit: (accidentally clicked submit before I was done)


Show me a way to prevent people from getting viruses and malware that doesn't involve the equivalent of requiring everyone to know how to service the entire engine on their car by themselves.


It doesn't exist. There is no way to prevent evil from happening when your opponent is a sentient actor that can see your defenses.

Also, stop conflating basic computer literacy (which includes common security knowledge) with the knowledge required to "service the entire engine". We expect people to learn how to drive a car safely. This doesn't require learning how to repair the car or other technical knowledge.

This belief that we should keep users ignorant is highly offensive. If there isn't a clean way for someone to learn the basics of how to use use software safely and securely, that is entirely the fault of the software vendors. Unfortunately, instead of addressing these problems (which is probably hard and expensive), it has become fashionable to blame the victim.


My point is that things that are "easy" for the tech elite are not so easy for the general public. People have a hard time figuring out how to (e.g.) "program the VCR" let alone use the computer.


I know that's what you believe, and it's a perception that must change in the tech industry, because it gets in the way of important things like education and it breeds contempt. When you treat people like idiots, they will respond like idiots and learn to hate you because of it.

The "VCR blinking 12:00" problem is a good one, because it is absolutely not caused from a lack of capability. The clock on the VCR is often a very low priority for most people, and while those of us that understand technology think it's a simple thing to set it and move on to other problems, for a lot of people, they estimate that it would require finding the manual, reading it for a while, some trial and error, etc, and they judge it's not worth their time. They have more important things to do. The much easier solution is to either ignore it because they really don't care, or wait until the local nerd stops by and bug them to do it. I would even suggest it's a very good evaluation of opportunity cost.

Now, computer security is very important, but it suffers from a pandemic problem: most people are ignorant (which is not their fault) of just how common security problems are, the cost of failure, and the novel problems that technology has created (e.g. automated attacks that make questions like "am I a target?" irrelevant.

You will never solve those problems by taking people's capabilities away. All you've done is give them a false sense of security, because they trust devices based on an incorrect threat model. Educate them, and they will adjust their behavior. This is significantly less technical than learning how to set a VCR's clock.

As for UI - consider the example I used in another comment: include a hardware switch that must be flipped to allow "sudo"-style access for things like installing software. People understand this (I knew a LOT of non-technical people that regularly used the write-protect switch on 3.5" floppies when they didn't want to erase their homework. Simple metaphors like this, when applied consistently (they shouldn't have to use the switch very often) can help a lot.

You will never solve everything; if you had a truly foolproof way make a safe OS for everybody, I suspect you would have solved the Halting Problem. So use technology to catch the obvious stuff and provide tools for people, educate them well (this will take a few generations), and most people will be safe enough.

What you absolutely shouldn't do is limit everybody in a futile effort to try to make it safe for everybody. This is an impossible task, so you will inevitably end up in a cycle where you remove more and more features, as clever people find ways to abuse them.


> And where is that?

To censorship[1].

[1]: http://www.bbc.co.uk/newsbeat/article/33280133/apple-removes...


My Galaxy S3 let me flash whatever I wanted on it. Just had to reboot in a special mode and click a button agreeing to void my warranty. No back door required.

Why can't iPhones do the same (from a security perspective, I understand the business reasons)?


I'm saying that the flashing/side-loading is the back door. A malicious actor with physical access to your phone (think screen repair shop, TSA agent, etc) could flash a compromised OS or install malware on your device.

Also, there a lot of not-savvy people who will follow instructions in a well-crafted email or pop up and allow themselves to succumb to spyware and malware.


I'm saying that the flashing/side-loading is the back door

Next you'll be saying compilers are backdoors, too. That's not what the term "backdoor" means.

A malicious actor with physical access to your phone

Stop. Physical access means all bets are off.


I should have been more clear. I was using Tim Cook's statement as an analogy for flashing/side-loading. Allowing good guys to do it means bad guys can do it too.

Tricking a user into side-loading malware does not require physical access to the device and is relatively common on the Android side (more so on third-party stores).


My last few Android phones have wiped themselves before allowing me in a state where I could flash a modified version of android.


Well, it's a good thing we have Apple's track-record to show us it's impossible to break out of these walled gardens, otherwise we'd really be in trouble if someone got physical access to our phones.

https://www.theiphonewiki.com/wiki/Jailbreak


Fair enough. I used to keep up with jailbreaks way back before I started buying unlocked phones and had read about difficult-to-crack OS versions (and the as-of-yet unjailbreakable 3rd generation Apple TV), but I wasn't aware that it was still this pervasive.


While I hate the economic stifling of tech innovation that is the App Store culture, I would like to point out that Jailbroken iphone were used by the Chinese government to target protesters in Hong Kong quite recently.


My point is that stating that side-loading is a back-door that can be abused when your phone is in someone else's possession ends up not being a very good argument when that actually ends up being the harder way to accomplish getting software onto the phone, considering side-loading is both password protected (if your phone is) and still doesn't expose functionality beyond what the OS allows (unlike jail-breaking).


The device clears user storage before allowing you to flash a new OS, so that isn't an effective way to compromise a user's data. Application installation requires unlocking the device, so the ability to sideload doesn't give an attacker any meaningful ability over installing from an unscanned app store like Apple's.


The device is wiped of profile data and logins. You files are intact and accessible through a file explorer.

It is a very restricted and time consuming method to access a device.


back door for the good guys? you mean being able to execute unsigned code?

If apple is ever compelled by the government to not allow a certain app (if they lose in the FISA courts) or if the leadership of apple changes? Stories like this could be just the beginning: http://www.theguardian.com/technology/2015/sep/30/apple-remo...


> Personal computing devices shouldn't have artificial walled gardens

To me this seems excessively moralistic, like "all closed source is bad". There are concrete advantages and disadvantages to a walled garden and whether a device has one or not should be up to the manufacturer. Customers then can weigh the tradeoffs when they make their decisions.

I also don't find privacy to be among the significant disadvantages of walled gardens. Lack of choice leading to lower quality experience, the chilling effect of disallowing specific content or entire categories and the potential for price gouging seem to be far and away the biggest issues.

Another issue is "walled gardens are bad" is a bit ignorant of real restrictions device makers have historically had to navigate such as carrier contracts saying no tethering apps.


Right. I haven't used an iPhone in years but I remember being forced to enter credit card details to install or upgrade even a free app.


Their used to be a way around giving up a atm/credit card? I remember getting home with my big purchase, and thinking, I'm not giving this company more money. Plus, I just don't like to give the number out.

I don't think I even gave them my real address.


It's possible to use the App Store anonymously. For example, Apple allows users to create and use accounts via Tor. And it accepts gift cards, which one can purchase with thoroughly anonymized Bitcoin. Getting devices anonymously is the hardest part. Paying cash for used devices is probably the best option.


I like the idea of keeping app signing requirements, but giving each user a device-specific signing key if they request one. I don't think a novice user is going to accidentally get the signing key for their device and then sign a malicious app with it.


I would support something like an Fdroid repo for ipas and apks. It supports private repos. Every company having a repo you could simply add would be the best. Would fix trust and maintenance costs too since you control the repo list, they control the repo.

I can't see Apple or Google giving up the cash cow that is their respective garden though.


You have a misconception: any apple device is not a 'personal computing' one, but a 'corporate controlled' one. There goes your dilemma.


Unfortunately if you remove the walls from the garden you get a malware, crapware, and (ironically given the topic) spyware explosion -- a tragedy of the commons more or less. Mobile devices are used by over a billion people, most of whom don't know much about computers. From the perspective of sleazy marketers and black hat hackers that's a lot of fresh meat. They're better than PCs since they're studded with sensors, allowing the user to be tracked and spied upon to an unprecedented degree.

Look at the Google Play store and the Android ecosystem, which is comparatively more open. If you don't know what you're doing and install apps from the app store without examining them closely, you'll get random ads, trackers uploading your geo-location constantly to god-knows-where, etc.


> If you don't know what you're doing and install apps from the app store without examining them closely

That's completely orthogonal though, because Play Store is Google's walled garden. What Google allow you to do, in addition to downloading things from their walled garden, is also side-load apps completely separately from the walled garden.

There's no evidence that the ability to side-load causes an increase in malware/crapware.


What would we do if we didn't have apple to protect us from our own ignorance?

Apple is fallible. If developers can break out of app review constraints by using carefully included bugs[1] then the only solution is actively monitoring what the apps are doing. If that's the case, what value-add is Apple's walled garden? Just keeping out crappy programs? I think there are plenty of solutions for discovering what programs are good or not, as we've been working on that problem for decades now for computers specifically, and centuries in the general case.

1: http://www.imore.com/researchers-sneak-malware-app-store-exp...


I think you and the downmodders are shooting the messenger. I do not like walled gardens any more than I like taxes or drunk driving checkpoints. I'm just explaining why they are there and why many users actually prefer them, and I'm doing so to let people know there's a problem worth solving. Pretending the problem doesn't exist doesn't help.

Computer security is just far too confusing and difficult for the non-tech-savvy. With walled gardens people can delegate their security to someone that, while not perfect, is much better at it than they are. For most people this is a huge win. This is what happens to most peoples' computers if they are "open":

https://c1.staticflickr.com/1/99/259360546_504c726d0b.jpg

If someone has a better idea I'm all ears, but I don't know of one that could match the Apple Store for ease of use. Since user experience trumps everything, app stores are winning.


None of the security and privacy arguments require a walled garden though. The real benefit is the fine grained control over what an application can do on your system. A set of fine grained controls and a a good set of defaults, or better yet the ability to subscribe to defaults as recommended by third parties. I Apple had controls much finer than those presented by Android with a set of sane defaults, and to that I could add what the EFF's recommends as defaults, we would all be better off, because Apple's best interests and my best interests are not always aligned.

I'm going to repeat that last part, because I think it's very important. Apple's best interests and my best interests are not always aligned. Why should they have ultimate control over what can and cannot be run on my phone.


"why many users actually prefer them" - Android is more open and from the numbers it seems users prefer it more then Apple.

But in actuality users don't have much option right more so in iOS.They would use the defaults.


Drama.

Seriously, stop giving the media so much credit. You can trust the Play store as well as you can trust the App store.


"Here is what we do on our platform. Here is what an unnamed search engine company does, or might in theory do, on their platform. You will probably agree: what we do is better in multiple, obvious ways. To be even less subtle, I am indeed trying to cast our competitor in very bad light."

If they're willing to walk the walk, I don't even mind how transparent this "privacy as a competitive advantage" strategy is.


They have been for years and people wondered why they didn't talk about it because of how it differentiates them from the competition. Now they're talking about it and a common comment here is "they're only doing it because Google makes their money from advertising", well yes that's the point. Kind of funny but predictable.

Having a business incentive that's aligned with privacy is the absolute best thing you could ask for if you want privacy. First because it will ensure they remain committed to it, and also because it shows up the competition and forces them to move in that direction as well.

http://www.apple.com/privacy/approach-to-privacy/

This shows pretty clearly that they "walk the walk".


As long as they have your data and don't give it to anybody else, there is no meaningful difference in privacy based on how they process the data. Why is processing words in emails for search indexing and spam filtering less of a problem for privacy than matching ads to words that appear in emails or figuring out helpful suggestions from them? It isn't, but Apple is happy to pretend it is to the rubes who will pay to use an inferior product.


>They have been for years and people wondered why they didn't talk about it because of how it differentiates them from the competition.

Probably because privacy concerns weren't mainstream until recently, and it would have looked like an excuse because their services weren't up to par. Which, to be frank, is still the case.


In fact, it would be stupid of them NOT to capitalize this golden opportunity.

The unnamed search engine company simply cannot afford to offer the same level of privacy as Apple is capable of offering. Meanwhile, the only other supermassive company whose business model might have allowed them to focus on privacy has just thrown in the towel and decided to become as privacy-invasive as everyone else, starting with the recent free upgrade to their flagship operating system.

Being a hardware company that also happens to sell some cloud-based tools, rather than the other way around, sure has its perks.


Amen. Who the heck wants their own government spying on them? I'd rather be "less safe" whatever that even means. The government works for us, not the other way around.


> Who the heck wants their own government spying on them?

A depressingly large proportion of the population.


I believe it's more of an education problem, than of people actually welcoming the intrusions.


Most people don't know of any intrusions. I couldn't even tell you if my government has ever intruded on me, and to assume they do is a bit too much paranoia for my day-to-day life.


Apple has admitted to giving the US government information.

So they don't run algorithms over your meta-data, instead they just give your personal info the the government.

I'm aware Google does comply with government requests too, but also attempts to push back and keep the government honest.


Apple also attempts to push back.

"Apple only disclosed content in response to 27% of the total U.S. account requests we received during the period from July 1st, 2014 to June 30th, 2015"

Source: http://www.apple.com/privacy/government-information-requests...


You mean the "spying" program that was revealed to have filters to filter out their own citizens communications?

Seriously, why do people conveniently ignore this little fact? Why would these filters exist, as revealed in CLASSIFIED documents, if the government were interested in spying on its own citizens?

If the government were interested in spying on you, they would just do it. They wouldn't need to build machines to protect your privacy rights.


So they record/log every single phone communication in the US, only to filter out every single phone recording made in the US?

That doesn't make sense. How can you tell who is talking on the phone anyway? And what if a US citizen was talking to a non-citizen?

Regardless, the point is that backdoors - even if used 99% of the time correctly - opens up your phone and all its nitty gritty details of your life to adversaries. IDK how the NSA phone logging worked so I can't really comment on how safe it is. None us can. That's sort of the problem.


Anything can happen anytime in the future, but this is the discussion about what is actually happening. Maybe a giant seamonster comes up and swallows the earth whole? Who knows.

But, right now, there exists filters to filter out US citizens comms from NSA spying, as revealed in CLASSIFIED documents.

This is something Snowden's friends love to conveniently ignore, because it doesn't fit into their artificially constructed world-vision of the US government as giant evil monsters.


Well it conflicts with what we've heard from multiple sources.

I thought I recalled Obama saying that all meta data was stored. Regardless...

How did they detect when a US citizen was using a phone abroad, or a terrorist using a landline phone in the USA? Identities are not tied to phones.

Furthermore, how secure is the system they used? Can adversaries tap in to the NSA database? Is the filtering "hard wired" or is it something that is programmable and how susceptible is it to bugs, viruses, or hacking? How trustworthy are the people that control this data? Is it possible that anyone that works at the NSA is not trustworthy and might defect? Such as Edward Snowden?

These are important questions. Even with 'filtering' there is a lot of questions to be asked. And I doubt all them have great answers. The point is, there is a system in place to problematically spy on any phone conversation at any time among other things. Even if they use it "for good," that should be very concerning for anyone that knows that programming secure systems is mostly an effort in futility. Especially for the government who can't even code up a CRUD healthcare website without issue - and that the system is completely closed source not allowing others to review it.

Which brings us to another issue - why was this secret? Isn't this the kind of thing that should be approved by USA citizens? And publically verified for safety and security? This alone is so concerning.

I'm happy you think "filtering" makes all of these questions a non-issue though.


What difference does that make? Foreigners deserve privacy too.


OK then codify that into a law.

But, right now, the NSA does what is legal, which is to spy on foreign nationals.


Legal is meaningless. What they're doing is wrong.


Or right. Take your pick - all morals are relative.

We decide what matters when its time to write the laws.


They deserve privacy from their own governments. Ours owes them nothing.


Is that true? My understanding is only one party needed to be a non-citizen.


Apple's comments on privacy over the past year have seemed very opportunistic.

>But what they don't want to do, they don't want your email to be read, and then to pick up on keywords in your email and then to use that information to then market you things on a different application that you're using. ...

They don't have the technology/infrastructure to perform machine learning techniques that google employs. So they will say 'we think it's a feature that none of our services interoperate.'


Apple actually has a solid argument when they say "Our users are not our products". Very little of Apple's revenue comes from ads so they really have no need for the infrastructure. However, considering they are one of the largest companies in the world with hundreds of billions in the bank, I seriously doubt it would be very hard for them to setup that sort of infrastructure.


> Very little of Apple's revenue comes from ads so they really have no need for the infrastructure

However, they wish to carefully control the advertising on iOS, if you want to uniquely identify a user for retargeting purposes, well, tracking cookies are evil, but iAd is good!

https://developer.apple.com/iad/my-audiences/iad-my-audience...


Fond something near the end of that link:

>In the spring of 2015, iAd plans to provide developers with unique audience insights about the users in your segments, viewable in iAd Workbench—including their distribution by gender, age, geography, and iTunes Preferences. With audience insights, app developers can find new users with similar characteristics to their existing customers, and tailor messages for retargeting campaign

Did they abandoned this or not? Seems contrary to their current "we aren't using your data" mantra


They are definitely using our data, but probably not in the same way. More of a we won't use your data unless it's to improve your experience with our device.


I'm not saying it isn't a solid argument. Their PR people clearly thing it is the way to go, judging by how often it has come up.

You're right, it wouldn't be hard, but they aren't doing it. Why? Maybe because they wouldn't be able to use this talking point or maybe they plan to be a hardware/OS company forever and never expand their services? IDK


I would guess that Siri requires some beefy machine learning techniques.


> They don't have the technology/infrastructure to perform machine learning techniques that google employs. So they will say 'we think it's a feature that none of our services interoperate.'

You mean the machine learning techniques Google developed to better market people things?

This is like saying, Sweden doesn't have nuclear capability so it's opportunistic for them to claim they are peaceful. Only states that pour resources into building nuclear weapons can have a legitimate claim to being peaceful.

Look at Apple's approach to privacy for Maps, which uses randomized identifiers and trip segmentation so that Apple cannot reconstruct your trip. Or iMessage, which uses full end-to-end encryption. That's not Apple failing to catch up, that's Apple going out of their way to protect your privacy.


"the machine learning techniques Google developed to better market people things?"

That's a bit of a cheap swipe. Google's machine learning has been applied to things like spam filtering, image search, and speech recognition.


Why not both? I would consider an email provider that one, didn't run an ad network, and two didn't scan my emails at all a huge feature. Neither Google nor Apple fit both criteria however.

This doesn't have anything to do with the technical capabilities of each company, it has to do with the fact that my and Google's interests are not aligned. Google, like every other company, is going to do what benefits them the most and there's nothing wrong with that. But when what benefits them doesn't benefit me I have reconsider using their services.


>They don't have the technology/infrastructure to perform machine learning techniques that google employs.

Pretty sure Siri is machine learning...?

Even if it wasn't, they could just hire a bunch of PhD students, maybe acqui-hire a couple of tech companies with their ludicrous amounts of available capital and bam, they have their machine learning algorithm


Siri's voice recognition is awful compared to Google Now and Cortana. It has refused to recognize my accent ever, to the point where it seems almost racist.

Not that voice recognition products count as machine learning. Google has a growing neural network, Apple is not competing in that market at all.


> interoperate

That's what we're calling online profiling now?


>> interoperate

>That's what we're calling online profiling now?

My Google Now cards on Android say "yes it is". They are driven by email, calendar entries and phone sensors all tied to my profile and it's bloody magical. YMMV.


Didn't Apple support CISA recently? Actions are more important than words.


What's with apple and their super-privacy stance lately? First his letter, and now his interview. Not that I mind it, but I feel like they're expecting something bad to come out soon.


They have no business model based on consumer information. Guess who does? One of their big competitors, Google, whose revenues are 90% made up of advertising, thrives on this data.

Further, privacy is obviously a hot topic, well at least in tech discourse. None of my friends are really bothered by any privacy concerns and couldn't care less if Google mines their email to provide targeted ads, honestly... But to the extent that privacy is meaningful to people, safeguarding it is obviously a benefit and thereby something one can sell, and seeing as there've been so many different scandals people probably are at least somewhat sensitive to privacy-centric marketing.

In short, they're selling a consumer benefit while taking a jab at the competition. Not a bad deal, there are barely any cons. Apple never really got into the advertising game, doesn't need to and unlikely plans to. They've had some ad products and neglected it for a long time, probably in large part because ads are a shitty consumer experience and Apple tries to offer the best experience even if that requires higher consumer prices. And finally it's completely in line with Apple's business models on content in the past, creating walled gardens to push paid content (music, books, news, apps), advertising doesn't really fit into that context.

Consider also that it's obviously in line with Apple being the 'nice guys', who recycle materials, appreciate diversity, the artists who just want to be free blabla. Businesses or governments tracking users doesn't fit that narrative and protecting users from these parties does.

And finally, I do believe they genuinely care without any ulterior motive and are willing to push for the things they care about if it doesn't hurt them (much).


I think they see this as a key differentiator between iOS/iCloud and Google services and they want the public to know about it. In a post-Snowden world, it's worth noting the difference between platforms that prioritize user privacy and platforms that share user data with advertisers in return for additional services.


A business strategy perhaps ? Considering everyone else is in the surveillance-as-a-service monetization model.


Apple is essentially the only major US tech company that doesn't rely heavily on advertising income (and thus on online profiling to target ads effectively), and they've realized that this gives them a moral high horse from which to attack their competitors without actually having to change any of their own habits.


Apple has also failed in their own advertising efforts (iAd) and additionally failed to catch up in the software service business (almost all of the mass market services are monetized via advertising). Their privacy stance is the only move left to them.


I always saw iAd not as a money-maker, but as Apple's thoughts on what advertising should be. That is, it was their way to try to make ads less tacky for their devices — especially noticeable in the initial million-dollar ad sales to high profile brands.

The messaging was all about how slick and polished and pretty they were, and how expensive.

Unfortunately Apple failed to realise banner ads are tacky no matter how much polish you throw at them.


I wouldn't necessarily say, they have suddenly realized this; they have been leading by example for user experience for awhile and are now just being more public about it.

Nobody that really understands the problem of no privacy between government or another collecting agency knows enough to weight it on purchasing their device or going with their OS for their phone, tablet, or laptop. If more people knew, how it could be for collecting every single thing from a user, i'm sure apple products would sell more, so I do agree with your point, that it is better to now advertise this.


This popular talking point conveniently ignores that ads not the only way companies can benefit form spying.

The HN crowd absolutely loves using "analytics" to try and divine ways to make their products more profitable. Microsoft even uses this as an excuse for Win10's "telemetry" spyware: they say it's to improve the product. Yet when Apple spies on its users, suddenly it's ok because Apple isn't an ad company.

Apple is using this topic to attack their customers, but they are not exactly on moral high ground.


Tracking crashes, interface pain points, etc. is, in itself, a perfectly legit thing for a developer to do. There's been all kinds of software, open-source and otherwise, that asks for permission to phone home with usage logs for roughly as long as there's been broad internet adoption. Win10 telemetry is hated mainly because they're so dishonest about it: you're encouraged to use default settings on installation, they're very secretive about what info is actually gathered and about what the settings do, and they straight-up lie about some of the things you can supposedly disable.


Neither Amazon nor Microsoft rely heavily on advertising income.

Neither does IBM, HP, Oracle, Cisco, Salesforce, etc.

In fact when it comes to "major US tech companies", either by revenue or by market cap, I can only think of Facebook and Google that make money mainly from advertising income.


Everybody you listed, excepting Amazon, make the majority of their revenue from the enterprise, not the consumer. B2B companies don't need to use privacy as a differentiator, and would gain no advantage by doing so; enterprises want "managed" software that enables them to spy on their employees.

Amazon, the consumer-focused exception, makes their money on physical products not produced by them, and on cloud services—both of which are mostly privacy-neutral as far as third parties go. Amazon knows what you tell Amazon (your shopping history; your AWS API calls) but their business model doesn't involve giving that information to anybody else. They could theoretically take a stance on privacy, but they aren't doing much to enable privacy; just coincidentally avoiding doing anything that negates it.


"Consumer-focused major US tech companies," then. Hell, the logic still applies even if it's only directed at Google, who Apple seems to consider its arch-nemesis.

(And while they may not be specifically selling ads, Amazon's business model is absolutely heavily invested in invasive ad tracking, which was my point. Witness how Amazon products that you look at but don't immediately buy follow you around the Web for fucking weeks.)


They did have a pretty bad incident not all that long ago with private celebrity photos being leaked.


Not Apple's fault: poor security on the celebs' accounts.


> But what they don't want to do, they don't want your email to be read, and then to pick up on keywords in your email and then to use that information to then market you things on a different application that you're using. ...

Sounds like they are taking jabs at one of their biggest competitors, Google.


Google is an advertising company

With Windows 10, MS is now also an advertising company.


The only way to have real privacy when it comes to software is by it being: 1) open source and 2) executable on your own servers. Open source in itself is not enough, but it is a required pre-condition, same thing with the fact that you can run it on your own server. In these aspects, Android is way ahead as there exist projects to replace all the proprietary parts.

http://www.replicant.us/


If you are talking about rights, then you're already in trouble. You do not need to talk about rights until someone has the power to violate them. Governments murder people without heeding the power of their rights, human or otherwise, fundamental or otherwise. The only thing that power answers to is power.

The language of rights does not challenge the fundamental structure of the system.


It's certainly an interesting angle with which to attack Google.

A few pertinent details:

> The government comes to us from time to time, and if they ask in a way that is correct, and has been through the courts as is required, then to the degree that we have information, we give that information.

Pretty standard stuff. They'll basically give up your info if any court orders them to.

> There have been different conversations with the FBI, I think, over time.

> I don't support a back door for any government, ever.

He doesn't support back-doors. Notice he didn't say there isn't a back-door.

As far as I can tell, Apple will 'give up' your information more or less to the same degree Google will.

Google also runs some algorithms over your data to match an ad to you. Not sure why this is so bad, no data has been given to a 3rd source.

This just strikes me as fluff to attack Google.

BTW, both companies' transparency reports (or at least part):

http://www.google.com/transparencyreport/userdatarequests/co...

http://www.apple.com/ca/privacy/government-information-reque...

BTW - if you compare requests to number of total users of each platform, Apple gives up more data. Not that it matters too much, both seem to comply with applicable laws, although Google's transparency report does seem more extensive.

Either way, not sure I can see the practical difference here.


Actually he's repeatedly said in public that there's no backdoors.

http://www.engadget.com/2014/01/24/apples-tim-cook-there-is-...


The practical difference is that heavily marketing in this area can be a good thing for privacy. The more it becomes a "core value", the bigger the backlash from having failed at it publicly. It's a start. Edit: To clarify, I agree with you that at present there are no practical differences.


True. But data on devices is encrypted locally. And Apple supposedly doesn't know the keys/passwords.

However, there was the issue that cloud backups weren't encrypted with users' local keys/passwords. Has that been fixed? If not, he's blowing smoke.


Apple supports CISA though which is pretty much poison to privacy...


>If you're in our News app, and you're reading something, we don't think that in the News app that we should know what you did with us on the Music app — not to trade information from app to app to app to app.

I thought this was the most interesting quote in the article. I would've assumed the all Apple products would be colluding together, but Cook seems to be saying that isn't the case. That's actually a pretty big gift to consumers, but I wonder what it means up top. Does "Apple" get my news and mail info separately, or do the news app and mail app get their content and deal with it independently of mama Apple? It's a great sentiment - and I can't imagine he'd say it in this sort of interview without truth to it - but it's hard to imagine it in play.


The quote specifically says they don't share data between apps, but not that they don't collect data from all of the apps and correlate it behind the scenes. Perhaps it is an overly cynical point of view, but true to the spoken words.


That sounds like classic Jobsian tactic of saying everything Apple doesn't do is bad. Similar to how Jobs said smartphones were a bad idea until he got into the market.


It's undeniable that part of this stance is simply asserting that Apple products are designed with user interests in mind. Implying that some other companies have different interests in mind, and sure, you can draw your own conclusions about the products of those companies. Yes, drawing this contrast is a marketing opportunity, and it makes no sense for Apple to pass it up, so Tim doesn't.

But I think just as importantly this stance is about girding for a fight against government-mandated back doors. After some noise about this from parts of the US government, some important people in government have started to come to their senses, but that doesn't mean it's over, whether in the US or with other governments around the world.


A tad "holier than thou" attitude pointing a finger at your competitor and to a large extent the entire internet economy. It would be nice to dial down the spin/FUD machine esp when you're the CEO. Ad targeting will and has evolved. Good things happen when you understand the user and their needs - I for one am willing to trade my privacy for USEFUL features and products (to paint this as only ads is injustice). Just like I'm willing to trade my $s for good devices.


The irony of your comment is that most ad companies play the victim and moralize how blocking ads is akin to murder. Ad targeting has evolved alright, but not in a way that has helped consumers one bit.

It's become an arms race to the bottom, essentially. To coddle ad companies as if they are mom and pop shops that are struggling is not only naive, but dangerous.


Ad tracking has certainly evolved - Google, Facebook, et al have built successful enterprises out of tracking every detail of people's web browsing. But targeting? AdWords still only coughs up ads that coincidentally include a random keyword I was searching for, devoid of context, or attempts to sell me something I've just bought.


"I don't support a back door for any government, ever."

Note that he did not say that no Apple product has a back door.


And?

He also didn't say the camera on iPhones stays off when you're not using them. That must mean they're taking pictures of us and uploading them to Tim Cook's personal sex grotto for him to ogle us, right? I mean, he didn't say they DIDN'T do it, so...

That is to say... perhaps we could keep the paranoia and speculation to things that have a basis in reality?


"We believe in privacy" except when it affects our bottom line. We will totally sell you out if you're in China and 100% allow the government to censor your requests. We say we believe in privacy in the West because dumb people believe us but really were just all about that cashhhhhhh.


The claim that the NSA and the FBI are coming to an agreement that everyone should own their own data and backdoor-less encryption is cool sounds more naïve than I think Tim Cook can possibly be in real life.


Do you have full control over your iPhone?


I'm at


This is just a plain attack on Google. sigh


Then google should come back with stronger privacy rules and ensure that their users information won't be sold to any third party.

Oh wait, they can't because that is the majority of their income.


For so long we've heard "consumers don't care about privacy" and a lot of companies have seen success based on that idea. Now Apple is challenging that status quo, differentiating themselves from their competitors and offering consumers a choice on privacy. I can't see that as anything but good.


> "consumers don't care about privacy"

It's worth pointing out that while this was obviously propaganda (or wishful thinking), we now have some proof that this isn't true:

https://www.asc.upenn.edu/news-events/publications/tradeoff-...


What choice? iOS a walled garden. Majority of users would not able to change anything.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: