Hacker News new | past | comments | ask | show | jobs | submit login

> Is it reasonable to describe these single points of failure as backdoors?

Yes. The case that Apple is making is, in part, that the FBI is forcing them to use a back door by putting Apple's stamp of approval on it with their signing key. Whether or not you agree that Apple must create the back door or not, they are still being asked to approve its use. Apple says this is compelled speech and violates the first amendment. Their brief has more details about it. The Tech Dirt summary has the most details [1]. Here's one notable passage,

“The government asks this Court to command Apple to write software that will neutralize safety features that Apple has built into the iPhone in response to consumer privacy concerns... This amounts to compelled speech and viewpoint discrimination in violation of the First Amendment.”

> I think many people might argue that industry-standard systems for ensuring software update authenticity do not qualify as backdoors, perhaps because their existence is not secret or hidden in any way.

It's relative. That used to be somewhat hidden. Now it's very out in the open.

> Having access to a "secure golden key" could be quite dangerous if sufficiently motivated people decide that they want access to it.

Yeah. So let's not compromise Apple's existing security procedures by forcing that out in the open.

> I expect that in the not-too-distant future, for many applications at least, attackers wishing to perform targeted malicious updates will be unable to do so without compromising a multitude of keys held by many people in many different legal jurisdictions.

I hope this day comes soon. For now, let's continue fighting for our right to privacy.

[1] https://www.techdirt.com/articles/20160225/15240333713/we-re...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: