Hacker News new | past | comments | ask | show | jobs | submit login

I posted this earlier today, but the current article (from bbc.co.uk) does a poor job covering the issue. In summary, Apple iOS uses a validation system to ensure Touch ID sensor is not maliciously replaced or modified. The Touch ID sensor has access to the iPhone Security Enclave, where fingerprint data is kept. A malicious sensor could, hypothetically, steal fingerprints from an iPhone user unknowingly. This could be used to unlock the phone and make purchases through Apple Pay without the owner's permission. To prevent this, Apple uses a validation system whenever the Touch ID sensor is repaired. When iPhone is serviced by an authorised Apple service provider or Apple retail store for changes that affect the touch ID sensor, the validation paring is updated. Third-party repairs to the sensor will not update the pairing, and will fail validation when using Touch ID. This validation error is shown to users as the mysterious "Error 53".

If the validation fails, the device will function mostly fine, although with Touch ID disabled. However, the device will be prevented from restoring or updating to a new version. Restoring from backup still works. I'm not too sure why restoring or updating is blocked, but my guess is that they want to prevent malicious software from being uploaded in this process.

From the Daily Dot article, if a user encounters this error, Apple's current resolution is a full device replacement. It may be overkill I don't think Apple expected many people to encounter this issue, so it seems reasonable why they chose this option.

This is a great security feature for users, and I'm really glad Apple engineers considered this situation. Unfortunately the media is blowing this and leaving crucial details about what's happening and the reasoning behind it.

Here is Apple's statement on the matter:

We take customer security very seriously and Error 53 is the result of security checks designed to protect our customers. iOS checks that the Touch ID sensor in your iPhone or iPad correctly matches your device's other components. If iOS finds a mismatch, the check fails and Touch ID, including for Apple Pay use, is disabled. This security measure is necessary to protect your device and prevent a fraudulent Touch ID sensor from being used. If a customer encounters Error 53, we encourage them to contact Apple Support.




What you didn't mention is that this verification only happens on a software upgrade. In the meantime either one of these two things could happen (I don't know which): 1. The sensor could do all the malicious things you mention 2. The sensor is blocked from accessing the security enclave.

The former doesn't seem like a secure solution that one should be really glad of. The latter would also be possible after a software upgrade so there is no need to disable the device completely. In short, they didn't choose a good solution.

Simply disabling a phone at some point well after a repair is just bad.

Edit: The parent post was edited a bit, so my point is now mostly covered. I still don't see a security-related reason to disable the complete device on a software upgrade. Maybe it could enable an attacker to modify the OS somehow in the process. However, I don't agree that this issue is "overblown". This presents a real problem for users that now have an unusable phone. It's important to note that Apple doesn't offer repairs everywhere in the world so many users now can't repair their phone at all.


My mistake, added that part into the parent post. The coverage of this story is all over the place and it's hard to keep track.

You are right though, allowing users to update/restore but disabling the sensor is a better solution. I'm not too sure why Apple chose this route. They haven't commented on the technical reasons behind it so it's hard to say for sure.


> The Touch ID sensor has access to the iPhone Security Enclave, where fingerprint data is kept. A malicious sensor could, hypothetically, steal fingerprints from an iPhone user unknowingly.

No, the CPU reads encrypted data from the sensor and sends them to the SE for decryption and analysis. See the PDF linked here by somebody. What a malicious sensor could do is store user's fingerprint for retrieval by unauthorized parties.

> If the validation fails, the device will function mostly fine, although with Touch ID disabled.

On iOS 8. Once the device is updated to v9, it turns into brick. Quoting from OP:

"They repaired the screen and home button, and it worked perfectly." He says he thought no more about it, until he was sent the standard notification by Apple inviting him to install the latest software. He accepted the upgrade, but within seconds the phone was displaying “error 53” and was, in effect, dead. When Olmos (...) took it to an Apple store in London, staff told him there was nothing they could do, and that his phone was now junk. He had to pay £270 for a replacement and is furious.

> If a customer encounters Error 53, we encourage them to contact Apple Support.

This may be a media-friendly euphemism for "it's dead", unless this London staff was clueless.


What a malicious sensor could do is store user's fingerprint for retrieval by unauthorized parties.

Of course, taking advantage of the exploit in question requires the phone to be stolen by an extremely sophisticated (if not state-level) bad guy, altered by installation of a malicious sensor that has never been documented to exist in the wild, then recovered by the owner, and then stolen again at a later date. All to acquire personal biometric data that could just as easily be obtained with a piece of Scotch tape.

A simple application of Occam's Razor suggests that Error 53 isn't a "security feature" at all, it's just Apple being a rent-seeking asshole.


> * A simple application of Occam's Razor suggests that Error 53 isn't a "security feature" at all, it's just Apple being a rent-seeking asshole.*

I don't think you understand Occam's Razor.


There's always more to learn. What am I missing?


Occam's Razor says that you should select the hyposthesis with the fewest assumptions. Saying Apple is a "rent seeking asshole" assumes that Apple did this maliciously, which is a huge ball of assumptions when they've literally put out a security paper[1] on how Touch ID and Security Enclave works.

[1]:https://www.apple.com/business/docs/iOS_Security_Guide.pdf


Does that document explain Apple's motivation for bricking phones that never had the fingerprint reader enabled, and that didn't even use traditional lock passwords?

No?

Well, then it makes sense to look elsewhere for that motivation. Additional data appears to be needed. Lacking such data, assumptions are all we have.


Possibly you're stretching for Hanlon's Razor, except you've got the wrong end of it. Hanlon's is the one that says "never assign to malice what can be assigned to stupidity". This feels like a screwup. The reason why it happens is consistent with security. But the effects are maddening.


Looks like you got this one right, and I'm glad to admit it.


Could be. At least your assumption, unlike mine, is testable. If it's an unintentional bug, the policy behind "Error 53" will become more consumer-friendly in an upcoming iOS update. If it doesn't... well, Occam wins the day.


Today's state-level agency is tomorrow's small town police department, and next week's street criminal. Attacks only get better.

Also, Apple is not only designing phones for you and me, but for businesses who nowadays are the target of state-level security agencies.

Clearly Apple fudged the implementation of this feature, and it's a PR nightmare, but all evidence points to their intentions being genuine.


Exactly this.

Do you really think Apple hasn't been shocked/annoyed at how China/US/et al have actively tried to hack their customers including attempt to compromise their own servers ?


What if the "owner" of the device, i.e. the person who paid her hard-earned wages to "own" it, is not interested in using the "TouchID" feature?

Is there an untapped niche for a similarly-sized single board computer (not several computers, baseband processor, SIM card that runs code, etc. jammed into a hermetically sealed casing that is worthy of being in the Museum for Modern Art) that just does simple simpler, "boring", useless things like send and receive packets, read, write and store files, etc.

A pocket-sized computer that a user can not just rent but _pwn_, that does not make gratuitous network connections to some "mothership" and allow for easy monitoring and remote control by a third party, and that does not brick itself if it's casing is opened. A computer that can be, to the fullest extent possible via open source software, controlled entirely by the user.

Who knows maybe a person could turn such a small computer into "a device for sending and receiving text, images, sound and video over a network, such as the internet."

Nah, there would be no use for such a thing. Only a device _that can handle payments_ is worth using to send and receive text, images, audio and video.

What is the point of a device used to communicate, e.g., a "phone", if it cannot also be used to spend money?


> A pocket-sized computer that a user can not just rent but _pwn_

Then don't buy an iPhone. Simple. You and the other dozen people on the planet will surely be missed.

The rest of us absolutely want Apple to be as aggressive about security/privacy as they possibly can be. Especially with even moderate countries e.g. UK, Australia being equally aggressive about invading privacy.


Looks like we got a fanboy.


When people visit countries and come back finding screws on their laptops removed, they don't need to be "fanboya" to want more security.


> A malicious sensor could, hypothetically, steal fingerprints from an iPhone user unknowingly. This could be used to unlock the phone and make purchases through Apple Pay without the owner's permission.

Why in the hell would anyone bother with this, if it's trivial to get persons fingerprints and reproduce them to unlock the device ? [1] Even if you lack the touch ID, the device is still encrypted by the PIN and is functioning (and is secure) normally without it.

Either it's really over-engineered or is what it is - scare tactic to bring people to Apple repair centers.

I wish they'd use rather this media attention to inform the public that fingerprint authentication isn't there for security, but conveniency first. Apple Pay would function just fine without it. But would it have it's appeal of easy payment ? Probably not.

[1] https://www.youtube.com/watch?v=2u4ZLGsw1zo


I'm not sure it was over-engineered. Recall when Touch ID was introduced there was huge media backlash: Apple is stealing our fingerprints, how do we know there isn't an NSA backdoor to the fingerprint storage, and so on.

The Secure Enclave system was set up exactly to counter those concerns.

Interestingly, when other phone vendors later implemented fingerprint unlocking there was far less outrage. Even when the fingerprint images themselves were found as unencrypted raster images on device storage.


I think it is definitely over-engineered. If it is a scare tactic to bring people to Apple repair centers why isn't this happening with other Apple products?


Oh you mean apart from all the glue and solder, non-replaceable batteries and the like?


Not sure if I understood you correctly but replacing batteries yourself in your Macbook does definitely void warranty but does not brick the device.


Wouldnt it be more logical to simply disable the Touch Functionality and treat it like a Pre-TouchID button when not replaced by Apple with an OEM part?


From the Daily Dot article, it looks like what you described is what happens (TouchID is disabled) and the phone will function mostly fine, but the author also encountered other bugs possibly related to the issue. However, the bigger problem was whenever he tried to restore or update the device, Error 53 occurred and it failed. Restore from backup still worked though. So essentially, his iPhone was locked to that iOS version and could not be modified.

http://www.dailydot.com/technology/what-is-error-53-iphone/


I don't think that would solve the underlying problem of the TouchID sensor being compromised. The device would potentially be venerable to software based attacks that re-anbled the compromised TouchID sensor.


Why? Then the user is left guessing why their phone is acting like it has no TouchID when it does.


Would you rather guess why Touch ID isn't working or have a completely unusable phone?


If you get your phone repaired and TouchID no longer works when you get it back then it's not going to take a genius to put two and two together and figure out that it might be related to the repair.


It's a well known issue and every independent repairmen will warn you before replacing this part or else they would quickly get sued out of existence.


We don't know the technical details behind. For all it's worth, the button might be like one of those thunderbolt ports that has direct memory access and can alter the firmware during a software upgrade.


Don't worry. If there's any bullshit in the article, I'm sure Apple will help us discern which parts are true and which parts are false, the expensive way. Every time.


Is enclave a standard security term? Does apple publish technical details on what sort of security the touch id uses?





Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: