Hacker News new | past | comments | ask | show | jobs | submit login
The limitations of Android N Encryption (cryptographyengineering.com)
192 points by razer6 on Nov 24, 2016 | hide | past | favorite | 29 comments



So I only worked on the ext4 encryption feature in the Linux kernel which was used to implement Android FBE, and so I'm not an expert on vold, but there shouldn't be a reason why vold needs to hang on to the key. Vold is responsible for pushing the keys into the kernel, but what ext4 uses to do the per-file encryption is stored on a kernel keyring, not in userspace. So vold should be destroying the key after it loads it into memory --- it certainly can't use the key for anything useful.

As far as being able to remove the keys while the phone is locked --- I can't comment on future product directions, but it's fair to say that the people working on the upper layers of the Android security system aren't idiots, and the limitations of what was landed in the Android N release was known to them. It is no worse than what we had in older versions of Android (with FDE, we were using dm-crypt, and the keys were living in the kernel as well), and while it doesn't help improve security with respect to the "evil maid" attack, it does significantly improve the user experience.

I agree that there is certainly room to improve, and note that future changes will require applications to make changes in how they access files, which does make it a much trickier change to introduce into the ecosystem without breaking backwards compatibility with existing applications.

As far as why different keys are being used for different profiles --- this makes it much easier to securely remove a corporate, "work" profile from a phone without needing to do a secure erase on the entire flash device --- either because an employee is leaving their existing employer and there is corp data on a BYOD phone/tablet, or because the phone has been lost, and corporate security policy about how quickly corp data should be zapped from the phone might be different from what the user might want to exercise when they have temporarily misplaced the phone. (e.g., you might have a different tradeoff of losing unbacked-up data from your personal files versus their getting compromised while you hope that someone turns in your phone to lost+found than your company's security policies might require.)


Actually, that should be "cold boot" attack, not "evil maid" attack.


Looking at the code, it seems to keep the keys in a map called s_ce_keys, though there is a comment referencing b/26948053 which presumably is a bug to stop it doing that.


I know that apple did a lot of controversial moves in the past years, but i still get the impression that apple takes privacy seriously and tries to respect it even when there is the fundamental need for data collection [1].

[1] https://backchannel.com/an-exclusive-look-at-how-ai-and-mach...


Absolutely, and the commitment to security and privacy keeps me coming back to iOS. The iOS white paper[0] is an excellent read for anyone interested in security. Unlike android devies, iOS security is coupled to hardware HSM chips which Apple refers to the secure enclave. It's used not just to encrypt and decrypt, but to verify the entire boot process and all binaries used by the cellular chips.

Now that google is manufacturing devices I expect them to begin following Apple's example. Avoid garbage like TrustZone and do it right.

1. https://www.apple.com/business/docs/iOS_Security_Guide.pdf


"Garbage"? Can you explain why?


There's very good reasons why Apple doesn't trust it and does something different. When so much of the phone security relies on it, it's very important to get it right.

https://www.google.com/amp/www.slashgear.com/android-soc-sec...


The "trustlets" that run in the TrustZone have enhanced privileges and must be securely written. Once crap like DRM ends up in there, as it invariably does, it turns actively harmful.

http://bits-please.blogspot.com/2016/05/qsee-privilege-escal...


It should be mentioned that Matthew Green linked the above blog entry in the article, so if you didn't read either post, you should.


The design and implementation surface area for TrustZone is just too big to be secure.

Protecting a few secrets is a bit different from a Quasi-hypervisor environment that could be used to implement such a scheme.


For a really in-depth dive about the iOS class keys system he mentions I recommend this 2016 blackhat talk by Ivan Krstic (head of security at Apple): https://www.youtube.com/watch?v=BLGFriOKz6U (relevant portion starts at 6:56)


Given that app developers need to actively use those iOS features, the obvious question is: How much of my app data is actually encrypted that way?

It's much harder to get these things right (e.g. you want your fancy sync and notifications to work even if the phone is locked, etc.) thus I suspect that most apps would just use the Protected Until First User Authentication class and that's about it. Is there any way to confirm or refute that?


In fact, that is indeed the default protection level, which is strong enough for a default setting.


Sure, but it's not any better than the Android FDE, and thus kind of defeating the point of the article. If everyone just uses the default setting, then iOS's encryption is theoretically and architecturally better, but it won't make a difference in practice.


It's a compromise between security and usability, like always. And it is good for most cases. When higher security is needed, developer can choose to override and implement a different mechanism.


Still reading, but in case anyone is interested in the grugq'd hardening project (Ironsides) that Matthew referred to, it started as an open source project. I believe I have the fork that has the most recent public commits, as I was playing with it a few days before it went away:

https://github.com/patcon/darkmatter

Unfortunately, the wiki was one of the most interesting parts, as I remember, and that wasn't preserved.

Disclaimer: Goes without saying, but VERY stale code, and light-years behind the current project.

EDIT: some interesting things I remember from the wiki: triggers to losing a wifi or bluetooth signal (so that actions could be triggered if phone was placed in a faraday bag and lost touch with a paired bluetooth device or mobile hotspot or whatever), trigger for when temperature dropped (so actions could be triggered if someone tried to pull a cold boot attack)


Geez, as much as i'd love to divorce myself from the Apple ecosystem I continually find reasons to keep my iPhone instead of opting for Android.


I'm in the same boat. Due to Apple's most recent product decisions I'm losing confidence and I'm starting to look elsewhere. After reading up on the current state of Android and especially how to increase privacy and security, it seems I'll have to stick with Apple for a while longer.


Yeah, I need a new phone (screen dying, probable battery swelling) and would like a newer macbook, but the latest Apple offerings (and prices) on both fronts drove me to look outside the bubble for the first time in a while.

I can say the competition has gotten good enough that it's only obviously worse when you look a bit closer than a quick glance at owner sentiment and specs. That's the best I can say. And those are the handful of stand-out devices—the overwhelming bulk is still trash.

So: Apple's stumbled really hard, but the competition's still basically a whole different product category, so I guess I'll just wait and see what things look like in 2017 :-/


I'm coming to the same conclusion. Despite being a happy Sony Xperia for years now, I think my next phone is an iPhone.

Which kind of sucks, because I don't want to spend eight trillion dollars on a phone, but...


Slightly o/t; my 2008 macbook is on its last legs. Are ThinkPads still sending metrics back to Lenovo HQ? I simply can't justify buying the new macbooks.


The consumer grade Thinkpads did that for a short while, not anymore. If you buy yourself an X or T series new or even better refurbished, they will last for years and are serviceable yourself. Wonderful machines.


I had a Moto X Play where a bug ended up just totally disabeling the lock screen. You would turn on the device and land on the home screen even though you had set an unlock pattern.

This bug was triggered by using Google Drive while the internal storage was basically full. The day after that the phone got stolen, so I couldn't take a closer look. :(


Maybe the thief filled your Google Drive first :o


I actually wasn't sure if i should even post the Google Drive part, because theoretically this might even be exploitable in some way. I also found other people with the same bug online, so it really was a thing.


This article makes a lot of assumptions that seem to be based on how he thinks things work instead of providing proof of how they work. He states that the keys "seem" to live forever in userspace RAM, yet provides no proof of this.


The code is here. It's pretty explicit. https://android.googlesource.com/platform/system/vold/+/andr...

Here's an email from Paul Crowley to me. Paul is one of the devs on this code: https://twitter.com/ciphergoth/status/801220048622714880?lan...

Not really sure what else I can offer you.


Also, I want to clarify that I don't think the weaknesses of Android N's encryption are the fault of Paul or anyone else on the team. I think this is really a question of corporate priorities. If Google wanted to put a lot of energy behind Android's device encryption, my suspicion is that they would be much farther along than they are now.


Relying solely upon a phone's built-in encryption is much like relying solely upon a condom to prevent pregnancy.

It's only when multiple steps are taken that a method of prevention is even remotely reliable.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: