ASLR is a band-aid. If you need it, your system is already insecure. It's just that the attacker may need to crash your system a few times before they get in.
64 bit ASLR is not a bandaid. There are definitely ASLR approaches that don't have enough entropy, but that doesn't mean ASLR as a whole is unworkable.
No, they need tech that either contains the attack in its own partition or prevents it entirely by language/compiler-level action on the target. Both exist in academia and commercial sector with varying capabilities, prices, maturity levels, and so on. Most such things are rejected in favor of band-aids like ASLR.
And the systems continue to get hacked through the very holes covered in bandaids. As he said, if you're using a bandaid, you're covering up something inherently broken.
That's not a practical solution. Sure - you could write super-secure (Ada-style?) code in a verified environment (?), running on verified kernel (SL4?), on secure hardware (got any ideas how to solve rowhammer?). Realistically though - nobody does that (in a product which we can buy). Producing any application in that kind of environment would be too expensive and not possible for most companies. We don't even have secure hardware available. Academia will experiment with that. Some industries will care enough to apply it.
But in a mass-produced software/hardware? Realistically my choice for productive desktop is OSX/Win/Lin. We can talk about cool, perfect solutions for a very long time. In the meantime I'm making sure my apps are running with ASLR. I hope you're not actually advising people not to use it, just because there's some ideal solution maybe possible on the horizon, that doesn't run any apps they need?
Whoa there. There's an entire spectrum of options in between "ASLR" and "formally verified everything" that defend against memory safety related RCE. Such as, for instance, writing in a memory-safe, high-level language where reasonable (which is in fact not only practical, it's what Android does).
(That's not to say ASLR isn't great as a way to harden the C and C++ code at the core levels of the system, of course. Daniel Micay's work here is very solid.)
Yeah, Android pushes memory safety quite hard. Most code in the ecosystem is written in memory safe languages (Java and friends). That still leaves the entire kernel and lots of performance critical or legacy code. Languages like Rust could reduce the amount of memory unsafe code on the platform but there's still going to be a lot left over even it's mostly contained in a language runtime and the low-level libraries.
Despite Android's usage of Java, most vulnerabilities are memory corruption bugs. It makes sense to focus on those since it's low-hanging fruit. High-level security/privacy changes involve much more subjective changes and usually have a perceptible impact on users. Hardening the base system is invisible, and that's a good thing.
Android already does an amazing job at the access control level via very locked down SELinux policies. There's a lot of work to do there, but it involves making changes that are going to make some Android developers/users unhappy. For example, `hidepid=2` made it into Android N from CopperheadOS and there's going to be fallout from that: https://code.google.com/p/android/issues/detail?id=205565. I think Google will end up shipping it, but it's not a sure thing.
I was refering to "prevents it entirely by language/compiler-level action on the target". I understand there's a whole spectrum in prevention and mitigation. But "prevents it entirely" is an extreme, just as "formally verified everything" is an extreme.
I'm just ticked off by people lately repeating that ASLR is a bandaid, like it's a bad thing. It's a bandaid, but it can still crash-instead-of-own your app/system with 99.XX% probability. Why complain about it being accepted rather than say: "great, we're nowhere near secure, but at least we have something that works most of the time, now we can work on better protection". Safe runtimes can fail too (CVE-2015-3837 / serialization bug).
Basically, if anyone reads threads like this and thinks "it's a bandaid, it's not needed / it doesn't protect me", then we're all worse off.
It's a bandaid because it covers up instead of fixed the root problems. Getting something through Softbound+CETS will stop almost all the memory errors because it tries to fix the cause. Same with pcwalton's Rust. Then, there's solutions that say leave all the problems there while trying to counter the results of an exploit in a "maybe it will work way" that are often bypassed. World of difference.
Note that using bandaids is A Good Thing if you have something broken already. It's just best to avoid what causes the breaks where possible and look for prevention measures. Our industry loves bandaids while systematically ignoring stuff that negates a need for them. So, I call out that problem but doesnt mean someone shouldnt use ASLR if it's the best bandaid they have.
I'm talking things as simple as Code-Pointer Integrity, common tools recoded in safer language, or app-level sandboxing with or without microkernels. People rarely use strong stuff even if it's a straight-foward download, recompile, or configuration. Hell, most wont use protected messaging when it's as easy as Signal. It's a demand-driven problem largely about convenience and access to insecure apps.
Btw, solutions like OKL4 exist already and are fielded w/ Android + other OS support. Android hardening tech also exists. Cryptophones also exist. Not perfect, future tech so much as existing tech companies and FOSS developers mostly ignore. With exception of Blackberry that tried something decent by integrating QNX with stellar results.
So where does a technical end-user find a relatively secure solution for a phone/small tablet, even if they have to pay a little for it - even $500 extra?
> With exception of Blackberry that tried something decent by integrating QNX with stellar results.
Are you saying Blackberry 10 is significantly more secure than Android and iOS?
I'm not sure if you can get a secure solution at that rate. The more secure systems simultaneously have high development cost and almost no buyers. This means they're usually OEM licenses for custom work instead of mass market. So, trick would be a smart group of people licensing OKL4 or something then putting it and hardened Android on a specific phone.
Far as Blackberry, no Im not saying it's more secure. I'm saying using the QNX OS made it more secure, reliable, and responsive than it was. That's because of QNX's great design.