Hacker News new | past | comments | ask | show | jobs | submit login
Using Rowhammer bitflips to root Android phones (arstechnica.com)
279 points by ryanlol on Oct 23, 2016 | hide | past | favorite | 141 comments



Perhaps we'll finally start being able to get mobile devices with ECC memory, now that its useful for preventing the nominal owners of devices from having actual control or visibility into their operation.


If you want to leverage Rowhammer to get ECC everywhere, don't root phones; figure out how to crack into a media stream you shouldn't have access to, like a Netflix stream or a Bluray rip. Which may very well incorporate rooting phones. But that'll get the industry forces screaming for ECC everywhere (and per other threads, whatever else it takes to make hardware proof against Rowhammer) far more than merely rooting, and they'll scream for it for all hardware. No point in protecting Blurays if the cheapest hardware is what rips the most easily.

Proofing hardware against Rowhammer is still necessary even if it does happen to help DRM. It is not possible to write secure code of any kind in an environment where Rowhammer can be performed. The casual dismissiveness the industry seems to have about this problem really surprises me... per the article we all just read, this at least threatens to become a Heartbleed-class problem at some point, but will make Heartbleed remediation look like a walk in the park by comparison. Hardware diversity generally protects us, but there are places where there isn't as much diversity, such as phones, where this could become a massive problem.


Hey, doesn't blu-ray players have web access?

Rowhammer via Javascript is already a thing.

See where I'm going with this?


Does ECC memory make this harder to pull off? I was under the impression that ECC memory does not mitigate rowhammer although it may make it harder[1].

> Tests show that simple ECC solutions, providing single-error correction and double-error detection (SECDED) capabilities, are not able to correct or detect all observed disturbance errors because some of them include more than two flipped bits per memory word.

[1] https://en.wikipedia.org/wiki/Row_hammer#Mitigation


It reduces the occurrences of successful bitflips with a big margin. So it is helpful, but not perfect.


That would be very nice indeed. And on a laptop that comes in less than €2k also would be doubly nice.


Dont count on laptops, Intel treats ECC as one of the market segmentation features. AMD on the other hand supports ECC on everything, even the lowest budget cpu models.


There is now a Mobile Xeon CPU, which is available in the Lenovo P50/P70 series, with ECC RAM.


I priced it there, costs upwards of ~€3k


As far as I'm aware, AMD don't support ECC on many APUs, which nowadays account for the majority of CPUs they sell.


I was trying to figure out the roadmap, pricing, and feature sets for their APUs. I gave up. I'm looking for a low-end APU with a fair amount of system ram (not dedicated vram) for use as a graphics coproccessor on a relatively slow bus. I'm finding high end CPUs with integrated graphics, and low-end CPU only chips. The only thing close to what I'm looking for is on an older process.


I think you might be looking for Socket AM1, like the Athlon 5350. If you need ECC, get the Asus AM1M-A. None of the boards have more than 2 RAM slots though.

If you have a Microcenter nearby, you can get a $40 discount on an AM1 bundle deal. The motherboard and CPU end up being about $40 total. It makes a decent little Unix server.

The Intel J1800/J1900/J2900 boards are pretty cheap too but no ECC. Also, Asrock just announced its Goldmont-based J4205 board.


Thanks, it's for an embedded development project so the low total price is very helpful.


Most APUs support it. Many motherboard manufacturers choose not to expose this capability and a few sockets don't even expose it.

Kabini supports it. Construction cores support it. FM2+ does not, but the mobile socket version FP3 does.


If the choice is between insecurity and ECC, market segmentation based on ECC may no longer be possible.


Another choice is : memory chips that work.


Touché.


This hardware defect just keeps on giving.

Since we're unlikely to see larger memory cells again, mitigations will likely be applied. There is a good question on a discussion board [1] from last year about memory scrambling and its utility here; but with no responses. Can these questions be answered? Some points about memory scrambling are also made here [2] by Kim of the 2014 CMU paper.

[1] https://groups.google.com/forum/#!topic/rowhammer-discuss/tp...

[2] https://github.com/CMU-SAFARI/rowhammer


Hmm, so the purpose of memory scrambling is to avoid excessive current swings on the data bus, because they have some kind of undesirable effect on the voltage regulator. I wonder if the result of such current swings is just excessive EMI or power consumption, or if the supply can actually go out of regulation. If you know the scrambling algorithm then you can generate those current swings intentionally...


I suppose you could get some interesting things to happen if you kick the control loop of the regulator into oscillation and set up some resonance; I wrote an old comment here, related to video cameras, that describes one such experience I had:

https://news.ycombinator.com/item?id=9251479

It also reminds me of the old CD copy-protection schemes that worked by attempting to (and in a lot of cases, successfully) forcing the scrambling algorithm it uses to output difficult-to-write bit strings:

https://www.pricenfees.com/digit-life-archives/magic-figures...

http://www.cdmediaworld.com/hardware/cdrom/news/0102/sd2_tru...


It seems somebody reversed this Intel scrambler at least to some extent (I only skimmed the article):

http://www.sciencedirect.com/science/article/pii/S1742287616...

They suggest that the scrambler is reseeded randomly on every power up so it may be hard to control raw bits on the data bus.


Dumb question (maybe OT): why is rooting phones so hard in the first place? Shouldn't root permissions be part of device ownership (akin to fair use)? Why do I have to hack my own phone to get unfettered access?


You don't, if you care enough to buy a phone where that's possible. Any Nexus device can be reflashed to a custom Android build for example. Of course you void your warranty that way.

What you're really asking is why is there no root shell by default on all phones, and simply posing the question also answers it - it's a phone, there's zero demand for messing about with terminal emulators on a phone. If you say, but I'd like to install apps as root, the problem is such apps could break the phone in arbitrary ways like by infecting it with malware/adware/spyware that then can't be uninstalled. Modern operating systems are very much designed to limit what apps can do to avoid the nightmare that desktop OS's turned into, where people are afraid of installing apps in case something goes wrong. For better or worse you can install apps on Android/iOS without much fear, which is one reason people do it so much: you can always remove the app again if it's trouble.

Android is pretty customisable and has lots of permissions anyway. There are very few things you can't do if you don't have a carrier-locked phone. So supporting apps with root access would vastly worsen the malware situation in order to satisfy a tiny number of geeks who like editing config files: a very bad tradeoff.

But hey, Android is open source, so if you think I'm wrong go ahead and make money selling phones with root access.


Of course we shouldn't forget little niceties like YouTube not being able to play background music. Only the paid YouTube Red service can do that which is simply not available in many countries -- but even in those YT won't play in the background. Unless, of course, you have xposed installed... So, as usual, controlling what you can do with your own device results in a) perceived gains for the rightholder b) pain for ordinary users c) nothing at all for the informed. If this resembles the DRM situation, it is.


What's that got to do with root access?

If you can beat YouTube's DRM then you can just make your own player app. No need for root.

If you can't then what makes you think you could defeat whatever checking for rooted devices YouTube would add to their app?

Anyway, this sort of discussion is pretty pointless. The market has spoken. The number of people who care about root access is tiny compared to the number of people who want a safe device and app experience. Heck the entire iPad/iOS experience is pretty much "let's take a general purpose computer, make it do less, and it'll sell much better".


Um, how do you plan to install xposed if you are not root?


You install your own player app. You don't need xposed. You don't need root. You do need to sideload because the app store forbids anything that annoys youtube.


Just install NewPipe and you can download YouTube videos to your phone, or only download the audio. [1]

Works fantastically.

[1] https://f-droid.org/repository/browse/?fdfilter=NewPipe&fdid...


Oh. And how do you install F-Droid?


The process is called sideloading. Android offers it for free, whereas on iOS you need a developer license to do it. https://f-droid.org/forums/topic/how-install-from-f-droid/


Let's prod further. How do you sideload if you are not rooted? I can't remember what you can and what you can not do but I am reasonably sure you can't just adb install foo.apk if you are not rooted as the "untrusted sources" option won't be available.


Go to settings=>security and enable "unknown sources" (allow installation of apps from unknown sources).

After that you are allowed to install .apk files (e.g. clicking the "download of xy.apk finished" notification or in a file explorer app).

Edit: that setting is always there, it's only greyed out if you installed a corporate app that explicitly disables it (e.g. the "GoodApp" email/calendar/contacts suite).


It's explained (not well, but it is) in the link ce4 gave you. You enable it in the settings and and then just install packages.

EDIT to match your edit: you remember wrong, the option is always there (unless there is some manufacturer out there that blocks it (Amazon maybe?), but most devices certainly don't)


> Only the paid YouTube Red service can do that which is simply not available in many countries

Oh yes, this. I pay for Google Music which is supposed to include that, but I do now have it by virtue of living on the wrong side of some border.


You can play YouTube audio in the background and with the phone locked if you use Firefox.


Just install Firefox and you're set.


> "Any Nexus device can be reflashed to a custom Android build for example. Of course you void your warranty that way."

Maybe in the USA you do (I'm not sure about consumer protection law there), but in the EU you certainly don't. The law states clearly that: It's up to the manufacturer to prove that the unintended software modification caused the hardware problem.

So, if your software modification doesn't pose any problem, or even in the case it does, if it doesn't affect the part of the device you are asking to repair under warranty, then the manufacturer has to make the repair under warranty.


"Of course you void your warranty that way."

Wait.. What? Why would that be the case? And .. why would you state that you'd lose your warranty and prepend 'of course'?

As far as I'm aware you do NOT lose your warranty. That would be rather stupid (of course).


Because if you go about flashing your phone willy-nilly and some bad firmware bricks the phone, it's not the manufacturers fault is it.


Yes, if your firmware breaks the phone it won't be covered under warranty. If your phones hardware breaks, the firmware doesn't matter.

Of course the manufacturers, who have vested financial insterest, will make you think otherwise.


It's cheaper for manufacturers to blame non-official firmwares for bricking a phone and saying "I told you so" as a big fuck you


I'd argue that you have a hard time bricking a phone.

But even if you manage to do that somehow, that is a specific problem. Using a general statement like 'flashing voids your warranty' remains weird (and wrong imo).

"Stuffing your phone in your pockets voids your warranty. Because if you stuff it in your back pocket and sit down on it, hard, it's not the manufacturer's fault if the screen breaks. Is it?"


But you _do_ void the warranty if you break your phone when sitting on it. It's a specific action that does that, and it's not as simple as "only rooting your phone voids the warranty"


I.. think we agree? Are you just chiming in to support my position? Because I think that is what I wrote above. That was the point, I tried to answer the GP's 'if you brick it (by flashing it)..' line of reasoning with 'if you brick it (by sitting on it)..' and showing that there is no connection between flashing and bricking in general.


There's plenty of opportunity. For example, there are reports of flashing the wrong firmware (Galaxy Note) on a similar phone (Galaxy SIII) can lead to bricking touchscreen controller-


There's also plenty of opportunity to accidentally drop your phone in the loo. Common sense and being somewhat careful helps in the vast majority of things, flashing your phone included.

That said, even if you flash stuff without reading instructions or fully understanding what you do: Most errors I've seen are flat-out not working (software stating: "Can't do this, you're messing up") or recoverable/soft-bricks only.

(In the end it doesn't matter if you CAN kill your phone by flashing nonsense: That still wouldn't make a good argument for "Flashing voids the manufacturer's warranty". I'd just mean that those individuals who bricked their phone .. might not be covered.)


Since when is dropping your phone in the toilet covered under warranty ?

Apple added "been in water detection" to avoid payout on that very activity.


I never claimed that this was under warranty. I'm constantly trying to use this thread to kill any connection between 'rooted my phone', 'flashed a different firmware' and 'bricked my phone'.

GP replied to one of my posts, saying that there IS a real risk of bricking your phone when you flash it (I don't disagree, but consider it unlikely).

By comparing the process with a totally different ~real risk~ (phone -> toilet) I'm trying to drive the point home that it isn't a problem of either flashing your phone or playing some mobile game on that white throne of yours, it is a problem of being careless. Bricking my phone shouldn't be covered by a warranty. Flash or flush doesn't matter.

But trying to imply that the warranty is void because I installed software is overly broad and crazy: If I flash CM today and the phone dies in a month from now due (battery dead, hardware issues w/ screen or power button, antenna issues, pick anything faulty you had with a vanilla/unchanged handset in the past), then CM isn't to blame and my phone has a working warranty (for all I know, for all I care - and anything else seems completely unacceptable in my world).


It's possible, but rare. As an example, some Samsungs had faulty eMMC chips (used in the Galaxy Nexus, SGS2 and Note 1) and certain operations when installing custom ROMs (eMMC secure erase/wipe) would trigger the infamous BrickBug that crashes the eMMC chip from next boot on. The problem was faulty eMMC firmware that Samsung later patched but never acknowledged (it was discovered by reverse engineering [1]) and worked around.

[1] https://plus.google.com/+AndrewDodd/posts/Drq9uJbkGfq


Are you aware if eMMC firmware is signed?


The thing is that if you drop the phone in the toilet you aren't covered but that doesn't mean that you void your guarantee by using your phone while you are there, for me that's the difference.

If I flash my phone and later some part of it breaks because it came bad from factory, that is independent of me flashing the phone and should be covered.


Then it is your flashing of your phone willy-nilly and bricking the phone by doing that which might void your warranty, not rooting your phone.

Of course, this depends on the laws in your jurisdiction.


Yeah, I'm tired of using these proxies to unwanted behaviour when the behaviour itself is easy enough to see.

"What did you do that bricked your phone?"

"I flashed some custom firmware and stopped the flash mid-way"

"But how? All our phones are locked"

"I unlocked it"

"Aha! That is the real culprit!"


I haven't worked for a computer manufacturer for a couple decades now, but back then it was common to require the user to use the stock software when requesting support. If you removed Windows and installed Linux, they wouldn't support it. If anything went wrong, they'd point to your changes and say that they are a possible cause and basically refuse to try to fix it until you un-did those changes.

Likewise, if you hack special software onto the phone, overriding protections that they put in place to prevent it, I could see them refusing to spend time and money trying to fix the phone until you undid those changes first. And I believe some of those changes can't be un-done fully, even if you re-flash a stock firmware.

The net effect is that it effectively voids your warranty.

So no, technically you don't lose your warranty, but you also can't really make use of it, either.

This really only applies to software things, though. Hardware problems, like the battery exploding or the phone just refusing to turn on would still be easy to get covered, since you can't even get far enough to see it's been modified.

But if the GPS stops working or it stops receiving calls, they could blame it on the custom firmware, and they might even be correct.


I'm ignoring support - support is a different issue and unrelated when we're talking about the warranty here.

If I hand in my handset to claim my warranty, I'm fine with a 'wipe by default' policy. In fact, my limited experiences (both with my phones and the ones from my closest family) that was implied anyway: You'd get the phone back with a ~fresh~ factory image (and all the crap- and adware that this includes).

So here's the thing: If I claim that GPS doesn't work, you reflash it with a Known Good™ image and GPS still doesn't work - then the device has a hardware problem.

If I can cause a hardware problem by running a different flavor of Android, then .. that's a serious hardware _design_ problem.

I'm not saying that you cannot kill or damage a device. But defaulting to 'You installed stuff, you probably caused it' seems rather simplistic, is probably wrong a good number of times.

You brought up PCs: I think you'd have a hard time if you'd defend the stance that installing Linux on a laptop voids the warranty. Granted, the underpaid tech support might not be able to help me around software issues, but if I claim that the wifi doesn't work and send it in... Do you believe they can/could/should refuse to check and fix the problem under warranty?


>Do you believe they can/could/should refuse to check and fix the problem under warranty?

I'm actually saying that they did refuse to attempt to fix the problem unless you restored the original OS that they sold it to you with. (I believe upgrades to newer versions was okay, because those were supported on newer computers. I didn't encounter that situation in the time I was there.)

Having said that, I didn't always toe the company line. I often did my best to help people who were using an unsupported configuration (added internal hardware, different OS) but there was still a limit to what I could do. Unlike most of the others, though, I had a few years experience working at a computer shop, so I actually knew things and wasn't just reading from the company intranet's guides.

I got caught a couple times on software problems that appeared like hardware problems, too. By not forcing them to revert to the original software, and then escalating them to second tier support, I ended up getting in trouble. So there was at least some method to their madness.

Sending it in, though... There wasn't a place to do that. So far as I know, there weren't any physical stores other than big name retailers like Best Buy, and you couldn't just ship it in without going through phone support. So you had to try the advice from the phone support first before shipping it in.


Because the trust model on phones is different.

On old UNIX mainframes the trust model had 2 kinds of actors:

- System administrators

- Users

The biggest security concern was one user getting access to the files of another user on the same system. The sysadmin decided which applications to trust (and most of them were either opensource or made by big companies, so we didn't have to worry about application developers doing nasty things). As a result, applications just ran with all the permissions of the user who started them. They could do anything that the user themselves were allowed to do, like read all your files, open a network connection and send your secrets out over the internet. But so long as users didn't run any dodgy programs it was (mostly) fine.

All modern desktop operating systems inherit that system. But over the last couple decades three things have changed:

- Computers got so cheap that now everyone has their own

- There are more apps than ever, and some of them are malicious

- There aren't enough sysadmins to save grandma

Modern phones we have a much better security model, which they inherited from the web. There are two changes:

- Multiuser support is gone at an application level.

- Apps are sandboxed from one another to protect you and your data from malicious apps. The way they do that is by isolating each app on the phone so no app can read&write to any of the data owned by another app.

For example, if you use google docs then by necessity you have to trust the docs app with your documents. But thats it! The google docs app doesn't get to access to your facebook login information. And the facebook app doesn't have access to your google account. If you had equivalent native apps on your desktop, these apps would be able to read one another's files!

But rooting your phone (at least partially) ruins all that! Rooting (by definition!) gives some 3rd party apps the capacity to access data stored by other apps on your device, which ruins the phone's security model.

The reason why some manufacturers (like apple) take a hardline stance against rooting is that if rooting your phone was easy, some app developers would undoubtably decide they'll only function on rooted devices. (This happens on android today for wifi sniffers and stuff like that.) If enough apps do that, the manufacturers will feel pressure to make it easy to disable app sandboxing, which will result in more and more apps wanting "Full permissions on your phone". People will mash "yes" on the install box (nobody reads that stuff anyway) and before you know it there'll be malware on your phone reading your email and threatening to send porn to your boss on your behalf if you don't pay up. DDoS attacks will come from phones, and will cripple the mobile network. And to fix all that norton will release an android version which will start scanning your phone for viruses at the worst possible moment of that show you like on netflix.

And that is why rooting your phone is hard.


  > […] the data owned by another app.
This makes me sad. I don't want my data owned by an app, I want it owned by me.

(This does not imply allowing all software running on my behalf access to all of my data all of the time.)


the parent is using "owned" in the filesystem sense of the word, meaning what entity controls access and permissions for the files. That's completely separate from ownership of the data, in the sense of copyright and data portability.

Apps owning their own data in Android should make you happy, not sad, because it's what prevents a malicious app from hoovering up all of your Tinder messages, or your financial data from Mint.


> Apps owning their own data in Android should make you happy, not sad, because it's what prevents a malicious app from hoovering up all of your Tinder messages, or your financial data from Mint.

That's well and good, but it also prevents YOU from hoovering up all your data.


  > That's completely separate from ownership of the data
No, it isn't. Under the app-controlled model, my access to my data is limited to what the app offers.

  > […] it's what prevents a malicious app from hoovering up all
  > of your Tinder messages, or your financial data from Mint.
That's an implementation of access control using the app-owned-data model. It's not the only possible implementation of access control.


> The reason why some manufacturers (like apple) take a hardline stance against rooting is that if rooting your phone was easy, some app developers would undoubtably decide they'll only function on rooted devices. (This happens on android today for wifi sniffers and stuff like that.)

Are you claiming that wifi sniffers could work without root, and require root for other reasons? An app that uses root to work around a gap in the permission system is not in support of your point.

Also, apps can already force the user to grant them very invasive permissions or exit. Yet I don't see it being universally abused.


This isn't about you, the consumer, getting root access. It's about a piece of code running on your phone (i.e. maybe part of an app with restricted permissions) getting root access.


Most likely related to strict FCC regulations regarding the mobile transceivers.

And it's always been done this way, old "dumb phones" have always been closed proprietary blobs.


That doesn't explain phones with easily and officially supported unlockable bootloaders (eg Nexus). The transceiver is, in any case, a seperate processor (the baseband) which is much more locked down.

The real reason is that hardware manufacturers like control.


Well there's also the part where root access fundamentally breaks the security model of the device.

I'd say it's a better way of running a consumer device for the typical user. For example, any application you run on a Windows PC even without administrator access could access and upload your web browser profiles. The average user doesn't expect this.

But a lot of manufactures do give you the choice to run your own firmwares which can have the root user access. It'll require a bootloader unlock which removes the signature checking.

For example: HTC: http://www.htcdev.com/bootloader

Motorolla: https://motorola-global-portal.custhelp.com/app/standalone/b...

Sony: http://developer.sonymobile.com/unlockbootloader/unlock-your...

LG: https://developer.lge.com/resource/mobile/RetrieveBootloader...

Huawei: https://emui.huawei.com/en/plugin/unlock/index

Couldn't find an official page for Samsung but it looks like it's a matter of running fastboot after enabling OEM unlock in developer settings.

The other side of this is that there are quite a few devices that are carrier branded. And usually what happens is that carrier requests that the phone's bootloader is not unlockable.

Sadly unlike how Apple can sell their phone without any carrier modifications, Android manufacturers have to accept these kinds of changes otherwise they won't be able to sell their phone.


> Well there's also the part where root access fundamentally breaks the security model of the device.

Good thing we're still admin/root on our personal computers. Seeing what MS/Google are doing, perhaps not for too long.



A security model that locks the user out of their own device sounds fundamentally broken.


> In a statement, Google officials wrote: "After researchers reported this issue to our Vulnerability Rewards Program, we worked closely with them to deeply understand it in order to better secure our users. We’ve developed a mitigation which we will include in our upcoming November security bulletin."

Then why am I reading about this in October?


Presumably you're asking why you're hearing about it before the fix is out. Perhaps Google is trying to apply public pressure to the manufacturers/carriers to actually update their devices since they have a habit of not pushing critical security updates very quickly.


Because the existence of this exploit was already publicly known, as you can see in the many links in that article. That statement doesn't have any new information for people who want to break into your phone.


because nobody dares publishing anything on Android at Google without going past legal, pr and recently the founder


On the one hand, you could argue that this is not a good thing because the hardware is fundamentally buggy. On the other hand, and this may be a bit of a contrarian view, if it leads to "the insecurity that gives us freedom", maybe it's not all so bad after all... although in this case, it might be too much of a free-for-all. But given how locked-down mobile devices are by default, this almost feels like a breath of fresh air.

https://www.gnu.org/philosophy/right-to-read.en.html

http://boingboing.net/2012/01/10/lockdown.html

http://boingboing.net/2012/08/23/civilwar.html


nullc makes a similar comment below.

I had thought this was too obvious to bother pointing out, but no, root exploits are not a good thing. The reason phones use kernel sandboxing is to allow users to install apps and have confidence the permissions granted mean something. A root exploit means any app or update you install may turn the phone into spyware, a portable bug-in-your-pocket.

The number of users who care about that is measured in billions (all phone users). The number of phone users who care about getting a root shell for their own use is vanishingly small, especially as people who care about that buy phones like the Nexus that have unlocked bootloaders anyway.

So no - this sort of exploit is pretty damn bad. The number of people hurt is multiple orders of magnitude higher than the number of people "helped", where that "help" is extremely tenuous anyway.


> The number of phone users who care about getting a root shell for their own use is vanishingly small

Only a very small number of people care about how /any/ individual piece of technology could be improved, they have have other things to think about. That is a not a valid argument. It would be better to have it by some other means to get root, but it's not as simple as you make it out. Show a user 2 otherwise identical phones, explain that on one phone, they could do things like tethering, and on the other phone, it's restricted because they can't get root, they will choose the one where they can get root.


IMHO the better answer is "buy/recommend phones from manufacturers who give you root access, instead of giving business to companies making vulnerable devices where you can use those vulnerabilities to get root"


Tethering and root access are unconnected: my phone can tether and is unrooted. Maybe if you get your phone from a US carrier these things are related, in which case, buy elsewhere?

There really aren't any reasons to try and root an Android phone. If you want to replace the OS with some custom open source build, just use a phone with an unlocked bootloader and go wild. But you can already access so many features and customise so many things without root it hardly seems worth it.


They are not exactly unconnected. Unrestricted root access would let you set up your own iptables rules and direct packets to the cellular network. This is how tethering worked on rooted phones before the carriers offered it as a service.

Now you are correct, and carriers responded to those demands from users for mobile networking with (usually metered or priced as an addon) "mobile hotspot" applications.

Contrary to your earlier claims, the advanced Android and iPhone user communities are not insignificant and may have played a part in getting demands like these met.

Places like androidpolice, and sites like engadget, lifehacks, gizmodo and others cater to this market, and drive popular knowledge of root modifications and jail breaking. Previous incarnations of Cyanogenmod boasted large numbers of downloads in the 100s of thousands if not millions.


But thats his point exactly - one of the phones allows him to get over customer hostile limitations of some corporation and makes his device more useful in his life. The other is controlled by a whim of somoeone who sees you just as a walking wallet.


If the carrier is giving you the device on better terms than an outright purchase then it's not "customer hostile" is it? You can always buy one instead.

Your attitude seems to be "I should be able to get a cheap phone from a carrier by agreeing not to tether, and then I should be able to violate that agreement anyway, and this is a totally moral position". Phones are a commodity. You can get them anywhere. Want to tether? Buy a phone and a plan that allows it. Problem solved.


No, my attitude is "I should be able to do with my 5GB dataplan whatever I wish because that's what I pay for and the carrier has no business dictating why my hardware can or cannot do.". And since I live in EU that is an actual reality - my carrier provides the voice and data service and my phone does what I tell it to and I can choose whichever phone I want. Free market as it should be.

Which is why it baffles me beyond all limits just why so many smart Americans go out of their way to defend corporate practices that hurt them as customers.


I'm not American, I'm European, so you're really putting the ass in assume there. And frankly the moralising so many Europeans are so quick to engage in is an embarrassment.

This really isn't complicated. If your carrier contract doesn't forbid tethering but they sold you a phone that does, take it up with them, that sounds messed up and rare. If your contract forbids tethering then you're just trying to weasel your way out of a market you voluntarily entered in to. That's the opposite of the free market.


Read your contract. You don't pay for a "5GB data plan". You pay for a "5GB data plan to be used on your mobile only". Violating your contract is not "free market".


What does "to be used on your mobile only" even mean? Does it mean you can't transfer any data downloaded on the phone to anywhere else? That's ridiculous. Or perhaps it means the connection must be via your phone, which is definitely the case if you are tethering. A data plan is an Internet connection. However you look at it, it's a gross violation of net neutrality principles to say what you can or cannot do with the data transferred through it.


It means what you know it means.


Again, I'm not from America. My contract does not say anything so idiotic. I can use my paid data as I wish.

Why are you trying so hard to support your telcos decision to make your life worse?


I'm not from the US either. And those policies are there to reduce congestion, that's all.


"reduce congestion" is just a weasel way of saying that the providers sell beyond their capacity. I understand overbooking is a very common practice, but it is based on risk assessment -- and providers shouldn't be able to dodge all risk for their decision by disavowing their responsibility in single-sided "contracts".


Yes, of course they sell beyond their capacity. This is the real world we live in.

If everybody in your neighbourhood turned their faucets on at once, water supply would simply stop working. So?


Nice straw man you have there "If everybody in your neighbourhood turned their faucets on at once, water supply would simply stop working. So?"


[flagged]


We're quick to ban accounts that post like this, so please don't.


[flagged]


You can't attack other users like this on HN, no matter what you're replying to.


Parent is just pointing out the BS in hindering tethering based on the amount of data and how well the customers are protected in EU regarding such practices. It's also absurd (perhaps a bit sad) to defend such policies from the point of the customer to your own loss.


However, when presented with the full story (you can get tethering on this phone, but at the cost of security; you can get security on another phone, but at the cost of tethering; and you can get both tethering and security on this third phone, but at a higher price), enough people will opt for option three as to make it at least meaningful to the market, despite those who don't care about security.


you're taking most out of thin air.

let's see: device that sells the free-to-root marketing (even being a lie in this case), Nexus: billions of sales.

device that boast root to be impossible, blackberry: zero sales


lol. downvote facts. go hackernews!


So, the proof of concept code is at https://github.com/vusec/drammer . Can we get a reliable rooting tool based on this?


Interesting. This is not something that has a simple fix and can be patched. The arms race continues.

Or we could end it and give users (limited) root access to their stock phones. Let us run OCI containers with restricted root user accounts. Bind mount certain filesystems given the correct Android permissions, such as the SD card or internal storage, or the user's emulated root. Or supported nested ARM virtualization.

Modern Linux supports a uid 0 with less than complete access in a cgroup, using the LSM to regulate specific capabilities, or creating the cgroup with limited caps to begin with.

Make access to areas the carrier considers sensitive conditional on a capability, or limit access to the full video decode hardware or shared memory from this root jail.

I have been able to compile but not run the runc binary from OCI/docker/rkt. Nested cgroupfs would solve a lot of these restrictions.


> It's not uncommon for different generations of the same phone model to use different memory chips.

Actually the same generation of the same phone can use different memory chips and be produced by different manufacturers. It's very common for Apple and Samsung where they can't source enough parts from a single manufacturer.


Not just because of not getting enough chips. Sometimes it is plain hedging against potential flaws, or supply chain failures.


Or just to have a strong position if one manufacturer tries to raise the price.


Very interesting application of the rowhammer. Funny that it comes at the same time http://dirtycow.ninja allows us to write very reliable and portable exploits (which should be applicable to Android).


Is there any way to statically analyze an app for code that might be attempting to execute a rowhammer attack? I'd imagine that rowhammer requires a tight loop doing nothing but writing to the same value in memory repeatedly, or something similarly recognizable. Such a tool could be used to at least keep any malicious apps out of the play store. It would probably be fine if it sometimes gave false positives on innocuous code that a human (at Google) could override after inspecting the suspect code.


Doubtful. A malicious app could always programatically write the attacking binary after it has been installed. At that point, you no longer have serious performance requirements, so you can make the functionality as obfuscated as necessary. Not to mention the possibility of self modifying code, or downloading the payload from the internet.


How much of this is possible within an Android app? Are Android apps allowed to download a file and then make it executable?


Android isn't like iOS, so yes. An app can load in additional code into its processes after the fact.


With root? Yes, you don't even need special permissions past an Internet permission.

SuperSU is popular app used to limit which apps can get root permission via the su binary. But depending on the way the device is rooted you can actually use root permission to remove SuperSU while leaving the binary that grants root in-tact.

This allows you to create a "backdoor" in which your "legitimate app" asks for root, then downloads an update and deletes SuperSU, allowing the payload (and any other apps) to get root access silently.


If you already have root, then you don't need to execute a rowhammer attack to gain root. I'm asking if an Android app with no root access on an un-rooted phone has a way to execute code that wasn't included with the apk, in order to hide the code that executes the rowhammer attack. If not, then a static analyzer that detects rowhammer attacks in the code of an apk would be sufficient.


You can even execute the rowhammer attack from the loaded javascript: https://github.com/IAIK/rowhammerjs There's no protection against it really. It doesn't matter if you can scan APKs for this behaviour if any app can open a webview with the right page.


Thanks, this pretty clearly answers the question.


Even then it's impossible to be certain, see https://en.wikipedia.org/wiki/Rice%27s_theorem - you cannot prove that a programme will or won't do a certain thing. It's not very intuitive, and the whole thing comes down to a clever reduction to the halting problem. The same reasoning also excludes the existence of a perfect virus scanner (even if you were to define a virus to be a programme that may open a dialog saying "VIRUS").

That doesn't mean that you can't catch most of the cases where a programme would do something like this. As they say, perfect is the enemy of good—in this case perfect is provably impossible, but that doesn't mean we shouldn't strive to find something that improves the current state. Google automatically scans all Play Store submissions for known malicious behaviour, they might be able to detect rowhammer exploits the same way.


>The same reasoning also excludes the existence of a perfect virus scanner (even if you were to define a virus to be a programme that may open a dialog saying "VIRUS").

It's a cute theorem, but it's more misleading than it is useful. Let's say we use a simple bytecode format for your programs. They can only open that dialog by having a call to the open-dialog function with a fixed parameter saying "VIRUS". We may not be able to say if an arbitrary program will run that call, but we can state with certainty whether it has that call. Such a program is either a virus or a virus decoy, and either way should be blocked.

We don't need to answer the question "Are we certain the program does X at runtime?". If we can answer "Does the program have an X-performing module?" that's good enough.

On android we can't answer that because the execution model is too powerful. But it's possible to make a practical app platform with the ability to answer that question.


The problem is that your execution model has to become incredibly constrained (no execve, no mmap, no function pointers and no reflection) in order to make that virus scanner work. I don't think that a general purpose programming language will be that useful if you disable all of those features.


Yes, it can.


mprotect(..., PROT_EXEC) works just fine on Android


I don't know why I got down voted for misunderstanding the question, the other way of understanding the question made it clearly sound impossible.

How was I supposed to understand that someone was really asking if there's a way for an Android app to change its behavior at runtime when all mobile platforms have access to JavaScript...


Sorry for the confusion. I wasn't aware that a rowhammer attack could be carried out even in a high-level language like JS.


Here's a dynamic approach to stopping rowhammer: http://www.openwall.com/lists/kernel-hardening/2016/10/27/18


>a human (at Google) could override

you cant be serious, google is not paying 200K for developers to play with individual cases of consumer/dev support. You cant even get Human support from google when you make >$100K a month in adsense/YT.


I think dynamic analysis would be more useful.


So any bets on when a jailbreak based on this is developed?


Root is to Android what jailbreak is to iOS, and the proof of concept[0] that was released is exactly that rooting tool.

https://play.google.com/store/apps/details?id=org.iseclab.dr...


Except jailbreaks are historically much harder to find than root exploits. iOS 10 still has no public jailbreak.

You can see this reflected in the going rates for vulnerabilities, respectively. A remote iOS jailbreak can net you a cool 1.5 million, while Android goes for $200,000.http://arstechnica.com/security/2016/09/1-5-million-bounty-f...


Is there any reason to suspect that the same technique wouldn't work to get root on iOS?

Then there's the slight problem of rootless that makes it harder to do a permanent jailbreak, but getting root should be a pretty decent step along the way.


"rootless" is no real obstacle, Rowhammer is almost an arbitrary memory modification exploit. All you need to do(!) is find and modify the appropriate bits, and you will have full control. With "sufficient" effort you should be able to get files in persistent storage modified too, and from there get a permanent jailbreak.


The Google Play store link is broken, probably removed due to ToS violation. Does anyone have an alternate link to download the apk?



Unsurprisingly, Google has purged this app from the store.


They allow quadrooter and stagefright testing apps. Why would this be different?


Because the core of this testing app induces the flaw, while other apps merely detect the presence of their respective flaws. You don't get to trigger the actual exploit.


Is there any way to test for drammer without triggering it?

The app doesn't actually root the phone, as far as I can tell.


I first read bout this a few months ago. Coming from a hardware background I was particularly struck by how damned clever this is!


It's clever because you aren't a memory design or test engineer. This type of problem has been known and tested for since shortly after DRAM was invented.

E.g. here's a 1977 databook from Mostek, at the time the premier DRAM manufacturer: https://archive.org/stream/bitsavers_mostekdataryProducts_17...

Look at these tests in particular:

   Adjacent Row Disturb Refresh
   Column Disturb
   Pattern Sensitivity
Look at the details of the column disturb test:

Column is written with an all ones data pattern an "0" is then written into row of the column 100 times followed by reading all other bits of the column and checking each bit for a logic "1" output. Row of the column is then rewritten to a "1" and the procedure is repeated for rows 1,2,3, ... 63 of the column under test. The entire procedure is then repeated for columns 1-63.

That particular test step is only repeated 100 times, because it's a production test and short test time is critical. You can bet that in design verification the chip was characterized a lot more thoroughly.

The downside of increasing memory sizes over the decades (now literally billions of bits in a single chip rather than thousands) is reduced design margins. So we get rowhammer.

You can bet that rowhammer did not come as any surprise to key engineers at the DRAM manufacturers. They just didn't take it very seriously as a possible real world problem, because the access patterns required to cause it are atypical from those of "normal" operation.


You can bet that rowhammer did not come as any surprise to key engineers at the DRAM manufacturers. They just didn't take it very seriously as a possible real world problem, because the access patterns required to cause it are atypical from those of "normal" operation.

I bet there are also plenty of others wondering whether it's actually a known backdoor that's been used secretly for a long time, and this is just an independent discovery since everyone else who knew about it were told to keep quiet...


Before Snowden, I might have dismissed your post as simple paranoia. But now, to steal a few words from 'jasoncchild', it just might be "damned clever".

All of which means that right now there are probably guys posting in an HN discussion thread on NSA's private comment site. They're saying something like: it's only clever because he's not a spook ... we've been doing this for the last decade. :)


Thanks for the interesting historical reference!

As you state, this type of problem is not new, and as noted at [1] e.g. Intel have taken this seriously on their Xeon lineup for a while.

It's a shame (LP)DDR4 standards did not make Target Row Refresh mandatory.

From the FAQ at [2]:

I have a phone with LPDDR4 memory. Am I safe against Drammer attacks? Again, we don’t know. Chances are that your DRAM comes with the Target Row Refresh (TRR) mitigation, which makes it harder – but still not impossible, in theory – to induce bit flips. Moreover, TRR for LPDDR4 is optional, so your DRAM manufacturer may have decided to drop this technique and leave you vulnerable.

[1] https://en.wikipedia.org/wiki/Row_hammer#Mitigation [2] https://www.vusec.net/projects/drammer/


You could implement similar row access counters in the memory controller (inside cpu). Intel had 2 generations to get on it, they didnt ....


>It's clever because you aren't a memory design or test engineer. This type of problem has been known and tested for since shortly after DRAM was invented

...yeah...sometimes it happens that you are ignorant of the details of a topic...and learning more about it can feel magical...


There's something about Rowhammer that warms the cockles of my heart.


Rowhammer was first presented in 2014

https://en.wikipedia.org/wiki/Row_hammer




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: