Hacker News new | past | comments | ask | show | jobs | submit login
My Take on FBI's “Alternative” Method (zdziarski.com)
149 points by aorth on March 22, 2016 | hide | past | favorite | 86 comments



I've done this kind of "NAND mirroring" technique before but in the context of SSD failure analysis and debugging. We built a special board that would dump the contents of the NAND to a file and restore it. That way we could be free to do failure analysis without losing our only evidence. And the effort to build this infrastructure was not difficult at all and paid back in the first use.


I would love to see pictures of the testing / analysis rig, etc. That would be very interesting.


Its from a company I no longer work at. Not much really to it. The hardest part is connecting to the NAND. We had special debug connections provisioned to make it easy. Other options include desolder the part and putting into a socket, using a interposer, using boundary scan. Then we had a FPGA board that connected to the NAND and via USB to a PC. The FPGA board issued the proper signal patterns to control the NAND and the PC had a GUI interface to issue all kinds of commands to the NAND including doing a raw read and write to files. Obviously we cannot decrypt anything stored on NAND.

When we get a failed SSD, I would read the internal logs and try to root cause it. Sometimes, we have to power up the firmware or change contents of the NAND to help determine the root cause and the concern is the problem is no longer reproducible. By keeping a backup, I have a good chance of recovering things back to the original state.

Having a backup gives me lots of freedom to experiment. I've seen weird multi-factor failures where a batch of NAND parts would fail due to high temperature, with particular data pattern written using specific commands. One of those things that even manufacturing screening could catch in reasonable time. Debugging these failures was always one of those moments where I would go in wondering "what the heck is going on" to ... "ah now it makes sense."


Care to elaborate on this topic a little bit more?


What we build was a way to save and restore a raw copy of the NAND in an SSD. Since that is the only persistent media on the SSD, it is kind of like a "save game" situation where you can try different things and recover back to the previous state if you screw up. Now this is not exactly 100% true for NAND because some bits will be unstable and with multiple write cycles you make the data in a few locations progressively uncorrectable over time.

So with this technique, assuming the iPhone has no other persistent media, they could save the contents, attempt a password unlock and then restore the NAND contents when it gets too many retries.

As for ways to thwarts these attempts, most NAND memory have a pseudo OTP section and unique serial numbers which could be combined to make things more difficult. But these were not meant for cryptographics secure protection so I'm sure someone could find a way around them.


I think they know the risk is too high, and they jeopardize being able to use the All Writs Act in a similar situation against a company that lacks all of the resources that Apple has at their disposal. If patent trolls back down when challenged too hard, why not the US government?


They first use the momentum of the monolithic "justice" system to flatten you.

Do you accept our authority? You defy it? Okay, the misdemeanor we charged you with, if you don't plea we will file an unrelated felony against you.

Oh you resisted, oh you showed the evidence we have against you actually exonerates you?

Okay, the prosecution moves to withdraw the case citing lack of evidence. We won't give you the opportunity to win against us.

(This happened to me when I was a kid. I won my case. The FBI is acting just like that. Threaten and lie, then when everyone doubts them, they run. But as they run, I run after them faster, and in the end they lose.)


story time! <mj popcorn gif>



> I think they know the risk is too high

Yes indeed, let's not forget that nobody cares about this particular phone! FBI has got archives until 6 weeks ago, they should have had full access, hadn't they fucked up (or sabotaged?), and it's a professional phone that's most unlikely to contain anything interesting anyway.

The FBI thought it has a perfect opportunity to establish a precedent: Apple wouldn't fight against this case, because TERRORISM! It looks like it's failing, so of course they won't help establish a precedent against them.

If you have a hammer, everything looks like a nail; and if you're a techie, everything looks like a technical issue. But this is not a technical issue, it merely masquerades as one.


I was just posting the video[1] on iPhone SE thread about the flourishing iPhone NAND Flash memory upgrade market in China yesterday. Incidentally, "NAND mirroring technique" is a crucial step in the process of upgrade iPhone NAND Flash memory. It's increasingly clear that what Congressman Darrell Issa questioned FBI Director James Comey about what exact steps FBI took to attempt to crack the iPhone 5C and in the U.S. House hearing was right on the money! [2]

The iPhone's NAND Flash is in BGA packaging and de-soldering / soldering aren't too much of an issue for a skilled technician (even Foxconn workers can do it!)

And did the timing suggest this "3rd party" in the far east? Sunday night in the U.S. is day time in the far east.

This is getting interesting!

[1] https://news.ycombinator.com/item?id=11331044

[2] https://news.ycombinator.com/item?id=11249033


"I was just posting the video[1] on iPhone SE thread about the flourishing iPhone NAND Flash memory upgrade market in China yesterday. "

I read those posts and was interested ... and I am sorry to digress now in this thread, but:

If you upgrade your iphone with an amount of memory that no iphone has ... what does iOS do ? Does it care ? Does it object ?

If I upgraded a 16GB to a 32GB, I would expect iOS not to notice or care. However, I would be worried about upgrading a 32GB to, say, a 128GB, which is a flash memory size that no iphone SE ships with.

What does iOS do ?


That's a very good question! I am guessing that the memory addressing might be tricky to figure out - again, I wouldn't be surprised if repair shots had tried it. It's always safer to use the hardware parts with most compatible specs - similar like how people do hackintosh [1] :)

You have to also consider the memory chip availability at that specific size and BGA packaging - since Apple is a top consumer of those flash memory chips with only 32GB, 64GB and 128GB in sizes (omitting 8GB/16GB), the production of of those memory chips might very well be coming in ONLY those sizes due to the economies of scale in manufacturing cost.

[1] http://www.hackintosh.com/#hackintosh_compatible


3. An NSA 0-day is likely also out, as the court briefs suggested the technique came from outside USG.

This seems like very flimsy logic. Not to say that an NSA 0-day is clearly the answer, but ruling it is non-sensical. Both the NSA and FBI are well are well practiced with "parallel construction": http://www.dailydot.com/politics/nsa-dea-fbi-snowden-doj-oig...


I came here to say this, and also to offer a rimshot of: "But the NSA is outside the USG. Above the law is still outside of it".


Sure. And shell corps aren't unheard of for laundering government actions--the CIA is particularly known for that tactic.


And we all know that the U.S. Government wouldn't lie to protect National Security assets.


"The weak link in all of this has been Farook and his poor choice of security."

It's only a poor choice if there's anything worth securing on the phone, and most people think that there's not, he destroyed his personal phone while leaving this phone intact.

In any case, most people unlock their phone dozens of times a day, perhaps hundreds of times a day (for some). So expecting users to use a secure passphrase as an unlock code is unreasonable. It seems that it would be better to have a 2 stage system -- if the phone hasn't been unlocked with the passcode in X hours (user configurable), then it has to be unlocked with the secure passphrase -- even if someone knows the passcode the phone can't be unlocked with it after that timer expires. And on reboot, only the passphrase should be accepted.

So someone that unlocks their phone frequently can use a simple passcode, but once the phone has been out of their possession for a few hours, only the passphrase can unlock it.


This could actually be a way to cause some havoc. Leave nothing useful on the phone, but dangle it in front of authorities, making them go crazy trying to get into it. I'm sure this guy was not that clever, though.

What you describe is essentially how Touch ID works on newer iPhones, just with your fingerprint as the simple passcode. On a fresh boot, you have to put in your full passphrase, and every 48 hours after that. After you put in your passphrase, fingerprint unlock is available until either you turn the phone off, 48 hours passes, or fingerprint authentication fails five times in a row.

Unfortunately, the 48 hours thing isn't configurable, so it's not quite perfect, but it's pretty decent.

Certainly for me, it has immensely improved the security of my phone. In situations where I don't trust the fingerprint authentication (e.g. crossing an international border), I can just shut the phone off. Otherwise, it provides "good enough" security. Yes, there are potential attacks on the system, by copying my fingerprints and replicating it, but the five-attempt limit and the 48-hour timeout are good enough. For me, it's a vast improvement over what came before, when I had a passcode of 0000 just to satisfy some apps that wouldn't offer certain features unless a passcode was set.

There are some improvements I'd like to see. The timeout should be customizable. I'd like to be able to set up a "duress fingerprint" which immediately disables fingerprint authentication, as an easier alternative to turning off the phone. But overall it seems like a good blend of security and usability.


>I'd like to be able to set up a "duress fingerprint" which immediately disables fingerprint authentication, as an easier alternative to turning off the phone.

Coercion codes would certainly be a great next step for Apple to implement, not just with fingerprints but with actual pass phrases/PINs as well. Ideally a coercion code system would allow a couple of different levels of response depending on a user's needs and the situation they faced. For example, one simple PIN/finger print could just revert the system to needing the full passphrase, while another (or a longer passphrase for that matter) could cause the machine to activate pre-scripted actions, wipe itself, or even simulate a malfunction.

For what it's worth you can actually already do a limited version of this under iOS if you jailbreak. Touch ID can handle up to five separate fingerprints, and in vanilla iOS AFAIK any successful read is the same as any other in terms of system action. Interestingly however, internally Apple actually registers a separate system event for each different finger. Either writing your own tool or using a tweak like Activator, you can thus set different fingers to trigger different actions, including shutting down/rebooting the device (which in turn activates a long pass phrase) or running a script (which allows alerts, photos, erasure, boot looping, or potentially even something proximate to bricking via messing with nvram).

I'm actually quite curious what, if anything, Apple plans to do with Touch ID in the future because they've clearly left themselves some flexibility in the underlying system that I don't think they're presently using anywhere in iOS. Maybe they're more leaning towards features like "select a different payment system with a different finger" or "unlock and launch straight into a favorite app with different fingers", or even some form of multiuser access based on different finger prints, but if Apple decided to start implementing features against rubber-hose cryptanalysis in iOS 10 it'd genuinely be pretty exciting. I'm not aware of any mainstream consumer electronics even attempting anything like that yet, though if anyone has some existing/historical examples I'd really love to read about them.


Ahh interesting, I didn't realize that TouchID would also require a password periodically, I don't have much faith in fingerprint ID, but the required password helps.

I like your idea of the duress fingerprint since the police can demand that you supply a fingerprint to unlock a device (but can't demand a passphrase). If you use the wrong one then "Oops, I was so stressed out I forgot which fingerprint to use".

I really think it's unlikely that law enforcement will ever take an interest in my phone, yet I still like knowing that my data is my data.


> I like your idea of the duress fingerprint since the police can demand that you supply a fingerprint to unlock a device (but can't demand a passphrase).

There are cases where a judge can force you to supply a password.

https://www.eff.org/issues/know-your-rights#17


Dump a few hundred megabytes from /dev/random into an encrypted file with a provocative name. The thing every Cypherpunk dreams about.


> I'd like to be able to set up a "duress fingerprint"

I personally just don't use the tip of my thumb, so unless you knew that I would just try it 5 times and lock out touch id.


Unless the FBI goes through your HN history ;)


This is just part of the subterfuge; it's actually a different finger he doesn't use.


One annoying fact on Android is that it doesn't allow you to set a long password for device boot unlock, and a short one for the lock screen. It forces you to use the same password for both.

I would like to be able to shut down the phone when the risk of attacks/theft/loss is high, knowing that a high-entropy password protects it, but without having to enter it all the day.



Pretty sure this is exactly how my nexus 6p operates.

I have fingerprint set up to unlock the screen - but at least once a day ill have to enter the passphrase and it wont accept fingerprint. It says something like "for additional security, please enter your passphrase" instead of "finger moved too fast" or whatever

Every reboot you have to use the passphrase to unlock the phone the first time.

I dont see any options for configuring the time period when it asks for the passphrase again though.


How about proximity to a key fob which also contains a button to lock it down?


Last time I looked in to this (many years ago) the only way you can use software that reboots an Android phone is to have that software signed with the same key that signed the firmware running on the phone. [0]

Sadly, it's not straightforward to make a remote reboot-phone button that's usable by the general public.

[0] IIRC, calling halt(8) from the shell didn't work, either.


It's definitely possible if you have root: http://tasker.wikidot.com/devicereboot.


Sure, sure. If you've flashed a non-stock firmware all sorts of useful things become possible.


The speculation that, on Sunday night, some security researcher was able to convince the FBI they had a good shot at cracking the phone defies belief.

A simpler and more likely explanation is that the FBIs lawyers, having overreached, have gone back on an earlier lie and concocted a new lie: Oh yeah, we need just a couple weeks to try this out. (And for the furore to die down.)

Don't let these little John Yoos off the hook. They are over-aggressive, unethical, and unprofessional.


I think it's more likely to be a mix. The FBI knew for a while, if not from the beginning, that getting into the phone without Apple's help was feasible. The NAND mirroring technique described here has been floating around for a while. But the FBI didn't want to, either because they have ulterior motives in forcing Apple's cooperation or because they just didn't want to take the risk of breaking anything. Now that they're having so much trouble securing Apple's cooperation, they either think the alternative is worth trying, or at least find it to be a good excuse for stopping.


What you say is plausible. Can it be shown by outside parties what FBI knew and when? If so, it seems a fair bet that several agents and lawyers perjured themselves/suborned perjury in their zeal to "win" at all costs.


We know they were aware of the attack no later than three weeks ago, when Congressman Issa brought it up during the FBI's congressional hearing on the matter. Of course, that doesn't mean they took it seriously, but I'd say they should have, given that Issa described it in great detail. In fact the FBI director said they'd check into it. Of course, that could just be meaningless pleasantries meant to placate someone in power. The timing certainly is suggestive, though.


One thing is clear to me from all of this: Rep. Issa is a really smart guy. I don't agree with his politics (for PATRIOT act, against gay marriage, and supports anti-flag desecration amendment) but I have quite a bit of respect for the guy. A few weeks before pretty much anyone else brought up the point of NAND mirroring, he stood before Congress and confronted Comey about it. In general, he's written very well thought out letters before about issues before Congress.

In a Congress where the majority (!!!) does not believe that humans contribute to climate change, it's rare to find someone who actually thinks. Congratulations to Rep. Issa for his role in protecting our Constitutional rights.


Hold your horses. It seems the natsec apparatus is divided on this case and many have been vocal about it, including the Secretary of Defense. I don't think Issa himself cares very much about this issue, it's more likely that someone close to him, someone not too keen on having the courts ruling on this, fed him the NAND mirroring line. A good contribution to the debate congress was having but not enough to justify this inadvertent canonization of Rep. Issa.


Oh, I'd definitely like to avoid a canonization of him if possible. He's anti-gay, anti-climate change, and supports the PATRIOT act. But even if this is just a fed line (and this is definitely possible), just think how many other members of Congress would even be able to deliver such a line or understand it? Of the 99 others, do you think any would understand NAND mirroring? Even on a superficial level?

My broader point remains that in general, Rep. Issa takes the time to analyze things and write about them, often bringing up issues that are otherwise neglected. Even in his support of the PATRIOT act, he lodged unique complaints about the constitutionality of the law under the fourth amendment.


You might want to look up Issa's voting record on environmental and climate change issues before praising him too highly.


Oh absolutely, in fact I wrote in my post how I disagree very strongly with most of his political opinions and voting record. I think it's unacceptable for someone to deny climate change, be homophobic, or vote for the PATROIT act. I would of course not vote for him due to this.

On the other hand, I think that the vast majority of politicians from any party with any views in Congress are so ridiculously brain-dead that I could not vote for them either, even if they were a carbon copy of me on the issues.

There are only 100 US Senators, and they make the most important decisions in the US. If an important decision came down the line, I have no hesitation saying that the vast majority of senators with my viewpoints would make a ridiculously uneducated, stupid decision.

Given how sharply Issa diverges from me on pretty much all issues, it's very likely that his decision on any topic would be something I disagree with strongly. It might drive climate change in the wrong direction, it might discriminate (in my view) against certain people, and any related law may be something that I spend quite a bit of time and money fighting against (I say this as someone who is very politically active.) But I have no doubts it would be an intelligent, researched decision.



A little off topic, but I was wondering what kind of "wear and tear" occurs on a phone at the molecular level? I imagine a phone with a "pin" verification will be tapped fairly regularly in certain places on the screen (especially if the user never changed their pin).

Also wonder if any of that "wear and tear" would show up in the circuits that detect taps. Again, if only specific places on the screen are tapped for the pin (never changed) it may leave enough evidence somewhere somehow at the molecular level.

Anyone here with knowledge in this area that could debunk or validate this concept?


The author assumes the FBI wants to continue pursuing Apple.

They might be calculating that it's too risky now, and this is the cleanest way out without losing face.

They may not even care if the alternative method works or have a viable one.


Of potential interest: "Can This Israeli Company Hack the San Bernardino iPhone Without Apple’s Help?" by Ariel David Mar 21, 2016 7:58 PM in Haaretz http://www.haaretz.com/israel-news/business/.premium-1.71006...

Or:

https://webcache.googleusercontent.com/search?q=cache:8zcNEr...


>Aside from a robust research department, Ben-Peretz attributes the company’s success to its original business in the mobile services sector, which gives Cellebrite good contacts with mobile phone operators and vendors and ensures it gets early access to new phone models and operating systems, allowing it to release updates for its software often months ahead of the competition.

So manufacturers are helping them break in earlier by providing early access? Why would they do that?


I think they mean "mobile carriers and retailers" there. I.e. they're getting it from AT&T, or a Sprint Store (or the appropriate Israeli equivalents), not from Apple.

I would speculate that the relationship is either commercial in nature, or that it also serves as a pentest to prevent things like unlocking/jailbreaking contract phones.


Non-paywall second report that Cellebrite might be the company providing the iPhone hacks to the FBI: http://www.reuters.com/article/us-apple-encryption-cellebrit...


I don't see why anybody believes the FBI aren't just lying at this point. They pretty clearly lied or at least obfuscated the truth on several other points so far. It seems obvious to me they never cared that much about what was on the phone and were only after a legal precedent - this claim about being able to unlock it another way is just a way to save face. It will be interesting to see whether they ever make public any evidence they claim was gathered from the phone or whether they just quietly let it slide into history and hope everyone forgets about it - I'm betting the latter.


<conspiracy theory> Apple releases latest version of iOS (which requires everyone who installs it to type in their passphrase, for both the phone and iCloud, again), and the FBI backs off on demanding that Apple write FBiOS for them.

Maybe Apple got hit with an NSL and were actually required to backdoor their products? Now Apple is delivering the full encryption keys to interested parties? </conspiracy theory>

I know it's silly, but the coincidence of the release of iOS 9.3 and the FBI backing off from its position was striking.

The more reasonable theory is, of course, that the FBI realized that they overstepped their boundaries and put a tool they've used for decades ex parte at risk of being taken away from them by butting heads with a determined adversary.


I'd find the timing scarier if the beta for 9.3 hadn't been going for a while before this all came up, and we were already right around the number of releases they always do before an GM build is ready.


IANAL, but since FBI was a plaintiff, they have a right to withdraw the case without giving any reasons, no?


Of course they have the "right", but after pushing to an extreme here - to the extent of getting everyone up to Obama to pressure Apple to give in - they can hardly just walk away and not say anything. Even the "oh we found another way" sounds pretty lame, but it's probably the best they could come up with.


> they can hardly just walk away

AFAIK there is no law requiring them to do so... (?)


For embedded devices which run code off of ROMs, there are "romulators" [1] which emulate ROMs. They can be fitted into the ROM socket to work just like a ROM except that you can reload it at-will without burning a new ROM. Makes it very easy to recompile and try a new set of code.

Is there nothing like this for Flash memory?

[1] https://secure.transtronics.com/I-PKTROM.htm


The one that is the most tight lipped is, of course, the one people are paying the most attention to. I’m not at liberty to specify who

Is this some bastardized take-off of responsible disclosure? Why the hesitation in naming the suspected firm?


> We know the FBI hasn’t been reaching out to independent researchers, and so this likely isn’t some fly-by-night jailbreak exploit out of left field. If respected security researchers can’t talk to FBI, there’s no way a jailbreak crew is going to be allowed to either.

There are thousands to ten of thousands commercial forensic companies (domestic and international; not including Shenzhen one-man shops) that can de-package a chip and hook into the ARM AHB.


Doesn't this kind of circumvention around encryption violate the DMCA?

(Yes, I know, gov't agencies are probably exempt from DMCA enforcement and/or the FBI doesn't really care if they're violating it anyway).


DMCA is about copyright. Breaking encryption has nothing to do with it except insofar as the encryption is protecting copyrighted data.


I think parent is referring to access control circumvention under the DMCA: https://en.wikipedia.org/wiki/Anti-circumvention#United_Stat...


>No person shall circumvent a technological measure that effectively controls access to a work protected under this title

Which, as I said, refers to encryption protecting copyrighted works as a means of control.


Is it possible that Farook had some copyrighted material on his phone?


I think that's irrelevant. The encryption needs to be a means of control like DRM, not just copyrighted material that happens to be encrypted.

Edit:

>a technological measure "effectively controls access to a work" if the measure, in the ordinary course of its operation, requires the application of information, or a process or a treatment, with the authority of the copyright owner, to gain access to the work.

So encryption that doesn't depend on the authority of a copyright owner isn't affected.


His reasoning made me think about JTAG.

JTAG can be used to inject test data into device. Basically you put data into device flip-flops (registers) by shifting a long series of bits into special port. Then you perform a single step (computation of a function) and read resulting state from registers back.

I think it is possible to inject PIN attempts into device and prevent device (by resetting it instead of performing step) to write over the key. Or even simpler - I think it is possible to read the key from device by JTAG operations.


Apple surely disables JTAG ports on shipping devices, this has been standard practice on security critical hardware for years.


My Take is that they circumvented the existing protection against brute forcing. The iPhone does this by implementing a countdown timer that prevents trying more than a few passcodes before the time between tries becomes unreasonable. If you can speed up the clock, then you can cut down on the timer. There may be other methods to shorten the time as well.


I think I read that it takes 80ms just to run the decryption algorithm on the iPhone, so while having unlimited retries would get them in eventually it will not speed up the brute force attack.


Assuming it is a 4 digit numeric passcode, that's only 10,000 possible combinations. At 80ms each (let's round up to 100ms to be conservative) that's only ~17 minutes. Even if it is a 6 digit pin we're only talking about ~28 hours (if they use a single, serial process).


That is true. Most people use a 4-digit or 6-digit numeric password and that's what the FBI is betting on when they ask to remove the brute force restrictions.

However Apple allows something like a 35-char max with numbers and letters so in the worst case they will never crack it.


I don't know if it is correct but I've seen it mentioned elsewhere in this thread that the passcode they need is 4 digits.


One thing I think we can count on is that now that there has been a retreat we will never hear another word about this from FBI or NSA. And probably from Apple.


Wouldn't these hacking techniques be completely illegal for a corporation to sell?


Maybe the guy that changed the password remembered what he changed it to.


The iCloud password was changed (and yes they know what to). Now the iPhone & iCloud won't - that is, can't - access any data until the 4-digit passcode is entered on the phone, which will self-destruct if 10 wrong codes are entered.

After they enter the 4-digit passcode, then they can enter the new iCloud password.


NAND mirroring technique this article describe is just absurd.

It would be more believable to me that FBI will try to steal the Apple's secret signing key or to bribe the Apple engineers in two weeks.


For a government agency tasked with enforcing the law, it's absurd to contend they would steal and bribe to acquire evidence (victimizing one of the world's biggest & richest corporations, involving the "crown jewels" and/or significant engineering efforts, in the process).

NAND mirroring is hard, but not impossible. (Whether the FBI can do it may be approaching absurd.)


Not absurd to contend they would steal or bribe. FBI's use of sneak-and-peek black bag operations is well-documented and has been going on for a very long time. (https://en.wikipedia.org/wiki/Black_bag_operation#Use_by_the...) In the book "The Puzzle Palace", there is a section documenting how the NSA pushed to get the FBI to reinstate blag bag ops after Hoover stopped them, so that they could recover cryptographic information to aid or obviate cryptanalysis.

I leave judgement of the ethics of the practice and integrity of the organization as an exercise to the reader.


Is it actually hard? See this discussion just a day or two ago of services in China that are doing iPhone memory upgrades for cheap [1]. This involves desoldering the NAND, copying its contents to a bigger NAND, and soldering the bigger NAND into the phone.

[1] https://news.ycombinator.com/item?id=11331044


Actually doing it may be easy for those familiar with the details.

Doing it with legally-provable assurance that you won't permanently screw it up, and proving chain-of-custody issues along the way, is hard.


What's absurd about it? The process sounds perfectly feasible to me. If you can perform the procedure, then it will definitely allow you to unlock this phone through a tremendously tedious but entirely doable brute force process. Try five passcodes, reset, try five more, repeat until done. I'd feel sorry for the special agent(s) tasked with entering sequential passcodes until one works, but the process would only take a few days if done continuously, or about two weeks if you stuck to regular business hours.


I bet you could build a robot with three stepper motors, a hot dog, and an Arduino to punch in codes for you pretty easily.


There is a passcode that is really tough to replicate- behaviour - aka habbits. Every day, you travell down the same road, meet the same people (okay as your cellphone the same cellphones, wifi-stations). This habbit is like a fingerprint, as in, there are very few people who have the exact same browsing behaviour. If you could extract a fingerprint from that information, you have a sort of biometric identifier that is at least hard to come bye.

Problem- every holiday your phone thinks youre a phonethief


Seems like that would be the incredibly difficult to train, incredibly invasive and by the time it detected the user probably wasn't you or you weren't nearby to considerable confidence you could have been compromised already.

I think that kind of approach would be better handled by implants or wearables like watches with super long battery life but Bluetooth tokens already so most of that with half the invasiveness.


I think that Toopher is developing something like that.

Two-factor where your phone supplies the second-factor over a sidechannel if you're in the right place, doing the right thing, at the right time.


One of many problems.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: