Hacker News new | past | comments | ask | show | jobs | submit login
Fun with your friend's Facebook and Tinder sessions (robertheaton.com)
488 points by aitskovi on Dec 16, 2014 | hide | past | favorite | 55 comments



I really enjoyed the style in which this post was written. It was hilarious and really engaging. It was also interesting to read through all the details of the hack.


Really clever hack and entertaining blog post!

Love this author's writing, esp this other post of his on Playing to Win - http://robertheaton.com/2014/11/03/why-you-should-read-playi...

A hacker at heart.


Thanks for the link. Although there were some points in the article where i was not sure he was being serious or sarcastic.

Just began reading the actual book Playing to Win (Its available free for online reading: http://www.sirlin.net/ptw ). It has already struck me as very intelligently written and insightful when you view its lessons as applying to life (at least the competitive aspects of it) rather some video game.


I'm trying to think about whether there's a way for Facebook and/or Tinder to mitigate this attack without degrading user experience. Because the auth token used is from the response to the last request ever made from Steve's computer, having a changing auth token on each request wouldn't help in this scenario. Restricting an auth token to an IP address wouldn't work since both users are presumably behind the same NAT (all devices on the same residential WiFi router) - not to mention that IP addresses change all the time. Restricting an auth token to a user agent string would stop the first attempt at this hack - but then someone would simply tell the proxy to mimic the UA of Steve's desktop - but then could Facebook refuse to honor mobile application authorization requests if the mobile device is mimicking a desktop browser's UA?


It seems like the most reasonable mitigation would be to disallow Burp Suite from working at all by using SSL cert pinning. (I'm actually pretty surprised that they don't do this already -- I know that Google pins certs for their own apps in Chrome.)

This, of course, would not completely stop the issue. But, it would make the author's job that much harder, since he'd have to emulate the Tinder protocol without the assistance of the Tinder app -- or would have to hack the Tinder app (and run it on a jailbroken device) to disable the cert pinning.


In the story, BurpSuite was used only on the attacker's machine for ease of use. You could also hand-craft the requests using curl.

Cert pinning doesn't help when someone installs their own certificate authority. It stops other CAs that came bundled with the browser from working, but if it stopped self-installed certificates from working it never would have gotten off the ground because many organizations demand the ability to use their own certificates for signing things.


Cert-pinning in the application for their own server is totally doable, that's exactly what Google is doing with Chrome.


If you install your own CA into Chrome, it will overrule the cert-pinning that Chrome does. This is very on purpose.


That wouldn't help either: In this article, the Facebook is uninstalled prior to authorization to ensure the Facebook request is forced to go through the browser. So you might be able to get away with HSTS certificate pinning, but that could likely easily be cleared out (unless it's preloaded... i dunno if you can clear HSTS preloads in Safari, or if they even have such a thing). Even then I suspect the authorization could be spoofed somehow, as all of these measures only matter on the attacker's machine


I wonder if it's feasible to reference the TLS/SSL session against the session cookie? While the HTTPS session itself is probably transient, you can tell something is up if the same session cookie is being used with two different HTTPS keys.


You're describing the concept behind channel bound cookies. https://tools.ietf.org/html/rfc5929

As far as I know, it's not supported by any current browser (I welcome feedback to the contrary) but is included in SChannel. Given that we've only recently (arguably) gotten away from SSLv3, I don't have high hopes that it will be viable to require channel binding in the very near term.


Chrome v24+ does support all you need for channel-bound cookies: it supports TLS Channel IDs (previously known as Origin-Bound Certificates). To actually bind cookies, it is the server's responsibility to extract the channel ID from the TLS/SSL handshake, and bind the cookies to it.


Do any cloud SSL terminators like Amazon ELB support forwarding the channel IDs on to the application servers (i.e. in a custom header)? For that matter, is there a configuration setting for i.e. Nginx if you want to roll your own SSL terminator to do this? Having trouble finding good documentation about how to handle this from the server side.


Cool – hadn't heard of that before. I was thinking more along the lines of a purely server-side approach:

  $_REQUEST["salted_SHA_hash_of_symmetric_TLS_key"]
You'd save the current key to a DB, and manually check it in future requests.


Well, if you can intercept the request to the server to can also change that parameter of the TLS certificate hash.


Is that actually true, though (especially w.r.t. Forward Secrecy)? Don't both parties generate separate halves of a symmetric key independently, preventing any one party from forcing the use of a particular key on a new session?


If each computer had a unique hardware private key, that could stop it. But I'm not sure that they do? (Or even if some do, can HTML5 access that somehow?)


That's the idea of a Trusted Platform Module, which many machines have had for some time. TPM provides a per-device hardware environment for signing and storing keys in a tamper-resistant manner.

HTML5 can't access TPM directly, but in ChromeOS, you can create or import a client certificate as a 'hardware-backed' certificate, which is then wrapped by the device's TPM.

At this point (if properly configured) an attacker can't exfiltrate client certificates from the device even with root-level access to the machine. Plus, in theory, extracting key material from the TPM should be made difficult by its manufacturer by means of various physical protections.

Obviously this is a very niche edge-case, but it is possible :)



Another reason: SSL certificates cost money. StartSSL has some free option, though.


The costs-money kind of server SSL certificates and client SSL certificates are two very different things.

Client certificates are generated by the user's machine and signed using your server's private key. The user's client presents them to your server to prove that the client is who they said they were when you signed their certificate. These certificates don't cost anything, besides some CPU cycles on both sides of the process.

The kind of server SSL certificates that cost money are generated by you and signed by a CA that most users' browsers will trust. Your server presents them to the client to state to the client that the server belongs to the domain it says it belongs to.

Most CAs will charge you money for the service of signing those certificates, but that process has nothing to do with the lack of adoption of client SSL certificates.

The parent article does a good job describing why client certificates aren't used more often: the UX doesn't make sense to users and there's not a user-friendly way to protect them with a second factor (the way you can encrypt your SSH keys using a passphrase or authentication device).


Couldn't the author just copy Steve's private key to his computer then?


Not if it's a hardware key. You give the processor something you want to encrypt, but you can't look at the actual key itself (the only way to do that would be with an electron microscope).


That said, if you could gain persistent remote access to the computer, you can just repeatedly ask the processor to encrypt things.

This is incidentally part of why the Chromebook design makes it hard to persistently change the machine; a reboot starts from a clean signed image and then mounts a home directory. It's still possible to stick a persistent exploit somewhere in the home directory, but it's not as simple as just dropping a file in /etc/init.


Can't you just sniff for a browser fingerprint and if too many characteristics have altered end the session?


The problem is that the proxy that the attacker is using could ostensibly alter the outgoing messages from his phone's web browser to mimic the browser fingerprint of Steve's desktop perfectly.


Hack aside, really wonderfully written. Very funny and well explained.


As I was recently doing some reading, it seems like the cookie stealing could be made more difficult by adding something harder to fake? I think for flask-login they add in the ip and user agent. https://flask-login.readthedocs.org/en/latest/#session-prote...

This probably wouldnt work given that I am assuming its the external IP and a user agent is pretty easy to copy/clone. Seems like there should be another value mixed in that might be hard to figure out for a third party behind the same NAT.


IP can cause weird behavior for mobile, since it changes all the time, especially if they hop onto/off of wifi. User agent is trivial to fake.


This is extremely well written and engaging. Any other blogs like this one?



I really liked your article on how to take over Rails servers if they leak their secret.

http://robertheaton.com/2013/07/22/how-to-hack-a-rails-app-u...


Shouldn't the brown-eyed people kill themselves on day 101? Or are they supposed to reason that they, and they alone might have some other coloured eyes.


This is true - it depends on whether they know there are exactly 2 different eye colours on the island. I should clarify that!


Also, since you're not allowed to talk about it, if you work out you have blue eyes you could just keep your mouth shut.

They do seem to be going to an awful lot of effort to top themselves.


Wonder if the match referenced at the end is true. Fun read.


Monica and Steve have two kids. As they weren't referred to as "twins", I'm going to assume that they're not. Assuming Monica got pregnant on the first date and again immediately after she gave birth (which is perfectly possible, but an unusual choice and very strenuous on the mother) that puts the episode 18 months ago, no later than July 2013.

Considering that Tinder was launched around August 2012 and assuming Monica and Steve are modern and responsible people (ie. are careful not to rush into the big responsibility of parenthood), I'd say it's highly unlikely.


Maybe they were part of the YOLO crew of the era.


In Chrome, you can view and manage cookies here: chrome://settings/cookies


You can also get a nice tabular view for the domain you're on by going to "Resources" from dev tools and navigating to the "Cookies" subsection.


Thanks for that! I always went through chrome's preferences which is like navigating a maze.


> you most likely have 2 minutes alone with his computer

Install a RAT and do whatever you want later. You have have a lot more fun with a RAT than just grabbing FB cookies.


Grabbing cookies can be done easily as an offline attack though, assuming they don't clear cookies on every shutdown (which I'd bet 99% of users don't). You simply copy their cookies file/s from chrome and temporarily replace yours with it.

I previously carried out a similar attack whereby I temporarily borrowed the hard-drive out of a housemates laptop when they had left it not locked in their room and gained access to many of their frequented accounts (after they had made a point of saying I wouldn't be able to gain access). I was able to maintain access for several months changing subtle things before someone else notified them and they cleared sessions.

The cookies file is generally small enough to easily upload in the background if your passing around casual programming apps with friends. I don't condone this, but it's a very hard attack to mitigate without services breaking UX. Shopping websites do this by asking you to re-enter your password before changing account details/making a purchase, I'm not sure whether such a UX change would hurt social media.

Note: This was all probably around 6+ months ago, chrome may have mitigated this exact attack since by encrypting the file with something Google account specific.


Reminds me a lot of my experiences building Tinder++ : http://tinderplusplus.com

(Made as a desktop app so it has permissions to intercept the auth token, could have also been done as a chrome extension).

Source: https://github.com/mfkp/tinderplusplus


TL;DR Lock your machine before leaving the room.


That's not really the tl;dr version of the article.


The only hole I can see here is that chrome extensions can read HTTP-only cookies. What are your thoughts on this?


First, some chrome extensions might legitimately need this. But even if they were disallowed -

The guy had physical access to a running chrome capable of sending those cookies, and the ability to install an extension. This basically means no software policy was going to stop him.


It's much easier to use the bottinder chrome extension (or whatever it's called now) than dealing with the proxy. Really it's as simple as exporting the cookies, importing them into chrome, and firing up botinder.



I made a tinder auto liker script a while back to help my non-nerdy friends out. That's where I learn't about the man in middle attack trick.

Very well written I should say.


Excellent article aside I also enjoyed the reference to Darkplace in the header. :)


You and he were...buddies, weren't you?


Someone should automate this process. Very nice write up.


Meta: I was hesitant to click on this link, as the HN comments implied it was "enjoyable", which in my experience when applied to technical articles is usually codeword for "fluffy and un-informative".

But this article has all the technical details, and just-enough-but-not-too-much humor and background story to make this entertaining. Highly recommended, even if you're a "the details, all the details and nothing but the details" technical reader like me.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: