Hacker News new | past | comments | ask | show | jobs | submit login
Vulnerabilities in Heroku's Build System (titanous.com)
170 points by Titanous on July 3, 2012 | hide | past | favorite | 34 comments



Heroku offered me a paid penetration test contract, but required that I sign a retroactive non-disclosure agreement which would have precluded publishing this article.

Worth pointing out: virtually all paid pentesting engagements are delivered under NDA. In fact, more often than not, they're done under the far-stricter terms of a master agreement with detailed IP clauses.

If you're talking to any firm about having your app tested, get an NDA in place, and don't feel bad about asking. Nobody who thinks they need it should forestall having their app checked out because they think the firm they're going to work with is going to try to make news out of their findings.

Obviously, if your customers find vulnerabilities themselves, all bets are off!


I can't blame Titanous under these circumstances, though.

It's a whole different ballgame when you're testing applications under contract (as I know you know), but as a budding researcher/security guy, it might be better for him to publish via responsible disclosure (as he did just now) than to accept a temporary (presumably secret) contract.

I know that ethical disclosure is a whole can of worms that isn't completely relevant to this conversation, but I'm a firm believer that Titanous went about this the right way. First off, he is the customer (as a Heroku user), and secondly he notified Heroku and waited until the vulnerability was fixed before publishing his findings. If all researchers behaved in such a responsible way, we'd probably be in a better place as an industry.

Anyway, the point I'm making is that, sure, having a penetration testing contract with Heroku might have been nice for his career, but I think being able to point to this research that he's conducted on his own and responsibly disclosed is far better. Hell, I'd offer him a job right now if he seemed interested.


I hope it doesn't sound like I'm blaming him for anything. I'm not.


I do not think your post reads out as blaming Jonathan/Titanous. But David's post is a valuable addition - especially the possible personal marketing side of the publication.


Of course there was never any question that any new penetration testing I would do with them would be under NDA. What worried me is that the disclosure may not be made public if I was under a NDA that prevented me from publishing it. I didn't want to have any professional or ethical conflicts.


You were right: once you had engaged a unit of Salesforce for an app test, it would have been extremely difficult for you to publish this. Not black-letter impossible, but difficult.


The retroactive part of the deal was a clear ominous alarm and you acted accordingly. I find great joy in knowing that you showed a pristine level of ethic behaviour in your security research. Well done.


Nothing ominous happened here at all.


This is a minor point, but one of my favorite details: the use of a simple table for the "Disclosure Timeline", which is really the clearest way to illustrate the step-by-step sequence of events. I would love to see this be a standard practice for any narrative in which order/visual comparison is vital. HTML tables are so easy to include (or, even bulleted lists).

A masterful explanation, on top of being an altruistic deed.



Very glad to see this not only documented, but patched extremely quickly. Heroku continues to impress, and Titanous is a credit to the security profession.


This is a really interesting insight into the way someone found access to a system but could someone explain to me why he needed PGP keys from heroku? I'm sure there is some good reason but if someone could tell me that would be great.


It's standard practice to encrypt your emails when sending potential security notifications to the party that you're notifying. Not only does it keep out prying eyes, it ensures the person sending the email is who they say they are.


"it ensures the person sending the email is who they say they are"

Although, if you're going to trust the pgp key you get sent when you mail someone asking for one, you don't have any more assurance that you're encrypting it to the person you intended, and you _almost_ may as well have sent it in cleartext in the initial email...

Get your pgp fingerprints in some out-of-band method of communication. (When was the last time anybody even heard of a key signing party? I attended on at the Perl Conference in '98 or '99 - don't think I've been aware of one happening in any of my circles since)


In this case I was able to confirm the key by matching it with a copy they uploaded to https://policy.heroku.com/security


Good to hear.

Now the paranoid in me is wondering - if there's an attacker deeply entrenched enough to be reading their plain-text email, either on the wire between them and you or with sufficient root privs on their infrastructure, they could probably have arranged their own pgp key to appear there for you as well. If I were trying to ensure I was doing my utmost best diligence, I'd have tried to include a completely non-internet backchannel - perhaps a call phone number for Heroku sourced from a dead-tree phone book (if such a thing exists anymore?) and get someone to read out the key fingerprint.


if they're that owned, it's highly unlikely that you telling them about some additional vulnerability is going to help their attackers. and you'll figure it out soon enough when none of your proposed fixes are enacted.


True - but if they're _not_ that owned, TLS encrypted email probably would have been sufficient. (Though I'm not sure how easy it it to force/ensure TLS in common email clients…)


TLS only protects a single link; from your client, to your server. It doesn't prevent you from disclosure on that server, on any relaying servers in between, or between their server and their client (remember, they may be reading email on wifi in a coffee shop).

S/MIME email is another end-to-end encryption scheme, like PGP, but it isn't as popular among a technical crowd as PGP is.


This is where, the owner of the PGP key hopefully have verified themselfs by setting up a web of trust.

In other words, that they have passed the key along to employees and other trusted (ie. large web-of-trust) tiers - which in turn have put their keys to use to verify that this key belongs to the security team of Heroku.

To verify the verifications, you could contact one of the trusted parties and ask them how they are sure that the particular key is correct - if in doubt.


Accept nothing less than the full company board singing the entire key to you in binary at a restaurant of your choosing.


Heh…

I'd like to hope the whole "web of trust" idea could solve this problem. If I've got a large enough set of people I've been sending signed or encrypted email too using a particular key, that history means I've got a pretty reliable idea that the key is "real". With a bit of luck, if enough of my set of people has their own group of historically-verified keys, I might have a good enough chance of finding someone I know and trust who'll vouch for a key fingerprint of someone I need to securely communicate with.

(I wonder if pgp signing registration email or payment receipts might help here? Or perhaps including key fingerprints? It'd be nice to be able to mail a user/customer saying "here's our PGP key, and you can check it against the key fingerprint we sent you when you signed up" or maybe " … that we print on every invoice" ?)


Widely communicating your key fingerprint makes sense. I put it on my business cards, etc. Back in the late 1990s people were talking about publishing root key fingerprints in newspapers, engraving them on stone tablets, etc. I.e. things which obviously required a lot of money be expended, thus making casual forgery less likely.


Perhaps people who rely on a longstanding online reputation could provide a service as verifiable online keystores for many different organisations via their own public key, so providing a distributed and multiply redundant public key resource that would be incredibly difficult to hack all in one go to fake a specific key.


Does it warm your heart that key signing parties still happen about twice each year at Caltech?


ah yes, key signing parties

http://xkcd.com/364/

:)


That's a nice way of handling it, both from heroku and you who found the exploit. I feel content with my choice to use heroku.


I'm happy to see both parties in this situation acting with a great deal of professionalism. It feels good to be a part of this community.


Perhaps a fantastic example of why keeping detailed configuration in environment variables is probably a bad idea. Your private SSH key isn't part of the environment. It's configuration. Treat it as such.


Not sure that would have helped much in this case - he used dumping the environment as the particular avenue to get the "private" data, but he mentioned he had source code access and the ability to run untrusted code. If important credentials were in or available to the code, it sounds like they would have been vulnerable anyway.

It's a hard problem trying to secure credential that code needs to work, from other code running as the same user when someone has source code access to "authorised" code.


Well someone could have easily discovered the environment variables vuln without getting the source code, although it is unclear why so much was disclosed in env vars. It suggests that everything runs as the same user, or you would not get anything.


Thank you for taking the high road and exposing the issue rather than giving them the option to hide it (not saying they would).


This is the concept of Full Disclosure. http://en.wikipedia.org/wiki/Full_disclosure


the page is well composed




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: