Hacker News new | past | comments | ask | show | jobs | submit login

Responsible disclosure would have been to product-security@apple.com. Do apple have a bug-bounty program?



"Responsible Disclosure" is an Orwellian term concocted by vendors to control the actions of independent vulnerability researchers who work without real compensation, using information freely available to consumers, in competition with malicious attackers.

The term you're looking for is "Coordinated Disclosure". Yes, Coordinated Disclosure would involve sending the bug to Apple and waiting for them to publish it.

If you'd like to complain that this disclosure is irresponsible, fine. But try not to do it using the vendor's marketing term, because it's not up to them to decide what is and isn't "responsible". Other reasonable people --- myself included --- will probably disagree with you, and say that getting information out to people as comprehensively as possible is usually the most responsible thing you can do with a security bug.


I don't like the term you made up. But fine, I think this is "Irresponsible Disclosure." Is that better? Did anything change?

Vendors and non-vendors alike are all responsible for good security, and that includes working together to make this happen. If you are working against vendors because of some preconceived notion that they are "evil," that's not a good thing.

If it turns out that the author did submit to the vendor and worked together to minimize damage then I'll retract my statement. Until then I think it's irresponsible, not just "uncoordinated".


The problem is that allowing the vendor to define what is responsible, which seems these days to be expanding into giving them unlimited time to fix it, is to allow them to take unlimited time to fix it.

Cooperation or even coordination takes willingness from both parties. Let's look at the actual page apple has on reporting security issues [0]

"When we receive your email, we send an automatic email as acknowledgment. If you do not get this email, please check the email address and send again. We will respond with additional emails if we need further information to investigate a security issue."

Something seems a bit off here. I would have expected a human to get back within a few working days for a serious security problem. That might be in the auto response email, but I wouldn't be surprised if it wasn't.

"For the protection of our customers, Apple generally does not disclose, discuss, or confirm security issues until a full investigation is complete and any necessary patches or releases are available."

Does this extend to the security researcher that reports the vulnerability? If so, that's probably why there was no coordination.

[0] https://support.apple.com/en-us/HT201220

Edit: removed spelling mistake mistake.


Since when did the term "responsible disclosure" mean allowing the vendor unlimited time to fix it?


When Microsoft decided they needed more than 90 days to release a patch.

https://bugs.chromium.org/p/project-zero/issues/detail?id=10...

I'd say 30 days is enough. Google was generous with ninety. (They too live in a glass house after all).


Not everything is a web app that can be patched in 5 minutes and doesn't need to run in one hundred million different environments.


If it's not unlimited, what's the limit? Apparently a month isn't long enough.


They admitted they never contacted Apple product security, which means they never notified Apple to begin with. That month you see at the top of the writeup appears to be how long they waited for ZDI before deciding to publish, not how long they waited for Apple to fix it.


So what? They owe Apple nothing. They owe you nothing.

Unless you are taking requests from random HN commenters for software that you would like to build them for free, I suggest you rethink your suggestion for highly skilled researchers to donate charity labor to the largest corporation in the world.


This has nothing to do with owing Apple anything and instead has to do with not intentionally compromising the security of millions of innocent people around the world.

And before you interpret this to mean never disclosing publicly, that’s not what I’m saying. But no matter what your opinion is on the best way to handle disclosure, releasing a 0day without any attempt whatsoever to notify the vendor is highly irresponsible and immoral.


No, it isn't.


As a Mac user, I feel it’s irresponsible. I don’t want zero days published before Apple has a chance to fix.

I also think that the vendor has a responsibility to fix the exploit quickly, and if not the researcher should publish and shame the vendor.


Is 3 years considered quickly enough? How about 3 years for a remotely-exploitable problem? According to <a href="http://www.telegraph.co.uk/technology/apple/8912714/Apple-iT... Telegraph</a>, "Apple was informed about the relevant flaw in iTunes in 2008, according to Brian Krebs, a security writer, but did not patch the software until earlier this month [Nov 2011], a delay of more than three years.".

It seems to me that nobody but Apple has a responsibility to its users. The public at large certainly doesn't owe Apple (or any other software proprietor) specific performance regardless of whether they report what they've found publicly or when.

Apple is also not being nice to its users by denying them software freedom: most of MacOS is proprietary and the aforementioned bug concerned iTunes, a proprietary media player. So no matter how technically savvy and willing the user is, they're not allowed to diagnose and fix the problem, prepare a fixed copy of the changed files, and help their community by sharing copies of the improved code.

"Responsible disclosure" is indeed propaganda that benefits the proprietor in a clumsy attempt to divert blame for a product people paid for with their software freedom as well as their money.


The author has commented above. It seems Apple was aware of this issue before the author published it. I wouldn't put any blame on the author at all.


I don’t want zero days published before Apple has a chance to fix.

Because you think you are safe until publication?

What kind of "if I don't know about it, it isn't happening" worldview is that?


You imply by framing without explicitly stating that "coordinated disclosure" is "unlimited time", but that's not the time frame under discussion.

I consider 24 hours notice bare minimum responsible disclosure, and 1 business day in the operating timezone of the company as an polite courtesy to the human beings who have to respond to uncoordinated security disclosures with emergency builds of their product.

What do you consider the bare minimum notice sufficient to respect the human beings who use the products we find vulns in? One business day? One hour? Zero seconds?

(Siguza, I'd also love to hear from you on this question, if you're willing to share. I know Apple said they don't need advance notice but if they hadn't, and offered no guidance, what would you have chosen?)


> I consider 24 hours notice bare minimum responsible disclosure

You can't possibly be serious? Have I fallen for some trolling here?!


That would be technically impossible, since you had no prior participation in this thread. I would have happily answered questions about my choice, but if your only question is “r u trolln” then there really is very little to say.

Rabble-rouse all you like, but unless you respond with whatever your personal bare minimum delay is, you risk being perceived as the troll in this exchange.


Given...

> I consider 24 hours notice bare minimum responsible disclosure

...it seems rather unfair of you to have a go at my reaction. But somewhat incredibly, it appears you are serious.

I don't have a bare minimum delay - I think the vulnerability discover should coordinate a 'sensible' and 'fair' disclose with the vendor. What 'sensible' and 'fair' means, really depends - how serious is the vulnerability? How many systems are affected? How quickly can the vendor patch, test and document a fix? How quickly can the fix be distributed?

It's a stretch to imagine a scenario where 24 hours is in any way sensible, fair or responsible. I'd be intrigued to know your reasoning.


Their reasoning probably rests on the word 'minimum', which you are typing but also completely ignoring.


> in addition to the spelling of acknowledgement

"acknowledgment" is the English US form: https://en.oxforddictionaries.com/definition/acknowledgement


This made me think of judgement/judgment, so I looked it up there too. It's apparently mixed enough everywhere that there's not a localization. It always makes me pause to think what form is correct whenever I have to write it.


Thanks. I thought I was aware of most differences, but clearly there is always more to learn with respect to the differences between the two localizations.


> and that includes working together to make this happen

It also includes disclosure if the vendor drags their feet and does nothing, which is a very common response.

I have no preconceived notion that vendors are "evil," but I sure as hell have a preconceived notion that they're as lazy as they think they can get away with.


I think "irresponsible disclosure" would involve disclosing it to black hats for money.


Well, it has been disclosed to black hats - just not for money. To end users the result is the same; black hats have an unpatched 0day to play with, and we have no mitigations to deploy.


The end result is not the same. You know about the bug as well, whereas if the bug + exploit was sold to black hats, they can use it without your awareness.

This would be a less serious problem if vendors pushed out fixes faster.


Note that 99.99% or more of end users remain unaware of this bug, regardless of this HN post about it.


Well, we do have mitigations. They're not good ones, but they're mitigations:

- Be more paranoid about allowing r/w direct access to your computer.

- Be prepared to power off or otherwise halt your machine if (on 10.13.12) you see unexpected logouts or similar.

- Safeguard your data and/or consider moving it off of the machine or not using it in some situations.

None of those are great things to rely on or to have to do. A real working patch or detection mechanism would definitely be better. But that's not the same as "no mitigations" whatsoever.


It’s not « preconceived »: Vendors have very low bounties, not even covering the time spent on 1 issue (none for macOS I think) and the reason researchers are working with them is more the drawback of working in black hat than the reward of white hats.


Why a consumer should be left in the dark while malicious attackers may discover (or have discovered) the same hack? The consumer should know right away that the system is not secure and take action (like not using it any more until a patch is released). That would be responsible.

If you knew something is dangerous would you let your family/friends in the dark just to give the Company the time to fix it?


Just to strengthen your back, representatives of the Chaos Computer Club agree with you on that, and also agree that no vendor has a right to coordinated disclosure.

But they also argue that you should try to coordinate disclosure the first time at least, and only if a vendor doesn’t cooperate, you should publish future bugs. And they also suggested to coordinate disclosure at least with the club, so the disclosure is handled via official press communications of the club, and they can offer legal protection, too (very often, vendors will just sue any researcher).

This information is from a recording of the 34C3 year in review and PCWahl talks.


You're correct, but I think it's worth going into a bit more detail about some of the tradeoffs involved in the concept of "Responsibility" as it applies to security research and exploit discovery, because there are a few clashing objectives that shape what the right choice is each time if we're considering how to minimize harm to all parties (if someone just wants to sell it to the highest bidder none of this applies). On the one hand, a vendor provided, well tested full patch is of course the ultimately desired permanent fix.

But what I think many people forget when they get "responsible disclosure" in their minds is that there are often bandaids users can do to protect themselves immediately regardless of whether a patch is ready, so long as they know about it. And since it's always possible and generally unknowable as to whether someone else might have already found the exploit and be using it, there is extra hard to calculate risk. Releasing it without a patch may lead to some users getting exploited, but it could also actually protect some users from being exploited or at least allow them to minimize the harm. Once it's known about in the wider community, it is also easier to check whether it's been selectively deployed anywhere. The lag time between notification and vendor patching is itself a risk (and of course there is lots of room for perverse incentives in all this).

So the real core issue with Coordinated Disclosure is that there is not in fact a Right Answer in general, any choice may help one group at the expense of another. Many researchers and organizations try to split the difference with standardized policies that seem to strike the balance, perhaps with occasional exceptions if it's serious enough. But ultimately it really is up to the discoverer and it's wrong to insist they conform to what the vendor finds desirable, particularly since ultimately the responsibility for the blunder lies with the vendor. It's a hard area and researchers should be respected for the work they do on their own terms.


tptacek's position is well thought out and defended, by himself and others, and generally includes what you are covering here when expanded in detail. I feel for him, because he's put in the position of being the mouthpiece for this all too often and it must get tiring repeating himself. Sillysaurus did a summary of his prior comments on this a while back and added some additional resources[1].

I myself have found myself in total agreement in one instance, and then a week or two later a big exploit comes out that makes me really wish some patches had made it out first.

What I think it comes down to is that any vendor needs to assume the exploit can come out any minute after notification, and act accordingly (if it's important, they better damn well get it patched quick). Any researcher should assume that if they act like an asshole and aren't accommodating in some way, they'll get raked over the coals by at least some of the technical public. As tptacek noted, coordination is best, and that requires a dialogue.

Also worth noting is that sometimes there is no patch. Some security problems are of the degree that the entire process is fundamentally flawed, and in those cases there's little to be gained waiting for the vendor, unless the vendor is working to notify all clients and recommend they cease use of the affected service or product. If, for example, you identify a flaw in in how a protocol is defined, and almost all implementations are flawed, the only responsible thing to do might be to publish publicly. Otherwise you're just favoring some groups over others in some way or another.

1: https://news.ycombinator.com/item?id=14010010


So a disclosure can be irresponsible but don’t call it an irresponsible disclosure?


The "responsible" disclosure term is a pretty devious piece of marketing, because it creates situations such as this very thread.

Its use carries an implicit catch that anything that does not meet the narrow definitions of "responsible" is the opposite. Without naming names there are vendors that have been pretty terrible at handling their end of "responsible" disclosure and appear to be getting worse, down to not even acknowledging there is a problem or even that they have been notified of a problem.

The alternative to disclosing a vulnerability is non-disclosure and, frankly, that's what some vendors mean when they say "responsible".


Coordinated disclosure information is here, and edits/suggestions/feedback are welcome:

http://github.com/joelparkerhenderson/coordinated_disclosure


This looks like the usual greyhat posturing to me. He could have gone with "coordinated disclosure" and talked to Apple. Or he could have sold a local vulnerability for whatever that fetches on the black market. But he thought that boasting about it on the tech web would benefit his personal brand more than either of those routes, so this happened.


So what? He found the vulnerability, he wrote the exploit, he gets to decide how it is "disclosed", and anyone else's opinion of his choice is irrelevant.


He did talk to Apple. They didn't coordinate with him.


I don't understand why Apple doesn't have a well-funded bug bounty program. You would think that companies would welcome people finding bugs in their software. Hell, they could give away free MacBook Pro laptops, phones, and IPads along with CASH!!!


I agree. Considering that giving away a dev style laptop would be a nice reward, it’s weird that they do not leverage their products as reward.

High spec MBP + some cash goes a long way. Even engrave the freaking laptop to make it “sough after”.


They do have a well-funded and well-publicized bounty program for security exploits.


Apple's bug bounty program does not cover macOS.


Would you agree that if it were really well known and earned a fair amount to hackers people would not end up throwing this stuff online (even on Twitter) because their PR team is the only one that gets affected in these cases?


... for iOS


"...turned out Apple PR channel is much more responsive than product security [...] No wonder nowadays people just throw security issues on Twitter right? What a world we live in."

https://medium.com/@khaost/your-home-was-not-so-secure-after...


They don't for macOS. And the iOS one is invite-only.


> They don't for macOS. And the iOS one is invite-only.

Which may, unfortunately, speak to what management thinks about the security/quality of the macOS codebase.

I wouldn't be surprised the recent and upcoming exploits lead Apple to increase iOS dominance over its future product pipeline.


Apple's financial reports and behavior the last few years make it clear their priorities are not with the Mac, but with iOS. For better or worse, Apple is a smartphone company that happens to make computers on the side.


It's not even an electronics company anymore, is a luxury goods company.


> For better or worse, Apple is a smartphone company that happens to make computers on the side.

Apple makes computer and operating system way before it entered the mobile market. The first part is right, but your second part is inaccurate historically.


Yeah but to phrase it pointedly, to what degree does Apple consider Macs to be basically development kits for iOS apps? It's very clear that iPhones make the big cash.


This is stupid though. macOS is the operating system for its flagship desktop/laptop devices. The lack of bug bounty on macOS (or even an invite one) seems really bizarre.


Maybe if more people would disclose such vulnerabilities “irresponsibly”, vendors would develop their software more responsibly. Just my 2 cents ¯\_(ツ)_/¯


Unlikely. Security is hard, and a complex product like an OS and all it’s support programs creates a ton of surface area for attack.


Comments like these really rub me the wrong way. If it were some other company that actually has such a program or isn't as evil and uncooperative in so many aspects as Apple I could even agree to the sentiment (but not the wording). But as noted in the comments - they do fucking not for Mac OS. As noted in other comments - they knew about this bug anyway for a while now and it requires local code execution first to do anything anyway.

They deny problems until a shit tornado actually starts somewhere and they absolutely love to control the narrative. They are clearly PR first, lobby and fight against right-to-repair efforts because they want to "guarantee the quality of authorized repair", sell expensive proprietary software on proprietary hardware (ostensibly to provide absolute perfection by controlling the entire stack), have slogans like "it just works", "light years ahead", "touch of genius", "say hello to the future", care enough about current social outrages (USA specific ones that is..) to do truly silly crap like water pistol emoji or removing a historic game that featured a confederate flag.

They are also secretive as hell (i.e. 0 social media presence/interaction, YouTube comments off on their channel to prevent criticism, engineers under stricter NDA compared to Google, Microsoft, etc.), instantly fired an engineer for a mere iPhone X video her daughter filmed in Cupertino Campus, actually sent police to raid a house of a Gizmodo reporter and confiscate his stuff (which apparently breaks journalist protection laws both federal and state but oh well, it's USA and Apple, money talks, bullshit walks) when they wrote about a leaked iPhone 4G prototype, ran idiotic misinforming (but funny, so apparently it's okay!) ads in the past like the "Macs don't get viruses" one but still managed to have bugs like infamous "goto fail" or "password got stored in hint" (which are frankly insane to me).

If they cared they could do as much as silently toss a few thousand dollars per big bug, make sentimental/bragging right's rewards (like Knuth's cheques), etc. but chose not to.

They should be thankful that between "sell 0 days on the dark net", "do what Apple says for free" and "post online for cred" actual security hackers pick the last and not the first (I mean, I also would out of principle of not being a criminal, but there is clearly something wrong with a company's image if we have to fall back to a person's moral compass or even criminal justice system for any choice).

To add to the problem a lot of their fans are outright rabid, instantly plunder the Apple stores at each product release and criticizing Apple anywhere near them online results in being called a Microsoft/Android shill, hater, poor, fucker, faggot and such, and Apple's PR fluff being regurgitated at you.

If someone lives lavishly in a multi-billion ivory tower that also has a diamond mine under it while surrounded by their cultists and won't even toss you leftovers when you help them out, what do they expect?

They are a crazily rich, global and long established company and they should stop being excused. They are the world pinnacle of technology business in every sense and if they can't deliver they deserve the criticism. The poor guys working for free on OpenSSL for a bazillion platforms for single digit k of donations per year (and 0 compensation from all the corporate freeloaders from Fortune 500) got torn a new asshole the size of La Manche for HeartBleed but Apple keeps getting excused for not being able to get their shit together (by their own wish, like not having a bug bounty after all the bugs that happened in 2017) while they have the opinion of saints and geniuses and control everything to the tiniest details - shops, repairs, hardware, software, components and information.


that would have been a withholder, not disclosure. you need to disclose the vulnerability to those who are vulnerable for it to be disclosure, and nothing else is responsible.


No mention of disclosure, reporting, or CVE in the entire article :-(


That's why they're calling it a 0day, because they haven't done any of those things.



Why do that when you can wreck a security engineer’s vacation?


Can a company the size of Apple not afford 24x7 security resources? For their installed user-base, I don't think this is unreasonable. Security doesn't have a holiday.


Why would your security engineer take CVE-2008-0001 as a holiday?


On the other hand, it could teach a good lesson on technical debt?


I would claim that there is a very high likelihood that the person having to work all night to fix this on new years Eve is not the same person who prioritizes tech debt pay off vs. new features.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: