Hacker News new | past | comments | ask | show | jobs | submit login
Apple Snubs Firm That Discovered Mac Botnet (forbes.com/sites/andygreenberg)
131 points by VuongN on April 10, 2012 | hide | past | favorite | 87 comments



Good job on Dr. Web on finding this and trying to do the right thing but these quotes give a different context than you'd get from the article title and lede:

"Sharov believes that Apple’s attempt to shut down its monitoring server was an honest mistake."

"In Apple’s defense, it may not have recognized Dr. Web as a credible security firm when the company contacted Apple earlier this month–I hadn’t heard of the firm either until its discovery and analysis of the Flashback botnet."

It looks like Apple wasn't the only one surprised by this:

"But the better-known security firm Kaspersky confirmed Dr. Web’s findings on Friday. A Kaspersky representative said it hadn’t contacted Apple with its findings and hadn’t had any direct communication with the company, and Kaspersky researcher Kurt Baumgartner wrote in a statement that 'from what we’ve seen, Apple is taking appropriate action by working with the larger internet security community to shut down the Flashfake [also known as Flashback] C2 domains. Apple works vigorously to protect its brand and wants to rectify this.'"


Oh, Kaspersky couldn't be possibly surprised by existence of Dr. Web. The company is well established and has good market share in the former USSR. I remember using their products back in 1999 or so.


Yeah, Dr. Web is very well-known in Russia, and among the wider security community. They have some very talented people working for them, and, as far as I know, Dr. Web is still one of the only AVs that can disinfect the Rustock.C rootkit/virus successfully.


I've been in the security industry for about 10 years, and never heard of them. Most of my peers had the same reaction - "Who?"


The first quote is from the forbes writer and not Kaspersky.


Huxley seemed to be using the quote to suggest that Kapersky didn't know of them.


Either they didn't mean to or they are mistaken.


That kind of "we didn't know" will only get you so far in this day and age where 1 second on google will get you a wikipedia page on dr web saying they've been around since 1992.

The world is bigger than firms who are big in america and that kind of american centric thinking also won't go down well on the web.


But once you dig down, it becomes a less rosy picture for Dr. Web. The citation for the 1992 date doesn't actually appear to contain that information. It's a bunch of rather poorly written(perhaps by non-English speakers) reviews that don't actually give much useful information about the company or product. Secondly, the website that posted all the test results has not had any activity for over a year.

Their download.com cleaner software has not been updated since 2010.

Finally, a Google search comes up with their main website, wikipedia page, news stories with about their discovery, and several other links to sites that feel like antivirus phishing pages rather than legitimate links.

All-in-all, there isn't enough content to really judge them as absolutely trustworthy. It's really a brand recognition issue in a space where trust is absolutely key in evaluating claims. Kaspersky has the brand and trust worldwide. Dr. Web appears to have it in Russia, but not nearly as much outside of the country.


It's not American-centric thinking, it's Apple being Apple.


American centric? You realize Kaspersky is Russian as well?


Everybody at Kaspersky knows who Dr. Web is. It's a household name in Russia.

So it's probably the article's honest mistake if you infer otherwise from it.


I am surprised that author of the article wasn't aware of Dr. Web! Well, I had an impression that any self-respecting security expert will be aware of "Russian big two". Dr. Web is very well known, especially for their powerful malware detection and removal engine. I always keep Dr. Web CureIT handy in USB sticks.


How long will it take to change the mindset at Apple to think about security before shipping? Microsoft did their job years ago, now Apple has to follow.

How long will it take for Mac users to learn that viruses are indeed a threat on all kinds of computers, not just PCs? I can only hope Apple will take a more active role in educating them.


... they have. They don't ship OS X with Java anymore. The only exploit this attack campaign is using is a Java exploit. They install Java when the user attempts to run something that needs it.

The next best thing they'll have to do is, when they install it, disable the web plugin in Safari by default or add an interstitial that prompts the user for it to run.

One 500k node botnet really is not that large in the grand scheme of things.


I think "they have" is a bit overstated, unless you just mean they've shown improvement in the last few years. Things like functional ASLR and more use of seatbelt in Lion are definitely headed in the right direction; however, there seem to major gaps in testing and validating security--even basic stuff like continuous fuzzing and automated security testing. Also, their response and turnaround time is generally worse than Microsoft's (close to a year has not been uncommon). I can understand Microsoft's slowness due to their massive compatibility matrix and enterprise customer base--but Apple doesn't have that excuse. Apple has also been historically ignored as a mass malware target, so maybe a 500k node botnet is a helpful wakeup call.

And just to be clear, I'll admit that I'm not a disinterested party here. The Chrome security team carries the bulk of the WebKit security workload (fuzzing, auditing, fixing, etc.). That consumes a tremendous amount of my team's time, and prevents us from focusing more on Chrome. So, I'd definitely appreciate it if Apple were significantly more proactive about security.


Why does Apple need to fix security bugs in Webkit? Are they being exploited somewhere by attackers I'm not familiar with? It doesn't seem like there is a problem to solve.

I'd venture to say that Apple is more proactive than most other vendors about security because they look at the forest and not just the individual trees.


I talked to Alex briefly at CanSec about your strategy. I appreciate the idea of setting a minimum bar, but it can start to sound like selling feel-good security. It's easy to say in hindsight "all you had to fix was these ten bugs and ignore the rest." The truth, however, is that you can't reliably predict that in advance. History has shown that even the low-end mass malware moves to more advanced techniques as targets become harder. And, of course, the strategy you're proposing does nothing for those at risk from targeted attacks (which is something I'm professionally seeing more and more of).


Actually, history does not show that low-end mass malware moves to more advanced techniques and you CAN predict the vulns they'll target in advance, as long as you're familiar with their motivations, capabilities, and incentives. http://www.trailofbits.com/research/#eip

Again, show me an actual attack that has exploited Safari. Ever. Targeted, mass malware, I don't care. Apple has better shit to worry about and their investment in Seatbelt was worth 1000x more than individually fixing the limitless supply of bugs in Webkit. Problem solved, move to next actual issue.


You can scope the discussion to mass malware on desktop Safari, but that's just reductio ad absurdum. Any fortune 500 or government has to be concerned with real attacks, not just tamping down the noise floor. And as for Apple, they've historically been more interested in protecting their manufacturer subsidy by stopping iPhone jailbreaks--in which WebKit exploits against Safari have played a key role.


Wow. So you're not exactly unbiased are you ?

At what point were you forced to take on the responsibility for fixing those bugs. It is an open source project and Apple may feel that their best contributions are in adding new features or fixing rendering bugs.


That's pretty much my point. Apple does actively contribute to WebKit, but they're not very active on the security front. I've gotten a similar feeling when dealing with security reports we've made on proprietary Apple components (core libraries, etc.). Based on those experiences I assume they feel their resources are better invested elsewhere, which is absolutely their prerogative. Of course, that perspective also strikes me as something any potential Apple customer should consider.


I don't understand what the difference is between having java default-installed and installing java on demand. Both methods run a jar when it shows up, right?


Not for 85% of Mac users. They'll use Mac App Store and Facebook, which are both Java-free.


In that case, it still doesn't make a difference whether Java is installed or not. Dylan's point is that the only difference between having Java installed by default and having Java installed on first use is launch time the first time you run a Java program.


Default installed means there's a java web plugin floating around to exploit, for starters.


They have? This is like making a house burglar-proof by bricking the doors. You just cut a small hole when you want to get out... and why would you want to get out, it's so nice inside!


Well according to the article the did not. Here the quote:

"The bug was patched by Oracle in February, but Apple didn’t fix the flaw until earlier this month. “Their response should have been much earlier when they should have updated their Java,” says Sharov"


What do you think Microsoft has done that Apple has not?


Microsoft had its security Waterloo 6-8 years ago, and they have dramatically improved and changed their development practices since then. I once read that they have one whole building of engineers dedicated to security (sorry cannot find the link anymore).

Reports involving security breaches of Apple products often show Apple as someone who feels wronged and would rather try to kill the messenger than fix the problem. To their credit, they appear to be looking for fixes that cause the least harm to their users. But to me they appear to be operating on very limited manpower/budget for this so they have to prioritize.

An example: Apple didn't fix a flaw on Windows iTunes for 3 years because it wasn't a priority, and intelligence software companies sold software based on this hole - imagine if Microsoft did this:

http://www.telegraph.co.uk/technology/apple/8912714/Apple-iT...


2002 was the start of Microsoft's major security rethink, prompted in part by exploits like the Code Red worm. For a short period it wasn't a building of engineers, but the whole engineering staff that stopped development and focused on security code reviews and bug fixes[1]. One of the outcomes was the Security Development Lifecycle[2].

[1] http://www.windowsitpro.com/article/security/microsoft-repor...

[2] http://www.microsoft.com/security/sdl/default.aspx


You dodged the question.

You haven't listed a single thing that Microsoft has done that Apple has not.

You are right that Microsoft started to take security seriously before Apple did, but Apple is not still behind.

http://www.theregister.co.uk/2011/07/21/mac_os_x_lion_securi...


Attempting to revert the PC revolution into centrally controlled display terminals is "pretty fucking behind". Microsoft attempted to do this a decade ago and was rightfully called out for it, but it seems most of the Apple community is happy to not own the devices they rely on. Apparently Macs really are not meant to be Personal Computers.


Are you sure you're not talking about Chrome OS?

Also, what's your proposal for establishing the trustworthiness of code?


Although what I said could be applied there as well, given that I didn't even attempt to reference it, I'm pretty sure. Sorry, arbitrary dichotomies are only fun for ultimately inconsequential things (professional sports, for instance).


It's not an arbitrary dichotomy, it's a factual one. Mac OS doesn't resemble a centrally controlled 'display terminal' at all, whereas Chrome OS does.

You also didn't answer the part about trusted code.


Mac OS hasn't reached the point where one needs to jump through hoops to install arbitrary programs, but given that the beginnings of such a system is present in Lion and it's the status quo on all of Apple's more recent devices (and one of their main selling points), one can only conclude that's the direction they will be taking consumer-oriented Mac OS in.

> You also didn't answer the part about trusted code.

Because it wasn't there when I replied.

Centrally signed code repositories on their own aren't the worst thing, I rely on them myself (apt-get). The problem arises when it's fixed to one possible repository. If the proverbial "average user" cannot install and administer a friend-approved but Apple-unapproved app with the nearly the same ease as an Apple-approved one, we end up with a situation where Apple directly controls what average users are capable of. It also causes users who wish to own their device to hope for new exploits to be publicly discovered, which is utterly backwards. I understand that progress for actual security (isolation, capabilities, proper deputies, etc) takes significant work, but coarse-grained whitelists aren't the answer.


I agree with your first point, but I disagree that it makes Mac OS into a 'display terminal'. Particularly since although hoops are being introduced for good reason, there is no evidence that Mac OS X will be closed in the way iOS is.

Your second point as I've indicated, is just conjecture. I don't see good evidence for it actually happening. Certainly nothing that Apple has publicly disclosed suggests that it will. It seems pretty unlikely to me.

But yes, if they did go that far, they would indeed control what average users were capable of.

If it ever gets to that point, I'd hope that by that time, there would be some obvious killer apps for another OS to demonstrate why it was a problem.

[edit: Coarse grained whitelists might not be the answer, but I highly doubt that Apple is going to stop there. Every OS release is a step along the way. It's worth noting that iOS has generally developed in the direction of providing more capabilities to programmers, rather than less over time.]


I've actually thought about this a bit, and it's not a "killer app" that will be missing from locked down systems, as the repository can always add anything that becomes popular elsewhere (after a little delay from porting/approval/etc).

The difference starts at the foundation, and manifests in a pervasive lack of respect for the user (who's ultimate control and understanding should be a prime usability concern).

For instance, that whole device-id brouhaha - iOS apps really get a unique device-id, which they are then supposed to partially-obscure according to Apple's guidelines? Why in the world is an app allowed to directly query a fixed identifier in the first place?! There should be a specific ID-Api of which the user controls via a system dialog the same way a user controls how long a browser stores cookies for. Sandboxing+auditing then make sure apps aren't using something like the ethernet addresses to get around the user's choice.

But unfortunately, most of the developers who actually know enough to analyze this are on the take of the ad companies and think that their stake on the user's device is equivalent or even overrides the owner's! So Apple kowtows to the advertisers and permits uncontrollable tracking while the end-users are stuck with their only choice being 'use or not use' an app based on how much they perceive it abusing them. Instead of being introduced to a world full of self determination and limitless possibilities (as early computer adopters were), modern day users are shown a standard no-free-lunch world where "they either get you coming or going". Developers are still able to seek out freedom, but the goal of empowering an end user to solve their own problems couldn't be farther from sight.

(And yes, Android has most of these same problems in addition to some of its own, which is why I said dichotomies aren't useful.)


Instead of being introduced to a world full of self determination and limitless possibilities (as early computer adopters were), modern day users are shown a standard no-free-lunch world where "they either get you coming or going". Developers are still able to seek out freedom, but the goal of empowering an end user to solve their own problems couldn't be farther from sight.

I couldn't agree more with this. However that dream seemed to die with the breakup of Alan Kay's original group. Nobody is even approaching this problem except perhaps Kay's own FONC group, and even that seems to be more academic than practical now.

That said, I think that as digital culture matures as more generations grow up with digital creation, programmability will become the primary constraint, and then we might see progress in this area. If Apple doesn't keep up (although I expect they will), this is the domain I expect the killer app to emerge from.

I'm not sure why you bring up the device-id thing. Apple corrected that issue without external pressure. Also, in the real world, I think that expecting end-users to manage a second cookie-like entity with subtly different semantics to cookies is unrealistic.


It's easy to point to Alan Kay's research and then wistfully say that it will hopefully bear fruit one day, but that actually does a disservice to anybody working on today's systems that treat the user as a mature, self-actualizing owner - whether they be creating new software or making existing software more user-friendly.

A system that's built on a philosophy of eliminating capabilities can never progress into a system that allows a user to gradually learn more and empower themselves, as there's nothing "further down" that unifies the whole thing. Software that starts off requiring significant effort to administer can progress into having a user-friendly interface and be incorporated into systems with sensible defaults.

One shouldn't require a user to have to configure everything out of the gate (say, cookie policy), but one shouldn't prevent them from doing things they know they want. Wasn't the Apple device-id thing "fixed" by only allowing tracking on a per-app basis? With cookies, I can have them deleted every time I close the page.


It's easy to point to Alan Kay's research and then wistfully say that it will hopefully bear fruit one day, but that actually does a disservice to anybody working on today's systems that treat the user as a mature, self-actualizing owner - whether they be creating new software or making existing software more user-friendly.

Setting the straw-men aside, which systems did you have in mind?


Every mainstream operating system has unique IDs readily available to applications - it's just that native applications don't traditionally include ads.


Please read my answer again, and you will find differences between Microsoft's and Apple's approach to security that answer your question.


You didn't answer my question. You've offered a vague opinion about appearances.

If you had answered my question you would be able to pick out at least one thing that Microsoft had done that Apple had not.

You said: Microsoft did their job years ago, now Apple has to follow.

This is simply not supported by the facts.


Especially the fact that Apple's OS security has been solid, all thru the past two decades, with very few exploits, while Microsofts has bee wide open during those period, and Microsooft deliberately chose not to close the holes because they were there for "marketing initiatives".

On Windows a massive industry of malware detection has sprung up and still there are millions upon millions of zombie PCs out there. Meanwhile, despite no such industry on OS X there are no reports of infections in the wild (from viruses the original commenters claim, this article is about a trojan, which also is a lot more prevalent on windows than on the Mac.)

But don't let these facts get in the way-- remember, the point of this thread specifically, and a big amount of the draw of Hacker News, is that you can bash Apple and get up voted by other Apple haters. Facts are not relevant.


the easier question to ask is what HASN'T Microsoft done?

quick example: they paid millions possibly tens of millions for the best people from industry and academia to build automatic bug searching tools. these tools currently define state of the art for finding bugs in applications.

they also have a community outreach program where they will work with companies and individuals that report flaws. that same outreach program will work internally with the relevant product groups to get flaws fixed.

oh and also they invest constantly in improvements to their toolchain and operating system runtime to make exploitable bugs harder (safeSEH, ASLR, DEP, GS cookies, EMET, encoded pointers, safe-unlinked heaps both in user and kernel mode, etc).

and yet, you realize, there are still exploitable bugs. in my opinion you should regard this as the fundamental instability of system software written in C. if microsoft can't get that right (in terms of security and stability), after all that they've invested, who can?


1. Where are those bug searching tools used?

2. I'd agree that the outreach program is something that Apple clearly hasn't done.

3. Apple clearly is doing this too.

HP's work on secret agents in the 90's shows that you can't prove code to be trustworthy. You can only assign trust to the intentions of the originator.

Therefore, the most significant thing you can do to improve security is to verify the provenance of trusted code and the isolation of untrusted code.

Windows clearly has decent technical code-signing infrastructure, but Apple seems far ahead in terms of effectively deploying this model into the field.


"1. Where are those bug searching tools used?"

Microsoft has some good static code analysis tools ( http://msdn.microsoft.com/en-us/gg712340 ), for a start.

"3. Apple clearly is doing this too."

They seem more concerned about not letting users jailbreak their devices than anything else.

"Windows clearly has decent technical code-signing infrastructure, but Apple seems far ahead in terms of effectively deploying this model into the field."

"Far ahead"? How do you justify that claim? Most charitably, it seems that both companies work on security.

Of the two, Microsoft seems to have better documentation, better openness, and better tools.


That jailbreaking works by taking advantage of OS exploits to gain root access, and, in fact, represents a glaringly obvious, popular, and easy to use vulnerabilit has nothing to do with Apple fixing it quickly?


By #3 I was referring to the parent's 3rd comment about the focus on improving the toolchain and runtime. I doubt that you're serious about the jailbreak comment.

By far ahead, I was referring to the deployment of code signing technology.

I think it's pretty clear that although Microsoft has solid code signing technology, they are much further behind in promoting effective use of it in the field.


"Microsoft has some good static code analysis tools ( http://msdn.microsoft.com/en-us/gg712340 ), for a start."

The compound word 'PreOrder' exists as a discrete term. ... case it as 'Preorder' or strip the first token entirely if it represents any sort of Hungarian notation.

Aw shit, Apple is gonna get fucked by all those remotely exploitable Hungarian notation bugs.


> They seem more concerned about not letting users jailbreak their devices than anything else.

Ironic that you would use a feature that excludes malware from running but pisses off android fans, as an attempt to claim that Apple is not working on security!


these tools currently define state of the art for finding bugs in applications.

The stuff in Visual Studio is so far behind the state of the art I don't even know where to begin. But let's ignore that and recalibrate by asking this. What do you think the not quite state of the art second best automatic bug finding tool is?


Releasing security fixes in a timely manner for one.

This Java exploit was fixed for a long time before Apple so graciously bestowed a fix upon us. Then there's that SSL certificate SNAFU where I had to fix Safari myself, after almost all companies had already issued updates. Too bad that was not possible on iOS, where you just had to sit around and twiddle your thumbs.


I'm guessing Apple's plan involves Gatekeeper (coming soon) and not shipping OS X with Java and Flash by default (already implemented, but installing both is ridicoulously easy). I'm not sure how teneable and wise that is, though.


> and not shipping OS X with Java

That won't help the people who actually use Java on OS X. It would be better to allow Oracle to manage the updates for Java on OS X, the vulnerability that Flashback abused was patched by Oracle back in February.


Which, I believe, is the plan in the longer term. Apple is going to leave Java platform development to Oracle - there was a spate of "Apple eliminating Java from OS X" articles back when it was announced (last year?)


It was November 2010: Oracle and Apple announce OpenJDK Project for Java on Mac OS X (http://www.appleinsider.com/articles/10/11/12/oracle_and_app...)


Thanks for being my lazy web.


I agree - but there is a catch: Gatekeeper will help to prevent things like that. The default mode in Mountain Lion is only to allow to run apps downloaded from the App Store and/or signed by a developer whose certificate can be revoked by Apple at any time. This will prevent at least the spreading of malicious software components after it has been discovered. And this is the point... Apple needs to react faster and malicious software has to be detected first - otherwise a revoking system makes little sense...


So the default mode of the entire operating system is to only let you install software approved, for a fee, by Apple? Go business. Am I understanding correctly?


Not quite. You can get a free certificate for non-appstore apps. Apps signed with the free cert aren't reviewed or endorsed by Apple, but they can revoke the certificate after the fact if they find that you've been publishing malicious apps. The Mountain Lion default allows signed apps only, both appstore and non-appstore. You can also set it to 'appstore only' or 'anything goes'.


No, Developer IDs are free.


Is Apple getting dinged for Java installation being too easy? That seems pretty easy to fix. Having Java on by default in Safari was a bad choice (just like disk images and packages last year). I hope they'll do a scan and remove with Software Update soon, if that's possible.


Can you name a single mac virus that has infected a significant number of machines in the wild since the mid 1990s?

Note a Trojan, is not the same thing as a virus. A virus self propagates, a trojan propagates when a user overrides a security warning.

Viruses have been a problem for Windows for decades, but not a problem on macs, since the late 1980s, possibly early 1990s.


"Hey, this trojan has stolen our CC numbers, photos and personal documents, but at least it's not a virus! THAT would have been really bad...".

And please try to keep up with the latest developments, this trojan requires no user intervention. Not running Little Snitch + antivirus at this point is just ignorant and asking for it.


I think that's more to do with "herd immunity" than anything else. There haven't been enough macs communicating with macs to have widespread damage from a mac virus.


This 'discovery' did certainly boost Dr. Web's market share though! 'Dr. Web Light' is now the number 2 most downloaded free app on the Mac App Store: http://cl.ly/1z0Z1F0P29221K1y3X01


There is obviously no point in reiterating how Apple is removing Java, how they are adding VMs, code signing - etc.

The only way for them to improve security is to take it seriously, because the amount of code shipped with each release will only go up, never down. The attitude needs to change.

There is of course lots of data support this argument. Just do a quick Ctrl+F through http://support.apple.com/kb/HT5130 for 'arbitrary code execution'. 21 hits, and many of them in core apple components. These are almost extinct on Windows by now.


because the amount of code shipped with each release will only go up, never down.

Actually, it went down with the Mac OS X 10.6 Snow Leopard release. Up to 7 GB less. [1]

1. http://en.wikipedia.org/wiki/Mac_OS_X_Snow_Leopard#New_or_ch...


I'm nitpicking, but 7GB in binary doesn't necessarily mean 7GB in source code.



One interesting sidenote in this story is the fact that Mac OSX now has enough market share that it no longer enjoys security-by-obscurity from targeted malware, let alone herd immunity.


It never did. It has simply been much more secure. The "security by obscurity" claim was a rationalization of windows fans to justify why the mac was so much more secure.

Anyone who genuinely understand security understands that obscurity is not a form of security. There are many incidents of high profile targeted attacks against owners of macs that could have occurred in the past two decades if Apple hadn't been taking security more seriously.


The biggest problem is as long as people think Macs are secure... they will never be.


Which is why there are botnets of millions upon millions of Macs out there infected by viruses, while its really big news that a trojan manages to get 600,000 infections on Windows.

Oh, no, wait, its the reverse!


:) . Typical . I never mentioned anything about Windows. All I said was w.r.t computer security you are most venerable when you don't have the fear of bad things happening to you.


You made an either-or proposition. Either you're secure or you're not. The responder nailed it correctly: the issue is not "secure/insecure". His example showed that it's a measure of utility. How much effort expended by Apple and its users is worth the probable reduction in botnets on that platform and their consequent damage? Because botnets are historically so rare on MacOS (9 or X) the utility equation has always been very much in the don't-worry-about-it category. Not so for (for example Windows).

Will this equation change significantly with the new botnet? I think it's unlikely, but I dunno. Regardless, all-or-nothing seems to be the wrong perspective from which to examine this problem.


I never mentioned Macs are more/less secured. All I am trying to say is w.r.t computer security the sense of being secure & letting your guard down is the most insecure thing. Its just not about Virus,Trojans , Worms etc .. its also about social engineering. I tell my mom don't open random links .. I don't mention if its on email or a webpage or if its on Windows or her Android phone.


Given OSX install base that accounts for almost 2% of all devices infected. That seems rather serious to me.


Annoyingly, the Java 2012 update REMOVED the -uninstall option from Java, so you have to rm it and clean up the installhistory plist manually if you want to uninstall Java from Lion


so Apple first ignores Oracle's warning and fails to issue the patch. Later it react by removing Java and tries to shut down the security firm's domain.

How responsible :)


No, Java wasn't removed. Java was updated on systems where it has been installed.


>“We don’t know the antivirus group inside Apple.”

What antivirus group...


>“For Microsoft, we have all the security response team’s addresses,” he says. “We don’t know the antivirus group inside Apple.”

Does Apple even have an antivirus group?


They definitely have people working on OS X's malware protection (XProtect). The definitions file can be found at /System/Library/CoreServices/CoreTypes.bundle/Contents/Resources/XProtect.plist (Flashback A/B/C are in there for me).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: