Hacker News new | past | comments | ask | show | jobs | submit login

“My heart goes out to the guys,” said Larry Tabb, the founder and chief executive of the Tabb Group, a financial research firm. “On the biggest day of their corporate history, their own platform backfired.”

Maybe I'm just reading too much into this, but this quote makes it sound like Tabb really doesn't understand technology. The platform didn't "backfire", the gods of finance didn't frown upon their venture, their code was incorrect. This could have been avoided if they had found whatever bug they hit before they went live. Don't get me wrong, I also feel bad for them because it sucks to have a technical error cause anything more than debugging frustration, but this was preventable.




Everything is preventable (or more correctly, predictable) when you know everything about the system. If we knew everything about the quanta of the universe, we'd know when solar flares and neutrinos and alpha particles were going to affect our satellites and other electronics.

Since we don't know everything (not even everything about our own creations like stock exchange markets and software systems), it becomes risk management. After writing software for decades, I still have no idea whether these circumstances were predictable let alone preventable, but perhaps they could have spent more time considering the minutia of the system writing more tests or whatever. Yes, the mistake was the fault of the humans designing and implementing the system. But that could be said of any (non-natural, human-made) system that fails.

At some point, someone has to decide that code should just be shipped. We're not perfect enough as people to build bug-free systems. Risk comes with releasing "unproven" code. And yes, it's embarrassing when the fit hits the shan in front of the whole world. But I try to avoid becoming one of those jaded people that can't tolerate mistakes on behalf of others nor can admit their own.


I think I came off a bit too grumpy. I full-well know that every bug can't be eradicated before a launch, and I don't hold any ill-will against BATS for it, I was more trying to comment on the persistence of the idea that computers are magic which sometimes rise up to act against their creators. As another comment said, I guess my interpretation of "backfire" is a bit too narrow.


Heisenberg on line 1, or 2, I can't be certain, but he says you're wrong.


Heisenberg doesn't say that we wouldn't have absolute predictability or control if we knew everything. He says we can't know everything. He's correct, but that doesn't negate my statement.


It does.

And it's compounded by quantum effects, multi-body problems, and emergent phenomena.

Your requirement is that we 1) have absolute knowledge of a state of the universe, and that 2) all later states can be predicted from this a priori state.

Heisenberg says "you can never have absolute knowledge".

Numerous other elements argue that even where absolute knowledge is available, it's not possible to predict future states with certainty, or in less than real time.

So, no, everything is not foreseeable.

Mind: some risks are predictable in a probabilistic way, though generally these are good for saying that "in T' period of time, there's P probability of X event occurring", though that's a far cry from saying that event X will happen at time T. Much of my life revolves around clarifying the distinction between these two statements.

An instance which comes to mind: Schneier's blog has mentioned that the 9/11 attacks were statistically probable given terrorism trends. As has been the absence of similar-scale follow-on attacks since. Though with time, a similar magnitude attack becomes a near certainty.


Your interpretation of the word "backfire" is unconventional.

Also, you're trivializing the challenges likely faced by the kind of venture. It's not always possible to find all bugs, unless you want to code in Ada and formally verify things with OCaml and are trying to fly a plane.

The problem could have been caused by a strange hardware defect, a rare bug in a library, etc.


>The problem could have been caused by a strange hardware defect, a rare bug in a library, etc.

Or competition, a disgruntled employee, etc. Just speculating but IMO it's a pretty big coincidence that it failed exactly when they did the IPO.


The NYTimes article says that the initial offering of BATS "would have been the first company to be listed as well as traded on the exchange", and the post mortem currently at the top of http://www.batstrading.com/alerts/ (no direct link that I can see) says that the A-BFZZZ shard "encountered a software bug related to IPO auctions".

I read that to mean that this is the first time they've ever used their IPO-related code in production and it failed badly. So it sounds like it was the normal sort of bug, in their own code, and their own IPO directly triggered the bug.


Their IPO might have caused a spike in trading volume on their exchange, which might have triggered the bug.


Check the graph: http://www.nanex.net/aqck/3079-2.jpg

It happened in less than a second.


Yes, not to mention the two stocks it failed on.


To be sure, mucking about on the stock exchanges should earn a closer comparison of QA responsibility to aeronautics.


In my opinion, not at all. The stock exchange is an inherently risky market; everbody knows that. The only responsibility you have is to yourself and your own money. That's a far cry from aeronautics.


Just because an activity has risk involved doesn't mean it should be thrown to the wolves.

You might also familiarize yourself with the Just World Fallacy: http://en.wikipedia.org/wiki/Just-world_hypothesis


I don't see the connection between the discussion and the just-world hypothesis.


One interpretation is that the JWH means that people get what they deserve. The connection is your assertion that the markets are risky, people should know this, so quality controls are superfluous.


I still fail to see the connection you're trying to make. I think you're overreaching.


Well yeah, that's what often happens when one takes a libertarian stance.


Thank you very much for that link. It was a fascinating read to say the least.


Aerospace is inherently risky! The planet and skies do not submit to human authority.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: