Hacker News new | past | comments | ask | show | jobs | submit login

"...pie-in-the-sky ideas like commercializing Watson..."

I'm not knowledgeable about the product, but it seems to me commercializing Watson is the smartest thing you could do. I feel like Watson could allow IBM to become dominant in almost every field and produce trillions in value for their customers. Am I off base? Has Watson development stalled over the past few years?




So, I used to work on Watson. The way I see it, it's not that development has stalled but that it's nowhere near as impressive as you've been lead to believe. The natural language processing is its biggest weakness. I worked with Watson in a medical context and the NLP was absolutely atrocious. But to be fair, parsing English which is usually informally structured is really difficult! Compound that with the fact that IBM was asking developers with limited domain knowledge to write the parsing rules and well... it's about what you expect.

I should also point out that IBM talks up Watson's abilities way too much -- to the point that customers have thought that Watson could essentially tell the future.


> The natural language processing is its biggest weakness.

There were several documentaries about Watson after it won Jeopardy, and this weakness was quite evident to those paying attention.

Watson never really understood Alex's Jeopardy "answers" at all. This was quite obvious in "final Jeopardy". Watson's response clearly showed its limitations.

Here's how it went down:[1]

   The category was US Cities, and the answer
   was: “Its largest airport was named for a
   World War II hero; its second largest,
   for a World War II battle.”

   The two human contestants wrote
   “What is Chicago?” for its O’Hare and Midway,
   but Watson’s response was a lame
   “What is Toronto???”
Basically Watson's Jeopardy responses were a very refined equivalent of the Google "I'm feeling lucky" button.

I have to believe that the situation gets a lot lot trickier in a "medical context".

Still, couldn't it be possible to harness Watson as a super-smart assistant to humans? But there's probably not enough money to be made in that.

[1] http://asmarterplanet.com/blog/2011/02/watson-on-jeopardy-da...


When I was a researcher at IBM's Watson lab in Yorktown Heights, a standard explanation for many of the research projects was to give "luster" to the company.

It's easy to suspect that the main goal of the Watson computer playing Jeopardy was really just luster.

Really, there were some severely negative attitudes on (1) Research working on projects that could result in revenue and (2) transferring projects from Research to the rest of the company for development or sales. Our project in Research was quite exceptional and transferred two projects, both of which went to customers and got sold, but our group was unusual, and our success was not wanted by the higher ups.

At one time Gerstner mentioned "all the exciting projects coming out of Research" -- right, about like the wings about to sprout on all those pigs.

The explanation I like about IBM is one given to me by the manager of the Chicago branch office: "You might think of IBM as an electronics company or a computer company, but you would be wrong. IBM is a marketing company, and it would get into the grocery business tomorrow if it saw a business opportunity there. You might think that Research comes up with new product designs, development turns them into products, and marketing sells them, but that is exactly backwards. Instead, marketing figures out what they can sell, development builds it, and they go to Research if necessary."

So, maybe the current changes amount to IBM trying some new things to market.


In all fairness the Toronto answer isn't as ridiculous as it seems at first blush. If you confuse Lester B. Pearson with Alastair Pearson (something quite possible by NLP when you have to match "Pearson international airport" then you at least match you end up with a British World War II hero. Not being a U.S. city would involve either underfitting or ignoring category information, and trying to stretch buttonville or billy bishop into a WWII battle takes work, but it seems like precisely the kind of error that would be caused by actual NLP rather than optimizing page rank variants.

I also think IBM has the right model by creating Watson as a service on 30% revenue share -> Let external developers find all the various products and business models. They'll be better at finding and optimizing new products than a services companies would be and some of them will be able to go after much smaller products. You end up with more diversification and less of IBM's capital at risk for 30% of what's likely to be a much larger pie than IBM can generate on their own.


It would be very interesting to find out how Watson actually interpreted both the category and the "answer".

I don't know the actual amounts of time contestants are given, but on TV the final category is disclosed many minutes ahead of the "answer". Then a contestant has perhaps 30 seconds to formulate a "question". An eternity for a massively parallel computer like Watson. Quite different from the rest of the show, which relies on lightning reflexes when played by "champions".

Viewed from 30,000 feet (quite appropriate for an airport question, eh) I think that Watson didn't understand the category. Watson did not know what the words "US cities" meant. Those aren't words that would normally have any sort of double meaning, so if Watson understood the category, why would it have "ignored" category information? That doesn't make sense, especially for final Jeopardy.

As Dr. Venkman might say: "good guess, but wrong!" In Watson's defense, it did not have very high confidence in its answer.

We'll probably never know the real story. That's not the kind of information that IBM would want to disclose, mainly because it would probably make Watson look bad.


The confusion probably arose from the Billy Bishop Toronto City Airport, named after the World War ONE flying ace, Billy Bishop.


I always get dubious when people suggest letting external developers find business models / use cases.

I think if a product were truly compelling, they'd be able to find (and polish) at least one really awesome use case internally.


Looks like they've been writing the 2nd chapter of the IA winter.


Be careful not to assume too much of Watson. IBM's marketing makes it seem like a do-everything general purpose AI, but that likely isn't the case.

The program that won Jeopardy was more of a specialized search engine with really good query parsing. Many of the other things they have announced as "Watson" look like entirely separate programs that they are marketing under one brand name. While looks like one amazing AI is just a mish-mash of domain-specific algorithms with some well-built glue.


In fairness, "a mish-mash of domain-specific algorithms with some well-built glue" is also not a bad summary of how some people see the human mind. See, e.g., https://en.wikipedia.org/wiki/Modularity_of_mind.

(For the avoidance of doubt, I am not suggesting that Watson is anything remotely like a do-everything general-purpose AI.)


The human mind is a behavior copy algorithm, IBM's Watson is effectively an expert system with a search algorithm.

The way they answer questions is very, very different. Watson effectively translates the question into something equivalent to an SQL query, then executes it, and answers the top result.

What a human mind does is very, very different. It could perhaps be described like this : "put the electrical impulses on the outgoing nerves that seem like past successful Jeopardy players would have put on theirs in this situation". Your mind doesn't "answer" any question for starters. It simply reacts to the situation, the difference is in that your mind doesn't know or care about the difference between a question and a rocket falling out of the sky. Same prediction happens.

This reaction contains far more data than Watson will ever output. It contains the actual answer, encoded in had movements necessary to write them down. It contains instructions to pick up the pen, hold the board, crouch over the board so your hand would be in reach of the board, hide the answer from your competitors, ... You could easily write a 1000-page book about the response of your mind to simple situations (I've read a near-2000 page book that talks exclusively about the 3d math needed to control two fingers). This answer was produced by your cortex, which contains more computing elements than the internet. More data was transferred in your mind to weak computing elements than gets transferred on the internet in an entire US state for the same time frame. Granted, those computing elements are somewhat (a LOT) slower than a CPU, but there's 100 neurons in your cortex for every CPU ever sold, which is estimated to be a billion. (and I'm cheating with those numbers, for example the total data transfer that occurs in a single CPU easily outstrips internet traffic for a medium city. So your mind looks at the same amount of data as about a dozen cpus in the same time, much less impressive)

Fundamentally it's the "best" reaction your mind knows how to have. Best being defined something along the lines of "if you did this in front of me, you'd get maximum attention from my mind" (in this case hopefully because you'd win Jeopardy. But it's the attention that matters, not the win. E.g. if your girl/boyfriend plays, you'll play more like him/her, regardless of who wins).


Well, Jeopardy was 4 years ago, so there that. If Watson were really believed, internally to IBM, to be as revolutionary as their marketing copy says, then I would expect to hear great triumphs of how IBM deployed Watson internally to, say, their sales department (which has got to have communication issues, given their size) and let it absorb information and then let it answer questions sales guys have about their products, turning them into believers.

The fact that we've seen little of Watson since Jeopardy makes me wonder if development hasn't stalled.


Well, Watson (Watson Q&A) can only "answer" with a set of verbatim selections from the texts it has previously digested and been trained upon. Lots of folks struggle to understand this, and IBM seems not particularly diligent in correcting them, thus the idea that Watson can synthesize novel answers or insights persists.


Given a large body of literature that exists; say, all of IBM's marketing documentation, it's my understanding that Watson is able to parrot an answer to questions on things in there.

Something like "What's the minimum supported number of CPUs to run IBM DB2 on a new system Z?" or another customer-focused question.

Things like that don't require any insight, but would be a boon to the sales/marketing team when the customer has questions of that ilk.

Yes, that's positioning Watson simply as a better search, but what was Google, originally, other than better search than Altavista?


This seems like a case where Watson might be helpful, although the effort to train the system is considerable; out-of-the-box, Watson does not really know how to answer any kinds of questions, so the recommended approach is to (manually) walk it through thousands of Q&As. And even clearing that hurdle, would there be any payoff for the user _or_ for IBM above the Google results for "system z db2 requirements"?


Color me ignorant... What is the difference between Watson, Cortona, Siri and Now? What does Watson have that those services don't?


What does Watson have that those services don't?

IBM sales force - on a non-tech level that is.


I'm as ignorant as you, but here's a stab with something I found "This may seem similar to technology such as Apple's Siri or Google Now, but a cognitive computer becomes smarter over time as it learns from each interaction, a function far beyond typical computers. And let's face it … Siri is stupid.". As I understand it, Watson continues to learn with the goal of being a domain expert.


How many queries do you think google or Siri have had to learn from?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: