I'm very unimpressed - I had high hopes alas, here are some real queries from my lab where google gives me the right answer within 1-2 links. (or I can reach for the CRC handbook) W|A
Q: speed of light in glass
A: c (speed of light) and small glasses are not compatible
Q: magnetic moment of cobalt atom
A: wolfram alpha isn't sure what to do with your input
(note just "cobalt" has the information but you have to dig a little)
Q: band structure of bismuth
A: Wolfram|Alpha isn't sure what to do with your input
Q: polarizability of sodium atom
A: Wolfram|Alpha isn't sure what to do with your input.
Q: dipole moment of water
A: Wolfram|Alpha isn't sure what to do with your input.
Q: isotopes of copper
A: good job! (this one actually gives me what I'm looking for)
I just read it can do music so..
Q: bach
A: has a very small amount of info and a cute timeline of birth and death
<same for charles mingus, and ozzy osbourne>
Q: bach english suites
A: Wolfram|Alpha isn't sure what to do with your input.
It's also really annoying that it does the HAL9000 thing like every other query.
Nice effort - better luck next time. I'm sticking with the CRC handbook and GOOG.
This has been my problem during testing. I'm very impressed with the quality of the curated data, but getting at it seems very hit-and-miss. I gave the exdample a week or two back of searching on 'Orson Welles' - you can ask 'who directed [OW movie]' and it'll tell you, but 'what movies did OW direct' (and yes I tried many variants) always gets you an error.
In this respect it feels like the sort of thing that has always annoyed me about expert systems: it's useful to the extent that you are already familiar with what data is curated therein, and want a handy way to look it up, but rather frustrating for surfing and exploration. I do like it...but I can't help sharing some people's view that it's sort of an ad for Mathematica.
It will find a use as citation source - you can embed statistics or factual info in your web page and reference Alpha, in relative comfort that the data there is factually correct. On the other hand, the lack of specificity in the citation info means it won't be quite adequate for academic use, at least not just yet. Might be that its best market will be among high school students who will a) buy Mathematica later and b) grow up with Alpha as past generations did with their slide rules or TI calculators, and stick with it as it evolves.
I didn't take time to play with the API yet - superficially it looks flexible, but I'm only a hobby coder. Much depends on where Wolfram draws the line between openness and proprietary use.
You are testing it as a basic lookup engine, which is fine, but it's just a part of the offering. I tried to exercise the inference part.
Q: goog trading volume
A: works! I get a chart, but the volume is in shares, I want it in dollars but haven't been able to get that. From all I tried, only "GOOG trading volume / volume in dollars" seems to understand but doesn't show me anything. OTOH "GOOG trading volume monthly" works as expected. So basic unit conversions work, but not something like "for all days calculate number of shares by closing price" which would give me the volume in USD.
I also wanted to see a histogram of the daily volumes, but hasn't been able to. All "GOOG trading volume distribution | histogram | variance" just don't work.
A look on another domain shows that it knows something about movies (e.g. the name of the movie shows me a list of actors and other info), but doesn't cross-reference this information (the name of an actor gets me nothing, I expected at least a list of movies).
If anyone succeeds in figuring out how to get some of these answers, please reply! W|A doesn't seem to have a place where I can ask for help. Probably it just can't make those inferences though.
WA does seem to have IOR list of common materials, but that is easily obtainable from almost everywhere, however you can then use to calculate further down the road with WA
I have a theory... They're collecting queries made with the Alpha release, and they'll just implement the suggestion (in a general form, like slurping up an alcohol database in reply to someone's query complaint), and keep adding and adding cases until it can handle 1000 times more specific queries.
Just to be fair, I tried a couple of your queries with Google, and the answers were not forthcoming either. But I suppose with Google I could do some SEO and get the answers to show up. Whereas I am not sure how to do that with Wolfram. Of course it only just came out.
The queries I tried were:
Q:distance traveled by space shuttle in one day
A:Lots of links, none of which gives canonical answer.
Q:explosive energy of five sticks of dynamite
A:Lots of links, only a few of which deal with dynamite the explosive.
Query time was roughly 9:30 AM Central Time Sunday May 16, 2009.
Still, I think both Google and Wolfram will go to this model over time. Which, of course, will be terrible for Google since revenue wise it is a one trick pony. I think it will be better for us though. It does seem more natural for me to ask for the population of Ningbo, and just get an answer instead of a list of links.
That took about 2 queries and 2 minutes - the first result for the space shuttle's speed compared a bunch of sources, 3 of which were in agreement.
Q: [dynamite explosive energy]
A: 2.1 million joules
Q: [2.1 * 5]
A: 10.5 million joules
Wikipedia was the first result for that, with the answer right on the page (and the text leading up to the answer in the snippet).
Curiously, a lot of people do use full-sentence queries - perhaps it's the lingering aftereffects of their 3rd grade teachers. If Wolfram Alpha can handle those correctly, it'll be a big win. But other search engines have tried - Ask.com is based around being able to ask questions in natural language. And it never seems to work that way - computers don't seem to know which of the many things you say are irrelevant.
I've seen a few comments write Wolfram Alpha off. WA is going to be 10 times better than Google for a specific set of queries and when it gets etter at recognizing questions (or when I get better at asking them since Wolfram is infallible </sarcasm>) I can see myself using this thing IN ADDITION to Google.
Everybody is trying to use W|A as though it's Orac (Blakes7) but this is an NLP layer (not a particularly impressive one, it seems) atop a calculating encyclopedia. When it is able to latch on to some encyclopedic value it is able to make calculations with it. At first glance it seems useful for scientists, however, they already have much more domain specific and sophisticated models that they are already using. For example, knowing where the ISS is relative yo my location, for anyone that this matters to, they already have the software, for the rest of us its little more than a novelty. For the average person the data is not accessible enough unless you learn to speak its language.
Its another Powerset, trying to provide more accessible pathways to data tables. NLP seems to be a very tough nut to crack because Powerset is embarrassingly poor at this (despite some very clever people working on the problem) and W|A follows in their footsteps. Given sufficient data and much better parsing of that data this could be useful. Comparing it to Google is pointless because Google is a search engine, and W|A is not. W|A is a knowledge retrieval engine. CIA World Factbook on steroids.
I think the project should get government funding, expand the datasets dramatically and it will become another way to lookup facts and perform calculations on those facts. A useful tool for people in their everyday lives, but the sophistication needs to exponentially expand to have 'everyday usefulness', for example, if I can do queries like 'trend crime rates in San Francisco by neighborhood', or, 'aggregate opinions on canon 5dII, summarize' etc.
We are looking at something very primitive, but the concept is one very worthy of further research. When we figure a way for data models on any topic to be represented in a consistent and universally accessible format (akin to what Wikipedia is as a human NLP level) and these nuggets be shared such that an engine like W|A can process them we will really be on to something...
W|A is an attempt to let us query all of human knowledge the way we query our own memories. And the best use of it is to see if it also likes Monty Python?
Didn't Google launch with little fanfare? Like Cuil, Alpha's pre-launch hype set high expectations. If Wolfram had said, "Hey, we're trying out something interesting and new...far from perfect, but we want to let people try it out and get some feedback"....then these comments would all be, "Gee wiz! Look at this idea they had for music stuff..."
Can it compute the question, though? If not, that's merely cool but useless. Most of the people who would think to ask that already know the answer. It's the question that we need to know(My vote goes to "What do you get if you multiply six by seven?").
Try typing in the name of a college (the abbreviation is fine, too.) It's also interesting that it can not only perform integration, but also tell you the steps it took to get to the answer.
Try typing in the name of a college (the abbreviation is fine, too.)
My alma mater doesn't show up as "U of MN," which gets me a calculus problem, but "University of Minnesota" disambiguates nicely among the different campuses. But the information about the flagship campus hits some good information, but misses some too. Try your favorite university for the terms
[name of university] cost of attendance
or
[name of university] admission rate
or
[name of university] SAT range
or other things that people actually look up about colleges.
Hah! That was the first query I made when trying out Wolfram. I'm sure they were prepared for some of the popular, "expected" queries. Overall though, I found most of my queries to return no results. Most of them were based on similar queries to the screen cast. Its an interesting product and I look forward to seeing it improve.
Could somebody please tell me why the f2k we care how fast some random bird can fly? Why the swallow? Why not the bald eagle? Or the chickadee? Who gives a f2k.
Both Google and WolframAlpha answer this question almost instantly, without soaking up the fractional attention of dozens/hundreds of human readers here. Always try the simple automated query first: it's the only polite/efficient thing to do.
Q: speed of light in glass A: c (speed of light) and small glasses are not compatible
Q: magnetic moment of cobalt atom A: wolfram alpha isn't sure what to do with your input
(note just "cobalt" has the information but you have to dig a little)
Q: band structure of bismuth A: Wolfram|Alpha isn't sure what to do with your input
Q: polarizability of sodium atom A: Wolfram|Alpha isn't sure what to do with your input.
Q: dipole moment of water A: Wolfram|Alpha isn't sure what to do with your input.
Q: isotopes of copper A: good job! (this one actually gives me what I'm looking for)
I just read it can do music so..
Q: bach A: has a very small amount of info and a cute timeline of birth and death
<same for charles mingus, and ozzy osbourne>
Q: bach english suites A: Wolfram|Alpha isn't sure what to do with your input.
It's also really annoying that it does the HAL9000 thing like every other query.
Nice effort - better luck next time. I'm sticking with the CRC handbook and GOOG.