I beta tested this service today and I am little impressed with this search engine. The thing I liked about this service is that the screenshots of the webpages are prominent and gives fair idea about the contents unlike Google where the results become fairly blind and some times we just have keep on clicking many links till we land on required page. Its also unlike Snap service, where the screenshots are small and even if they are large on their homepage, the navigation is is not as smooth.
But, I guess there is long way to go for this service till we can compare it with Google, because the search results are very poor for atleast the keywords I searched. For "Tom Cruise", the the wiki page was right at bottom and imdb was nowhere in the sight. The top results were of some less profiled webpages. One more poor point is that the metadata or the text details for the results are also very poorly integrated, making it hard to find out the details about the results.
Dubious benefit. $31M in VC huh? I'm not sure you can't produce something with equally dubious benefit, with the fancy coverflow interface, with the Yahoo Search API and say, a Websense URL categorization database.... in a matter of an hour
Also.. how is this going to work for Flash based resources??
When I saw the Coverflow idea, I didn't think, "Aha! Time to take on Google!" But then again, I didn't imagine it used on a touchscreen iPod, either. So who knows.
When I search using Google, I use the context (e.g. the text excerpt) of the search results to decide which search result to choose more than the ranking.
I would like to see create a VR-type search experience - with a cocktail-party type auditory experience - where as you moved to another search cluster - the context of that cluster would increase in volume and as you moved away, it would decrease. Or maybe this makes sense only to me.