Hacker News new | past | comments | ask | show | jobs | submit login

Yelp is all but meaningless in San Francisco, probably its most popular city. Really average or even crappy cafes, salons, restaurants have greater than or equal 4 star ratings... it's a little baffling. I presume from the stories you hear that it's because the businesses pay.



This may be the case in San Francisco. It's not the case elsewhere, however, at least for this anecdotal sample size of one. :)

Business travel takes me all over the country and Yelp is incredibly reliable. I've lost track of the number of times that someone recommends a place, Yelp says it's a 2-3 star place, we go anyway, and yup, it's average at best.

I hear and see the complaints of restauranteurs about Yelp. Yet my experiences at places usually match what's seen in reviews. There's a restaurant in my hometown where the owner, a friend of mine, constantly complained about Yelp reviews where the the sentiment was that his food was too expensive and not of consistent quality (I agreed with the reviews). A few weeks ago he decided to close the place because of declining revenues. Had he listened to the reviews, settled for a bit less margin, and worked with his staff to produce a consistent product he'd still be in business.

Edit: I neglected to add that it's really important to read reviews for context. Sometimes you just have people bitching because they're a gluten-sensitive nut-allergic vegan and the place didn't cater to their highly specific needs.


I too travel all around the country for business, and I find Yelp to be less than helpful. I agree with you that reading the reviews for context is critical. In fact, that's the only way to really get a read on a place. But with hundreds of restaurants to choose from, and hundreds of reviews each (if there are only a handful of reviews, you can't rely on them at all), it's incredibly time consuming to go through them. I find that the star ratings are pretty much worthless. They get dragged down by people who ding fine dining restaurants for not having Olive Garden prices, and others who mark down local greasy spoons for not having French Laundry service. I don't care how the diner compares to the French Laundry. I care how it compares to other diners. A single rating scale across all types of restaurants is probably counterproductive, but the problem is greatly magnified by throngs of clueless reviewers who have bogus expectations and are rating places based on them.


Collaborative filtering is the obvious way to make a scalable restaurant ratings service. Google tried to do it and apparently failed; then Yelp tried to do it [http://officialblog.yelp.com/2013/06/whats-nearby.html]. Not being a Yelp user myself, I don't know if they are still doing this. Anybody have insight into whether Yelp is still personalizing recommendations or why personalization hasn't worked in restaurants (in contrast to e.g., Netflix)?


IMO, the star rating is useless as 4 stars can often be worse than 2.5 stars, but some random reviews can be helpful.

So, if your willing to spend a while going over them it's better than nothing. However, actual restaurant reviews are more accurate, take less time to use, and your less likely to read 10 reviews and learn nothing. IMO, Yelp way over hyped for how little value it's actually providing.


This may be the case in San Francisco. It's not the case elsewhere, however, at least for this anecdotal sample size of one. :)

Could we stop shutting down opinions using the sample size argument? It's one of the weakest arguments you can make:

'I think different so your opinion isn't valid because anecdote!' 'Oh you don't agree with my opinion? Well that's not valid either because anecdote!'

Not every discussion is capable of having prior polling as evidence (polling has its own issues with bias anyways). A highly voted on comment has >> 1 people who share the same sentiment and the beauty of sites like these is highly up-voted comments bubble to the top.


I was referencing my personal perspective.

I reserve the right to shut down my own opinions, especially the anecdotal ones. I have known my anecdotes to be 91.3% bullshit.


While we're ignoring the very specific statistical requirement to have at least a minimum of a significance level, could we please stop using the faux-grammatical prepositional-because?

Regarding upvotes as a proxy for "me too": an upvote doesn't mean "me too" in many cases, it means "this comment was something I thought contributed to the discussion in a meaningful way." Thus using upvotes as a proxy for a sample size counter, isn't even remotely accurate because upvotes do not necessarily correlate to me or anyone else sharing your sentiment. They only correlate to someone saying that your posting added value.

I upvote things all the time with which I disagree because the commenter made a strong argument and it added to the quality of the discussion. I downvote things with which I sometimes agree because perhaps the point was badly made, inflammatory or otherwise not enriching to the discussion. And then, I also downvote comments that use trendy I "haz cheeseburger" level diction. Of course, given my own guidelines, I probably ought to downvote myself.


"Yelp is all but meaningless in San Francisco, probably its most popular city. Really average or even crappy cafes, salons, restaurants have greater than or equal 4 star ratings... it's a little baffling. I presume from the stories you hear that it's because the businesses pay."

I have noticed this as well, but if you read the reviews you get a good idea of what is happening in SF: tourists.

Joe and Jane Honeymooners come to SF from Tulsa/KC/Duluth/DesMoines and have the "best meal/coffee/pastry/brunch EVER".

Good for them.


The flip side happens as well, especially with hotels on sites like Trip Advisor. "My $200 room in Manhattan was tiny and looks out on a light well." Umm, welcome to NYC!


Ouch. You consider a tiny hotel room to be ok because "that's how it is in NYC"?


If you're vacationing to NYC from Kansas for the first time, you might book the largest affordable room in the city and be shocked at how small it is. Conversely, if you are familiar with NYC and travel to it enough, you might recognize that a 400 sq. ft. room for $200 a night is an astonishing bargain.

Both types of review are valid, but the former is more valid for new-to-NYC vacationers while the latter would be better suited to those who are repeat travelers or people people who have migrated away from NYC and are vacationing home.

The complaint though, is that there's no way to distinguish between them. How big is big? How small is small? Reviewers seldom list the square footage, and the net result is that one person's anecdotal evidence is countered by another's opposite anecdotal experience. At the end of the day, unless there is an equal number of use cases, we're left reading reviews for context, which renders the star ratings useless.


I don't spend a lot of awake time in my hotel room when I'm in NYC. But my point was that if I have a cheap room in NYC (and $200 is quite a cheap room) my expectation is that it will be quite small.

I actually often stay at a hotel that has small [edit: 170 sq. ft] but deliberately engineered for the size rooms and I rather like it.


Funny thing, I tend to use Tripadvisor more for reviews than I do Yelp because I find the non-local perspective on places to actually be a benefit when it comes to deciding where to go. I am fully aware that those reviews are even more profoundly tourist-tinged than Yelp, however, there are some very smart travelers. For example, if some lady that has been all over Asia said "this was the best Kung Pao Chicken I've had since I was in Suzhou" then I'll pay more attention than some San Franciscan claiming their Kung Pao Chicken was the best one outside of Chinatown.

Also, the implication that everything is better in SF/NYC compared to Tulsa/KC/Duluth/DesMoines and thus the opinion of Joe and Jane Honeymooner isn't as useful is really rather snobby.


I find Yelp very useful in the following way: good places will exist and have lots of reviews on Yelp. Sort by number of reviews, and ignore the star rating.


How long before someone from Yelp reads this and adjusts the numbers to favour their advertising clientele?


I think it is a result of the five star rating system -- a vast majority of the businesses fall between a 4 and a 4.5 star rating. Perhaps a binary rating system is a better forcing mechanism to determine quality?

But since we're stuck with the five star system...here is how I leverage Yelp review data when it comes to selecting a small business to patronize:

1. Normalize for the true Yelp rating scale of 4 - 4.5 stars. Filter out anything below a 4 star rating. If a small business is unable achieve 4 stars on Yelp, then something must be really wrong.

2. Quantity of reviews is important. For starters, you need a significant sample size of review data for the star rating to be meaningful. Further on the review quantity front: massive review numbers relative to other similar businesses in that given city is a good signal of popularity (though not necessarily quality).

3. You need to supplement your Yelp findings from points 1 and 2 with other sources, ideally critical reviews (e.g. Eater, Serious Eats, Time Out, NY Times, Chicago Mag, etc).

4. The content of individual Yelp reviews should be taken with a grain of salt. Your individual priorities and tastes might be radically different so there is limited value in an individual's rating. I've come across some truly great restaurants with 1-star reviews because the reviewer tried to dine at the restaurant on a Monday...when it was closed.


This bothers the hell out of me. I reflexively rate anything on any platform 4 stars if I like it, leaving one star as "room for improvement" if I can think of any one thing I wish they'd done differently.

But the modern convention is 5 stars as "good", and getting too many 4s is a failure.


Normalize for the true Yelp rating scale of 4 - 4.5 stars. Filter out anything below a 4 star rating. If a small business is unable achieve 4 stars on Yelp, then something must be really wrong.

My experience is that higher star ratings on Yelp equate mainly to presentation over taste and that higher prices tend to mean a better rating (Out of a few average Chinese food places, the higher priced one will be rated the highest).


>higher prices tend to mean a better rating

If you assume an efficient market, they presumably wouldn't be able to charge a higher price if people didn't like them better. Of course, the average rater may care more about presentation than you do. A lot of people do go to restaurants more for the "experience" than for the food.


I wonder if a breakdown of services would be better. One to five stars for things like price, service, choices, quality, and so on. No overall one score for the entire experience with an often useless text description of what supposedly happened.

But, I think in the long run it doesn't matter. It just seems consumer based reviews are useless overall and people should stop relying on them.


Reviews and recommendations are harder than you'd expect.

I run the restaurants team at TripAdvisor and we're working to solve problems in finding the best place to eat. If you think you can do better, you may be right; come prove it!

Drop me a line, especially if you're excited to work on big piles of data, NLP, ML, a heap of phone stuff and crazy stuff like using word concreteness scores to analyze review value.


Oddly enough, in the Peninsula I think it is reversed.

You get restaurants that would be a solid 3-stars (maybe even 4!) anywhere else in the country, but because they are competing against all the hot new places that food snobs like me love, they are only 2-3 stars max.


It's interesting to compare to Japan's version, tabelog.com

IIRC the highest rated restaurant on the site is 4.62 (I might be off but it's not 5.0) Most restaurants are in the 3.0 to 3.6 range.

A curious difference in culture to Yelp


TripAdvisor's reviews have the same problem, but usually they've been faked all the way up to five stars. It's ridiculous.

And these days, it seems average restaurants are sporting the "TripAdvisor Certificate Of Excellence", so now that's meaningless too. I've seen it on one genuinely top-class Indian restaurant, but recently saw it on some distinctly mediocre places in Singapore.


The trick with TA is that you want to look at all 1, 2, and 3 star reviews and ignore the 4 and 5 star ones (just assume they're all fake). Then you know the "worst case scenario" with a specific establishment, the worst it can throw at you.

Stereotypically most negative reviews are about booking issues (e.g. lost booking, someone arrived too late and the booking was given away, etc). Or moaning about the free food or free/paid WiFi.

I don't think management replies add anything, unless they're being a jerk/dismissive. It is better just to say nothing than to be a jerk, since others reading the reply will assume that they will be treated the same way.


That sounds like a useful way of looking at the reviews. I'll try it, thanks!


TA does their own review vetting (of course), but it definitely seems like they're less selective than Yelp. As a result, they don't seem to get badmouthed by businesses as often, but that has downsides, as you point out.

In my experience, TA seems to have a larger self-reinforcing effect than Yelp: Top 10 places get more great reviews, stay top 10, even if they're dropping off in relative value or weren't that great to begin with.


Wrong, probably wrong and wrong. Yelp gets lots of usage from Sasn Franciscans in SF. I'd be curious to see a few of your 4+ star cafes that you don't think deserve it. And of course it has little or nothing to do with payoffs as all the dropped lawsuits demonstrate.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: