Hacker News new | past | comments | ask | show | jobs | submit login

Wait! It gets worse! There are sellers who get paid reviewers to write bad reviews on competitor's products.

Even the honest ones are forced to get fake reviews, because their competitor is winning with fake reviews. This is the cost of replacing humans with algorithms, before it is on par in accuracy.




How does a human detect a fake bad review?

The problem here is not replacing humans with algorithms, it's with replacing trusted humans with arbitrary humans (and not even checking whether one human is pretending to be many humans). This is the same problem with online forums and comment sections. We need a way to build a web of trust, so I can tap my network for reviews, but have a way to blacklist the shills who pop up.


You are right about replacing trusted humans (word of mouth) with arbitrary humans (reviews). But, the algorithm part comes because of the volume of arbitrary humans' reviews. Algorithm crunches these reviews/ratings and determines the top products, which is deeply flawed at the core. Today, our buying pattern is driven by the scale (volume of rating) rather than the quality (word of mouth).


Maybe require that you purchased the product on Amazon before they count a < 3 star review?

There's no reason the "average" from the reviews presented need to count them all... the search ratings could easily exclude reviews by those that didn't have a confirmed purchase.


As stated in the article: The reviewer did purchase the product in Amazon and he's getting refund through PayPal.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: