Hacker News new | past | comments | ask | show | jobs | submit login
Fake Amazon Tech Book Reviews Are Hurting My Book (perl.org)
322 points by Ovid on March 13, 2015 | hide | past | favorite | 117 comments



Anything with less than a real critical mass of reviews (less than hundreds) is way too easy to entirely game on Amazon. I wrote a book quite a few years ago on a niche tech topic. To be entirely honest, very few people read it. The positive reviews of the book were almost all from people I knew personally. Then the 1-star reviews started coming in. Each 1-star review had no actual content in the review. It was clear that the writers of these BS reviews hadn't read the book at all, since they didn't mention anything specific whatsoever. However, each of the 1-star reviews DID manage to mention a competing book that was "better". And all the bad reviews mentioned the same exact competing book. It was ridiculously obvious that someone was bashing my book to push a competing book instead, without any actual substance. At first I cared, and asked Amazon to remove the reviews, which they did. Then more BS reviews came back, I had them removed, they came back again, and eventually I just got tired and stopped caring.


How about leaving a 1-star review yourself to explain the spamming? Anyone considering to buy the book will not just look at the rating but also see what complaints people have (at least I do, to see if they mention something I would have a problem with).

I haven't really thought this through, so I'm open for arguments here.


That's actually a good idea, because you can vote on reviews on Amazon and it might become the 'most helpful negative review.'


That's awful. Has no one automated the process of checking for bogus reviews and reporting them?


Yelp has a system. It doesn't seem to be perfect; I've seen what looks like both false-negative and false-positive classification of fake reviews. But it seems to work decently well.

One thing I do like about their system is that you can click a link to read the reviews they consider "not recommended."

Here's what they say about it publicly: http://officialblog.yelp.com/2013/11/yelp-recommended-review...


Let me just say that Yelp's classification system is complete crap. Until a user starts actively using Yelp, their reviews get filtered out, which excludes most people who only post a review because they loved a certain business. Sometimes, five star reviews will vanish after months of being up, all because of a change in the algorithm. And the Yelp business team is extremely unresponsive when there is a clearly illegitimate review that needs to be removed, or really when there is any concern whatsoever.

This may not be the best place to bash Yelp, but if you're reading this, please don't pay Yelp for their exorbitantly-priced business service. It is absolute garbage.


Also it discourages any new people from participating. I signed up, posted a few reviews of businesses I'd been going to for years, then noticed they got filtered out and stopped participating.

And frankly, I don't have seem to have a lot of tastes in common with the hardcore Yelp users.


I reached this conclusion when the top two restaurants in a big city were a Mexican place and Greek food truck.


Heh, I actually agree with you (and emodendroket's) observation that Yelp buries reviews by new posters. (That's discouraged me from posting, too.)

Maybe I was too generous by saying their system works "decently well"; though I don't think I'd say "complete crap"--unless my business had been burned by fake bad reviews, which it sounds like happened to you.

Perhaps this illustrates how subtle/difficult this problem is.


It's not automated but probably outsourced. Thus making it much harder to track.


How do you tell a bogus review from a real one, except by having it actually read by a human? This would be pretty expensive.


The good news is that you don't need to read all the books, because many bogus reviews tend to be written by the same "customers". Simply seeing a correlation of bad books getting good reviews, would result in those "customers" losing their ranking power from their reviews.

What's bad (really bad) here, is that not only are the books that are being ranked highly absolute crap, but they are crap in different categories. I.E. A "book" on swift, that was apparently "written", in a few hours, shoved onto Amazon, falsely ranked up - now ranks ahead of a 700 page opus on Perl, in the Perl category.

That's a total fail on Amazon's part, and really, really disappointing.


I don't find it at all surprising, however. Amazon can't even manage a half-decent search across the products they carry. I have this feeling that somewhere in Amazon, it has been decreed that the search and the rankings and everything else must be filtered through the same system that generates recommendations. You search for "4TB hard drive" and the 8th search result is a 3TB drive. Their site skips over much more relevant items apparently because the items are more strongly 'related' in terms of purchase history or browsing history or something. I expect exactly that kind of system is what is being gamed in these circumstances.

When it comes to reviews, the correct answer is that every user should have their tastes profiled and compared against the tastes of other users, with only users whose tastes in other areas counting strongly. I could easily imagine that this sort of thing might be difficult for Amazon to implement, though, depending on their infrastructure.


Amazon doesn't care about sellers. They simply don't. All they care about are customers. Don't be disappointed, as Amazon doesn't owe you anything.


In this case there were a few things that made it glaringly obvious (to a human at least).

The bogus reviews had one or more of these aspects:

- bad reviews from reviewers with no other reviews

- mentions of the same competing book in my bad reviews

- 5 star ratings of that mentioned competing book (and no other reviews beyond those)

- 1-star reviews of another random book. This one was interesting, there was another book (Service Oriented Architecture for Dummies) that is entirely unrelated. But 2 of the reviewers also gave shitty reviews to that book as well as mine. There's no logical reason that the same people badmouthing my book would have also read, reviewed, and badmouthed that other book (totally different topic) within a few days. So they were clearly hired for a number of books at once.

So yeah, you'd think some of those patterns would be easy to automatically detect. Although I assume it's constantly a game of cat and mouse.


Probably apply ML techniques. Similar to how spam is classified - build a classifier that determines the likelihood of a review being bogus, then flag it for human review if it exceeds a certain threshold.


They probably already have this, to be honest. Some of the fake reviews used keywords from the book title, probably in a learned attempt to game the classification.


Interestingly, Amazon can mine a great deal of the reviewers' histories to determine fake reviews. For example, a longtime customer of Amazon would get a higher "trustworthy" score for reviews than one who signed up 2 weeks ago.


Honestly, this should be no harder than spam classification


Expensive or not, Amazon has an obligation to make sure their review system has high integrity and is not "gamed" or no one will trust Amazon reviews anymore.


Not to mention the fact that pushing $3 spambooks that probably have high refund rates ahead of $30 expert guides in the rankings could and should lose Amazon money.


But these are all Kindle books which you can't request refunds for.


I guess they could limit reviewing to accounts that have spent a decent chunk of cash and aggressively ban accounts that write fake reviews.


Reviews that all mention the same book, or even have little or no content, and have the same rating are suspicious and easily detectable by a trivial scan and compare against existing reviews. A harder problem is when the bogus review content becomes more "clever" (e.g. random rating differences, "entropy" in the content e.g. by misspellings, etc.).


Amazon Mechanical Turk?


It was only a matter of time before the review system was attacked, and they are losing.

From sponsored "impartial" reviews to niche items with way too many reviews and a mismatched helpful/not helpful or Q&A area, Amazon reviews are starting to lose their value.


I recently came across this[1] book, which was suggested to me after having purchased Randall Munroe's "What If?" xkcd book. At least this book is relatively up-front about what it is in its description, but it's clearly a blatant attempt to piggyback on the original book's success.

There are a few legitimate reviews that indicate this book doesn't really have much content, and then 2+ dozen fake 5-star reviews praising the book highly and saying how much better it is than Munroe's book. If you look at the review history for any of the 5-star reviewers, they all seem to be reviewing the same products, and often they are reviewing 3-5 books per day.

I reported the reviews via feedback, but never received a response. Ultimately it's just disappointing to see this kind of thing happening.

[1] http://amzn.com/1503234924


Wondering what your book was! I'm writing Linux for Hobbyists right now and wondering if it's a bit too niche.


It was Adobe Flex 3 for Dummies. Niche book on a tech topic that flamed out after a few years. If you've never written a book I'd still probably recommend doing it once. But don't expect to make any money, besides whatever advance you get. But it's still damn cool to be able to say you wrote a book.


Wow, that one's super niche haha!! Sure, I'm self-publishing. Just glad to say I'm an author and hoping to make a little bit of cash too, we'll see.


I always thought that technical book writing wasn't about making money directly, but for up selling in other areas.

I.e. improving your resume, looking prestigious to potential consulting clients, or landing paid speaking gigs.


This is correct. I was doing well in my career before the book, but it really took off after the book. However, I was already co-author of one book before that, and I blog prolifically, so that skews this.

The way it works is simple. Publishers hope to sell about 5,000 or so copies of your typical tech book. That's enough to keep the publisher going, but not enough to really make money for them. The publisher keeps those "filler" books going to keep up street cred and hopes to sometimes land the "killer" book that makes them a nice profit.

I can't share the sales info for my book, but let's just say that I think the book is doing better than one might expect for its category.

If you want to write a tech book, you want to go with O'Reillly. My advance was much larger than what O'Reilly would offer, but they sell more tech books in the long run. However, O'Reilly's Perl book section is saturated and they generally don't take new authors.

What I did to reposition my book was a strategy I wish more tech books would take: I focused on jobs. Instead of teaching every esoteric niche of a language (though I covered it well), I focused very heavily on skills that employers look for. I've been doing this for a long, long time and am very familiar with employer needs, so I was well-positioned to do this. That has led to my book doing better than expected: here's what you need for a job!

I now have a successful Perl consulting firm (http://www.allaroundtheworld.fr/) and that book, while I can't prove it helped, definitely didn't hurt.

Oh, and my publisher has spoken to an Amazon rep and a contact I have in Amazon tells me that they're aware of the situation and are looking into it. I don't have much hope, but who knows?


If you have a few minutes to report the issues with the Perl category to Amazon, they accept "catalog feedback" through their 24/7 live chat system.

https://www.amazon.com/gp/help/customer/contact-us

#1: Ignore this section, skip to #2.

#2: From the dropdowns, select: "More order issues", "Give Amazon feedback", "Other feedback topic". In the text box, enter "Catalog feedback".

#3: Click "Chat".

It took me around five minutes to explain that several of the books in the top 100 were incorrectly listed in the Perl category, and they happily accepted the feedback.

Including a direct link to the Perl category helps them tremendously:

http://www.amazon.com/gp/bestsellers/books/6134005011

And I also included a link to the post above.

Edit: Tell them specifically which books, in which top 100 position, are in the wrong category, so that they know which ones aren't Perl.


I wish Amazon had a "This looks fake" button above each review. I would click it multiple times per week.

Most fake reviews are easy to spot. The simple test:

Could this review have been written about any other book in this category without changing a word?

I see fake reviews all the time for Kindle books written by indie authors. A mark of quality in a book is when there appear to be no 5 star fake reviews, but several written by real people, even if there are only 3 or 4 reviews.

Some fake reviews are harder to spot, though. For example, I suspect this account, which has been around for many years, is more than one person (perhaps a PR firm) hand writing unique reviews for each one:

http://www.amazon.com/gp/pdp/profile/A130YN8T37O833

All reviews are either 1 star or 5 star. There is content specific information but all stuff you can get from tech specs or descriptions - no sense the person actually used the product except generic intro paragraphs.

I think there is also fake voting on really good reviews. The review that I felt was the best review I ever wrote was downvoted more than any other review I've written. I don't really know if the downvoting was fake - maybe it wasn't helpful because I delved into too much technical detail:

http://www.amazon.com/review/R14QK0B7HRE5L8

However, most of the glowingly favorable reviews for this book have unanimous thumbs up, and the first 3 are written by "A Customer" which I'm guessing means the account of the reviewer has been terminated.


I don't think it's that easy. My book on Amazon currently has one 5-star review that's just one word and two 5-star reviews with two words each. I am the author and the publisher, and I know for a fact that these reviews are not fake. But they probably come across as fake. I think that there some people who just want to share their opinion of a book without making an effort to write a good review.

Personally, I'd prefer to have only reviews backed up with an explanation. If a book is good, tell me why. If it isn't good, I'd like to know why you think it isn't. That's the only way to make the next edition better.


I have noticed an increase of reviews that typically contain a subject of "<Number> Stars", and either no review body, or a very short review body, such as "good book".

I too hate reviews like that, but I think it's due to Amazon nagging customers to leave reviews for things that are purchased on the Kindle. I am beginning to suspect they have an app or form somewhere where you tap the number of stars and write your review in a text block.


The Kindle prompts to "Rate this book" when you reach the end of an e-book. It doesn't give you the option to enter a review, but only select the number of stars.


They definitely do! I get follow up e-mails for most of my Amazon purchases which direct me to just such a form.


Crowdsourcing the spam identification would be a great move.

Unfortunately it would also reveal to customers that there are fake reviews, causing them to lose confidence in and lessen the value of reviews. They may then read reviews and subconsciously be evaluating if the comment is fake, not if the product is a good buy.

If you hid the "Irrelevant Review" button under a dropdown it might work better, since it wouldn't be there as a constant reminder.


They do this - Was this comment helpful? Click no and you can report it along with a brief message. It's either ineffective or doesn't do anything at all, can't tell.


"not helpful" and "looks fake" are not the same. A review can be not helpful for many reasons, including:

* one or more incorrect facts

* goes on and on about one minor point while ignoring major stuff

* too brief

* reveals spoiler without a warning

Etc.


Reviews are sorted, by default, by how helpful they are rated.


The effectiveness of reviews is predicated on the population of 'helpful' reviewers being greater than the population of spam reviewers after some level of filtering.

It's unclear if a crowdsourced flag would tip the balance of helpful vs spam in any useful way for categories where there's already a unhelpful level of spam.


You could have a reputation system where vote weight is gained for correct identification of upvoted bogus reviews. Then your spammers are forced to fight each other in order to upvote their own spam. Start everyones voting weight at zero until they've spent $x.


Would it really reveal the existence of fake reviews? How many people are unaware of them? Sure non tech-savvy people maybe, but those people are aging rapidly. Amazon needs to get ahead of this problem, because if the end result is that their rankings are shit, that will be much much worse for them in the long run.


I worked on the customer review team at Amazon once upon a time. There are definitely fake reviews, and Amazon is well aware of "review rings" where people trade reviews, vote each other's reviews up, etc. Much like Google, they stop some and allow others, because the more feedback you give scammers in general, the better they can adapt.

In short: fake reviews and fake review upvotes very much exist and are WAY more rampant than you'd ever believe. It's an 80/20 or maybe 90/10 problem - most of the review content on Amazon comes from a small percentage of active users, and as soon as you become active on the site you start to get sucked into this sort of thing.


The problem is your "this looks fake" button would be as easy to game as the current system. People with a financial interest in a book would be marking every negative review as fake.


As a complete a side do you have any "Focused Value" investing books you would recommend.

I think you were probably being down voted for being overly technical though yes.


Common Stocks and Uncommon Profits by Philip A. Fisher is the best book I've read about the process of getting to intimately know one specific company. It was written long before the phrase "focus investing" was coined. Warren Buffett and Charlie Munger frequently recommend this book.


It's like that on the Android App Store as well - a ton of one-star reviews with no reason.


Web cache version, since the site is down for me: http://webcache.googleusercontent.com/search?q=cache:blogs.p...

While not really related, I seem to run across a lot of books on Amazon that have a high number of 5 star reviews that, upon reading, are just bad. Seems to happen a lot with books by independent sci-fi authors with Kindle-only books...

Of course, "bad" is just my opinion.


Strangely both the cached version and the website only work for me if I hit Chrome's "stop loading" button while they load.

Seems to be this JS file's fault: http://blogs.perl.org/users/ovid/mt.js


I've taken to looking at reviewer's "other" reviews when I see a slew of 1 or 5-star reviews for anything... It becomes pretty obvious which ones are real, or fake.

I've come to trust those items with all 4-5 star reviews far less than I used to.


Do you leave your own review on those books to counteract the unearned rating?


Usually yes, but always. I wish Amazon let you just give a star rating, instead of requiring you to add a real comment.


I'd prefer it the other way around, actually. Forget the rating, maybe just give a thumbs-up/thumbs-down, and require at least a couple of sentences' worth of review.

It's fairly easy to ignore a bogus review when you look at it, but the star system makes it too easy to take them at face value.


In light of that recent article about computer-generated prose being in many cases indistinguishable from human-written, I'd say those fake reviews are likely written by a computer. Notice that they're mostly of the form "I didn't think I'd be interested in [TOPIC] but [TITLE] was really interesting and really helped me learn about [TOPIC]."


That's an interesting thought. I'm pretty sure I could write up a nice grammar to chunk out "interesting" reviews of books. Instead of whining about my book getting buried by fake reviews, maybe I should chuck it and cache in on said reviews!


There should be very obvious ways to identify "review bots" - especially given that reviews are tied to an amazon accounts purchase and browsing history.

That they don't have an efficient algorithm for this sounds more like they don't really care and never bothered with it.


I'd say it's more likely that this is a complex problem, and will take a bit of time and computing power to work through...

Amazon has hundreds of thousands of products with tens of millions of reviews... correlating that with log history for each review will take a lot of time... not just to run, but to write any automated process, and work through resolving it.

It seems to me that Amazon seems to be pretty responsive when bot reviews are pointed out, and that may be, or at least have been a more effective strategy... But looking at an article a few days ago regarding twitter bot nets, and even seeing them try to draw me in... it's a very large problem all around.

Bad people will do bad things... as will misguided people. The bigger issue is the false positives... we've all read the horror stories of when a legitimate domain gets screwed by (insert popular domain registrar here) because of incorrect reportiong/reaction... or when a business' google apps is offline, and nobody can be reached at google... it happens.

In the case mentioned in TFA... it's probably prudent to ban the publisher in question. In others, the case may well be different.


> I'd say it's more likely that this is a complex problem, and will take a bit of time and computing power to work through...

It is a complex problem - but amazon has a serious advantage over other sites that have to deal with such issues (i.e. Twitter) - in that they have significantly more information on each user. I don't think Amazon is short on computing power either.

Taking into account order and browsing history, product review trends, linguistic similarities in review posts etc. They should be able to get very low error rates in identification.

Further unlike something like twitter feeds, it's quite possibly to silently de-prioritize abusive reviewers and associated products. Really, I'm quite surprised at how bad of a job they are doing - most of these cases are so blatant and obvious they should not require an author and a live representative to resolve.


If you can't beat them, join them! Of course your comment was meant sarcastically, but I wonder if authors(of genuine books not the spammy ones) might turn to posting fake reviews and justify it by saying that they have no other choice.

We can see this happening in the case of SEO with many white hat sites employing black/gray hat techniques simply to maintain their current positions in Google.


Are you really suggesting the best solution is to make the reviews more broken and useless?


Good point. If a small number of parties are selling such a bot service, it may be easier for Amazon to detect and block them at origin.


I'm also surprised that Amazon has such a poor handle on their product reviews. Obviously fake reviews absolutely litter the site. Speaking as only a consumer, it makes the process of purchasing much more cumbersome than it ought to be.

It's not reasonable to expect Amazon (or anyone, really) to detect fraudulent behavior with perfect accuracy. I have to agree with the OP, though, in that they should be able to do a lot better. Many fake reviews are blatantly fake, and could easily be flagged by a relatively naive set of heuristics.


> Many fake reviews are blatantly fake, and could easily be flagged by a relatively naive set of heuristics.

Yes, that's true, but I think this is a question of scale. Amazon is the world's largest e-commerce site (well, I think they're #2 after Alibaba now) and I've worked on some of the world's largest sites. Even "trivial" solutions become much harder at that scale.


I don't think this is unique to Amazon.

I've gotten this on the iTunes app store quite a bit. Low ratings from people who don't review any other products (or if they do, it looks super spammy).

On top of that, the bad reviews tend to come in bunches (multiple in a day), are void of any actual useful feedback or complaints and are just vague & angry (sucks, ripoff, etc).

I've had decent luck with having Apple remove any that are WAY off topic, but it still sucks.

Google Play allows you to reply to reviews (which helps quite a bit as far as either helping the customer or correcting a misconception). I'm not sure what Amazon can change on that front.


Of course in this case, he isn't getting bad reviews on his book. False good reviews on other books are pushing those "books" ahead of him.

Any type of system to respond would be useless for this particular problem.


Upvoted for visibility. A bit related : another recent article on fake Amazon reviews https://news.ycombinator.com/item?id=9136614 .


OP here.

I really can't say as much as I would like (there's stuff I can't share), but my publisher had a face-to-face with an Amazon rep and internal action was taken. Amazon's investigation is apparently over. The internal position seems to be "we're making money, there are words on pages, so there's no problem here." Amazon's investigation was short and sweet. Some bogus reviewers were removed, but Felicity — one of the worst offenders (http://www.amazon.com/gp/pdp/profile/A1NT2YXTUES4RW/ref=cm_c...) — is still there, despite the obvious fact that these are fake reviews. Many other obviously fake reviews remain. In fact, a new fake book with fake reviewers showed up. I genuinely do not know if this response is because of a careless employee or if Amazon discourages employees from shutting down profit streams.


Much like search engines combat black hat SEO, amazon and reviews systems can do the same.

It would not be to hard too for amazon. Use reviews that are a 'Verified Purchase' and highly 'helpful' reviews as a training/test set. Then machine learn a weighting to every review based on the (product, reviewer, other reviews, etc features...)

The outcome could actually hinder tactics like this for products that game the system much like how link farms hurt sites they were propping up when Google figured out how to stop sites from gaming a search engine.


In this case, the positive fake reviews are for verified purchases of ebooks that are relatively cheap... I'd suggest reviews from accounts more than 2 years old, with over 5 reviews, and more than $500 spent would probably be a better baseline for training.


It's always a cat and mouse game however. Once they employ machine learning, I'm assuming the spammers will adapt and create reviews that fool the algorithm(though at an increased initial cost).


Trusting "verified purchase" reviews is a pretty easy way to allow a lot of fake reviews. On a 99c ebook your royalty is 35%, that means a verified fake review only costs you 65c.


I wonder if any NLP approaches could yield an improvement on this? I immediately think of doing some NER on the review (looking for the title, authors, chapters, even specific topics) and if none of the above are mentioned, or at least not mentioned "enough", then it can be flagged for review as potential spam. Likewise, if you did sentiment analysis on a 5 star review, but the sentiment was either neutral or negative, it's likely not a very useful review.

I'm sure there's a lot that could be done with this, but some run of the mill NLP seems like it could at least help. I'm not sure the plausibility of this at a large scale, but it seems like an interesting problem nonetheless.


I would say just include a code inside the box that you can type in to validate your review as an actual purchaser. Then Amazon could heavily weight these reviews, at least for products known to be prone to scam reviews.


They already have a system for this: "Amazon Verified Purchase" will appear below the title of a review from someone who bought the book. Of course, they had to have purchased it from Amazon. They could default to only showing verified reviews by default or weighting verified reviews higher than unverified ones. It's too easy to create accounts and reviews to keep letting this just slide.


How about writing high-quality responses to the bad reviews? I have seen this done by some authors and it usually stops the meaningless reviews, since each additional spam review only creates a new opportunity for the author to present their marketing perspective via comment.

If this were a novel, economically hostile and unsubstantiated feedback would be considered economic warfare.


This is a romantic idea, but the obvious problem is that bad computer generated reviews are cheap and easy to produce, so there are naturally too many of them for the author to reply to individually.


Amazon will delete spam reviews if they are reported. Reviewers and reviews get meta-reviewed, and poorly meta-reviewed reviews won't appear in the featured reviews for your book.

BUT that means you still need a critical mass of genuine reviewers who give your book positive reviews, and for technical books, sometimes that's hard to come by. One way to do this is to get lots of people reviewing your manuscript pre-publication. That has lots of other obvious benefits, too.

Reply to reviews, especially if a reader had difficulty with a code example and you can help them.


I used to love reading tech books. Now I am writing a book of my own, and there are SO many incentives to churn out a low-quality book.

As an author, you make very little: maybe 5% of the cover price. So money is not an incentive. You are either writing because a. it looks good on your resume, or b. you really care about this topic and feel like this book should already exist.

Books are sold based on number of pages, not quality of content. The publishers know that if your book has 100 more pages, they can tack on another $5 - $10 to the price. So there's heavy incentive to produce a lot of content.

Readers don't want to buy multiple books: they want the one book that will cover all of their needs. So they will buy mostly based on the table of contents.

So if you want to optimize for your own monetary gain, the best book you can write is one that is big, and has an impressive table of contents, but took very little time to write. So we end up with books that have a very poor signal to noise ratio. For example: my book has code samples in Python. A couple of readers have asked me to write an appendix that shows them how to set up Python on their computer, play with the REPL, etc. I think this is totally useless. They can google and get up-to-date information, and the appendix will be out of date soon. But it is very easy for me to write and would pages to the book.

So as an author, the best job you can do is pump out a bunch of fluff and then pay for reviews on amazon. It is really frustrating to see, but that's how the incentives are set up here.


You make significantly more than 5% of the cover price if you self-publish.

There isn't much reason to use a publisher, unless its a reputable one who fronts you an advance and knows how to market your book for increased sales.


Again, it depends on why you are writing the book. I have heard that you make $10k on avg if you use a publisher, vs $50k avg if you self-publish. If you are doing it for the money, self-publishing might be the way to go.

I'm using a publisher because:

- They have the resources to market the book. The book's not done yet and I've already sold ~2000 advance copies. I'm trying to reach as many people with this book as possible, and it would have taken me a lot longer to sell 2k copies on my own.

- I'm trying to learn how to write a tech book. My editor is the rare breed that cares about making a good book, and has 20 years of teaching + writing experience. I have learned a lot from him.


2000 advance copies? Holy shit! I have the wrong publisher!

(To be fair, they were awesome with me and I had a blast, but damn, 2000 advance copies?)

What's the topic of the book?


Here's my book: manning.com/bhargava/

It is one of their better selling MEAPs :) I have also put a lot of work into it. I don't think 2k is typical.


I had a look at the sample chapter and I like the way that you're explaining concepts with sketches.

I've already learned all the material listed in the table of contents, so I might not get much from reading it, but I wish I had seen that book 15 years ago.


Sometimes review abuse is more subtle than the example in the article given.

I found a recently published technical book on Amazon, which had a couple dozen five star reviews. I read a few of them, and it seemed great. The author's biography said he worked at X, a very well-known software company. I bought a paperback copy, read it, and was disappointed. The book wasn't terrible, but it wasn't written very well, and did not contain the technical depth that it appeared it would on the reviews. The book turned out to be self-published, and the technical editor was the author's boss at X. The other editors were the author's family members.

I went back and looked at the reviews, and found that many of the five-star reviewers worked at X, or if I couldn't figure out their employer, they happened to be located in the same metro area as X's headquarters. One of the Amazon reviewers is even mentioned in the acknowledgements of the book.

I think all the reviewers had good intentions to help their friend and colleague, but I think it's still misleading, as you cannot expect a someone to give an impartial public review on their colleague's work.

All in all, the book was not total junk, but perhaps should have been a 3/5 star book instead of 5/5.


This is curious, because my g/f (who read the book and liked it) posted a good review of a novel I wrote and her review was removed because Amazon figured out she knew me. I have no idea how they did this: we don't live together and I'm not sure I've ever bought her anything off Amazon, certainly not at her present address.

So I have the impression that Amazon is pretty good at weeding out good reviews from people you know, while still being complete rubbish at policing obvious spam reviews, and doing a terrible job of suppressing reviews from obvious trolls (my book has a two-star review from a guy who also gave "A Tale of Two Cities" a two star review, but you wouldn't know that unless you looked at their review history, and who can be bothered to do that?

As both an author and book-buyer, Amazon's review system is a complete loss. Every single book I look at has both good and bad reviews with almost no way to tell if they are based on standards at all relevant to my taste. Any review that doesn't say something along the lines of "My taste runs to X and this was a great example of X" or "My taste runs to Y and while this was kinda-Y-like it failed in these respects" may as well not exist.


When I search for "perl" on amazon, I don't see any of the spam books now, so it looks like the issue is clearing up. Whether that's due to attention from Amazon or the public, I don't know.


Those reviews certainly seem pretty sketchy, and this sort of issue is definitely not unique to books.

What would be Amazon's best possible approach to dealing with this? Does there exist software good enough at distinguishing potentially fake reviews from real ones?


they should give more weight to the "verified purchase" reviews, I might even go as far as to say disallow reviews unless you actually bought the product, it would stop a lot of this, and the stupid review bandwagon that people seem to jump on


This. This more than anything. Amazon is now in a position not to need reviews from people that currently own that book. Or better still, there is a point where there is a verified purchases threshold that triggers the hiding of unverified votes.

Yes it still means a person can buy the book, then give it a bad review, then return it, however the effort to do this would be higher AND there would be a purchase pattern that begins to show itself.


You can easily detect that. If a person buys a lot of books and make a lot of 5 star or 1 star reviews, then returns them, that's clearly abuse and you ban them.


Or just wait until they have a lot of books and then reject every return. Their purchase history would tell a compelling story for the credit card bank.


Amazon will in fact ban you if you have too many returns, or so I understand.


Or just buy the books used, perform the review, and sell them back, taking a small hit on shipping....


Check out the reviews for the PHP book he mentions. Almost all of them are "verified purchase"! And the only negative review isn't one of them.

When the purchase price is so low, astroturfing with "verified purchase" reviews looks to be quite feasible.


These ARE verified purchases. You can buy them and just have to pay the reviewer for the purchase (or set the ebook to free for a limited time and let the reviewers download/"buy" the ebook in this timeframe). However, the strategy is not sustainable: Most 5 star reviews get "not helpful" marks and the ranking honors the number of purchases as well. The ebooks in question already get newer 1 star ratings and will take a nosedive when no new fake reviews are coming in.


Don't forget you get most of the money back, on top of which you can make your product/s cheaper or free temporarily.


Or perhaps remove non-verified reviews once there were more than just a couple of reviews.

I've never published a dishonest review, but I have reviewed a product or two that I bought somewhere else (not on Amazon), but I certainly "bought the product."


Isn't that why they have a did you find this review helpful button? That way bad reviews can be user flagged and given less weight. Of course that can be abused by spammers as well, so who knows how helpful it really is.

I wonder if Amazon has enough info to create a Googlesque filter bubble, so reviews are per user weighted to account for reviewer\user interest overlap.


They could buy Reddit and crib their spam detection algorithms. :)


The problem isn't unique to Amazon. The same thing happens on the App stores. It really does seem like it's high time something were done about it.


So, here's the thing about Amazon rankings. They're not a meritocracy, and you might think that good reviews help you and bad reviews hurt you. They do/can! But it's closer to SEO than that. Amazon ranks things by what sells, and yes, higher-ranked things sell better, but it's more complicated than just that. Not everybody finds Perl books by looking in the Perl category. I'm looking at the third book you highlighted, about PHP. Its sales rank is 38,xxx and yours is 2xx,xxx (btw, and you may know this - the sales required to move up rankings increase exponentially, so this difference in ranking translates to quite a few sales per day). As long as Amazon believes it's about Perl, even if only tangentially so, it will rank higher in Perl than you because it's selling better. It's in their interest to market what sells better for them.

I learned this from fiction, but it applies to nonfiction too. First of all, books are categorized by what you put in the keywords section in KDP, if they're self-published, or by whatever your publisher put in there if you're not. Let's say you write a book about Perl and cooking. It may outsell pure Perl books because more people like cooking than Perl. It might be a great cooking book but a poor Perl book - or maybe you're lying about the Perl thing, and it's just a cookbook. But it'll top the Perl category as long as Amazon believes you that it's about Perl, and among the Perl books, it will have the highest sales rank.

Unfortunately, what I bet is happening is that this book is legitimately somewhat about Perl, and the author tagged it as such in the keywords. But its sales aren't coming from winning the Perl category alone; they're coming from that and PHP and beginning programming and generally from being ranked in multiple categories.

Your best bet to get your category back is to try to convince Amazon it's not about Perl at all. Good luck - I didn't look close enough to see if that's a reasonable claim or not.


The books aren't about Perl; they're gaming the Amazon system. That being said, it was interesting when you commented about sales rank. Yeah, they sell more than I do, but you can frequently buy the for $0.99, when my sells for a real price. That might make them break even, but since they're (I'm assuming) self-published, they're probably making far more money overall.


Steps to reproduce:

1) Visit Fiverr [1]

2) Repeat

Sadly, I don't see very many ways to overcome these kinds of review factories. Maybe Amazon should start doing sting operations on these services.

[1] https://www.fiverr.com/search/gigs?acmpl=1&utf8=%E2%9C%93&qu...


I was recently on Amazon looking to purchase an item. Apparently not a lot people buy this item online, so there was very few number of reviews.

I got curious if the people reviewing these items are genuine buyers or otherwise, so I click through to one guy's profile (verified purchase). Turns out he posted hundreds of one liner reviews for a plethora of esoteric products(all verified purchase) all on Jan 12, 2015.

They are out there.


There's no point in playing whack-a-mole with the hired reviewers, they should be punishing the book publishers who are paying for this.


>why are three books in front of me in the Perl category about Swift, HTML, and PHP?

This is a big problem for more than just niche books. When I tried to find a new fiction book by browsing categories I found the same books listed in every major category, i.e. it's difficult to believe the same book ought to be listed in Sci fi, horror, self help, history, and current events.


Perhaps Amazon should have two categories of reviews, one category has only reviews by verified purchasers, people who have bought the book from Amazon, and the second is all the rest. If there's a huge discrepancy between the verified reviews and non-verified, then it's obvious that something fishy is up.


"why are three books in front of me in the Perl category about Swift, HTML, and PHP?>"

The other thing I notice is the titles are Kindle. Why are dead-tree products, arguably a more liberal format, mixed with a e-book products that are proprietary and closed?


> Come on, Amazon, you can do a better job!

I'm not sure they can. There are things Amazon seems able to do well, but this certainly has never been one of them. And I don't know that anyone else at half Amazon's scale or larger has done much better.


"...I was intrigue and more interested in playing minecraft knowing different tips and tricks that can help me to win the game. "

Lol. chuckle


If amazon made it policy to remove any negative comments that steer readers to competing products I think, on the balance it would help.


You can use my site (posted on hacker news back in January): www.fakespot.com

It basically analyzes reviews to detect if they are fake or not.


It's happening on MANY products and ruining Amazon credibility.It has become open and notorious!!!


Off topic a bit, but is there a review/rating system that isn't gamable (sp?) like this?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: