Solving Amazon’s Fake Review Problem

1008 reviews, a 4.3 star average rating, and an Amazon Choice product. But our review was strongly negative. Who do you trust?

Amazon complains theatrically that it is doing all it can to stop the flood of fake reviews that infest its site, but clearly the most casual of glances shows its feeble attempts to be almost entirely ineffectual.

Indeed, its problem is even more wide-ranging that it appears at first glance, because, quite apart from anything else, fake reviews (both positive and negative) don’t self-identify as being fake.  While some people will disclose that they received a free sample in return for a “fair and unbiased review” (which is always four or five stars), fake reviewers naturally go to great lengths to camouflage their fake reviews.  No-one really knows how many reviews are fake, but whatever the number, it is obvious that it is much larger than any official estimate.

So Amazon can quite honestly say, with perhaps a hint of a “nudge, nudge, wink wink” to go along with it, that they’ve only identified 1% of reviews on their site as being fake and hope that people will think this implies/guarantees that the other 99% are real.  But that is absolutely not the same as saying that only 1% of reviews are fake; all Amazon is saying is that it knows of 1% that are fake, while the other 99% are of as yet unknown validity.  (Here’s a great article about the fake review writing industry and here’s another good take on the problem.)

Sure, it is obvious that the two or three word five-star reviews are utterly useless, but we’ve seen plenty of credible seeming reviews that are also extremely suspect in nature.  For example, an emergency car external battery jump-starter that we tested features six reviews – five of them giving five stars, and the last one with four stars, and all these reviews report positively that the device works perfectly – two of the reviewers even went as far as to say they’ve used it repeatedly.  But our unit utterly and abjectly failed to turn over the engine in our car at all, which makes us suspect that all six of these reviews are fake, particularly because most people seldom/never use such a device, let alone use it repeatedly.  Even people who buy these types of units do so more to have as a standby for use in emergencies, same a fire extinguisher, not an use-every day thing.

We get that some of the reviews for this charger are very hard to identify as less than fully genuine because, other than for claiming it works well, they are reasonably well written.  But we also get that some reviews have fewer words than the stars they award products, and as such are easy to identify, indeed some of them seem to be reviewing Amazon’s shipping service or the price paid, not the product itself.

At the same time Amazon is allowing these reviews, it seems to be very sensitive to restricting negative reviews.  We’ve written before about our experience when Amazon refused to publish a negative review we wrote – check out our article which includes the full review text that was rejected and see if you think it should have been accepted or rejected.  It seems Amazon is more sensitive to restricting negative reviews than positive ones.

In our case, we negatively reviewed a Wi-Fi router.  The thing is that a person reading that negative review might be discouraged from buying the specific product reviewed, but that doesn’t mean they wouldn’t buy a different router on Amazon (indeed, while we criticized the router, we praised Amazon’s return service).  It simply gives a buyer guidance about which of the Wi-Fi routers on Amazon’s site they should confidently choose (we just checked – Amazon has over 10,000 listings when you search for “Wi-Fi router”).  Negative reviews can be as helpful as positive reviews to intending purchasers.

In addition to discouraging/censoring fair/honest/accurate negative reviews, Amazon actively encourages positive reviews.  It has its “Vine” program where selected “good” reviewers can pick and choose free products that will be sent to them, in return for which they agree to submit a “fair and unbiased” review.

How fair and unbiased are such reviews?  An interesting survey showed that Vine reviews are more positive than regular reviews (4.53 stars compared to 4.31 stars for ordinary reviews), and on the face of it, that might seem like a trivial and insignificant difference.  But in reality it is a huge difference, even though, theoretically, is should not be.  Which brings us to an interesting point.

Reviews Aren’t Truly on a 1 – 5 Star Scale

If you’re familiar with “grading on a curve” you’ll appreciate the significance of the average review getting 4.31 out of 5.  On a normal 1 – 5 scale, you’d expect to see the average somewhere around 3, because that is what 3 denotes – average.  In phrases, it is usual to describe the five stars (or A – E letter grades) as

Stars/Grade      Description
One / E Very much below average, worst
Two / D Below average
Three / C Average, normal
Four / B Better than average
Five / A Very much better than average, best

 

and you’d normally expect some sort of distribution of scores like these classic “normal” or “bell” curves.

The implication of a 1 – 5 scale is that 3 is normal, 4 is significantly better than normal, and 5 is outstandingly good, and the expectation is that most results will be 3.  So, as you can see, a bell curve is essentially symmetrical and peaks around the mid-score, although it can validly be tall and skinny (most results close to each other) or broad (a more even spread of results).

But, this is not the way scores are actually given on Amazon.

The key phrase in the preceding section was the revelation that the average review on Amazon is not three stars, but is actually 4.31 stars.  This means that even a product averaging 4 stars in its reviews is getting a lower rating than the average Amazon product, and if you should ever find a product averaging only 3 stars, while on a normal scale this would seem like an average quality product, in the reality of Amazon’s reviews, it is an unusually bad one.

This is the actual distribution of reviews (data taken from here), with a theoretical and symmetrical normal curve super-imposed.

In case it is not clear, an astonishing 66% of all reviews on Amazon are given the maximum, perfect, five-star rating.  Five and one star reviews should be, in theory, reasonably similar in number, and the lowest occurring reviews.

There is no valid model that allows for five stars to be the most common score, and these actual results tend to support our experience that negative reviews are more likely to be censored, or subsequently removed (in addition to two recently censored reviews, we’ve subsequently discovered that another four of our reviews – reviews that were originally approved – have subsequently been silently removed without us being told).

This destroys any notion at all of any sensible ranking scale, and when you think that in general, people are at least as motivated to share bad experiences as they are good experiences, it makes a mockery of the entire review system.  All of the “action” on Amazon is in the narrow range between four and five stars.

We concede that we’re endlessly urged to reward average behavior in our lives these days, starting off at school where the concept of “failure” is being erased out of the educational system, even in sports where the surely unavoidable concept of one team losing and the other winning is being blurred, and then whenever we tip someone generously, not for good/special service, but for average or even bad service, and when we are urged to fill out customer satisfaction surveys with every question given the highest “excellent” response, because, we are told, the rating company views anything less than an excellent score as a failing grade.  “Exceptional” – 5 on the 1 – 5 scale – is being redefined as the new normal, what was formerly 3 on the 5 scale.

So it is likely that many reviewers are adopting a similar strategy where five is now meant to represent “performed as expected with no major problems”, four means “not quite as good as I’d hoped, but I’m keeping rather than returning it”, and 3/2/1 stars are seldom/never used at all.  (We’re not even going to wonder what a person who rates every last ordinary experience as five stars does when coming across something truly outstanding….)

So, when judging products on Amazon, the spread now between a good and bad product isn’t between 1 and 5, it is between 4 and 5.

Now, to Amazon’s seemingly incurable problem with fake reviews.  As we see it, the solution is far from impossible.

How to Solve the Fake Review Problem

The solution is blindingly obvious and very simple.  Amazon needs to shift its focus from the fake reviews, and instead focus on the fake reviewers.  Don’t try to detect fake reviews, one by one; try to detect the fake reviewers.

The second part of this strategy is to make the cost of creating/buying fake reviews much higher than it is at present.  Currently, the lowest possible cost of a fake review is close to $0 (for example, someone writing a quick review of a product not purchased – the only cost is a couple of minutes of their time).  Sure, Amazon now positively differentiates reviews of validated purchase reviewers, but that added cost isn’t as huge as you might think, because in effect, the company arranging the fake review is selling its product to itself.  A common cost is perhaps half the cost of the product being purchased, plus a fee of perhaps $3 – $5 for the review as well.  So if the fake review helps sell three more units of the product, the company arranging/buying the fake review is profiting.

But don’t think that means only inexpensive products are susceptible to fake review scams.  Sure, it costs a company more to give Amazon its profit slice on a $1000 item than on a $10 item, but assuming the company is still making perhaps a 33% margin on the product sale, it only needs three sales as a result of the review to pay for the review, no matter what the item cost is.  Indeed, if a company is paying $5 for a fake review, that $5 is easier absorbed in a $100 or $1000 product than it is in a $10 or $1 product, and probably the percentage cost Amazon charges for shipping and fulfillment reduces too.

Make the cost of a fake review higher, and it will no longer make economic sense.

To take such steps, Amazon needs to change its own internal paradigm.  When Amazon was small and just starting up, it was desperate to add positive reviews to products, to add a unique sense of community validation and to encourage customers to buy things.  But now that Amazon is enormous, and with many products having not just ten, but sometimes more than 10,000 reviews (the router review of ours it censored would have joined 24,000 existing reviews of the same product), it needs to shift from wanting any and every possible review, to now only wanting “real” reviews, and from “real” reviewers.

Amazon has made a couple of stumbling steps in that direction already.  It only allows people who have already purchased at least $50 worth of product to publish reviews, and it requires that the $50 in purchases be to a credit card that reasonably matches the person’s claimed identity.

There are a couple of other very simple things Amazon can do.  The linked articles, above, and many other similar articles, clearly show that fake reviewers do two things.  The first is they review products they haven’t purchased.  The second is they publish lots of reviews.  Why not address both those points.

We agree that sometimes there are reasons why a person might want to publish a review on Amazon for a product they’ve purchased somewhere else.  It might be an amazingly good product – or an amazingly bad one – and you want to share that with the larger marketplace on Amazon.  So we’re not advocating a total ban on such reviews, but we suggest Amazon limit reviews of unpurchased products to perhaps no more than one in every ten reviews written by each reviewer.

That isn’t a problem for most normal people, but for fake reviewers, it means that now they have to buy and review nine other products on Amazon for each fake review they write.  (An additional bit of fine print would be to ensure that a person can’t buy nine items, each costing $1, then qualify to write a fake review of a $100 item.)  The time/hassle/cost of this rule would massively discourage most of the current “fake review factories”.

As for publishing lots of reviews, why not limit every reviewer to no more than one review a day, and to limit their reviews to no more than half the items they purchase from Amazon.  So this adds another constraint and cost to fake reviewers.  There is not a lot of money for publishing a fake review – it is a quantity based undertaking, not a quality based one, and if you can only earn one fee a day ($3 – $5 seems the typical range) and if you are further limited by needing to buy as many “real” things as fake things, that means for most people, the financial benefit of writing a fake review reduces down to less than zero (or, on the flipside, companies buying fake reviews will have to pay much more for them).

There is also one further very simple bit of quality control Amazon can do.  Require reviews to be a minimum length.  We see reviews that are sometimes only four or five or six words long.  Not even sentences, more like quick phrases, for example : “Great product, works as advertised”.  “Love it, would buy again.”  “Solved my problem, quickly delivered.”  And so on.

Such snippets are not reviews at all, they’re merely a rushed/throwaway bit of platitude wrapped around the gift of the five stars also provided.   Require reviews to be 50 words or longer.  And rather than talking about “using artificial intelligence to detect fake reviews” why not start off with something less ambitious.  Use a simple grammar/spelling checker to require reviews meet minimum standards of English literacy, the same as you get automatically included in Word.

Requiring reviews to be longer, better written, and more thoughtful increases the time cost to the reviewer, and also makes fake reviews more obvious for readers to spot.  While it might make sense for someone to spend five minutes on a fake review and end up with a $5 payment and a free product, if the five minutes becomes 15 minutes, then it is no longer as profitable for them.

Plus, one more thing.  Make the entry/qualification requirements steeper – instead of only requiring $50 of purchases to validate an account, require a member to have purchased not just $50 of product, but a minimum number of different transactions, too – perhaps at least two or three different items in multiple separate orders, and perhaps increase the $50 minimum to a higher number.

Best of all, these measures wouldn’t impact or even be noticed by most normal Amazon customers.  What percentage of your Amazon purchases do you review at present?  Would you even notice a restriction that you can only review half the products you buy?  And for those people who are obsessed with sharing their every thought on Amazon, wouldn’t this cause them to focus more carefully on products that they actually had meaningful things to say about?  How many words is your typical review?  If less than 50, how easy would it be to increase it to 50?

(By way of example, the last paragraph, immediately above, is 97 words.  So surely a 50 word review is far from onerous, assuming you actually have something to say about the product in the first place.  And, by way of further example, this paragraph is 50 words.  Exactly 50.)

Others Can Do This.  Why Not Amazon?

One of the frustrating and surprising things is that as a result of Amazon’s unwillingness to police its reviews better, other companies have sprung up to do it for us.  Two such examples are ReviewMeta and Fakespot.  While their results aren’t perfect and some of their logic is unavoidably a bit ambiguous and prone to false-positives, they seem to be more detailed and thoughtful than Amazon’s own efforts.

Amazon has access to much more information than these two companies do.  So why can’t it do a better job than these external services?

One article recently wondered if this is a deliberate strategy on Amazon’s part – to create a “trust hole” that it would then fill with its own branded products.  If that’s their plan, they’ve a long way to go before it is ready, because I’ve had major disappointments buying Amazon’s own recommended “Amazon Choice” products off their site, too.  Many times those products are just as disappointing as other no-name brands, and the reviews just as baselessly positive.

It is More than Just Amazon

To be fair, while Amazon is the highest profile company with a fake review problem, many others have similar/identical problems.

But there’s another insidious source of fake reviews too.  Many publications rely on advertising to exist, and many advertisers have been known to pull their advertising from publications that give their products bad reviews.  This means that many of the “big name” review sites tend to be vague when expressing disappointment with problems, and offer reviews that tend to be all more positive than neutral, requiring very careful parsing to understand what the reviewer is carefully not saying.  When reviewing a group of similar products, you’ll typically see several different products all get a “best” rating and most of the others get a “nearly best” rating.

At a lower level, many of the smaller sized publications rely on free sample products to review, and the unwritten rule that applies there is “thou shalt not write a bad review if you want more free samples”.

Indeed, companies will even go to other measures/lengths to strike back at publishers that have expressed disapproval – see this recent example where BA has stopped giving away copies of Britain’s esteemed Financial Times, apparently due to some criticism leveled against BA by the FT.

Summary

It is truly hard, everywhere, to find truly credible reviews.  It is however particularly lamentable that Amazon, a company that is now becoming more a universal commerce platform for sellers of all types and sizes, isn’t setting and enforcing better practices and guidelines.

This is all the more so because, as we’ve outlined above, it would be relatively simple for Amazon to make a few changes to its review policies that would greatly reduce the ease with which fake reviews can currently be published.

Most of all, allowing fake reviews to continue to infest its site does no good to Amazon itself.  So, come on, Amazon.  Make some changes.  Everyone will benefit, including yourself.

5 thoughts on “Solving Amazon’s Fake Review Problem”

  1. clevelandmb

    So true. But not sure we will ever get this controlled. The fakers seem to always find a way around. For example if they need 50 words, they can have canned reviews where they can paste quickly into the text area, couldn’t they.
    I think there are enough verified purchasers on most products that you do not need non-verified. Also, delete reviews older than, say, 1 year if there are 30 or more in the past year. Often old reviews are for a product that has been changed.
    The one thing I use reviews for is some kind on hint on how something works. Often users include such (like how to pair two headsets to one transmitter) and the included instructions are not very detailed – or in a foreign language. Also look at the questions, sometimes a hint in the answers on issues with the product.

    1. David Rowell – Seattle, WA, USA – New Zealander now living in the United States.

      Hi, Mike

      I agree that the fakers are creative, but that’s not a reason to try harder. Surely Amazon and its brains-trust can do better than it is currently.

      Your concern about cut and pasted canned reviews is one of the things that third party products already test for. That’s a very easy thing for Amazon to defeat.

      I’d wondered also about deleting old reviews, maybe with an added filter of “old reviews with no likes”. Really, when a product has 1000+ reviews, who is going to read the ones buried at the end of the list.

      Totally agree about how there are often nuggets of great information in the real reviews. Would hate to lose that.

  2. Agree that Amazon reviews have problems with them and I’d like to share a couple of comments.

    To be honest, I was even surprised to read in your article that Amazon was supposed to be trying to eliminate them because in many cases they are so blatant. Recently, I was looking on Amazon for a power washer. I saw one brand that I did not even recognize at all. It had about 25 reviews and they were all five star. That really piqued my interest (since most of the other products had reviews from 1 to 5) and I wanted to see if I could get an idea of why this product was so great.

    After reading all 25 reviews, I saw that all reviews were a variant of one of three themes. However, what I found most interesting was that not one review was written by someone whose native language was English. In fact, not a single review had even passable grammar and this is true even though all of the reviews were very short. I told my wife that I couldn’t buy this power washer since I wasn’t in their target audience which must have been the semi-literate!

    I would think it would be relatively easy to identify reviews that all said the same thing and had bad grammar.

    My second complaint is something you touched on. I’m surprised at the number of reviews that have nothing to do with the product which is misleading when it goes into the overall rating.

    In addition to the shipping problems you mentioned, I’ve seen reviews that don’t have anything to do with the product. It’s like the reviewer clicked on the wrong product. And, some are just based on the reviewer’s stupidity.

    My favorite one so far was a review where the purchaser admits that he purchased the wrong product. He also admitted that he was busy and did not return it in the time frame specified by Amazon. So, when he tried to return it, Amazon refused to take it. As a result he gave this product ONE star.

    Again, just don’t know why Amazon can’t clean up some of this stuff. I realize that they have a huge number of products and reviews and that they can’t check everything. But how about having a “this doesn’t make sense” button on each review that a ready could click on to easily notify Amazon that this review doesn’t seem right and then at least Amazon could check the reviews reported by others.

    However, I do also have one comment on your bell curve analysis. I know in my own reviewing, I just tend to send in reviews on products which are either outstanding (five stars) or lousy (one star). If a product is good, ok, or just satisfactory, I’ll admit I’m just not the type of person to write a review. From my perspective, if a good number of people are like me, I’d more expect to see an inverse bell curve rather than the standard one you described.

    Best – And, please keep up the good fights against the stupidity in our marketplaces!

    1. David Rowell – Seattle, WA, USA – New Zealander now living in the United States.

      Hi, Walt

      Thanks for your great commentary. All excellent points.

      I agree that there’s a lack of symmetry – you can upvote comments but not downvote them. Both upvoting and downvoting are open to abuse of course, but why restrict people to only one action rather than allowing both up and down-voting. Occasionally I feel like you, wishing there were a simple easy quick way to flag reviews as being nonsense so that a “real person” at Amazon might look at them. But there’s a lack of symmetry there, too. The nonsense reviewers are being paid pennies an hour in India or Bangladesh or somewhere, whereas Amazon’s staff are being paid many dollars an hour to police them.

      Your point about an inverse bell curve is a very good one, too. Completely agree. None the less, whether a regular or an inverse one, there should be a certain balance and symmetry between 4 – 5 star reviews at one end and 1 – 2 star reviews at the other end, rather than the astonishing curve as shown currently.

  3. I would not have given much credence to the Amazon review article, but I gave a one-star review to a lousy book recently, and Amazon published the one-star but edited my review to one sentence. Since the review was only a few sentences, I don’t get why they think they can edit what I wrote. They cut out the part about liking the author’s other books and only included the negative comment about the book.

    I won’t trust the reviews in the future.

Leave a Reply Cancel reply

Free Weekly Emailed Newsletter

Usually weekly, since 2001, we publish a roundup of travel and travel related technology developments, and often a feature article too.

You’ll stay up to date with the latest and greatest (and cautioned about the worst) developments.  You’ll get information to help you choose and become a better informed traveler and consumer, how to best use new technologies, and at times, will learn of things that might entertain, amuse, annoy or even outrage you.

We’re very politically incorrect and love to point out the unrebutted hypocrisies and unfairnesses out there.

This is all entirely free (but you’re welcome to voluntarily contribute!), and should you wish to, easy to cancel.

We’re not about to spam you any which way and as you can see, we don’t ask for any information except your email address and how often you want to receive our newsletters.

Newsletter Signup - Welcome!

Thanks for choosing to receive our newsletters.  We hope you’ll enjoy them and become a long-term reader, and maybe on occasion, add comments and thoughts of your own to the newsletters and articles we publish.

We’ll send you a confirmation email some time in the next few days to confirm your email address, and when you reply to that, you’ll then be on the list.

All the very best for now, and welcome to the growing “Travel Insider family”.






David.

Exit mobile version