Let’s Talk About Amazon Reviews: How We Spot the Fakes

Like a lot of people, we read Amazon reviews as part of our product research. Getting broad feedback on a product can be very useful when we’re looking for widespread issues or seeing how a company handles warranty claims. However, as time has gone by, we’ve begun to read user reviews with a far more critical eye.

Although many reviews on Amazon are legitimate, more and more sketchy companies are turning to compensated Amazon reviews to inflate star ratings and to drum up purchases.

Have you ever seen some random product for sale that’s from some brand you’ve never heard of, and the company has no website—yet its widget has somehow garnered 15,000 five-star reviews since … last week? We sure have. This situation is likely the result of a compensated-review program. Such compensated reviews—orchestrated by businesses that cater to companies that want more public positive feedback—violate Amazon’s terms of use but are difficult to police. (This arrangement is not to be confused with Amazon’s Vine program, in which companies provide products to users in exchange for an honest opinion, although those reviews can be problematic in their own way. You can read our thoughts on them below.)

The compensated-review process is simple: Businesses paid to create dummy accounts purchase products from Amazon and write four- and five-star reviews. Buying the product makes it tougher for Amazon to police the reviews, because the reviews are in fact based on verified purchases. The dummy accounts buy and review all sorts of things, and some of the more savvy pay-for-review sites even have their faux reviewers pepper in a few negative reviews of products made and sold by brands that aren’t clients to create a sense of “authenticity.” In fact, for extra cash, a company can pay one of these firms to write negative reviews of a competitor’s product. Wirecutter contributor Brent Butterworth has written about this practice as well.

Super shady, we know. And Amazon has a history of trying hard to deal with offenders and shut them down. In fact, in April, Amazon sued another round of companies that are accused of selling fraudulent reviews. But by the time those companies are caught, their clients have already made a bunch of sales, and the fraudulent reviewers will likely pop up again under new names to repeat the process.

Want to know more? Wirecutter headphones editor Lauren Dragan talks to Marketplace Tech about compensated Amazon reviews and how to tell real crowdsourced opinions from astroturfing.

How to avoid getting scammed

You have a few ways to suss out what may be a fake review. The easiest way is to use Fakespot. This site allows you to paste the link to any Amazon product and receive a score regarding the likelihood of fake reviews.

For example, we ran an analysis on some headphones we found during a recent research sweep for our guide about cheap in-ear headphones. You can see from the results below that the headphones’ reviews didn’t score so well.

fakespot rating amazon review

We corresponded with an official spokesperson for Fakespot to get a better idea of where these results come from. He said:

The quick answer is that every analysis does two simultaneous things: we analyze every single review posted and we review each reviewer and every review that reviewer has ever posted on that account. We take all that data and run it through our proprietary engine which grades everything and looks for patterns.

The engine adjusts based on the prevailing patterns used by proven fake reviewers and their reviews, so while there is some base criteria, we’re able to use artificial intelligence to keep ahead of the imposters. Every fake reviewer has patterns. And the more data we collect via analyses completed, the more our engine is able to adapt and learn. The secret sauce is not only in the engine but the ability to run the data in the quickest amount of time possible; ensuring swift delivery of an accurate product.

The likelihood of knowing for certain if a review is fake

To get some perspective, we spoke with Bing Liu, a professor in the department of computer science at the University of Illinois at Chicago, whose focuses include sentiment analysis, opinion mining, and lifelong machine learning. He has written textbooks on the subjects. We wanted to know his opinion on whether it is possible for a program or group of programs to evaluate reviews and correctly determine their validity. Liu’s thoughts:

It is hard to say without knowing their techniques. The problem with this task is that there is often no hard proof that the detection is actually correct unless the author of the actual fake reviews (not made up fake reviews) from a review hosting site confirms it. Of course, it is easier if the company actually hosts reviews (e.g., Amazon or Yelp) because they can analyze the public information that the general public can see and also (more importantly) their internal data which tracks all the activities after a person comes to the website. A lot of unusual behaviors can be detected. Unfortunately, such data is not available to people outside the site.

In other words: Unless you have a way to confirm with the person (or company) writing the review, or you are Amazon, it’s all conjecture. Keep in mind that these analyses are based on Fakespot’s techniques, so we have to take their word for it. We don’t have a way to verify how precise they are. However, you can make educated guesses. And if you’re in a hurry or in need of a second opinion, Fakespot can be a useful tool when you’re considering a purchase.

All of that aside, we had a similar opinion when we read the Rxvoit reviews ourselves, and we can tell you a few factors that we use when evaluating customer reviews.

How we spot a phony review

What aspects of the Rxvoit headphones’ reviews felt funny to us? Well, first of all, we noticed that a lot of the positive reviews happened within a few days of each other. That indicates to us that people made a push for reviews to happen on a timeline.

In fact, at the time we did our research sweep, the Rxvoit headphones had a five-star rating and a few hundred reviews posted within a week or two. This, for a company that is very new (as in, it has only one product—these headphones) and one we had never heard of. That’s a red flag.

Second, within those reviews, we saw a lot of the same wording, and even similarly staged user photos. It was as though someone said, “Hey, take a picture of a close-up of your hands holding the headphones over a countertop.” While we know that people do post pictures to accompany their reviews, it seemed too coincidental that they were all staged in the same way, all over a span of a few days.

And lastly, we couldn’t find a company website for Rxvoit. While the lack of a Web presence isn’t in itself an indication of a shady manufacturer or a signal to look out for fake reviews, it is worth noting. When your only point of contact for a company is through Amazon, you have no way of accessing customer service directly. This means warranty claims are tough to redeem. It also means it’s tougher for a significant number of people to “just happen” to stumble across a product and decide to purchase it, which makes a sudden spurt of reviews very unlikely.

What does this look like in the wild? Well, here’s an example of reviews that are accused of being fake from the most recent Amazon lawsuit.

amazon reviews lawsuit example

Notice how all the reviews appeared within days of one another. They also reference the same key thing: the light on the cable. In fact, two of the three use the exact phrase “how bright the lights on the cable are.” That’s a good indication that something is sketchy. And although we don’t know what product the lawsuit’s example refers to, if the product’s manufacturer was brand-new and had a few hundred of these kinds of reviews within a few days, chances are good that the company paid for them in some way.

The Vine program

The Vine program, and similar methods of eliciting feedback, give away products for free (or sell them at a deep discount) to potential customers vetted (by Amazon in the case of the Vine program) for the helpfulness of their reviews, in exchange for an “honest review.” While these sorts of reviews are far more ethical than paid-for reviews, they can also be a little problematic. Even if the way the review was obtained is disclosed on product pages, several aspects of the purchasing process don’t get considered as part of these programs.

For example, returns and long-term use aren’t part of the evaluation. When you get something for free, you’re less likely to follow up on breakage concerns or customer service issues. Additionally, if the reviewer didn’t actually buy the product, that person doesn’t take the purchase and shipping processes into consideration.

But most important, receiving something for free or nearly free can greatly affect one’s opinions. You might notice how few of the reviews through Vine and similar programs are negative or even critical. This isn’t a case of reviewers intentionally being dishonest, but rather the result of unconscious positive bias. Not paying for an item can make difficulties with that item seem less irritating.

Additionally, reviewers may give their opinions on items for which they have no expertise or real experience and therefore have no frame of reference about how well something works by comparison. It’s hard to say how good something is if you don’t know what else is out there.

So, just know that you can’t always believe what you see when it comes to five-star reviews. While some overnight successes do exist, often a four-star product with authentic reviews and a proven track record is a better buy. Look beyond the overall star rating and read with a critical eye, and you’ll be in good shape.