The Problem With Online Reviews

Online reviews have become one of the primary ways people decide what to buy, where to eat, which hotel to book, and which service to use. This is mostly a good thing — genuine peer feedback is genuinely valuable. But the review ecosystem is also riddled with problems: fake reviews, incentivised reviews, review manipulation, and algorithmic amplification of low-quality feedback all undermine the system.

Learning to critically evaluate reviews — rather than just counting stars — is one of the most useful consumer skills you can develop.

Why Star Ratings Alone Are Misleading

A product with a 4.6-star rating out of 847 reviews sounds impressive. But averages collapse all the nuance. A product could have 700 five-star reviews and 147 one-star reviews — meaning a significant portion of buyers had a very poor experience, even if the average looks good. Always look at the distribution of ratings, not just the mean.

Also watch out for rating inflation: in categories where sellers actively solicit reviews, average ratings tend to skew artificially high across the entire market.

How to Spot Fake or Incentivised Reviews

Fake and incentivised reviews are a genuine problem across most major review platforms. Some warning signs:

  • Overly generic language: Reviews that describe a product in vague, positive terms without any specific detail ("This product is amazing! Great quality!") often lack the specificity of genuine experience
  • Sudden review clusters: A product that receives dozens of five-star reviews over a short period — especially shortly after launch — can indicate purchased reviews
  • Reviewer profiles: Check the reviewer's history. A profile that only reviews products from one brand, or has reviewed many unrelated products on the same day, is a red flag
  • Identical or near-identical phrasing: Multiple reviews that use almost the same wording often come from the same source
  • "Received at a discount" disclosures: Incentivised reviews aren't always fake, but they represent a different experience from an ordinary purchase

What to Actually Look For in Reviews

Specificity

The most useful reviews are specific. A reviewer who says "the zip broke after three months of daily use" is giving you far more actionable information than one who says "good quality." Look for reviews that describe real use cases, real timeframes, and real problems or benefits.

Critical Reviews From Verified Purchases

Counterintuitively, a product's three-star reviews often contain the most useful information. They tend to come from people who have mixed feelings and are more likely to explain why — which means they describe the product's actual limitations clearly.

Photos and Videos

User-submitted photos are one of the hardest things to fake at scale. A review accompanied by genuine photos of the product in use is significantly more credible than a text-only review.

Recency

Product quality can change — especially when manufacturers switch suppliers or materials. Reviews from three or four years ago may not reflect the current version of a product. Prioritise recent reviews for any purchase where quality is variable.

Useful Tools for Review Research

ToolWhat It Does
FakespotAnalyses Amazon and other platforms for review authenticity; grades products on review quality
ReviewMetaFilters Amazon reviews to remove potentially unreliable ones and recalculates ratings
RedditSearch "[product name] review reddit" for candid community opinions free from algorithmic curation
YouTubeVideo reviews often show products in real use and are harder to fake than text

The Bigger Picture

No review source is perfect. Professional review sites have their own biases; user reviews can be gamed; even friends' recommendations reflect their individual needs, not yours. The most reliable approach combines multiple sources — editorial reviews, user feedback, community discussion, and your own understanding of what you specifically need from a product.

Healthy scepticism isn't cynicism. It's just good consumer practice.