The internet is a shopaholic's paradise.  If you can dream it, you can probably buy it; from an 85-inch TV to a tin of unicorn meat, millions of items are only a few clicks away from being shipped directly to you.  But with so many items to choose from, it's a mathematical reality that some products will be better than others.  In fact, some will be downright terrible.  And when shopping online, there's only so much you can learn about a product without having it directly in front of you.  So how can you be  confident that the products you buy will actually suit your needs?

On the face of it, the answer seems simple: look at the reviews!  Nearly every major retailer lets shoppers rate products.  In fact, ratings aren't just restricted to online shopping: Yelp lets you rate restaurants and businesses, TripAdvisor lets you rate hotel destinations, and in an increasingly shared economy, you can even rate people on services like Lyft and AirBnB.

But what's the best way to rate a product or service online?  Arguably the simplest thing to do is just display the number of positive reviews: this is the philosophy behind the Facebook "like," which only lets users show something positive.  However, this isn't necessarily all that helpful: knowing whether 100 people like something doesn't tell you whether nobody dislikes it or a million people dislike it.

Thumbs down for the thumbs up?

If you want to incorporate negative reviews as well, you could certainly do that.  Maybe you could show the difference between the number of positive reviews and negative reviews.  This gives people more information, but still seems somewhat lacking.  For example, if a product has a score of 0, is that because nobody's rated it, or because it's a polarizing product and the number of positive reviews equals the number of negative reviews?  One way to avoid this problem would be to show the ratio of positive to negative reviews instead of the difference between them; while 1:1 and 1,000:1,000 both represent the same number, they tell you slightly different things about peoples' feelings towards a product.

You could also ask for more out of your reviews than a simple "positive"/"negative" dichotomy.  In fact, most major sites that involve reviews — Amazon.com, for instance — let people rate things on a sliding scale from 1 (terrible) to 5 (awesome).  As before, there are many ways you could try to summarize a collection of star ratings, but the most common thing to do is take the average.  For example, if a product has one 5-star review, one 4-star review, and two 2-star reviews, its average would be the number of stars (13) divided by the number of reviews (4), or 3.25 stars.  Easy peasy, right?

Well, maybe not.  There are two problems with only looking at averages.  The first is that the average may not tell you much if the number of reviews is small.  If a product has a 5-star average but only one review, that doesn't tell you very much.  Secondly, if you've never read any product reviews on sites like Amazon.com, you really should.  You'll find that not all of them are the most trustworthy.  In fact, here are some examples of people who are using the platform to hone their comedy skills.  While these reviews are undoubtedly hilarious, it does call the reliability of the ratings into question.  And if the ratings themselves are no good, the average rating won't be much help either.

Amazon tries to combat this second issue by allowing you to vote on a review as helpful or not.  But is this the best way to ensure that their ratings are trustworthy?  And if you were put in charge, what would you do to ensure that the ratings on your site were reliable?

Teachers: interested in having this conversation with your students?  Then check out our newest lesson, Overrated.

Leave a Reply

Your email address will not be published. Required fields are marked *