Select Page

By Aaron W. Rockwell, Staff Writer

Too many ideas on the topic, so here are little jabs of my thoughts:

Movie ratings are broken because they do not use segmentation. Hipsters and indie lovers have different tastes than backyard BBQ dads, but they are lumped into the same review schema.

Rotten Tomatoes is my jam, even though it fails at segmenting.

For products and services, there is an incentive for companies to create and pay for fake reviews. In addition, folks that receive “test products” will generally rate the product higher because they receive it for free.

Courtesy: xkcd.com

Courtesy: xkcd.com

On the converse side, the only incentive for one-star reviews would be honest opinions and competitors trying to get an (I think illegal) advantage on a competitor.

Always go for the negative reviews. They’re just plain more fun and helps one distinguish whether the product is truly awful or just had moments in its history of bad employees or freak of nature experiences.

I read that there’re binary system diehards out there that everything can be either a, “this is good,” or, “this is bad.” But I say “Nay,” there is a difference in my mind between Waterworld (neither good nor bad), A Knight’s Tale (purdy good), Lifetime Movies (bad but SO DAMN WATCHABLE), and Dumb and Dumber To (bad and disappointing).

Part of the Google map SEO equation (whether a business shows up on a larger zoom out) is based on how many reviews and how positive they are.

Amazon’s rating system is broken (IMO). Notice when looking at reviews whether the product is a ‘verified purchase,’ if not, it’s a paid review or a review from getting a free product.

Courtesy: Google Maps & Trevor Tran

Courtesy: Google Maps & Trevor Tran

I once went to a pawnshop (on a date) because it had an incredible amount of negative reviews. We walked in about an hour before it closed, and I joked, “You all still open?” The man behind the counter shot back with a non-friendly rebuttal: “I guess I HAVE to be until you guys leave.”

Did you know there’s a big database list out there on how positive or negative words are? Really cool stuff, maybe the future of reviews will be based solely on the ‘sentimental’ score of the writing.

I’m not sold on systems that show consumers the highest rated items first or the most bought items first. I think it’s one of those avalanche effects that keep those items in the top and create a barrier for better products.

Ratings are like currency, but there is infinite inflation. I’ve always toyed with the idea that raters should get a standard deviation score. Then the star system would be based on all the users’ standard deviations.

Example of a standard deviation system: User1 gives eight 5 stars and four 4 stars. Normally he would be flooding the market with magical positivity, with a standard deviation system, the 4-star reviews would count as 2.5 stars, and the 5 stars would count as 3.25 stars.

And finally: there should be a way to turn off all reviews like ad-blockers; then we can navigate the world oblivious of others’ opinions.