I was thinking a little bit about numerical ratings for video games. I can't seem to figure out how it works, honestly. There are so many different systems out there, as well as so many different games, and so many different methods used, that I think the industry needs some sort of shock to let people have a clearer understanding of what numerical or other ratings mean.
With movies, there's a similar problem, but since this industry has been around longer, people seem to have been able to earn an understanding. You have Roger and Ebert, with their thumbs, star ratings, and online percentage ratings. Roger and Ebert seem to actually be the least influential, IMO. With a two point system, as well as hundreds or thousands of questionable calls, they just aren't very trustworthy. Besides, one of them bad-mouthed games, too. The next, probably better system currently used is online rankings. Several sites like Yahoo, Fandango, and others gather up professional critic reviews, user reviews, and all the accompanied ratings to provide readers a look at what several think of the movie, as well as a consensus score in a few different categories. To me, this seems like a nice way to go, and the percentages assigned often fall in the same category as school grades do. The third is an older, more well-known system, which uses anywhere from one to four stars to rate a movie. A one-star movie is deemed the worst, while a four-star movie is considered the best. This system usually pairs the star-ranking with a mini-review or info-snippet, so movie-viewers can decide whether the movie and rating is appropriate to them.
Now, I have to ask how video games should be rated. What would be the most efficient way? What do you like? What do you dislike? What don't you understand?
For me, I hate that nearly every website and magazine out there uses a scale of ten or five, and very few have interpretations to what these numbers actually mean. To further muck these up, the reviewers themselves do not have a consensus on what numbers mean. Don't get me wrong. I'm not griping about NWR here. I find the entire industry is in this state. With so many different reviewers under each publication, so many different genres, and so many different sources out there, it seems impossible to learn much about any specific title by its rating. To add fuel to the fire, I've seen stories, heard rumors, and read speculation that sometimes, the reviewer is forced to alter their numbers by employer, without changing the content of the review.
I personally think that the best way to rate a video game might be to adopt something more similar to the star system. A general rating as well as an information quip could be useful. I second-guess that rating, though, because it seems to me that the industry is a little more competitive than what I suggest, and that people need to be able to learn more from the rating than even supplied there. Then again, I think that the star rating would really help to identify where a game stands in its respective genre, and perhaps provoke the interested party to take stock in the written review. It's hard to say, though. The important thing is that this system wouldn't work with numbers, really. It wouldn't fit in with what we see now, and it would take some getting used to. On a side note, I find that this is somewhat similar to NWR's pros and cons section, coincidentally, my favorite part of the reviews here.
I also thought that the percentage needs to be based on a more concrete and less abstract system when it comes to reviews. Most websites are afraid to give games numbers in failing ranges. Anywhere buy NWR, do you rarely see a four, and you never see a three. Perhaps if we considered ratings as more of a ranking. A 10 would place the game at 100%, which would mean that if you were to take every game released on current platforms (or comparative platforms, at least) that the game described would be in the very top percentile of all games. A fifty percent would mean the game is better than half the comparative games out there. A one percent would mean the game is worse than ninety-nine percent of everything that's been released comparably. This would hopefully encourage raters to use the lower spectrums more, and allow consumers a better idea at what is being looked at. The error in this system has to be a lack of subjectivity. If someone loves sports games or football in general, Madden might be in the top five percent, even if it isn't much of a great game. However, if our reviewer does not understand football rules, concept, strategy, or culture, Madden could wind up ranked below fifty percent. Since reviewers are typically just names to the consumer, such a discrepancy could wind up being very confusing, perhaps even more confusing than the way things are now. In essence, each individual's personal taste, as well as the genre of the game, could end up influencing the score, perhaps much more than the actual game's content could. I think that this type of rating would be most effective when paired with a list of scores that other games received by the reviewer. Given this, it could be a powerful concept, though it takes a considerably longer time to learn from this method of scoring.
Anyone else have their two cents?