9/10 Doesn't Mean What it Used to
As the clock struck 2pm last Friday, something that we had long suspected was officially confirmed: Batman Arkham City will be in the running for game of the year honours. Two days later and it was sitting pretty with an average score of 98% on Metacritic, though this is taken from a limited sample of only eight different reviewers. It may be early days, but the consensus is that Arkham City is something special.
Visiting some of the bigger video game websites, and feeling rather good about my pre-order, I stumbled across a very detailed video review. The reviewer clearly enjoyed the game and went to great lengths to set out the reasons why they were so smitten, whilst avoiding the PR talk that will ruin any review. Their gushing praise culminated in a score of 9/10 which, despite being fractionally below the average, is a bloody good score. However, it seems that this favourable review didn’t go far enough for some commentators – a sentiment that was echoed on other sites.
There was much bellyaching among users about the lack of a perfect score for a game that seems to have fulfilled its vast potential. Statements ran along the lines of "I feel its worth a 10/10" and "What's wrong with this reviewer? IGN gave it 9.5!". Unless these cretins have somehow gotten their mitts on an advance copy, then they are passing judgement, as well as questioning a far more informed source, on a game that they couldn't possibly have played. My intolerance for stupidity aside, the sentiment was clear: 9/10 isn't perceived as being a high enough score for a game that is already being discussed as best of the year material.
For this sad state of affairs, review outlets have no one to blame but themselves. Over the years there has been a distinctive shift towards a higher average score doled out by print and online reviewers, culminating in a culture of "80% or bust". On any other scale, five would be deemed average with anything below being rather less meritorious. Sixes and sevens would be commendable, with eight and above being reserved for the cream of the crop. This is clearly not the case today, where in many places anything below an eight is seen as being of questionable worth, and there are far too many AAA titles settling nicely into the 9-9.5-10 bracket. If your job is to offer balanced opinions and to approach things from a critical point of view - they are called critics for a reason - then I'd say you are failing pretty miserably if you are awarding perfect scores. A 10/10 is most unhelpful for an interested consumer as it is unlikely to offer any constructive information. If I wanted unquestioned praise, then I'd have saved myself some time and just read a PR release.
Its not my intention to root out every single cause in this shift, as I think that's already been exhausted by far better and more informed writers than I. But just to touch on it briefly, Metacritic has elevated scores far above their station, and it has gotten to the point where sequels are commissioned or cancelled based on scores from this hugely influential site. As useful as it may be, the Metacritic system is rife with problems, including which review sources to use as well as having to convert all scores onto the same scale - for example 1UP reviews in an alphabetical, not numerical manner - and the simple fact that all reviewers will score slightly differently.
It would make sense that the rise of Metacritic would in turn lead to an increase in pressure from PR companies for reviewers to award higher scores. It is no secret that tiered, score based embargoes are placed on many new games, to control the information that the consumer receives in the vital opening week at retail. This entails reviewers being told when their reviews can go live, dependant on the score they award. For example, if you have given the game in question a 90% or above, then you may be able to post your review on Monday, whereas anything lower cannot go live for another couple of days. This ensures a favourable Metacritic score during the first few vital days, where consumers will keep or cancel their pre-orders and buy games at full retail price. For more on this worrying practice, I would recommend checking out the October edition of EDGE magazine, which featured an interesting article that touched upon publisher enforced embargoes.
|Lost Planet 2 suffered a 70% Metacritic, which meant it was DOA at retail|
When I post reviews here at toomanywires, I don't usually award a score. This is simply my preference as the author, as I don't feel the need to reduce 1500 words into one arbitrary number, and I like to think that the majority of people who have made the effort to visit will take the time to read most, if not all of the review. However, when I review elsewhere, I am usually required to award a score out of ten, and I must admit that I sometimes struggle with it. I have perhaps become overly reliant on awarding scores of seven and eight, which I feel accurately represents a game that is very good, but not quite great. I also feel the need to compare the game I'm playing to similar ones I have reviewed in the past, as reviewing in a vaccuum is not particularly helpful for the reader. If I gave Dragon Age 2 7/10, and then six months later awarded Shadows of the Damned an eight, despite not feeling that is a superior game, then am I doing something wrong? But then again, what business do I have comparing an action RPG to a third-person shooter.
Although I don't use them here, I do think scores are important and have a vital role in molding the public's perception of a game. I must admit, before reading a review I will always glance at the score first, and then I delve into the meat of the article in search of information to support the score given and discover how the writer settled upon it. However, when the scale is so bent out of shape, am I doing myself and the writer a disservice by giving it the time of day?
Its not all doom and gloom, as there are clearly no shortage of writers and publications that are wary of this problem. EDGE has earned its reputation as an outlet that is quite critical in its reviews, but its not uncommon to find a game scoring 6/10 with a write-up that is far more positive than not. I have also seen a number of writers on twitter bemoaning the current state of affairs. For a good dose of common sense, I'd strongly recommend following Gamespot's Kevin VanOrd, whose riffing on this topic the last couple of days partially inspired this post.
This problem is not something that can be easily fixed, and one might argue that we shouldn't even try. After all, reviews are a hugely personal thing, peculiar to each individual reviewer, so applying any sort of rigid universal scale would be utterly pointless. However, the current system has been loosely settled upon by the majority of the gaming press, even if each will have their own slightly different interpretation of it, and as long as the general rules are understood by reviewer and reader then I'm sure we can live with it. Even still, a 9/10 doesn't mean what it used to and, as gamers, I feel its important that we keep this in mind.