The obsession with MetaCritic

There’s an obsession with MetaCritic these days.

For the un-initiated MetaCritic is an aggregation service who collates video game reviews and provides an average score for a given game.

Lots of studios and publishers use MetaCritic as their guide to how good a game is. Lots of studios and publishers have established (on somewhat firm but not rock solid ground) that a high MetaCritic score = high sales.

You’ll often hear phrases like “We want to make a 90′s MetaCritic score”. Warner Brothers – under the hands of Jason Hall – went so far as to attempt to tie royalty rates to MetaCritic score in a somewhat-flawed-but-well-intentioned attempt to make quality a focus for developers.

EA uses MetaCritic religiously internally and when faced with the problem of “how do we guarantee a high MetaCritic score before we ship when you only get the MetaCritic score after you ship. Their solution (and creative it was too) was to actually hire most of the reviewers who’s reviews go into a MetaCritic pot to write internal-only-seen-by-EA reviews of Alpha / Beta version of the game. Basically create their own internal echo of MetaCritic for pre-release versions of the game.

They can then build their own MetaCritic score and go back to the game team with this score to rebuild / modify the game to up the score.

However this obsession with MetaCritic has some draw backs. The first (and most obvious) is that with EA’s scheme, while it’s good as far as it goes, there are some fundamental flaws.

The pre-judging of a product is good and all but when you get a low pre-MetaCritic score, then what? You can look at the reviews and try and fix what they call out as issues, but by and large the reviews themselves are diverse – every reviewer has his own things he likes and dislikes about any given genre and they aren’t the same. If you are given 10 reviews and are given 20 things to fix, which do you fix?

The pre-review process is as good as a far as it goes, and sometimes there is consensus amongst reviewers and a feature that needs to be fixed – however in those cases I will bet cash money that this is something the dev team already knows about – you don’t need external reviewers to tell you that.

And then there’s the obsession with MetaCritic in the first place – the root belief that a high meta critic = high sales. I’m not always convinced of the logic of that equation. Sure, the result is right but sometimes you’ll get a high meta critic because of high sales. It’s not a precursor, it’s a result. There’s bandwagoning with reviewers just like there is among other media’s – no one wants to review Spore badly because, well, it’s Will Wright, right? You Don’t Review Him Badly and besides everyone else is spooging over it, so you’d better get with the program. Not every review (or even some reviewers) can be Yhatzee.

I’m all for MetaCritic being used as a filter personally – it’s certainly better than none. But to used as the only filter worries me.

Of course that does beg the question about how a development team gets objective reviews of their work in an ongoing fashion when the deeper in development you get the more subjective you get – can’t see the wood for the trees – but that’s the subject of the next blog post.

Till next time.

This entry was posted in Uncategorized. Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *


You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>