Skip to main content

Add Blizzard to the "We Have No Idea How Metacritic Works" List

If you scroll down a couple of posts you’ll see the Capcom/Metacritic/Dante-Looks-Weird story. I thought that was going to be our Metacritic Post of the Day. But this one is even better.

Tom Chick is a friend of mine, and he also happens to write for us at GameShark from time to time. Tom reviewed StarCraft II and wrote what I felt was by far the best piece of criticism on that game that you are likely to find published. It was fair, well written, and perfectly voiced.

Tom is, quite simply, one of the best in the business regardless of whether or not you agree with him — that’s really not the point. Tom and I disagree all the time and he likes it when people do because that can spur conversation, which he also enjoys.

Most of the time.

Tom was recently part of a GDC strategy gaming panel and was confronted by StarCraft II lead designer Dustin Browder, who just like so many people in this industry, has no earthly idea how Metacritic works, which to me is the issue more than Metacritic itself. You have to read this one, as it drives home the Metacritic point far better than anything I can say.

The only thing missing is the resolution of that discussion, which I plan on asking Tom about on this week’s podcast. (Is that a teaser? Did I just tease you guys? I think that was my first teaser!)

READ ALSO:  Hawken Trailer = Wow

Bill Abner

Bill has been writing about games for the past 16 years for such outlets as Computer Games Magazine, GameSpy, The Escapist, GameShark, and Crispy Gamer. He will continue to do so until his wife tells him to get a real job.

13 thoughts to “Add Blizzard to the "We Have No Idea How Metacritic Works" List”

  1. I’d say this sentence fragment sums up the point:

    “because the rating should never be the starting point for a discussion about a videogame”

  2. But it *always* is, through. It’s the first thing both readers and general industry folk look at — every time.

  3. I certainly wasn’t saying I’m above it… I love me some Metacritic. And I doubt Mr. Chick would honestly claim such a thing either… I think the point is not to STOP there. I tend to start with a bad review and then work my way towards good ones and see who I feel like I trust.

    I also sympathize with his frustration with the way scores are translated into a Metacritic number. Turning a coarse star rating into a gradient of 100 is definitely not a resolution sensitive operation.

    I hope that I’m not alone in that there are games that I lustfully adore that have been panned by the critical establishment. It ain’t fool proof.

  4. Metacritic needs to do just one thing (IMO): show the native site scores alongside their translation. Just that tiny bit of due diligence would make the situation better because it would raise awareness of how any specific scoring site actually scored the game. The problem is then they have to answer for one site giving the game a 9/10 translating to a 90, while a B+ at another site translates to an 83 (for example).

  5. Lets not pretend here, a lot of reviewers like to kiss rump. They’ll claim a game is the best thing ever because it’s an XBox exclusive, or it’s by their favourite developer etc etc.

    Metacritic not only combines all the real reviews, but all the biased, extremeist reviews that are either “WE LUFF DIS! 100%! SUPA FUN TIEMS!!!” or “WURST GAEM EVAR!!!”. The user reviews are even further skewed than this, either doing what the bad reviewers do, or just agreeing with whatever their favourite reviewer said.

    It’s a good tool, but like everything, people abuse it.

  6. It’s funny that the guy told Tom there are three kinds of reviews- “buy, don’t buy, maybe buy”. Because that’s operating under the assumption that writers like Tom are writing about games strictly as consumer products- not as a creative medium. I think an effective review does both, but it doesn’t come down to those three things. I don’t see how you can assess something as complex as a video game strictly as a product.

    I think there’s really, or really should be, two kinds of reviews. All or nothing. 100% or 0%. The game is either “recommended” or “not recommended” with the text of the review justifying the conclusion and offering more insight into what is on offer.

    As for reviews being a jump-off point for conversation…ideally that’s what you want. Unfortunately, dialouge and actual conversation in games criticism is rare. Too easy to jump in a forum and call someone “stupid” rather than engage and debate.

    The problem is that Metacritic ratings- and those of a lot of critics- are a numbers game, and that doesn’t really work like that. It’s also a problem that people want a single, at-a-glance summation of a review whether it’s stars, numbers, joysticks, or whatever. And unfortunately, that shortchanges a lot of what folks like us do as writers. And it shuts down conversation in favor of a number.

    It doesn’t help that Metacritic conflates reviews that aren’t on a 1-100 scale.

    For example, my B+ Killzone 3 review at Gameshark registers as an 83 at Metacritic. but I didn’t give it an 83. I think it’s better than an 83. If you actually read the review, I cite some problems but I’m also heavy on the triumphs of it as well. The number doesn’t match up right.

  7. The best thing about a black and white ratings system is that the Metacritic number would become relevant and easily understood. A 60% games is one that 6 out of 10 critics recommended. The occasional 90%+ game would be the nearly universally praised and recommended games. Seems like a workable system to me!

  8. It’s really a Rotten Tomatoes system, and I think it cuts to the chase. “Is this game good, check yes or no”. The Rotten Tomatoes thing works because it says “XX% of critics liked this”. It doesn’t put a meaningless number on it, and it takes us right to the bottom line so it serves as a much better summation of the critical reception.

  9. I had forgotten that Rotten Tomatoes already worked that way. And they deal with a number of different scoring systems and some unscored reviews. I wonder if they could be convinced to aggregate game reviews…

  10. Yeah, I think that’s another tool. That kind of tool is probably much better suited to a raw buy / don’t buy dichotomy.

    But as mentioned it doesn’t convey who and why you SHOULD by it… it assumes everyone is the same homogenized person. Only the best content need be consumed. That is a useful tool to have available.

    Metacritic is a different tool that attempts to convey more nuance across the board, for better or worse.

    These tools are bad for games that have niche appeal. I hate seeing things I enjoy get bad metacritic/rottentomato scores. For example, I loved Tron: Legacy. Critics did not. And while I can argue all day with the many reviews I’ve consumed, ultimately the scores dictates a lot of what happens to it in the market… for better or worse.

  11. Perhaps just the ability to “zoom in” on scores to see this information would be helpful. Just click the score, and see all the scores (including the crazy outliers) it’s made up of, along with links to those reviews. That would actually be helpful. Then you might be able to better discern if a game is for YOU, rather than simply whether or not a game appeals to the masses. Obviously still a lot of effort on the part of the prospective buyer, but by the same token a lot more informative.

    Just a thought!

  12. It would be somewhat humorous to create a more static reference page with links to these examples of companies really not understanding these systems…

Leave a Reply

Your email address will not be published. Required fields are marked *