OpenCritic Arises to Disrupt Metacritic's Dominance

There's a new review aggregation tool on the block.

Analysis by Bob Mackey, .

In theory, Metacritic is a great idea. A single site where I can find every published review for the media of my choice? Sure, sign me up.

The dark side of Metacritic, though, can be found in the all-powerful Metascore. These two digits comprise the average of every posted review, and, as we've seen in countless arguments throughout the decades, there's no "industry standard" for this type of mathematical evaluation—nor can there ever be. The equivalent of a 100-point score for USgamer (5 out of 5) is much different than, say, a 100-point score on PC Gamer, since their scale goes all the way to 100.

These Metascores would be a slight annoyance if they were only used for the sake of Console Warrior pissing contests, but, for a while now, they've had a much more insidious impact on the lives of developers. They often control the fates of bonuses and future projects, which puts unnecessary pressure on both devs and the people who review their games. And the Metascore can be an especially tricky proposition if you write reviews for a site with an atypical scoring system. When I worked at, which used a letter-grade scale, I remember our evaluations being transformed into numbers much lower than the ones we would use if a 100-point scale was forced upon us. A respectable B amounted to a 75 on Metacritic, which always seemed more than a little off.

When it comes the review aggregating business, there's definitely room for competition, and the new website OpenCritic has arisen as a Metacritic alternative. And unlike its popular counterpart, OpenCritic offers some degree of personalization for readers, allowing them to filter out the websites of their choice (hopefully not ours). The only downside can be found in the fact that OpenCritic also relies on review scores, though it does present them as they were originally published. My review for Her Story, for instance, is represented as a 4.5 out of 5 on OpenCritic, which Metacritic converts to a 90. And even sites that abstain from review scores, like Kotaku and Rock, Paper, Shotgun are included in OpenCritic's review roundup.

Even if OpenCritic somehow disrupts Metacritic's kung-fu grip on the industry, there's always the chance that its equivalent of the Metascore can be just as potentially destructive. Until then, though, it's always nice to have another option—just as long as you don't take it too seriously.

This article may contain links to online retail stores. If you click on one and buy the product we may receive a small commission. For more information, go here.

Comments 28

Comments on this article are now closed. Thanks for taking part!

  • Avatar for benjaminlu86 #1 benjaminlu86 3 years ago
    If you're having sales problems, I feel bad for you son.
    I got a 99 on metacritic, but a bitch gave me -1.
    Sign in to Reply
  • Avatar for internisus #2 internisus 3 years ago
    This still shares the two-fold fundamental flaw of Metacritic, which is that all critical voices are 1) reduced and 2) treated equally. Consumers need to find reviewers whose tastes and thinking suit them on an individual basis, and complex ideas cannot be communicated through a mere score.

    There really is no way for Metacritic or any alternative to escape these problems because they are built into the basic concept of what they seek to do. Aggregation is a detriment to the industry (and to any artistic medium) because it looks authoritative while failing to be meaningful.

    This results in the attitude that anything less than an 80 must be a poor effort and a failure, a mindset which completely ignores the possibilities of idiosyncrasy and redemption. Metacritic automatically kills anything which might be described as a flawed masterpiece. A more fluid transposition of review scores won't prevent OpenCritic from committing the same crime.Edited October 2015 by internisus
    Sign in to Reply
  • Avatar for MHWilliams #3 MHWilliams 3 years ago
    @internisus Actually, it'll alleviate part of that. From the FAQ:

    "Gamers can create their own personal score by customizing which publications they trust. "

    That allows users to tailor the aggregating to their own tastes.

    Once again, people dislike aggregation because they feel it minimizes the entire depth of a work or medium (it does), but they forget the useful aspects of that. For those outside of our medium, aggregation and scores are a helpful tool. Our games are no different from anything on Amazon, books on Goodreads, or films on Folks in those other mediums - I know a few car designers myself - put as much creative effort into their work as game developers.

    If you treat the score as the end all, be all, yeah, that's a problem, but they're only one facet of a review. A facet that is useful to certain other segments of the market outside of enthusiasts.
    Sign in to Reply
  • Avatar for internisus #4 internisus 3 years ago
    @MHWilliams That customization is a nice feature, but I don't think it can save OpenCritic, especially since it takes some initial setup work and therefore lacks the universal convenience of Metacritic.

    Generally, I would argue that aggregated scores are treated as the end-all be-all by most consumers who pay any attention to them. Potential customers who take note of a Steam game's Metacritic score are usually giving it their complete faith; unless they are already significantly informed and interested in a game, they are going to dismiss a 60 immediately rather than investigate it further. The very existence of that score implies authority and encourages that harmful mindset.
    Sign in to Reply
  • Avatar for MHWilliams #5 MHWilliams 3 years ago
    @internisus Because those customers are generally low-time, low-impact consumers who just want to know if a game is worth buying or not. For those people, non-enthusiasts, scores are very helpful.

    Enthusiasts will read deeper into a review, though the issue remains they tend to use scores as pissing contests to see who's the best. That's a fault of the user, not a strike against the tool.

    I have friends who purchase two to three games a year. Scores are very useful to them. Me? I'm sitting at around 13 games this year I've bought outside of reviews. Scores matter far less to me, with the content of a review or discussion being more important.Edited October 2015 by MHWilliams
    Sign in to Reply
  • Avatar for Roto13 #6 Roto13 3 years ago
    @internisus It doesn't require *any* initial setup to work. You can either instantly browse or search it with all of the reviews counted like Metacritic or you can customize the included reviews if you want to go that far. What do you want from the site? For it to scan your brain to see which reviewers and sites you trust and automatically create lists for you?
    Sign in to Reply
  • Avatar for ojinnvoltz #7 ojinnvoltz 3 years ago
    Loved seeing Jay Sherman on the site. Coincidentally I've been thinking about rewatching The Critic all week. "Rosebud Frozen Peas: full of country goodness and green pea-ness."
    Sign in to Reply
  • Avatar for Zebetite #8 Zebetite 3 years ago
    If nothing else, some competition might provoke Metacritic into making some changes for the better. The consumer generally (less these days) wins when companies compete.
    Sign in to Reply
  • Avatar for ol\'dirty\'bus\'stop #9 ol\'dirty\'bus\'stop 3 years ago
    metacritic is the devil
    Sign in to Reply
  • Avatar for prymusferal #10 prymusferal 3 years ago
    The Critic reference alone means that you, Bob Mackey, have won the Internet for the afternoon. Of course, the insightful and on the mark article helped, too.

    "Take that, Birth of Man."
    Sign in to Reply
  • Avatar for docexe #11 docexe 3 years ago
    @internisus I think that for quick reference purposes, aggregators are useful. The problems surrounding them seem more… well, “cultural” in nature to me. Yes, people tend to give that numerical aggregated score a level of “authority” that it doesn’t deserve, but that seems to me a result of those people being uninformed about (or not quite understanding) the critical process (not to mention for that matter, how weighted averages work).

    The really distressing thing is how many people like that work for the big publishers.Edited October 2015 by docexe
    Sign in to Reply
  • Avatar for bobservo #12 bobservo 3 years ago
    @prymusferal Guernica had it coming.
    Sign in to Reply
  • Avatar for sleepiest #13 sleepiest 3 years ago
    The best idea I've heard for a review aggregator is one where you opt in to follow individual people. That way you can follow the reviewers with similar tastes, even if they change publications/have a feature somewhere other than their usual spot.
    Sign in to Reply
  • Avatar for Pacario #14 Pacario 3 years ago
    Providing numerical scores is a perfectly legitimate means of expression. The problem, if there is one, is the tendency of critics to rate almost everything between a seven and a ten (on a ten/hundred point scale, of course). It's as if the true, unspoken review standard of the industry is actually a three or four point scale, with seven being "okay," and ten being "a masterpiece" (if still not exactly perfect).Edited October 2015 by Pacario
    Sign in to Reply
  • Avatar for lanmao #15 lanmao 3 years ago
    Now my website for aggregating aggregate game review websites makes a lot more sense!

    In all seriousness, this is pretty cool.
    Sign in to Reply
  • Avatar for link6616 #16 link6616 3 years ago
    @sleepiest Yes, this is what I would love! It could be cool to just have an easy place I can see the collection of reviews from former 1up staff as they've moved about.
    Sign in to Reply
  • Avatar for Ralek #17 Ralek 3 years ago
    @Pacario Yes and no, the problem begins when people start applying statistics to those arbitrary numbers and letters, thus pretending that they are dealing with empirical values. It creates a scientific allure that cannot withstand any proper scrutiny.
    If you want to be able to compute an average score for a game, you need to know how many times each score has been given, which makes it necessary to normalize all scores, thus creating problems right away. Ignoring that, you need to be able to assign a logical order to those scores, like A > B > C, 100 > 99 > 98 etc. Followed by making sure that the distance between each score is indeed equal. How do you prove that Person1 A- is to B+ what Persons2 88 is to 82.
    How do you do that? Well, for the most part, you really just assume that those conditions are met, and that therefore, you can compute an average value that still holds any significant meaning, and is not just an arbitrary number. Metacritic goes the extra mile though and in addition to that, introduces intranspartent variable weighing, which makes the whole thing pointless, even if the weight itself is determined based on the dataset - which we don't even know to be a fact.
    It's mathematical vodoo, breathing life into mostly dead symbols.

    Edit: The point you are making about a tendency to "vote" for the perceived neutral category is indeed present. On the other hand, without any resemblance of a normal distribution, there is very little any statistician can do with the data at hand ^^Edited October 2015 by Ralek
    Sign in to Reply
  • Avatar for enigma311 #18 enigma311 3 years ago
    @internisus literally created an account after reading this site for several years to agree with your post.

    I have realized that art (or personal appeal) trumps a score. In a way that matters to the player. So many interesting and beautiful games. My son loves pikmin. I enjoyed it, but his love has taught me about timing and lots of other feelings that are appropriate but difficult to quantify.Edited October 2015 by enigma311
    Sign in to Reply
  • Avatar for prymusferal #19 prymusferal 3 years ago
    @Ralek I just had to say that I love your phrase "mathematical voodoo." Too often in too many areas of discourse -- video game reviews, education, whatever -- the numbers are twisted and made to dance in ways unintended. Your comment was a very evocative way of putting it.
    Sign in to Reply
  • Avatar for l4wd0g #20 l4wd0g 3 years ago
    It'd be nice if you could filter it by reviewer. Let's say I like Justin McElroy's reviews, but I can't stand Arthur Gies, I don't want to block all of Polygon, just one person.

    But it doesn't matter. people will still determine bonuses on metacritic and the whole system feels totally flawed.
    Sign in to Reply
  • Avatar for Pacario #21 Pacario 3 years ago
    @Ralek Scores are standard tools for critiquing most forms of expression--they're no different than, say, a college professor grading your paper or critiquing the project you did.

    The problem with many gaming publications, however, is the actual scale they use. IGN, for instance, breaks down titles to tenths of a point, which is of course absurd. The idea is to use scores as a general guide to a game's quality, not insist that game A is three tenths of a point better than game B, which is just silly.

    I've always championed the 5-point scale, which creates broad but useful categories for games. For example: 1-awful, 2-mediocre, 3-good, 4-great, 5-revolutionary/groundbreaking. Or something like that. Within this broad scale, most educated critics who are striving for some sense of objectivity would probably choose similar categories when making their assessments.

    But because there is no universal standard like this, then yes, these aggregator sites have to do the best with what they have.
    Sign in to Reply
  • Avatar for sleepiest #22 sleepiest 3 years ago
    @l4wd0g polygon is the perfect example of a site where I love some people, and can't stand the rest
    Sign in to Reply
  • Avatar for Ralek #23 Ralek 3 years ago
    @prymusferal Thanks, it does sound snappy, does it not? ^^
    And yeah, I totally agree with your point. I think, part of the problem is, that most people do lack even basic insight into how those numbers came to be, and how they can be used/interpreted, and how ... not, not reasonably at least.
    An example that comes up quite often, is the use of _average_ income, when talking about e.g. inequality. Unfortunately, more often than not, this derived statistic is indeed used to twist facts and correlations to suit some specific narrative. Also, most of the time it is intentionally employed, despite the fact that _median_ income might actually be better suited to describe the distribution issues at hand, or at the VERY least, both need to be considered in tandem with each other.
    It's like Churchill said so famously: I only believe in statistics that I doctored myself. :-D

    There is a huge difference though, anyone grading papers does not do so based on his own arbitrary whims. He works based on guidelines, that you (really should and can) now beforehand, and the final score is calculated based on a publicly accessible methodology, so that you can sit down and reproduce it, ostensibly coming to the same result. Obviously, it is not 100% objective as well, but it is open to intersubjective criticism and therefore bound to be more objective than any arbitrary number, someone just threw out there - without giving you much of any indication how it came to be or how it was calculated.
    In fact, any paper that wants to be taken seriously, will not only provide it's results and conclusions, but also the raw data those results are based on, as well as the transformations and calculations that where performed on said data. Without that, there is NO safe way to tell, if those numbers have any value whatsoever. The ability to reproduce results is the basic tennant of any serious scientific work.

    I don't really expect games journalism to be up to that standard, but the question remains, if they cannot meet those critertia, should they really be throwing around the same absolute numbers, thus creating a pretense that, well, those numbers are just as "solid", as we are used to from other number crunchers, thus offering what I called "scientific allure"?

    Instead of "This is a rocksolid 89/100", it would be more honest to go with "This is a decent game, which will mostly appeal to ...., but not necessarily to ...." or anything like it. I think Eurogamers current grading system is as much as can be legitimately done within the confines of pure subjectivity, without any recourse for reproducing and checking results. In the end, I would like to see any game review, prodviding a short conclusion, what the game does right, and what it does wrong, and who it might be for and who not, that I can read in like under a minute. I can than resort to reading a handful of those, from reviews I know and respect, and make up my own mind without being influenced by a BIG number floating around out there.
    Basically, as far as MC goes, yeah I would like the numbers to be gone, and be able to filter reviewers to my liking, which means, from now on, I'm indeed going to keep a close eye to OpenCritic, as it's just closer to my vision ^^

    Last but not least, this obviously does not only apply to games, but all kinds of other media as well, be it movies, TV, music or what have you - same issues all around!
    Sign in to Reply
  • Avatar for rosebennett #24 rosebennett 3 years ago
    Deleted October 2015 by hammersuit
    Sign in to Reply
  • Avatar for boatie #25 boatie 3 years ago
    metacritic always seemed like inside baseball to me, I never look at it. I like reading reviews from USGamer/Eurogamer/RPS and some others and usually google for their take first.

    Metacritic is useful for industry people, and their albatross to deal with
    Sign in to Reply
  • Avatar for docexe #26 docexe 3 years ago
    @Ralek You know, part of what you mention is why I prefer a site like Rotten Tomatoes to Metacritic. The “Tomatometer” doesn’t aggregate those arbitrary numbers (that are not even presented in compatible scales), but merely states a percentage of critics who gave a favorable review to a movie based on a sample of reviews, then provides some extracts or succinct quotes from some of those reviews.

    For practical purposes, seems like a better “quantitative measure” than the arcane weighted average of Metacritic. I would prefer it if a site like that existed for videogames.
    Sign in to Reply
  • Avatar for secularsage #27 secularsage 3 years ago
    I really wish we could get away from review scores and aggregations. I develop surveys and scales as part of my professional life, and I dislike the idea of taking highly subjective scores from varied scales, slapping them into an algorithm and then generating a single number by which games are judged. (Given these problems, it would make far MORE sense to use a Rotten Tomatoes-style score that based the percentage on whether or not a review is positive or negative.)

    There's also a deeper problem in that readers start with the score much of the time and then read the reviews to see why they should agree/disagree with it. Though enthusiast sites don't want this to happen, it's a natural pattern, because the review score offers a summary of a lengthy piece and helps them to know whether or not they should read on. If you read the comments on most reviews, they're reacting to the score itself, not the reviewer's perspective or nuanced thinking.

    Nowhere does this become more of a problem than for games with a niche appeal. I know people who LOVE the Dynasty Warriors / Musou games and who buy every one of them. I can't imagine why they do it, but it's their thing. These games always get beaten up in the reviews, and yet their fans are passionate about them. This points to a problem with the review system itself - if the tastes of a reviewer are poorly-matched to the appeal of a game, the review is garbage. I've lobbied several sites through forums like this to include a dual review for games (such as the "conventional" perspective and "fan" perspective), but no one seems interested in giving two scores since Metracritic only accepts one.

    I think Eurogamer has the right idea, reviewing games without scores and then labeling the best titles "Essential." This embraces the subjectivity of reviews, highlights the most universally good games and forces the reader to actually pay attention to the review instead of a score.

    Another approach I'd like to see (and one day may launch a site to actually use since no one else seems interested) is a consumer-friendly rating that advises for which audiences a game is a must-buy, a must-try, a bargain bin selection or worth avoiding entirely. I'd value that SO MUCH MORE than a number.
    Sign in to Reply
  • Avatar for Pacario #28 Pacario 3 years ago
    @Ralek Everything that is critiqued or graded should be based on objective standards. So whether it's a professor grading a paper or a critic grading a game, there are certain criteria both must meet to be considered good, or at least acceptable.

    And remember, many universities offer degrees in game design--which means, of course, that these professors are also grading their student's game projects much like one would a term paper or anything else. That's the whole point of getting a degree in this field--through the criticism, mentoring, and advice of one's professor, a student will ultimately learn how to create a quality game that will satisfy the standards of his teacher's evaluation. Indeed, the best critics use this same criteria and methodology when deconstructing and evaluating the games they play.
    Sign in to Reply
  • Avatar for Pacario #29 Pacario 3 years ago
    @docexe I'm actually not a huge fan of the Rotten Tomatoes strategy. For example, movies that, say, get 2 and a 1/2 stars (out of 4) are immediately lumped into the "rotten" category, which really isn't fair. A 2 and a 1/2 star score basically means the movie is decent, just not exceptional.

    That said, if all you care about is watching the truly best films, then Rotten Tomatoes serves its purpose.
    Sign in to Reply