Op-Ed: Can Console Gaming Save Itself?

Op-Ed: Can Console Gaming Save Itself?

There's plenty of talk these days about another video games crash. But what does that mean, exactly? And would it be a bad thing?

Let's talk about the great video game crash of 1983. You've probably heard of it. The year the Atari 2600 market collapsed so hard it nearly killed all video games forever and ever? The time that E.T. and a bad port of Pac-Man nearly destroyed a budding industry single-handedly? That crash?

Well, that's not really how it went. What actually happened was much more complex; Atari released a couple of bogus high-profile games, yes, but far more damage was done by unrestrained third parties who churned out games far worse than E.T. or Pac-Man. The resulting glut of unregulated content of wildly varying quality clogged the Atari 2600 market, making it impossible for consumers to tell good games from bad. And, sure, it didn't help that those two high-profile first-party releases were equally crummy.

The budding console industry choked on its own eagerness to cash in on the Atari craze, and Atari lost money hand over fist... as did parent company Warner Communications... as did countless retailers. Understandably, everyone wanted out of video games.

Now let's fast-forward three decades. Adjusted for inflation, video games as a industry bring in about 12 times as much per year as they did in 1982, at the market's pre-crash peak. It's edging its way toward being a $100 billion global business. That's a lot of money flying around. And yet, we constantly hear about how difficult life is for video game makers. We hear about another multimillion-selling game that somehow fell short of publisher expectations. Another round of studio layoffs. Another publisher shutting down a developer it bought and failed to make profitable.

With each passing year, we see fewer and fewer major game releases, and the ones that do appear seem increasingly troubled. Last fall's big launches were a particularly dire lot, riddled with glitches and laden with mercenary monetization models, or both.

Watch: The video companion piece to this editorial.

If corporations as vast and resource-rich as Microsoft or Sony can't get their tentpole game releases sorted out, what hope is there for the rest of the industry? Looking at the state of the games industry in 2014 and 2015, it's hard not to see another 1983-style crash looming on the horizon. And maybe that's just what video games need.

When people say the 1983 crash caused everyone to scramble to get out of the games business, they're actually way off the mark. The "everyone" people refer to when discussing the video games crash consisted primarily of American toy retailers, who didn't want to stock console games anymore, and console and console game makers, who depended on those retailers for distribution. The '83 crash was bad news for Atari, Coleco, and Mattel, who had a lot riding on their game consoles. But in fact, everyone else weathered the storm just fine.

PC gaming found itself suddenly bolstered by an influx of developers abandoning the console ship for the safe shores of the computer market. And outside of America, console games were just beginning to find their way in 1983: Japan saw the debut of Sega's SG-1000, the MSX platform, and Nintendo's Famicom that year.

When American consoles collapsed in the '80s, PC and Japanese developers (and Japanese PC developers) were able to come into their own.

In effect, the 1983 crash amounted to the death of a single narrow slice of gaming that burned out on its own greed. But the medium, and the industry, chugged along just fine. The '83 crash was simply a market correcting itself. The console business had grown bloated and unmanageable, and it imploded.

But the death of those early consoles in the U.S. market made the industry stronger. PC gaming in the West improved, and foreign markets didn't have worry about competing with American imports for a while, allowing them to build their own businesses.

Console gaming eventually bounced back in the U.S. a few years later, and it returned stronger than ever when Nintendo resuscitated the American console market. The Atari 2600 sold 30 million units; the NES sold twice that. The industry would continue to grow to the point that 15 years later, the PlayStation 2 would arrive and sell two and a half times as many units as the NES did.

This all sounds great, right? But the reality is that things aren't so great for console gaming. The business seemingly peaked last generation, with roughly 270 million consoles selling altogether — and that's not including portable systems. This generation, however, has gotten off to a much slower start, with only the PlayStation 4 putting up respectable numbers. Xbox One is creeping along in a soft second place, and Wii U might as well not even have shown up despite its one-year head start.

A pale shadow of its predecessor, at least in terms of sales.

The games industry is growing, sure, but console games aren't where the money is these days. Everyone is scrimping, cutting corners, and flailing desperately to stay solvent in consoles, and even then it's not really panning out too well for them.

There's no single issue at work here. Like the crash of '83, the market's troubles are the result of a complex blend of factors.

The most obvious is the ever-rising budget required for developing big-name games. Kevin Spacey doesn't work for free, after all. Fifteen years ago, Sega's Shenmue was widely hailed as the most expensive game ever made at about $90 million. But last year's Destiny supposedly cost half a billion dollars to develop and market. That's insane! And it's in no way sustainable.

Sure, Destiny reportedly broke even in a day, but it's hard to measure the invisible costs the game exacted. Released in a seemingly unfinished state, riddled with bugs and exploits, with a nonexistent story, and costly DLC that lacks satisfying content, Destiny has done a great deal to make gamers more cynical and distrustful. And certainly it's not the only culprit here, simply one of many games that reinforce for many the notion that game publishers are out to squeeze consumers of every penny by any means possible, however unethical.

Kinda makes you pine for the days when Oblivion's horse armor was the most egregious monetization attempt.

This seems like the worst possible message for big publishers to be sending at a time when other areas of the games industry continue to pummel the entire concept of value. Mobile releases and Steam sales have cemented the notion that 99 cents (or free!) is a fair price for a game. When the $60 going rate for a packaged console retail release buys an unfinished, incomplete piece of software, it's hard to make the case that "premium" pricing buys a premium experience.

And then there's the fact that the younger game enthusiasts who will define the entire nature of the medium simply have less spending cash than previous generations. According to a 2012 NPR report, the all-important sub-35-year-old market that game publishers target so eagerly only has about a third of the spending cash that their demographic did a generation ago.

Change is afoot, but unfortunately most of the new models big publishers use to help sustain themselves feel like they're designed to take advantage of consumers, to say the least. Either they nickel-and-dime consumers to death with microtransactions and frequently questionable DLC or simply use paying customers as beta testers, with no firm guarantee that the finished product will ever see the light of day.

Lara Croft, out hunting for some funding.

Some cash-strapped studios are looking to first parties for handouts, which has made possible otherwise prohibitive releases like Bayonetta 2, Street Fighter V, and Rise of the Tomb Raider. But this seems like a short-lived solution; first parties only have so much money to go around, after all. And this option only really works for developers with a proven track record, working as a sort of life support for legacy franchises established in happier times. A new studio with an untested staff working on an unknown series might get a passing nod at Sony's E3 presentation, but they're sure not going to get the backing Sony's invested in From Software's upcoming Dark Souls successor Bloodborne.

Console games are choking themselves out of existence, suffocating on their own costs and the ever-rising demands of a stingy audience. Gamers want 1080p graphics at 60 frames per second at a minimum, but they don't want to spend a cent more than the bare minimum they can get away with paying. Like the Hollywood blockbuster model, the AAA console game is spiraling into extinction.

So what can you, as a console game enthusiast, do? Honestly, not much — at least not directly.

The most important thing to do is make your voice heard, whether with direct feedback or with the time-tested practice of voting with your wallet. It's ultimately on publishers to right their business models before it all comes crashing down, but American corporations in general aren't known for their ability to look beyond the current fiscal year's earnings when they set up their strategies. Their current model may lead them to ruin five years from now, but so long as their shareholders are happy come the next earnings report, most companies are content to stumble aimlessly toward their doom.

The squeaky wheel does enjoy some extra grease, sometimes: Fan outrage over the technical problems issues affecting Halo: The Master Chief Collection and Assassin's Creed: Unity resulted in extra free content.

The secret, in other words, is to make your discontent a near-term concern. Look at how quickly Ubisoft and Microsoft moved to offer make-goods for fans disgruntled with the flaws of Assassin's Creed: Unity and Halo: The Master Chief Collection — amelioration that never would have happened without consumers raising a massive stink on forums and social media. As entities, corporations tend to be more pragmatic than compassionate; consider those tales of auto manufacturers who determine it's cheaper to pay for a handful of lawsuits for people injured due to a manufacturing defect than to recall and correct the error. By comparison, free DLC seems downright loving.

In the long term, though, the business model seems in desperate need of change. Fewer and fewer publishers are able to compete at the top tier of game tech and design, and the games that do feel increasingly conservative and stale. You can play a drinking game with trends aand buzzwords at each year's E3 press conferences... but only if you have a strong liver. The open-world titles at E3 2014 alone were a case of cirrhosis waiting to happen, all on their own.

From the outside — that is, from the perspective of someone who doesn't experience the pressures on corporate decision-makers to push the decimal point ever further to the right each quarter — the solutions seem simple. No doubt they're far more complex from within, and thus tend to manifest in strange and sometimes bewildering ways, such as Nintendo of America's recent spate of infuriating choices. Most companies are gravitating toward a model of austerity: Big publishers lean on a few tentpole titles guaranteed to sell millions of units, while smaller companies make harder choices about what to release or localize.

Star Citizen may be one-of-a-kind, but aspiring developers could learn a lot from its success.

And yet, more and more games launch every year. But a growing number of them bypass the traditional publisher model that defines consoles; independent hits like Minecraft and Star Citizen could never have happened under the old-fashioned games business model. Their runaway success has left the old guard stunned. To their credit, the stewards of console gaming are taking action, albeit slowly, and with varying degrees of sensibility. Microsoft's solution was to buy Minecraft outright, as if simply owning a thing gives them the secret of its breakthrough design and model. Sony, on the other hand, has opened its doors to a vast array of indie developers.

It's here where the best chances for console games to survive seem likely to manifest. Indie developers represent the mid-tier game model where modest budgets and tech allow creativity to thrive, unlike AAA games (forced into conservatism by the need to play it safe and recoup all those rising costs) and bottom-tier mobile games (whose bargain-basement price tags result in rapacious monetization at the expense of good, player-friendly design).

Sony definitely looks to be leading the charge in this movement, creating a positive feedback loop through the promotion of promising independent games. For example, their emphasis on the entrancing No Man's Sky in last year's E3 press conference won them respect from proponents of independent development while also raising the profile of No Man's Sky and practically guaranteeing it a successful launch that was by no means a certainty beforehand.

If console games are to survive as a business, we'll need more of these mid-level games out on the market — just the way things were in the PlayStation 2 and Wii eras. There, too, we saw a positive feedback loop: Those platforms' large install bases gave publishers the courage to publish more modest-scale games on those systems, which gave consumers more choice, which made the systems even more appealing. At the moment, Sony is once again in the best position to make this happen, so in many ways it comes down to them.

A radical shift in the games industry could give a chance for developers like Olivier Madiba to shine.

But if they blow it and consoles continue to spiral into an untenable state where we're only seeing a few dozen blockbuster titles, largely cross-platform releases, per year... is that really so bad? The middle game market already exists in abundance on PCs. If the market implodes again, the way it did in 1983, we'll probably simply see PCs rise to take their place, the way they did after 1983. And foreign designers would likely have a better chance to make their mark in the medium — but instead of being European and Japanese designers, it would probably be creators from East Asia, Latin America, and the Middle East.

Not that I'm in any hurry to see the console market implode. Not only do I prefer to play on consoles, my livelihood depends on them. Whatever form the post-console market might take, the ad revenue model that makes sites like USgamer able to thrive would likely evaporate.

But games, and gaming, will survive regardless of whether or not the old giants figure out how to be small and nimble again. Sometimes to save the forest you need to burn it to the ground and give new life a chance to take root in the place of sickly old trees whose aging canopies block out the light. It's not the ideal outcome, but anything sounds better than another set of bungled releases like 2014's. Right?

Related articles

The Next-Gen Version of FIFA 21 Is EA's Idea of a "Strand Game"

A look at all the enhancements being made to FIFA 21 on Xbox Series X and PS5.

Starting Screen | Now Is a Perfectly Good Time to Tackle Tough Subjects in Games Coverage

This terrible, historic year has also seen an uptick in interest around gaming. That's exactly when our best writers should follow their instincts.

You may also like

The PS5's Activities Are Designed to Counteract the "Single Player Is Dying" Mentality

A report from Vice sheds light on the inspiration behind one of the PS5's biggest features.

Xbox Is Making It Clearer Which Old Games Are Running Auto HDR

Plus, you can pre-install (some) Game Pass games.

Squadala! The Awful Zelda CD-i Games Have Been Remastered

Nintendo may deny the CD-i Zelda games' existence, but fans never forget.

Japan's Super Nintendo World Theme Park Will Swing Open Its Gates Soon

If you're lucky enough to live nearby or are cleared to travel, you'll be able to visit the Mushroom Kingdom in a matter of weeks.