Graphics Aren’t the Key, But They Could Bleed the Industry Dry.

Graphics Aren’t the Key, But They Could Bleed the Industry Dry.

With the anticipated holiday launch of the PlayStation 4 and Xbox One but a stark few months away, it’s no surprise to see gamers and developers alike raving about all the flashy gadgets and toys that the new hardware will offer. (I know I’m eager to get my hands on the brand-new tech.) Equally unsurprising is that better graphics ranks at the top of most wish lists. After all, the gap in visual fidelity between the PlayStation 2 and Xbox to the PS3 and Xbox 360 is immense, so the next installment in the console arms race is expected to deliver.

Ironically enough, a major blockade in the way of innovative and unprecedented visuals is the fact that today’s games are already pushing the boundaries of realism in video games. Titles like The Last of Us, Metro: Last Light, Uncharted 3, God of War 3, and many more have already shown that near-lifelike depictions are possible on current tech. As a result, it’s impossible for the next hardware generation to present the same qualitative gap. However, this isn’t necessarily a bad thing. E3’s motley crew of demos, trailers, and gameplay showcases proved that we’ve got more eye candy to look forward to than ever, so there’s no question that PS4 and Xbox One titles will look fantastic. However, that same raw hardware horsepower represents a unique variable in the software development process—one that could prove disastrous for even the most anticipated titles.

Let’s look at the sad tale of the recent Tomb Raider reboot. Anyone familiar with the title will defend that Lara Croft, the game’s female protagonist, has come a long way since her polygonal inception on the original PlayStation. The prequel-flavored reboot is easily one of the best-looking games in recent years; cinematic moments genuinely come to life, and detailed island environments are rendered in breathtaking detail. Of course, detail like that doesn’t just fall out of the sky. Square Enix explained (reported by Eurogamer ) that they “put a considerable amount of effort in polishing and perfecting the game content,” which surely incurred considerable development costs. It was this lavish and graphics-intensive development process that drove the game’s requisite sales figures so high.

Square Enix believed that Tomb Raider would sell a minimum of 5-6 million units within a month of release. Unsurprisingly, the game was only able to sell 3.4 million units. That a triple-A project was able to move 3.4 million copies—an absolutely monumental figure—within a month and still be considered an “extraordinary loss” that “did not translate to actual sales performance” is appalling. If I weren’t aware that we’re talking about a Tomb Raider game here, I’d swear that those figures were dragged out of the Great Depression era.

This demonstrates a crucial point for video game development, even outside of the triple-A scene: Graphical emphasis is only as valuable as its development is viable. In short, if allocating considerable time and resources solely to the graphical side of a game’s development is costly enough to denote an outlandish sales forecast, then it can’t, or perhaps shouldn’t, be done. The same rule applies to many of today’s technological innovations: Dazzling technology exists, but the vast majority of it is too impractical (i.e. expensive) to implement on a broad scale.

Luckily, history has also proven that a high pixel count isn’t necessary for a game to find success. Look at Borderlands 2, which sold over six million copies by the end of Q1 2013. Considering how much of a cult classic the original Borderlands was, two million sales would’ve been impressive, so to reach six million copies is absolutely mind-blowing. The game was never perceived as a triple-A project, nor did it herald its graphical quality.

If anything, the best way to sum up Borderlands 2’s art style is, well, style. The game’s cel-shaded aesthetic packs so much flare that it’s practically dripping off the screen. In fact, you could easily argue that the popularity of the game’s aesthetic spawned the slew of cel-shaded titles we know today. And yet, the game’s graphical quality certainly can’t be called photorealistic, nor does it push the limits of hardware. But it works.

This isn’t the only example of this “phenomenon.” BioShock Infinite tells a similar story of taking a stylized graphical approach but still finding great success—in this case, four million units moved in a mere month. Granted, Infinite was obviously a highly anticipated title, the third in the popular BioShock series, but the point still remains. This forms the end of the spectrum opposite that of Tomb Raider, and, intriguingly, echoes the next-gen thoughts of many industry leaders.

An early piece from Gamasutra detailed the expectations of several studio heads in regards to next-gen development. Tim Sweeney, the founder of Epic Games, expressed excitement towards “bringing together all the features and expectations that gamers have built up from all the main platforms,” such as “Facebook integration,” which allows players to “hook up to social networks and find [their] friends in there.” A similarly non-graphical notion was held by Christian Svensson, senior vice president of Capcom Entertainment. “I’ll tell you something I’m hoping for,” Svensson asserted. “I’m hoping for a much more fluid means of providing updates to consumers, being able to have a much more rapid turnaround in between when content is submitted and when content goes live to consumers, to provide a higher level of service to them.”

However, true equilibrium for next-gen development was coined (reported via GenGAME ) by Masachika Kawata, producer of Resident Evil: Revelations. “Gamers like games, and they like games with the content of something that they want to enjoy and they want to play,” Kawata claimed. “That’s never going to change no matter how good the hardware gets.” Kawata went on to add that “Really it’s just important for everyone who’s developing for the next generation to focus on the games themselves and the gameplay, and let the additional graphics power and stuff be an amazing bonus.”

Graphics Aren’t the Key, But They Could Bleed the Industry Dry.

Whether or not you’re a die-hard graphics enthusiast bent reproducing reality on your TV is irrelevant. History agrees with Kawata, as does the prevailing gamer’s opinion: Good graphics don’t make a game matter; a game makes good graphics matter. It’s more prudent, however, to point out that there’s virtually no room for an excessive focus on graphics in the next generation of software development. With networking opportunities, Cloud computing, the aforementioned social avenues, and more to account for, development for the PS4 and Xbox One (and, to some degree, the Wii U) will surely be a tight ordeal.

Of course, there’s no excuse for a sub-par presentation either. The GPUs of the Xbox One and PS4 are powerful enough to provide all manner of wow factor without incurring exorbitant development costs, so we can expect to see plenty of eye candy in the next generation as well. However, we don’t need another Tomb Raider ordeal, so studios will have to strike an appropriate balance—Kawata’s golden threshold, if you will—in order for the future of gaming to stay healthy.

To top