The Changing Face Of Gaming

The Changing Face Of Gaming



Once upon a time, gaming news was something we saw once a month or so, when the latest copy of print magazines such as GamePro and Electronic Gaming Monthly hit shelves or landed in our mailboxes. Today, gaming news, as with all other varieties of information, is on a twenty-four-hour cycle. Reports are made not monthly, but daily, if not hourly. This Internet-heavy generation comes on the heels of Sony's tremendous success with the original PlayStation, which paved the way for identifying video games as a part of mainstream culture rather than a shameful sub-culture celebrated only by the most socially inept and secluded in dimly lit basements.

Now games are topping the media revenue charts, with numbers that dwarf other forms of entertainment (electronic or otherwise). Games like Call of Duty: Modern Warfare 3 become national phenomena, while World of Warcraft chugs along with its cool ten million subscribers, an astronomical number to be paying for one game continuously month after month. World of Warcraft, though, acts as a great example of one of the greatest truths of modern gaming: what was once a mostly solitary experience is now a social event.

Most games today have multiplayer portions to extend their lifeblood. Even those that don't are almost always connected to some sort of service, whether through the console they're on or the digital storefront through which they were downloaded, that keeps players updated on their friends' trials and travails in the world of gaming.

The Changing Face Of Gaming

This, depending on how one looks at it, is either good or bad. For those who still seek to use gaming as a solo pastime, without the influence or interaction of other gamers and friends, it's extremely difficult. Consciously cutting out that online framework is a difficult process and, in some cases, either illegal or impossible. That's, in part, because games, especially those for the PC, have been pursuing a "perfect" form of DRM for a while now.

DRM, both in its current form and in whatever theoretical form it takes in the future, is apparently here to stay for PC gamers. With the proliferation of broadband, online gaming has become a more regular, popular thing, and digital game downloads have also become a possibility. This also means, though, that piracy is thriving. There is, after all, bandwidth to spare. As such, DRM has become the normal response, with companies such as Ubisoft and Blizzard going so far as to make each of their games, even those with a single-player component, completely unplayable unless they're connected to the company's authentication servers. This isn't so bad in something like the recently released Diablo III (except when that game experiences service hiccups), which has a drop-in/drop-out party system that allows for the continuous blending of both single-player and multiplayer. For a game like Assassin's Creed: Brotherhood, though, which has a segregated and distinct single-player experience, the inability to play it without logging on and keeping in touch with Ubisoft's authentication servers can be devastating.

There's another element to this social evolution of gaming, though. As greater numbers of players have flocked to games, maintaining their attention and providing a measured learning curve has proven a new experience for developers. All the way up from the classic days, as recently as the N64 generation (perhaps the PlayStation 2 generation as well), the key seemed to lie in offering players a finely honed, well-crafted gameplay experience. It's worth noting, as well, that genres were in vogue back then, and games that bridged them were rarer. RPGs were RPGs, First-Person Shooters were that.

Advertisement

And then, someone realized that adding elements of one into the other created a highly addictive experience. Part of the appeal of RPGs, after all, is that, despite their slow and awkward gameplay, one's characters are continually improving, getting stronger. Soon, this began to show up in action games, with upgradeable weapons, abilities that could be learned (in truth, harkening back to River City Ransom on the NES). Eventually, though, Call of Duty: Modern Warfare came out and its multiplayer system utterly demolished players' expectations. Performing well in matches earned one experience points, which gave the player access to new equipment and abilities. These in turn evolved the gameplay, all while providing a measurable bragging point for the player who'd achieved a higher level than their foes. Since then, it's actually rare to see a game that doesn't in some way employ at least trace elements of an RPG framework that allows a player to, in some manner, customize their experience.

And it goes both ways. MMOs have hit a rut, with World of Warcraft clones hitting critical mass. Star Wars: The Old Republic has, despite the tremendous license that backs it and incredible design from one of the best RPG houses in the world, come up short in sales, with flagging retention rates among players that point to potential implosion down the line. Games like TERA, therefore, have begun to experiment with different methods of gameplay in an MMO, replacing the hot-bar method that has become an unspoken tradition with fast-paced, precise hack and slash combat.

It's not just how the games are made that has changed, though. They're marketed differently too, able to be targeted directly to the consumer, to come from companies large or small, available through channels that no longer require a physical disc to change hands. Indie games on Steam, Xbox LIVE (as well as its arcade marketplace), and the PlayStation Store all come to mind. They're chock full of games that will never be seen on shelves. This has also led to the double-edged sword of DLC and patching.

The Changing Face Of Gaming

Patches have been something that PC gamers have dealt with for decades, while expansion packs have also been available for games in the past, though they were almost always substantial additions priced at a significant fraction of the original game's cost. Systems that support downloadable content—consoles and PCs alike—also open games up for post-release patching, which sometimes leads to developers bringing out a product that isn't wholly finished come release day, only to provide tweaks and additions down the line. On the flipside, DLC can allow developers to extend a player's enjoyment of their game, with more of what made the game great to begin with. It can also be abused, allowing devs to withhold content at release, dump it on disc without the player able to access it legally, and sell it to them later down the line for extra money.

No system is perfect, and the current system that comprises the gaming industry is far from it, but it's interesting to see how it evolves as games become increasingly complex. The precedents being set now, regarding DLC in particular, as to what's acceptable for a developer to do and what steps too strongly on players' toes is likely to influence how that element of design and marketing is handled from here on out. Looking at how far we've come, though, from the humble days of Pac-Man munching quarters at arcades, from Street Fighter II and Turtles in Time to Call of Duty: Modern Warfare and its blockbuster ilk, the medium of gaming is still very young, and constantly shifting. The operative question on most developers' minds, then, is, "Where is it going next?"

And for the few brilliant ones, "Where can I take it?"

By
Shelby Reiches
Contributing Writer
Date: May 18, 2012

*The views expressed within this article are solely the opinion of the author and do not express the views held by Cheat Code Central.*

Comments
blog comments powered by Disqus
X