Are Games Better Now Than They Used to Be?

In the last two weeks, Pillars of Eternity has occupied no fewer than sixty-hours of my time, without counting the time spent thinking and writing about it. This is, by a wide margin, the most I’ve played a single player game in years. It’s also, perhaps, the most I’ve enjoyed a single player game in years.

And yet, Pillars of Eternity is a game that simply resurrects an old genre rather than being any meaningful innovation upon it. During all of those hours of play, there was little in the game that I could point to and say “that’s new” or “I’ve never seen anything like that before.” Pillars of Eternity ultimately sticks to an existing, even outdated, formula and executes it well.

Old is new again in Pillars of Eternity

Granted, there’s a certain itch that it scratches in this particular gamer, and it’s by no means everyone’s cup of tea. But the fact that I enjoyed it as much as I did got me thinking about the progression of games with time, and whether it can be called a true progression. Specifically, it had me wondering whether games today have made real advancements in game design upon games from earlier generations.

Naturally, in the technical areas of graphical and audio fidelity, games have improved by leaps and bounds. One cannot return after a long absence to even brilliant games of the early 3D era such as Ocarina of Time or Final Fantasy VII without being struck by the blockiness of their models or the eyestrain-inducing blurriness of their textures. Similarly, where games once featured simplistic (if sometimes wonderful) MIDI soundtracks, today full orchestral scores have become common. By and large, it’s hard to argue that these aren’t improvements.

But a technical comparison focuses only on what might be called the craft of game development, and not its art. It’s comparing The Empire Strikes Back with The Attack of the Clones and remarking only upon the visual improvements made between them. Visual quality is important, and it may well enhance one’s enjoyment. Advancing graphical quality may even be an art form in its own right. But such a limited comparison ultimately neglects what makes the medium of video games an art form.

Technical superiority doesn’t necessarily make for a better product

And not all such advancements are necessarily improvements. I have a hard time believing, for example, that having voice actors speak every line of dialogue in modern games is any more of an improvement over older text-heavy games than movies are an improvement upon novels. Indeed, in many cases, it seems the ubiquity of voice-acting and its associated expense has led to a decrease in the depth and complexity of character interaction in modern RPGs.

But the question of advancement isn’t merely about narrative. In my view, there’s no less art in the design of Sim City than there is in BioShock. And that comparison gets to the heart of the question: Has the design of games ultimately improved?

Is there a better designed side-scrolling action game than Super Metroid or Castlevania: Symphony of the Night? Is there a better designed fighting game than Street Fighter II or Virtua Fighter? Is there a better designed first-person shooter than GoldenEye, Half-Life or Counter-Stike? Is there a better designed real-time strategy game than StarCraft?

In my view, most of these games are clear improvements on their predecessors. They not only defined their genres but realized their potential. But when a game realizes its genre’s potential, where do you go from there?

Is there a better RTS than Starcraft: Broodwar?
Is there a better RTS than StarCraft: Broodwar?

Perhaps you go to new genres. You create Minecraft or DotA. But can games get better in meaningful ways within existing genres? Or is the progression of modern games simply a function of technical improvements, changing subject matter, and additional complexity and features? Is it only more and different rather than better?

Consider, for a moment, the first person shooter genre. When Halo: Combat Evolved popularized automatically regenerating health, it was widely copied and considered an advancement in the FPS realm. But years later, genre staples, such as Counter-Strike: Global Offensive, Team Fortress 2 and Left 4 Dead have simply done away with it, returning to the older standard of health being recoverable only by special items and abilities–or not at all.

There was never anything particularly better about regenerating health, even perhaps within the games that used it. It was simply different, something new.

With Halo, combat evolved. Or did it?

So what if the future of existing genres holds no better? And what makes a game better in the first place? That, of course, is a very hard question to answer.

Certainly, modern user research methodologies have ensured that our games are better designed for use by actual humans, but it still doesn’t answer whether the underlying game design as a game has improved. You can only test and iterate once you have something playable, and user research is unlikely to change the soul of the game. Perhaps there is no better.

But it’s hard not to feel that history has indeed shown us better. For my money, StarCraft was simply better than any real-time strategy game at its launch, bringing asymmetric gameplay while perfectly integrating resource collection, multiple research paths and intense unit skirmishes into relatively quick matches. Street Fighter II built the competitive fighting game scene where all its predecessors had failed by perfecting input controls, developing the combo mechanic, and bringing a multitude of diverse yet (roughly) equally strong characters that players could battle against each other.  Counter-Strike: Global Offensive, which is the current king of competitive first-person shooters, features only minor differences in gameplay from its 1999 predecessor, Counter-Strike. 

There are too many examples of genre defining games to list here from earlier generations, but very few in the past decade.

Street Fighter II defined its genre
Street Fighter II defined its genre, and continues to be played today

So have video games achieved most of what they can, or is there significant room for growth?

Of course, that conversation is hardly new. Much hand-wringing has been done about whether the industry has found the potential of video games as a medium: of whether video games have had their Citizen Kane.

But what if early system-simulation games like Sim City or Civilization have already found that potential? Ian Bogost made a similar argument in The Atlantic recently, arguing that such system-based games are better than character-focused games because they do what only games can: teach people to see the world as complex systems. I disagree insofar as I happen to believe that there’s a great deal that narrative-driven games can do that other media cannot, but I agree that system-simulation games are one of the strongest examples of design as art.

Gaming's Citizen Kane?
Gaming’s Citizen Kane?

So what if gaming’s Citizen Kane has come and gone, forgotten and unidentified because we were looking too hard for beauty in storytelling, when in reality it was the beauty of urban traffic or the progression of research upgrades? What if there’s really nowhere to go from here, except to improve our technology and tell different stories and model different things?

What if there’s no better to be had, unless it comes within nascent genres?

I think it’s an open question, and I’m not myself convinced that it’s the case. I certainly hope it’s not.

But sometimes when I play a new critically acclaimed game, it’s hard not to feel like I’d rather just go play Final Fantasy VII.


Active comments on PC Invasion: