The Graphics Parity Dilemma

Nov 24, 2014

As someone whose passion is covering the video games industry, I am always eagerly looking forward  to the storylines and trends that take hold of  a generation of consoles. It’s been pretty hard to miss that a growing trend is graphics comparisons between PC, PS4, XBox One, and sometimes, the Wii U. When 3rd party developers release a game on multiple gaming hardwares, it’s only natural to wonder who ended up with the best version.  Games like Metal Gear Solid V: Ground Zeroes saw the PS4 with visual edge over the Xbox One, boasting a resolution of 1080p compared to 720p, respectively. And really, it seems that nearly every 3rd party developed game has at least a higher resolution on the PS4 than Xbox One.

Over time and to the delight of many PS4 owners, this scenario has continued to repeat. However, in the last couple of months we’ve seen a possible growing trend of developers choosing to level the playing field by offering the same resolution and frame rate across all consoles that their games are developed on. We see this trend in the latest Assassin’s Creed Unity running at 900p and 30fps and with Rocksteady announcing that Batman: Arkham Knight will feature the same resolution and frame rate across each system it is being developed for. As a result, many next-gen adopters have become a bit frustrated with this trend. But is this such a bad thing? First, let’s address the validity of  many console owners’ concerns.

Rocksteady Studios wants parity. Did they ask Batman?

Rocksteady Studios wants parity. Did they ask Batman?

When Rocksteady made its announcement, Twitter timelines were flooded with frustrated PS4 owners, who felt like 3rd party devs were preventing them from playing games that harness the full power of their console of choice. Granted, it’s frustrating to see a “next-gen” game not be able to boast the top visual output and fluidity of gameplay. Are you really telling me that these $400 and $500 dollar consoles can’t both produce the same resolution? Plus, why should the PS4 be punished for the learning curve that is seemingly required in developing a game for the Xbox One?  That is a legitimate worry and complaint. Still, being that were are only one year into this generation’s life, this feels like a knee-jerk reaction. It might help if we look at this from the developers point of view.

Product exposure is very limited in media – whether that is tv, internet, or even social media. There is only so much time that a developer has access to in order to broadcast to the consumer what makes its game so special. On top of that, we’re talking about a plethora of third-party developers competing for our attention. So, for example, when a top video game website’s main article is about the new Assassin Creed game’s graphical comparison between next-gen consoles, it becomes even more difficult for Ubisoft to communicate what the game is and why we should be buying it. Graphic parity eliminates this risk and forces those of us who cover video games to focus on the deeper qualities of a game and not just how it looks.
900p still looks great, if you ask me.

900p still looks great, if you ask me.

The truth is, this industry is always changing. As developers become more familiar with these new consoles, obtaining a higher, consistent resolution will be more possible. And let us remember, a “next gen” game means more than just 1080p/60fps. It means bigger worlds and more things going on in those worlds. Remember all of the various scenarios and worlds represented in Bioware’s Mass Effect trilogy, and imagine how much further those elements will be pushed in future Mass Effect games. If games like that require less focus on 1080p and more focus on other things, does that still fit the mold of what we envisioned as “next-gen”?
A good solution for gamers is to be patient and see where this trend goes. If it goes where you are not pleased, you still have the most powerful tool of all – your wallet. If you’re not happy with the changes in the industry, you don’t have to buy into those changes, and developers will hear you loud and clear when you aren’t buying their games anymore. So ask yourself: “What is “next-gen” to me and what will I do if the industry disagrees with my definition?”