Pop culture obsessives writing for the pop culture obsessed.

When it comes to video games, 4K is more than a buzzword

Screenshot: Assassin’s Creed Origins/Ubisoft

It took switching between two feeds of the exact same scenes from two different consoles to convince me that 4K was something more than the next needless step in the game industry’s endless march toward the unreachable promised land of bigger and better graphics. As I sat there flipping between the PlayStation 4 Pro and Xbox One X incarnations of Bayek of Siwa, the star of Assassin’s Creed Origins, the benefits of this resolution arms race finally started to become clear. Yes, Microsoft has turned 4K into a flashy buzzword to hypnotize people into dropping another several hundred bucks on an upgraded version of a game console they might already own, but the extra horsepower of its fancy new machine illuminates a problem that never fully occurred to me before giving it a try: Modern console games look better than ever, but they’re getting harder to see.

Not to the point that they’re incomprehensible or ugly, but the widening gulf between what big-budget games are being designed to look like—best exemplified by the pristine visuals pumped out of high-end computers—and the scaled-down iterations hitting consoles has left Xbox and PlayStation users staring at blurrier compromised versions of the same games. For most people, myself included, it’s not something that matters. After all, if you’re not actively comparing Wolfenstein II coming out of a PlayStation 4 to Wolfenstein II coming out of a $1,000 computer, the difference isn’t something that would even cross your mind. Aside from some slight technical advantages that only the hardest of hardcore tech fetishists care about, it’s still the same game, with the same story and frenetic action.

It’s also still going to be a surprisingly punishing game. I’ve written a bit about Wolfenstein’s difficulty here in the past, but something that didn’t come up in that confessional was the occasional niggling feeling that, in addition to being bombarded by Nazi gunfire from all sides, I just plain wasn’t able to see everything as well as I thought I should have. This comes partially down to art and level design—turns out, Nazis always wear gray and black and tend to hang out in matching gray and black army bases—but when games this overwhelmingly rich in visual detail get scaled down, the slight difference between a high resolution and the highest resolution starts to make matters worse. The less pixels that are being thrown on the screen to help render these lavish environments and characters, the less clear everything becomes and, if you’re like me anyway, the more likely you are to catch yourself squinting in an attempt to make out distant enemies or special surfaces that are only differentiated from their surroundings by slight changes in color or texture.


Once I started noticing it in Wolfenstein and Assassin’s Creed, where the Xbox One X’s bump in horsepower makes a visible difference in the legibility of an image even when compared to the PlayStation 4 Pro, my mind was flooded with memories of momentary frustrations and straining my eyes over hard-to-read scenes. It doesn’t help that modern blockbuster games are so overstuffed with graphical details while adhering to homogenous, muddy color palettes. When you combine all that busyness—props and characters and sets and foliage, all with dense textural definition of their own—with the blurring and fraying that’s being done to these games so that our current consoles can run them, the resulting image is just flat-out harder to discern. It can be enough to give anyone a headache, and it’s even more damaging for visually impaired players who already wrestle with accessibility problems.

Games don’t operate with the same visual language and focus as TV or film. In those media, small visual details certainly matter for building a scene and an artistic identity, but they are an integral part of what it means to even play a video game. Developers have been struggling with how to deal with that ever since games transitioned into 3-D. Those early years were a mess, with digital worlds looking like living smears and the ability to see more than 10 feet in front of you being a luxury. It’s not nearly as big a problem now as it was in the Nintendo 64 era. Technology has advanced and developers have learned how to subliminally guide players to their goals, but the growth in size, detail, and complexity of games just presents more challenges for visually communicating everything players need to keep track of. It’s why so many games over the last decade adopted contrived super-vision modes that highlight important objects and repaint the world in easily divisible colors. And it’s why even the modest bump in clarity I saw jumping over to an Xbox One X felt so impactful.


This isn’t to say Microsoft’s new premium console is some sort of miracle machine that renders games in a form that was previously unimaginable. Like the PCs that have been able to pump out fully 4K visuals for years now, the improvement is a slight one, a smoothing out of the kind of rough edges you only notice if you’re really looking for them. Is it a reason to run out and spend at least $1,000 on a new TV and game console? Hell no. Is it such a drastic difference that it makes the games running on the older versions of these machines look like years-old trash? Also a big no. It’s a small step forward, but that step was enough to open my eyes to an issue I hadn’t even realized was plaguing modern games, and it happens to be an issue that, at least for me, has had an actual adverse effect on the physical act of playing. It might not be worth the upgrade, but it’s literally easier on the eyes.

Share This Story