The dust has settled on E3, and we now know a lot more about how Microsoft hopes to revive its fortunes with the console formerly known as Scorpio. Like the PlayStation 4 Pro, the Xbox One X is a souped-up system laser-focused on displaying 4K images. You can argue over the degree to which either console produces “true” or “native” 4K, with Microsoft holding the technical edge, but I think such discussions miss the point. 4K is simply the wrong target in the first place.
The Xbox One X and PS4 Pro are unusual devices in that they provide significant power improvements without breaking compatibility with the existing Xbox One or PS4. Previously, console power upgrades were restricted to generational shifts — the PS3 that the PS4 replaced in 2013 ran on the same hardware as the one released in 2006. But the shift away from exotic components to the PC-style x86 architecture found in current consoles means it’s much easier to give them linear upgrades within the same generation.
This could be awesome if done properly — it’d mean you’d always have the option of buying modern hardware, or you could save money by buying the original model. But the way the PS4 Pro and Xbox One X have been designed and promoted is anything but inclusive. By focusing on 4K output, their hardware is wasted for a large majority of potential customers. 4K resolution requires a huge amount of power to render in real time, and the benefits are dubious even if you are one of the few with a compatible TV.
Most games on the regular PS4 and Xbox One run at 30 frames per second in 1080p resolution, or close to it. On the PS4 Pro and Xbox One X, you’re mostly getting the same thing just rendered with more pixels, regardless of what TV the console is hooked up to. Even if you have a 4K TV and are looking for a way to make use of it, this feels like the wrong way to spend the silicon.
What would be the right way? As it turns out, a platform does exist that both offers more power and lets you choose how to use it: the PC. And if there are any PC gamers out there who attempt to run games at 4K, 30 frames a second, and with Xbox One levels of graphical detail, well, I’m yet to meet them. In my experience, most players on PC consider 60fps table stakes and will tweak settings like texture resolution and shadow quality in order to achieve it — or even higher frame rates.
I’m one of them, and I actually had to make this choice a couple of months ago when shopping for a new monitor. (As an aside, it’s really hard to find good PC monitors!) I’d narrowed it down to two options, seemingly the only 27-inch IPS G-Sync models available in Japan: Asus’ PG279Q and PG27AQ. They are more or less identical products, but the former is 2560x1440 at 144Hz (“overclockable” to 165Hz) and the latter is 4K at 60Hz. My PC is powerful enough to play games at 4K, but I ended up going for the 1440p model.
1440p is still a big resolution upgrade over 1080p, but it doesn’t require nearly as much processing power as 4K. And it comes with benefits of its own: 4K monitors are limited to 60Hz right now, but you can get more than double the frame rate at 1440p. G-Sync is a huge game-changer here — it matches the monitor’s refresh rate to your GPU’s output, meaning that you get smooth, tear-free output while displaying every single frame your PC is capable of processing each second. And it’s honestly transformative — fast-paced games feel almost surreally responsive to the point where it’s very hard to go back. (AMD has similar monitor technology called FreeSync, and Apple made “ProMotion” adaptive refresh rates the headline feature of its new iPad Pro.)
But all this talk of 144Hz is probably in the weeds when the PS4 Pro and Xbox One X aren’t even targeting 60fps most of the time, outliers like racers and fighting games aside. My personal opinion is that 60fps makes a vastly bigger difference to the actual experience of playing games than 4K resolution — just look at Microsoft’s own Halo 5, which appeared to have been entirely designed around this principle — and I would be happy to buy updated PS4 or Xbox One models that focused on this aspect of performance. Unfortunately, that’s not what we’ve got.
And for 1080p TV owners, it’s disingenuous to suggest that these products will deliver a notably better experience, as Microsoft’s Dave McCarthy did last week at E3. “I wouldn't say from a 1080p TV perspective you're going to be all that disappointed either, right?” he told The Verge. “I mean, you have automatic supersampling from the Xbox One X to your 1080p TV. It's still going to look pretty damn amazing.” If you’re not familiar with supersampling, it basically means rendering the image at a higher resolution than your screen can display. It can improve picture artifacts like aliasing, but it’s a blunt approach to boosting image quality that makes very little sense for the hardware. It’s not a choice I can imagine many PC gamers with 1080p monitors making when they still have headroom to improve graphical effects or frame rate.
To some degree, Microsoft and Sony have been restricted by their original console designs. Both new systems are still built around low-power Jaguar CPU cores, originally used in mid-range laptops, and while the Pro and X’s boosted GPUs are helpful for rendering higher resolutions, the relatively weedy CPUs are likely to limit the degree to which framerates can be increased. On a TV, you’d have to hit a solid 60fps if you wanted to avoid torn frames above 30Hz, and that may be a stretch for many games even on the Xbox One X. Could Sony and Microsoft have made more fundamental improvements to their CPUs as well? Maybe, but almost certainly not without significant implications for compatibility.
But that’s not to say that 4K is the only way to improve visuals. PC games at 1080p look much better than PS4 and Xbox One games, owing to the better effects made possible by more powerful GPUs. And even if you do have a 4K TV, the biggest difference you’ll see will come from HDR, not resolution — a feature already possible on the cheaper Xbox One S and regular PS4. The PS4 Pro at least has a good reason to push more pixels if you own a PlayStation VR headset, where the extra resolution really can make a tangible difference to image quality, but this E3 Microsoft downplayed the prospects of VR on its console platform.
To be clear, neither Microsoft nor Sony are mandating that developers work on 4K output — studios are free to use the extra power to deliver better 1080p performance if that’s what they want to do. But the design and positioning of these systems makes it a lot easier and more desirable to concentrate on resolution at the expense of all else. It’ll be riskier and probably more time-consuming to work on a pristine 1080p Xbox One X release when Microsoft has pushed the “native 4K” message so strongly, even if ultimately it would make a more noticeable difference to consumers.
It’s getting harder to buy a non-4K TV these days, and it makes marketing sense to cater to people who don’t have much content that can give their new sets a workout. But I worry that the focus on resolution above all else is going to hold back game development overall. Sony clearly overpromised when it made 1080p a selling point for the PS3, and the vast majority of developers ended up targeting 720p on that system and the Xbox 360. This time the resolution bump is far less profound, yet we’re met with hardware seemingly not designed to chase after anything else.
I used to spend the vast majority of my gaming time on consoles, and I would have been very open to picking up more powerful versions. But I just don’t see the value proposition that Microsoft and Sony are putting forward here for most people. I hope one day we see the PlayStation 5 and Xbox Two come out with designs that focus on performance, not pixels. Until then, though, I think I’m going to be getting a lot more use out of my PC.