Skip to main content

How Nvidia's GeForce Experience harnesses meat and machine to make default game settings gorgeous

How Nvidia's GeForce Experience harnesses meat and machine to make default game settings gorgeous

Share this story

Gallery Photo: GeForce GTX 680 vs. Radeon HD 7970
Gallery Photo: GeForce GTX 680 vs. Radeon HD 7970

In April, Nvidia CEO Jen-Hsun Huang told us that the days of painstakingly tweaking PC game settings to get a decent gaming experience could soon be over. With GeForce Experience, he said, the company would use supercomputers to figure out the optimal settings for each game and each hardware configuration, by tailoring settings to your computer. Today, the tool enters closed beta, supporting around 30 games to start. We got to try it out for ourselves this week, and sat down with some Nvidia engineers to see how it works.

2012-12-06_0851

First of all, you should probably know that Nvidia isn't actually scanning your laptop or desktop with a cloud-based supercomputer and magically calculating the results. It's mostly human labor, actually, and it starts with human Nvidia game testers playing through the hottest new games to figure out just how difficult they can be to run and just what kind of framerate is acceptable. It's a human committee of graphics experts that decides just how important each and every game setting (including jargon-filled terms like Screen Space Ambient Occlusion) will be when it comes to the quality of the final moving images.

When you launch the GeForce Experience client, Nvidia's supercomputers merely attempt to match your PC's hardware components to the thousands of combinations that Nvidia engineers have plugged into data centers in Santa Clara and Moscow, and tell you which settings it thinks will give you the best image quality while maintaining that target framerate of 40-60 frames per second (and a 25FPS minimum). To start, the GeForce Experience supports both Nvidia's current-gen Kepler and last-gen Fermi GPUs, both desktop and mobile, but not SLI (save the GTX 690) or stereoscopic 3D. Nvidia selectively disables portions of the desktop GPUs to emulate the mobile ones in its datacenters.

Nvidia GeForce Experience pictures

1/20

When you start up the GeForce Experience, you'll find yourself confronted with an experience a good bit like Valve's Steam. It's a game launcher, with a list of supported games on the left for you to choose from. The software will scan your computer for games, and though it might not find them all right away, we noticed that it does root through multiple hard drives for Program Files, Steam and directories that include the word "game" in their titles. It tells you if your computer meets the game's minimum hardware requirements. It checks that your Nvidia drivers and profiles are up to date, and downloads new ones in the background to save you time.

with the push of a single button, you can optimize all those settings at once

The stars of the show, though, are the two panels on the right. You get a list of every tweakable setting in a game, what those sliders and toggles are currently set to, and what Nvidia thinks they should be for the best results. You get some sample images of the game that point out exactly what those settings actually mean (protip: tesselation rocks). And then, with the push of a single button, you can optimize all those settings at once. Nvidia actually modifies the game's configuration files, so if you want to stop using GeForce Experience at that point, you can still launch the game any other way and have your settings intact.

Now, here's the tricky part. Nvidia doesn't claim that it will have the very best settings for every configuration. If you tweak things on your own, you still might be able to do better. What Nvidia's trying to do here is give you a much, much better experience than you'd get right out of the box, better than the game's default or the game's automatic modes, which often fail to recognize newer hardware and default to terrible least common denominator settings which can look awful and slow things down. "It took a lot of brainpower to be a PC gamer," an Nvidia engineer explained.

2012-12-06_0934

When it works, it works really well. Nvidia showed us Borderlands 2 and Call of Duty: Modern Warfare 3 running on a Core i7 PC with a GeForce GT 660 Ti inside, and by default, both looked pretty bad. Borderlands 2 was running like it was constrained by Xbox hardware, with a fixed low framerate, rough edges, and missing eye candy everywhere. One click made it gorgeous. Modern Warfare 3, as you can see in images above and below, had the global graphics setting set to minimum, such that it was rendering the entire game at a much lower resolution than the 1080p monitor we were using. One click made it better. Mind you, in that case Nvidia was merely correcting a simple mistake that any seasoned gamer could easily find and fix as well, but according to the company, many gamers never bother to look at settings at all.

2012-12-06_0933_001

On a laptop or older desktop, the problem might not always be that a game isn't taking full advantage of all that it can do. Instead, it could be that the game settings need to be balanced just right to work properly. Nvidia showed us Crysis 2 running on a gaming notebook where the default settings were actually turned too far up, and yet the game was running at a 4:3 aspect ratio on a 16:9 monitor, with huge ugly black bars on either side. It didn't run well. One click, and the GeForce Experience lowered the settings, properly stretched out the game across the full monitor, and got a playable framerate in the bargain.

Bf3_2012-12-06_10-31-51-85-560

Unfortunately, it's not all peaches and cream quite yet. As I discovered when we fired up the GeForce Experience on our home desktop and laptop, it's definitely a beta and there's a lot of work for Nvidia yet to do. For instance, when we ran the GeForce Experience on our Acer Aspire M5 review unit, with a GeForce GT 640M LE inside, we were told the computer didn't meet the requirements of any of our games, even the ones I'd run playably in my review. The tool also gave us asinine suggestions like turning down the resolution to 1024 x 768 and pumping up the anti-aliasing.

With several Steam games, the GeForce Experience launcher didn't seem to realize that Steam needed to be running in order to launch games, leading several to hang or crash to desktop as a result. It also seems to use quite a bit of memory at the moment, consuming over 200MB on both my systems, and not automatically closing the app when games are launched. Take a look at Nvidia's suggested settings for Battlefield 3 on the GT 640M LE in the screenshot above (or here), and the game's automatic settings below (or here), to see what Nvidia is up against.

Bf3_2012-12-06_10-32-55-97-560

We're not going to judge a beta too harshly, but the GeForce Experience will depend heavily on things we can't see right now: how quickly Nvidia can add new games, how many games Nvidia adds, and how well it can optimize for hardware configurations that aren't bog-standard. Nvidia promises to support every new Nvidia GPU, and says the department has enough employees to add a handful of new games a month. It depends on developers continuing to make bad (or quickly outdated) decisions about how their games autodetect settings, so that Nvidia can have something to correct. It also depends a little bit on whether gamers agree with the decisions Nvidia makes about what looks good, when the hardware isn't capable enough to handle everything.

This is a huge step in the right direction, though, and we're really looking forward to the idea of just popping a game into our PC and hitting play. We want the "performance of a PC with the simplicity of a console" that Nvidia promises GeForce Experience will be. We're going to keep on testing GeForce Experience... and here's hoping we see something similar soon from rival AMD.