After all the hype and hoopla of last month's E3, you might be forgiven for thinking that consoles are the only way to game. And yet, the stalwart PC continues to defy exaggerated reports of its demise with hugely popular series like Battlefield, Crysis, Diablo, StarCraft, and Total War offering gaming experiences that simply cannot be touched by consoles.
It’s not hard to see why gamers would opt to stick with the trusty old desktop tower: PCs offer the highest resolutions, the greatest customizability, and the added flexibility of letting you play with multiple monitors. It was for these reasons that we built the Verge Gaming Rig in December, stretching a $999 budget into a comprehensively specced machine that could handle pretty much anything at a 1080p resolution.
Today we’re bringing the Gaming Rig back for a review of its performance in the latest games and a comparison against two hot new contenders: the Nvidia GeForce GTX 680 and AMD Radeon HD 7970. Yes, we’ve gotten ahold of the two most fearsome single-GPU cards on the market and we’re not afraid to use them to crush our previous benchmark records. Only question is, if our PC was already a strong performer, what value can a souped-up new graphics card really add? Read on for the answer.

The 28nm contenders

Though it was highly seductive to start off our original Gaming Rig with the best available graphics card and build around it, we held back in December and opted for the extremely good value of the $215 Gigabite GeForce GTX 560 Ti. The chief reason for our restraint was the knowledge that both Nvidia and AMD were planning major overhauls to their graphics card lines, moving them to new architectures and a smaller 28nm process in early 2012. It’s not true that every GPU refresh will be worth waiting for — oftentimes old chips will be cynically "updated" with just new part numbers, particularly with laptop graphics — but when the manufacturing process is being downsized, you should pay close attention. It allows chip designers to fit more transistors into the same space or the same number of transistors into a smaller space, resulting in either more performance for a given power and thermal envelope or lower power consumption when producing the same output.
Performance improvements are just the tip of the iceberg, these cards can now regulate their own clock speeds
No longer content with having to choose between all-out performance and efficiency, both AMD and Nvidia went for major redesigns with their latest generation cards. The HD 7970 and GTX 680 introduce new built-in sensors to detect the card’s power, temperature and utilization, and dynamically adjust clock speeds to match your use — throttling themselves way down when you’re just on the desktop reading a webpage, or squeezing out every available watt when pushed to render all the wrinkles on Max Payne’s face. This new level of intelligence, along with the predictable bump in performance stemming from the move from 40nm to 28nm, makes the current generation of graphics cards a truly compelling one to explore for anyone considering an upgrade. Anticipating the arrival of these cards was the reason we didn’t take the plunge into the high end market of late 2011 — has the wait been worth it?
Installation

Modern graphics cards are basically self-contained computers that fit inside your pc



Before we delve into the experience of plugging the new $499+ graphics cards into our system, a few words are merited on how the Gaming Rig has held up over the past six months. The good news is that all internal components have performed flawlessly, giving us consistently trouble-free operation. If we could redo anything, it would be to choose a higher-quality case. The Fractal Design Core 3000 that we went for originally was a definite budget buy, but we’ve had a number of issues with it, particularly when it comes to swapping components.
The side cover is slightly warped and doesn’t fit correctly, one of the bundled fans developed an annoying whine at regular operational speeds leading us to disconnect it, and the hard drive holders that slot into the provided cages bend out of shape easily. Still, this case does provide ample air circulation, including perforated grilles on its front, back, top, and bottom, and we were pleasantly surprised to see it didn’t turn into a dust haven after its first half year on the job.
The GTX 680 is a high-end card with the space and power requirements of a mid-ranger
As to the new graphics cards, Nvidia’s GTX 680 slotted effortlessly into the spot vacated by our GTX 560 Ti. Like its less powerful elder sibling, the 680 only requires two 6-pin auxiliary power connectors and has a max TDP (Thermal Design Power) of less than 200W. In practice, that means Nvidia’s new flagship GPU has power and heat dissipation demands roughly equivalent to yesteryear’s mid-range card. So you won’t need to buy a massive new case and power supply just to upgrade your graphics.
We had less luck with the Sapphire-branded Radeon HD 7970 from AMD, whose 11.1-inch length forced us to remove one of the hard drive cages in order to accommodate it. Admittedly, Sapphire’s dual-fan cooling solution and over-elaborate heatsink design are not the stock AMD configuration, but you’re unlikely to find a variant of this GPU that won’t have issues with fitting into a regular case. The HD 7970 also falls behind the GTX 680 in a couple of other areas. Firstly, it requires more power, needing a 6- and an 8-pin power connector and having a higher maximum board power of 250W — although that’s becoming less important now that the new generation of cards can monitor and manage their own power consumption — and secondly, its selection of ports is less useful. The GTX 680 comes with two dual-link DVI outputs, one HDMI, and one DisplayPort, whereas the HD 7970 gives you one DVI, one HDMI, and two Mini DisplayPorts. The latter have adapters to DVI and full-size DP, but that’s not as convenient as actually having both on the back of your PC. Finally, all three cards require two slots inside your case, one PCI Express x16 slot for the card itself (the new GPUs support PCIe 3.0) and another kept open for its cooler.

The Games
None of this shiny new hardware would mean much without the software to truly push it to its limits, so it's about time we stopped fiddling with our case and got down to the heart of the matter: benchmarking the games we all want to play.
Below you'll find the average and minimum frame rates we obtained while playing with each card. Quality settings were kept consistent across all tests, but the GTX 560 Ti was only tested on a single monitor, while the two new cards were pushed to three monitors. For the full spec of our base system, check out original PC building guide from December.
Benchmark Legend
GTX 560 Ti at 1920 x 1200
Radeon HD 7970 at 4800 x 900
GTX 680 at 4800 x 900
Batman: Arkham City

52/25

85/51

68/43
One of the few games to resist our attempts to play at max quality and 1080p resolution, Arkham City would only produce playable frame rates on our incumbent GTX 560 Ti when we turn its DirectX 11 processing off. That’s a downer, since DX11 is where the shiniest and prettiest new graphical innovations can be found, but even without them, the game looks very attractive. Needless to say, you can enjoy the DirectX 11 extras with the new generation of cards at that resolution, but what’s truly impressive about them is the ability to expand the non-DX11 gaming experience to three screens. The Radeon HD 7970 and GTX 680 both push a 4800 x 900 resolution with ease when DX11 is switched off, and come decently close to providing playable frame rates with it on: both averaged 29fps at the game’s Extreme setting.
Also, if you’re wondering about how the Gaming Rig might fare with Mass Effect 3, use our Batman: Arkham City numbers as a guide. Both games are built on Epic’s Unreal Engine 3, with the caped crusader title throwing up the tougher GPU challenge.
Battlefield 3

53/40

42/36

48/38
A repeat of our experience with Batman: Arkham City, Battlefield 3 was not smooth enough for our liking with Ultra settings enabled on the GTX 560 Ti, but the new cards fixed that and added the option to play on three monitors with High graphics quality selected. There’s no readily perceivable difference in detail between those two top-end quality settings for this game, so don’t fret about missing out on some infinitesimally small upgrade. The Radeon HD 7970 and the GTX 680 match each other in performance terms, with the Nvidia card also averaging north of 30fps on Ultra mode at 4800 x 900 — it’s just a shame that its 36fps average was spoiled by regular dips down into the 20fps range where gameplay starts to get choppy.
Battlefield 3: now in Ultra detail
Call of Duty: Modern Warfare 3

43/40

DNF

DNF
Although it spent the majority of the 2011 holiday season being compared to Battlefield 3, Modern Warfare 3 is a far less intensive game graphically. This is owing to the fact that BF3 was designed specifically to push PCs until they cry for mercy whereas the Call of Duty devs were content to port the console game over. In any case, you can max out everything in MW3 at 1920 x 1200 with the card already present in the Verge Gaming Rig, and our expectation was that you should be able to expand that same quality to three displays with the HD 7970 and GTX 680. Regrettably, in spite of seeming to support the crazy 16:3 aspect ratio and high resolution, the game stretched itself horizontally when we tried to play it on multiple monitors, leaving you without any tangible upgrade over the already competent GTX 560 Ti (unless you’re willing to invest the time and effort required to figure out a workaround solution).
Crysis 2

44/36

42/33

43/36
Like Modern Warfare 3, Crysis 2 is a game preceded by its reputation, and also like MW3, it’s actually lighter on your system than you might imagine. We were pleasantly surprised to see it maintain perfectly fluid frame rates at its Hardcore graphics setting on our oldie card at 1920 x 1200, which the Radeon HD 7970 and GTX 680 promptly converted into a 4800 x 900 future soldier battlefield. AMD and Nvidia’s new high-end cards produced almost identical frame rates.
The two cards match each other on performance almost every single time
Diablo III

50/37

96/71

52/37
You won’t be shocked to hear that Blizzard’s latest isometric loot parade is handled just fine by our original configuration, never dropping below 37fps at 1920 x 1200. It’s lacking proper support for tri-screen gaming, but we figured out a way to trick it into playing along. Switching the game to its Fullscreen Windowed mode extended it across our three monitors, giving us a truly widescreen view into Sanctuary, but also a few issues as well. Occasionally, large sections of the peripheral displays would, like the gray squares in a mapping application, be left temporarily unrendered, plus enemies would randomly pop up and disappear on those screens. That's fine for Blizzard to do, since the game thinks that bit of rendering is happening off-screen, but it does spoil what's otherwise a pretty fun experience. These problems arose a lot more often on AMD’s card than Nvidia’s, but the entire solution is hacky and imperfect, whatever your graphics card of choice. Overall, the only in-game lag we encountered was caused by the connection to the Battle.net servers; the GPUs' frame rates were never a problem.
Dirt 3

50/41

52/47

60/52
If you need just one reason to justify upgrading your graphics card, it might be the glint of sunlight gently rolling along the roof of your rally car as you glide through the snow in Dirt 3. This game has a wonderful habit of serving up breathtaking visuals like that on a regular basis, which is ably supported by the GTX 560 Ti at our typical single-display resolution. The Radeon HD 7970 and GTX 680 maintain the same standard of maxing out graphical options, only at the even more impressive 4800 x 900 resolution.
If you're not impressed by a single GPU powering a game across three monitors at the same time, you should be
L.A. Noire

30/26

29/23

30/24
A weird 30fps cap on this game confounded our testing somewhat (given that we’re looking for a 30fps minimum frame rate to consider something playable), but the GTX 560 Ti was once again capable of handling the maximum visual quality at 1920 x 1200. Moving to the newer cards and three monitors was no problem in terms of load, however there was an issue in how the game dealt with the extreme 16:3 aspect ratio. Instead of filling out the side monitors with more visual information, L.A. Noire expands your regular field of view to fill them, cropping out the top and bottom of the image. That leaves you with a really tight view unto the game, one which you may consider playable, but not comfortably so.
Max Payne is more grizzled than ever, with every wrinkle lovingly recreated
Max Payne 3

49/33

44/36

41/31
None of our cards were capable of maxing out all the graphics options on this DirectX 11 title at 1920 x 1200, however the last few notches that remain out of reach are, as with Battlefield 3, measuring extremely small upgrades that most people won’t ever notice. Our first attempt at starting up Max Payne 3 with the Radeon HD 7970 in place was met with a crash, as were our subsequent dozen tries until we figured out the problem. AMD’s Catalyst Control Center — the very software you need to configure your multi-monitor setup — was causing some sort of incompatibility that made the game crash repeatedly. Although we were running the latest drivers, that version of CCC was the one shipping in the box with the GPU, and once we wiped it from the machine and reinstalled it, everything worked correctly, leading to an even sweeter 4800 x 900 Max Payne 3 experience at nearly maxed-out settings. Still, this is a major black mark for AMD’s software team. The GeForce GTX 680 kept up in the performance stakes without causing any such headaches.
Rage

48/35

56/50

55/39
John Carmack’s latest contribution to the realm of PC graphics isn’t as daunting as Doom 3, which immediately turned everyone’s gaming rig obsolete upon its release. Rage is pretty, but also smart about where it uses its graphical flourishes, resulting in a balanced game that most people should be able to play at reasonable quality. Our trusty GTX 560 Ti did better than that, keeping consistently high frame rates at almost maximum quality. AMD’s Radeon HD 7970 had no problem matching that performance and adding two further displays, a performance duplicated by Nvidia’s GTX 680.
It's not like the old days when a single game could bring your PC to its knees
The Elder Scrolls V: Skyrim

49/37

45/27

DNF
Guess which superbly popular Bethesda game doesn’t have triple monitor support. Go on, guess. Our default GPU keeps up the trend of playing Skyrim at nearly full quality — once you turn off the memory-intensive antialiasing, you can have all the extras and Ultra options you like — but the HD 7970 and GTX 680 are stuck offering only minute improvements while locked down to a single-screen resolution.
We managed to dig up a forum thread with a solution to this little omission, but it only worked with AMD’s card: you just set your ultra widescreen display res in the game’s config file and, although its menus will appear highly zoomed-in, the actual gameplay will adapt perfectly well to 4800 x 900. You’d be right to recoil at the thought of having to search inside your Documents folder to mess around with unfamiliar files, but that seems to be the state of multi-monitor gaming at the moment, you have to be prepared to hack your way to success.
The Witcher 2

36/28

DNF

DNF
Another no-show for the multi-display support party. As we noted in the original Gaming Rig writeup, The Witcher 2 can be played with almost everything turned on, but its Ultra setting and the brutally punishing Ubersampling remain out of reach — even with the shiny new HD 7970 and GTX 680. Nvidia’s latest GPU comes close with a 32fps average frame rate, which translates to decent, usually smooth gameplay, but on occasions it reaches down to a 20fps rate that detracts from gameplay fluidity. Given the choice between the highest quality setting and a consistently smooth frame rate, we always prefer to ensure That’s actually a limitation of our overall system specification, with the bottleneck clearly lying outside the graphics cards, but the bigger issue here is that a game lauded for its graphical chops can’t be played on three displays.

Don’t let anyone tell you IPS monitors aren’t fast enough for gaming
IPS displays’ slow response times have traditionally made them unsuitable as gaming monitors, but times have changed. The latest generation of IPS panels are now good enough to keep up with fast FPS action without introducing any lag, while their wide color gamut and viewing angles make for the best visual experience you can presently get. Why buy a $499 graphics card and then plug it into a mediocre LCD screen?
Most of our testing was done using a 24-inch LaCie 324i and a 24-inch Dell U2410, both spanning 1920 x 1200 in resolution and offering an attractive, evenly lit picture. If there’s any downside to going with IPS for your multi-monitor setup, it’s that the bezels are not as thin as you can get elsewhere and the displays themselves are a bit bulkier. That, and the price of a complete set of three high-quality displays will likely be eye-watering.

Multi-monitor gaming

Alas, it’s not all smooth sailing
Caveat emptor
The new generation of multi-monitor-gaming-on-a-single-card GPUs is indeed impressive, but if you want to make that $499 (and above) leap, bear in mind that you'll have to upgrade more than just the card itself. You’ll likely need new monitors and don't assume you'll be able to get away with the same old bureau, either. Multi-display gaming requires a space closer to a dining table than a desk. Monitors need to be identical in terms of screen size and resolution — ideally identical, full stop, so that bezel gaps and position will match — and have the thinnest possible bezels to maximize immersiveness.
The overwhelming conclusion from our testing was that the major upgrade when moving to a card in the $499 tier comes from vastly expanding your playing canvas: 1920 x 1200 resolution on one monitor offers 2.3 megapixels of resolution, whereas 4800 x 900 on three raises that to 4.32 megapixels. That’s nearly double the resolution while sacrificing nothing in terms of visual fidelity, a staggering feat achieved by both the Radeon HD 7970 and GeForce GTX 680. We haven’t discussed tri-screen gaming on the GTX 560 Ti we already own for one simple reason: the card doesn’t support it. You’d need to buy a second GPU, to work in SLI mode with this card, in order to power a game across three displays. While that might be a reasonable (and more budget-friendly) solution for most people, we opted to to focus on what the true high-end cards can do, particularly since this 28nm generation is the first to really promise tri-screen gaming from a single card and deliver it.
Alas, it’s not all smooth sailing, as a number of games exhibit either incompatibilities or weird behavior when plugged into a trifecta of displays. You’d do well to make sure that your favorite game supports tri-screen output with the particular graphics card you have in mind before deciding to pull the trigger on making a new purchase. Other games we tested, such as Driver: San Francisco and NBA 2K12, also exhibited problems: the former completely refused to recognize any resolution above the standard single display options, while the latter emulated L.A. Noire in cropping the top and bottom off its field of view to fill the full width of our triple-widescreen setup.
The issue with NBA 2K12 stems from its rather two-dimensional perspective — if the game tries to keep the action in focus on the center screen, when it moves closer to either basket it’ll be forced to draw parts of the crowd that it’s probably not designed to display. We can imagine the same issue affecting football sims (of both the American and European variety), with first-person games like Crysis 2 representing the other extreme in being best suited to a super widescreen view.
Is it worth it?
Triple-monitor gaming is simply amazing. A sensory revelation. You still spend the majority of the time with your eyes anchored on the middle screen, but the sense of atmosphere that comes from the two auxiliary displays is spine-tinglingly good. After a while, you may even learn to look at things with your own neck rather than the mouse. In truth, this is how any sort of visual simulation is supposed to operate. The tight horizontal field of view of a typical monitor, no matter how resplendent its color reproduction may be, is just unnatural — human beings have peripheral vision which has gone (mostly) neglected, and it’s only once you move to this sort of surround view that you’ll understand what you’ve been missing.
Remember, gaming is, in and of itself, a luxury activity. So when you're trying to make a value judgment about money spent on equipment for playing games, the thing you're really measuring is how extravagant you want to be in satisfying your itch to play. You're not weighing up the purchase of food versus booze here, you're just trying to decide how much you're willing to spend to get your fix of intoxication.
Wrap-up
Bottom line: if you don’t plan on gaming on multiple monitors, don’t buy the top-end card

With their new 28nm cards, Nvidia and AMD are making the biggest generational leap we’ve seen in graphics cards for a good number of years. Dynamic overclocking, self-aware power management, and the ability to play almost any game across three displays are all novelties we’ve not seen within one card before. The last item on that list is clearly the only good reason for anyone with a midrange 2011 GPU to make the upgrade. Before you do it, however, think carefully about the expense — the GeForce GTX 680 starts at $499 and you’d need to probably spend just as much on getting the right set of matching, thin-bezel monitors to go with it. On the other hand, if you need extra reasons to justify a tri-screen setup, consider that there are plenty of good uses for it outside of gaming, such as using Photoshop with the peripheral displays dedicated to your tools and palettes.
AMD Radeon HD 7970

The Radeon HD 7970 started life with a higher MSRP than its Nvidia nemesis — it's now actually cheaper, but these prices fluctuate regularly — chews up more power, and needs more space inside your case. It has 3GB of onboard memory, but you can’t tell the difference between that and the GTX 680’s 2GB — with our reasonably specced midrange rig, we encountered other performance bottlenecks before we could push the card to use a video buffer larger than 2GB. The fact the Catalyst Control Center caused Max Payne 3 to malfunction was a real disappointment for us, particularly when the difference in performance nowadays is growing into a matter of drivers and software rather than pure firepower. Our Sapphire variant of this card is also equipped with a noisier cooler than the GTX 680, making the final choice among these two heavyweights a relatively straightforward one.
The Verge Score: 6
Nvidia GeForce GTX 680

Nvidia’s GTX 680 is an incredibly elegant upgrade. Jen-Hsun Huang’s company has taken the power requirements of its new flagship GPU right down to the level of its mid-range cards, while at the same time upgrading performance, features, specs, and capabilities. Basically, it’s exactly what AMD has done with the HD 7970, but better. For our gaming rig, replacing the GTX 560 Ti with the GTX 680 was an absolute cinch, with the new card working quietly and reliably from the first moment to the last. More affordable cards are in the pipeline from both Nvidia and AMD that will offer better value for money, but if you’re keen to get the very best single-GPU card on the market today, this is the one to choose.
The Verge Score: 9

