We're at a place in PC gaming where buying a top of the line video card is starting to look interesting again. A few things made that happen. First, 4K monitors finally became a reasonable purchase for desktop users, with the release of 60Hz IPS panels like the Dell 2715Q I've been using. 1080p 60Hz gaming doesn't require a $500 GPU, but the horsepower is welcome when gaming at 4K or 1440p at 144Hz. Second, we know that impending virtual reality gaming on the PC is going to require fast graphics--90Hz is the baseline for both Oculus and SteamVR, and we're expecting displays of at least 1080p from both. For high-end gamers, performance is a practical need once again; extra frames aren't just for show.
Nvidia's GeForce GTX 980 card seemed to address that need. It's both powerful and power-efficient, thanks to its second-generation Maxwell GM204 core, and its launch was well received by reviewers and gamers alike (aside from the GTX 970's recent memory revelations). And while I still think that the GTX 980 is a great buy for anyone building a new high-end PC, it's no longer the best option available. That title now belongs to Nvidia's new Titan X, which goes on sale this week. I've been testing one for the past week for 4K gaming.
The GeForce Titan X
I'm not going to dive into the deep technical attributes of the Titan X; what you should know that it's on paper a 50% bump up from the GTX 980. There are 50% more CUDA cores (3072 vx 2048), 50% more texture units, and 50% more transistors. Essentially, it's a fully loaded Maxwell GPU (GM200), and Nvidia even packed 12GB of GDDR5 memory in thing for future-proofing. That's more than enough for future ports of next-gen console games (surpassing the PS4's 8GB of GDDR5).
As with previous Titan class GPUs, the packing of so much CUDA cores into a single die offsets the need for a high core clock--Titan X starts at 1000MHz and boosts to 1075MHz, compared to the GTX 980's 1216MHz at load. That's necessary to keep the thermal load at a "reasonable" 250 watts, which is in line with past Titan cards and the power hungry Kepler-class GTX 780. That means that you don't get as much overclocking headroom with the Titan X as you would the GTX 980, which sits at a comfortable 165W TDP. And with the same cooling design as the GTX 980, the Titan X is just as quiet at idle as its sibling, and only very slightly louder at load. Maxwell's efficiencies don't go to waste here. The upshot is that Titan X relies on more cores instead of higher clock speed for performance. It's a scaled up version of the GTX 980's GPU--the largest Nvidia's made so far--to squeeze out frames needed for smooth 4K gaming. It also costs almost twice as much as a GTX 980 at $1000.
So let's take a look at some benchmarks and see what a thousand dollars of video card gets you today.
|Benchmark||GeForce GTX 980||GeForce GTX Titan X|
|3D Mark Firestrike 4K||3011||4063|
|Batman: Arkham City GOTY||49fps||58fps|
|Metro: Last Light (no AA)||31.1fps||36.40fps|
|Tomb Raider (2013)||30.8fps||42fps|
In my time with the GTX 980 and a 4K monitor, I've tried to run most games at native resolution with as many features turned on as possible. Usually I end up having to turn off anti-aliasing completely to get playable framerates, and few games hit that 60fps mark. On my Dell, that means occasional stuttering.
Running the same benchmarks and games on the Titan X yields better framerates, as expected, to the tune of between 20-40% improvement, depending on the game. The thing to note though: games that seem to benefit the least are the ones that were already running pretty well at 4K: Batman: Arkham City, Bioshock Infinite, and DiRT Showdown. Getting that extra 10fps in those games is welcome, and pushes us right up to the 60fps milestone. The games that previously hovered closer to 30fps also get a boost, getting closer to the 40-50fps range. An average of 40fps for a game is nothing to scoff at when we're talking about maxing out settings at 4K, but it's still not what I would consider a comfortable framerate. I'm still going to turn down some graphics features.
A Note on 4K G-Sync Monitors
One reason Nvidia is really pushing Titan X as a 4K-suitable GPU is that they're also heavily promoting G-Sync monitors. G-Sync (which comes with it a small price premium), offers a smoother gaming experience in two ways. It can allow you to play games at 144Hz (which is truly awesome for multiplayer shooters), or can help games running at 40-50fps look smooth on 60Hz monitors. At CES, we saw the first 1440p G-Sync monitor that supported 144Hz--both the GTX 980 and Titan X would pair well with that monitor. On the 4K front, I tested the Titan X on an Acer XB280HK 4K G-Sync monitor, which runs at 60Hz. With G-Sync turned on here, getting 45fps on a game looked almost as smooth as 60Hz on my Dell panel. The advantages are mitigated when I dipped closer or below 30fps, such as in my Metro: Last Light test.
I'm sold on G-Sync's smoothing capabilities, but still hesitate to recommend one of these monitors if you do much more on your PC than gaming. That's because the majority of G-Sync monitors use TN panels, which are OK for gaming, but really not ideal for desktop imaging work. Colors are more washed out and the viewing angles are borderline unacceptable. I have to keep my head fixed at the center of the screen to keep small black text looking good. Acer's XB270HU panel is much more interesting, since it's a 1440p IPS panel that also has G-Sync and 144Hz. But with much fewer pixels than a 4K display, I don't think you'd need a Titan X for smooth gaming on that.
Finally, Nvidia shared more details about its VR Direct initiative, which is their implementation of asynchronous time warp to reduce judder in poor-performing VR games. Maxwell cards will support that as well as VR SLI if you run two cards in parallel. Most of the PC-based VR demos we saw at GDC (including at Oculus, Crytek, and Valve) were running Titan X cards, though likely more for the card's horsepower than Nvidia-specific VR features.