Quantcast

Nvidia Announces $1000 GeForce GTX Titan Videocard

By Norman Chan

Likely the most powerful single GPU videocard available.

With surprising reports that AMD will stick with the 7000 series GPU (Southern Islands) as its videocard focus for most of 2013, the PC graphics space looks to be pretty uneventful in a year when we're expecting both Sony and Microsoft to launch their next-generation console systems. Nvidia's Kepler family of GPUs remains the best choice if your priorities are performance, power draw, and acoustics, while AMD is continually bundling its 7000 series cards with new games to compete on price. And competitive pricing is one thing Nvidia's newest card, the GeForce GTX Titan, could not care less about. At $1000--the same price as the dual-GPU GeForce GTX 690--the GeForce Titan is an ultra premium card that will only be produced in limited quantities to serve the enthusiast who has to have the best graphics card available. That gamer can even buy three of them for tri-SLI.

Let's go over some basic specs first. The GeForce GTX Titan houses a GK110 Kepler chip that was first built for Nvida's Tesla K20 GPUs used in supercomputing clusters. It's a major spec bump compared to the GK104 chip inside GTX 680: the Titan has 2688 Cuda cores/stream processors, 224 texture units, and 48 ROPs packed into 7.1 billion transistors. It also runs with 6GB of GDDR5 VRAM on a 384-bit memory bus. That affords it the ability to run at a core clock of 837MHz compared to the GTX 680's 1GHz, while keeping max TDP at 250 Watts. That means there's some headroom for overclocking, too.

Other notable attributes include a new cooler design with a wider cooling fin stack and blower on the side of this 10.5" videocard. Also unique to GTX Titan is GPU Boost 2.0, a software controller that automatically raises and lowers clock speed based on temperature, not power draw. Unlike GPU Boost 1 (and AMD's own PowerTune), this allows performance to be correlated with temperature and noise, since the fan spins at a rate based on the temperature of the card. For example, the stock running temp of the Titan is 80 degrees C, and fans only ramp up after the card exceeds that temp (linear ramp up based on temp). That means GPU Boost 2.0 will clock up the card in a game until it reaches that 80 degree ceiling so you get the most frames churned out without affecting fan speed and acoustics. In the software, users will be able to turn that temperature target up to 95 degrees, with a hard shutdown at 105 degrees.

Nvidia is directly manufacturing the GTX Titans to sell to OEM partners (the same ones selling the GTX 690) so don't expect much difference between OEM models. And though performance may be a toss up between it and the dual-GPU 690 (benchmarks come out later this week), deep-pocketed system builders who are adverse to dual-GPU videocard builds but still want to play games maxed out at 2560x1600 may like what the Titan has to offer.