nVidia GTX680 Review
Whereas Fermi introduced the idea of parallel geometry, Kepler has taken it to the next level with 4 main Graphics Processing Control (GPC) units each containing two Streaming Multiprocessors which are further made up of a ton of CUDA Cores.
We'll let nVidia themselves explain :
Inside the new Kepler GPC resides the next generation Streaming Multiprocessor (SMX). The SMX not only provides more performance than Fermi's SM, but does so while consuming significantly lower power.
Most of the key hardware units for graphics processing reside in the SMX. The SMX's CUDA Cores perform the pixel/vertex/geometry shading and physics/computer calculations. Texture units perform texture filtering and load/store units fetch and save the data to memory. Special Function Units (SFUs) handle transcendental and graphics interpolation instructions. Finally, the PolyMorph Engine handles vertex fetch, tessellation, viewport transform, attribute setup and stream output.
The primary focus of the Kepler architecture is power reduction. By having six times the amount of CUDA Cores than we saw on the Fermi, nVidia have managed to double the amount of performance available from each watt drawn. This fanatical dedication to power saving means that the GTX680 often is running beneath the TDP cap. Rather than just let all this potential go to waste nVidia have introduced GPU Boost, whereby the GPU automatically overclocks itself until it hits the 195W TDP limit, giving you the best of both worlds. Lots of performance but delivered in an ecologically friendly way.
It's not only about power though. The newest 300 series drivers have introduced the ability to enable FXAA via the control panel, enabling it in hundreds more games than were previously available. Further they have developed TXAA which takes advantage of the phenomenal texture performance of the GTX680 to bring Pixar levels of quality to the anti-aliasing.
Finally the latest 300 drivers introduce adaptive VSync. Anyone who turns off VSync to eliminate stuttering when the framerate drops from 60 to 30 will have experienced screen tearing. Equally anyone who runs with it on all the time will have noticed the jerking that occurs when the frame rate dips and the VSync is forced to go from 60 FPS to 30 FPS. Adaptive VSync aims to eliminate that by dynamically turning the VSync on and off so that when the frame rate is above 60 FPS you get VSync on, but as soon as it drops below the magic 60 FPS mark the drivers turn VSync off so you get a much smoother experience.