Nvidia GTX295 Quad SLI Page: 1
We were blown away by the improvements Nvidia made with the Quad SLI 9800GX2's we reviewed HERE so we were intrigued to see if quad SLI had been enhanced further with the release of the GTX295. Had the problems of linking four GPU's been ironed out? Was heat still an issue? How do they scale? Is it a worthy purchase. All these questions we wanted to answer and not only that but we wanted to see if the addition of a GTX285 as a PhysX processing unit would add anything to the setup. Call us crazy, call us mad but while we have all this delicious hardware in our test lab it seemed a shame not see how far things would go and report back to you.
The two flagship cards we will be using for this article hail from XFX and Zotac but for all intents and purposes these cards are identical under the skin so we do not expect any difficulties in getting the cards to work in tandem. There have been many discussion and misinformation regarding the use of different manufacturers cards but as long as the family is the same (GTX295 + GTX295) then you are good to go. Even different BIOS's are fine to use together so you could, for example, run an overclocked card with a stock clocked card. In this scenario the stock clocked card is usually best placed in the uppermost slot to prevent any conflict in clock speeds. Both cards will run to whichever card is in the Primary PCIe slot, which for all intents and purposes is where the 'Master Card' (the card which dictates the clock speed) should be positioned. If both cards are capable of the overclocked cards speed then there should be no issue in running the overclocked card in the primary PCIe slot, in effect, automatically overclocking the 'slave card' to the master cards clock speed. Cool eh?
I cannot confirm if this procedure works with the GTX295 as both the cards we have for testing today are stock clocked versions but I have no reason to think it wouldn't work - usual disclaimers apply.
So then, onto Quad SLI which has been around for almost 3 years in one form or another. The OEM 7900GX2 was a behemoth of a card that was shrunk down to the 7950GX2 for retail release. This card, while performing well in some games was ultimately deemed a failure thanks to driver issues and eventually lack of support until, it seemed, Nvidia totally abandoned the whole idea of Quad SLI as well as support for the 7950GX2, much to the dismay of those who had purchased the card(s). Fast forward 18 months and Nvidia tried again with the 9800GX2, which is still a formidable card by todays standards. The 9800GX2 was a massive leap forward in both support and compatibility and proved to the world that Quad GPU gaming was certainly viable. Despite a few minor driver issues, the cards ran flawlessly and even managed to fend off a challenge from ATI's answer to Quad SLI, CrossfireX.
With ATI now in full flow with their own take on Multi GPU gaming, Nvidia appeared to struggle to compete. The 4870x2 was dominating the graphics card scene with no reply from the green camp. Heat and power issues plagued Nvidia's rumoured GX2 successor and it wasn't until recently that the GTX295 was released, five months after the 4870x2, that Nvidia finally had an answer to the ATI dual GPU card.
You have probably read all about the GTX295 in our previous two reviews so I won't bore you with regurgitating the same spiel. It is however safe enough to say the GTX295 was worth the wait. If you game at the highest resolution and you're a big fan of AA, there is simply no faster card on the market today, that is unless you intend to use two of them.
Let's move on to the setup we intend to use for today's article...
Nvidia GTX295 Quad SLI Page: 2
To ensure that all reviews on Overclock3D are fair, consistent and unbiased, a standard set of hardware and software is used whenever possible during the comparative testing of two or more products. The configurations used in this review can be seen below:
CPU: Intel Nehalem i7 920 Skt1366 2.66GHz (@3.835 GHz)
Motherboard: Gigabyte X58 UD5
Memory: 3x1GB Kingston HyperX DDR3 2000MHz @ 9-9-9-24
HD : Hitachi Deskstar 7k160 7200rpm 80GB
GPU: Zotac GTX295 / XFX GTX295
Graphics Drivers: GeForce 182.06
PSU: Gigabyte ODIN 1200w
Setting up SLI was a painless experience. I simply plugged the first card in, installed the latest drivers from Nvidia (182.06), shut down the system and installed the second card. The second card was identified and after a few automatic screen refreshes later the installation was complete. I then went into the Nvidia control panel to enable SLI and to set the PhysX acceleration and went to the device manager to check all GPU's were present (see below) and that was it, job done.
You will most certainly need a powerful PSU to run two of these cards in SLI along with a high end base system. Luckily for us our test rig's PSU, a Gigabyte Odin 1200w was up to the job but Nvidia recommend a 1000W PSU for these cards with good cause as we show below.
Below right I took a shot of the SLI'd cards to highlight the Blue LED which is redundant in single card configuration. While not exactly ground breaking it is a useful add-on that reduces the trial and error when plugging in your primary monitor.
During the testing of the setup above, special care was taken to ensure that the BIOS settings used matched whenever possible. A fresh install of Windows Vista was also used before the benchmarking began, with a full defrag of the hard drive once all the drivers and software were installed, preventing any possible performance issues due to leftover drivers from the previous motherboard installations.
To guarantee a broad range of results, the following benchmark utilities were used:
3D / Rendering Benchmarks
• 3DMark 05
• 3DMark 06
• 3DMark Vantage
• Far Cry 2
• Call of Duty IV
• Unreal Tournament III
Power consumption was measured at the socket using a plug-in mains power and energy monitor. Because of this, the readings below are of the total system, not just the GPU. Idle readings were taken after 5 minutes in Windows. Load readings were taken during a run of Crysis.
As you can see the full system draw of our test rig with two GTX295's is massive. A 'normal' high end test rig is likely to be even higher than this as our rig had no extra fans, hard drives, watercooling etc to power so ensure you have a meaty PSU to power your setup.
Temperatures were taken at the factory clocked speed during idle in Windows and after 10 minutes of running Furmark with settings maxed out (2560x1600 8xMSAA). Ambient temperatures were taken with a household thermometer. As we use an open test bench setup consideration should be given to the fact that the temperatures would likely increase further in a closed case environment.
Surprisingly, adding a second GTX295 did little to affect the temperatures of the cards. I should state however that the noise output of the cards, even at idle, was clearly audible. On full load the cards were noisy enough to become distracting but this is outside of a case environment so it is likely the cards noise would be subdued somewhat in an enclosed space. I would still recommend a good set of headphones or speakers to drown out the whooshing noise nonetheless.
Let's move on to our suite of benchmarks where we pitch it up against the ATI 4870x2 and XFX GTX295 in our full suite of GPU benchmarks...
Nvidia GTX295 Quad SLI Page: 3
3DMark did not really show the effectiveness of the SLI setup as we were clearly CPU limited, especially at low resolutions. 3DMark Vantage showed incredible scaling but consideration has to be made to the the fact that PhysX acceleration was enabled which had a dramatic effect on the results. A massive 33k was attained with 3DMark Vantage, testament to Nvidia's PhysX processing power. Sadly, similar results could not be achieved with it's forbear, 3DMark 06.
Let's see if this transfers over to our real world gaming benchmarks.
Nvidia GTX295 Quad SLI Page: 4
Unreal Tournament 3 is the highly anticipated game from Epic Games and Midway. The game uses the latest Unreal engine, which combines fast gameplay along with high quality textures and lighting effects. All benchmarks were performed using UTbench with a fly-by of the DM-BioHazard map. As usual, all benchmarks were performed 5 times, with the highest and lowest results being removed and an average calculated from the remaining three.
BioShock is a recent FPS shooter by 2K games. Based on the UT3 engine, it has a large amount of advanced DirectX techniques including excellent water rendering and superb lighting and smoke techniques. All results were recorded using F.R.A.P.S with a total of 5 identical runs through the same area of the game. The highest and lowest results were then removed, with an average being calculated from the remaining 3 results.
Call of Duty 4 is a stunning DirectX 9.0c based game that really looks awesome and has a very full feature set. With lots of advanced lighting, smoke and water effects, the game has excellent explosions along with fast game play. Using the in-built Call Of Duty features, a 10-minute long game play demo was recorded and replayed on each of the GPU's using the /timedemo command a total of 5 times. The highest and lowest FPS results were then removed, with an average being calculated from the remaining 3 results.
Again we see some CPU limitation at low resolution, especially in Unreal Tournament III but as the resolutions were increased along with the AA/AF, the SLI setup came into its own, scaling extremely well. Bioshock was particularly impressive with the SLI setup almost doubling the FPS at our highest resolution. Call of Duty 4 showed impressive gains across the board too with over a 33% increase in frames per second.
Let's move on to come more challenging titles...
Nvidia GTX295 Quad SLI Page: 5
Crysis is without doubt one of the most visually stunning and hardware-challenging games to date. By using CrysisBench - a tool developed independently of Crysis - we performed a total of 5 timedemo benchmarks using a GPU-intensive pre-recorded demo. To ensure the most accurate results, the highest and lowest benchmark scores were then removed and an average calculated from the remaining three.
from Bethseda is now an 'old' game by today's standards, but is still one of the most visually taxing games out there. The benchmark was run in the wilderness with all settings set to the maximum possible. Bloom was used in preference to HDR. The test was run five times with the average FPS then being deduced.
has developed a new engine specifically for Far Cry 2, called Dunia, meaning "world", "earth" or "living" in Parsi. The engine takes advantage of multi-core processors as well as multiple processors and supports DirectX 9 as well as DirectX 10. Running the Far Cry 2 benchmark tool the test was run 5 times with the highest and lowest scores being omitted and the average calculated from the remaining 3.
Again we see some good gains were to be had across our range of games. Crysis was particularly impressive, especially at high resolutions. This game is usually a slide show at high resolutions with AA applied but the SLI rig cut through it like a knife through butter. Between a 30-40% gain in frames per second is certainly nothing to be scoffed at as the above games are renown for being FPS killers in the past but the SLI setup, even when pushed to the limit, were not phased by the challenge.
Now it's time to have some fun...
Nvidia GTX295 Quad SLI Page: 6
Balls to the Wall & Fanny to the Floor
Recently I have been asked to overclock the system as far as it will go in the reviews and while thats not really appropriate in most situations, this time I believe it is justified so in answer to your prayers, here's the 'Balls to the Wall' section of the review.
Although the setup would quite obviously clock higher on water and indeed extreme cooling, I clocked this on air, with no hardware mods or software tweaks other than using the BIOS and Rivatuner. This should hopefully give a fair representation of what to expect on a standard air cooled setup but I would be surprised if someone who spends this amount of cash on a PC would not invest a little more in watercooling.
Never the less I'll see what I can do:
Using the Gigabyte EX58 UD5 motherboard, I managed to overclock the OC3D test rig i7 920 CPU to 4000MHz. This increases to 4200MHz during load (Turbo Technology) which should give the GPU's adequate headroom and minimise any CPU bottlenecks. I say hopefully as we are talking about a serious amount of GPU power here.
The GPU's would not clock as high together as they did when clocked alone, with a measly (in comparison) 651 MHz on the cores and 1202 on the memory being the maximum semi-stable clockspeeds I could reach. We could have flashed the BIOS to EVGA versions and then used 'Precision' to increase the GPU voltages over stock. This would have allowed the cards to clock in excess of 700MHz but as they are not owned by OC3D, I didn't relish the thought of killing the cards, explaining to the editor and no doubt paying for replacements thereafter!
During testing, I had the hair brained idea of adding a further card to run PhysX on its own and leaving the Quad SLI setup to deal with the graphics. For this, I intended to use the fastest single GPU available in the form of the Nvidia GTX285.
Vista picked up the GTX285 and installed drivers and sure enough it showed up under devices and the NVidia control panel. I duly set the 285 as the PhysX enabled card and ran a quick Vantage test. Everything worked a treat but even with the GTX295's overclocked and the CPU running at 4200MHz, Vantage returned a relatively weak score of 32k, little more than when I ran the cards at stock.
Despite repeated tests it was clear that the system was slower with the GTX285 in-situ than without. This is most likely owed to the fact that when 3 cards are used on the UD5 two of the slots are reduced to 8x PCIe which then throttled the GTX295. So, taking the GTX285 out I ran another test and the score was back up to what you see below.
It was a shame we couldn't breach the 40k mark but I am under no illusion that the cards are easily capable of that under the right conditions. With the voltage tweaks available under the EVGA banner, the extra MHz allowed would certainly make a difference as the scaling we have seen this far with the raw grunt of the cards is nothing short of amazing.
No LOD bias or other 'cheat' tweaks we used to attain the Vantage score, just the overclocks applied to the card. I would like to have shown that adding a third card for PhysX was worthwhile but with the power of 2 GTX295's there is really no benefit at all and in our case, it was actually detrimental to performance of the system. I never thought I would say adding a £300 GPU would not increase performance!
Call of Duty 4 saw some improvements at higher resolutions, thanks in part to the extra 400MHz CPU clockspeed allowing the GTX295's more breathing space. Interestingly though, at the lower resolution FPS test actually decreased on average! I still feel that the tremendous amount of raw GPU power was still being held back with our CPU but alas, 4.2GHz was all our poor i7 had to give.
Well that about wraps up the extended overclocking section. It would have been nice to have some pots and LN2/dry ice to push things to breaking point but that would give the reader an unrealistic view of what was easily attainable. Still, it would have been fun seeing the editors face while I attach hulks of copper to the £400 GTX295's and pour liquid nitrogen in them.
Let's move on to the conclusion where I explain the pro's and cons of owning a GTX295 setup....
Nvidia GTX295 Quad SLI Page: 7
To say I was in awe at the benchmark results is an understatement. Although the scaling was nothing groundbreaking, it was still very good and made a significant impact on the results. The setup simply tore through everything we threw at it and while I didn't expect the setup to struggle, I was impressed with the ease of which it played every game, even with the settings maxed out.
As with all power, there should come responsibility. If you intend on buying this setup but running it in tandem with a CPU at its stock clocked speed then don't. Two cards would be a complete waste of money for such a system, even the range topping i7 965 chips do not supply adequate CPU cycles to 'feed' the cards,. The benefits of a Quad SLI setup only begin to show when CPU clockspeed is increased dramatically over stock.
I would also be aghast at anyone wishing to purchase and run a setup such as this with anything less than a 24" monitor. In the majority of benchmarks we ran, the cards were quite obviously CPU limited at lower resolutions. At times even 1900x1200 showed signs of being limited, so one might even suggest Quad SLI is only for the owners of 30" panels. While I wouldn't necessarily agree with that statement, I would say that if you intend on gaming at high resolutions and like to apply all the trimmings without worrying about trivial things such as frame rates and electricity bills then Quad SLI should certainly be on your shopping list. How you pay for the setup will depend on which bank will offer you a re-mortgage, such is the cost of ownership.
- Unparallelled performance
- Easy setup
- E-Peen glory
- Noise might be an issue for some
- Power consumption certainly will be.
- Even Members of OPEC will cringe at the price.
Thanks to Zotac and XFX for providing the GTX295's for todays review. Discuss in our forums.