nVidia GeForce GTX 460 768MB GDDR5 Review
Introduction
Published: 16th July 2010 | Source: nVidia | Price: £139.99 |

Introduction
Over the last 12 months, nVidia have been all about one thing - delays. Month after month, I'm sure many of you had waited for the range topping GeForce GTX 470 and 480 graphics cards. Indeed, nVidia were late to the DirectX 11 party but today no one seems to care. The bottom line is that team green have taken back the performance crown and are succeeding in their catch up endeavours. Not so long ago, we also saw the release of nVidia's eyefinity competition, known as 3D Vision and Surround. Step by step, it may seem that light is at the end of the tunnel.
One slight problem though. Not everyone on the market today can afford to drop upwards of £300 on a new DX11 graphics card. With ATi enjoying healthy sales figures from Radeon HD 5670, 5750 and 5770 SKUs, it is about time that nVidia returned with some more competition. Today we are pleased to present to you exactly that; meet the nVidia GeForce GTX 460.
I have lied slightly, in that nVidia already have a "mid range" offering on the market today known as the GTX 465. As my colleague highlighted, the cut down GTX 470/480 graphics card suffered from all of the power and heat based issues of its larger siblings but without any of the performance that made them worthwhile. To make matters worse, it is priced well above the £200 mark, making it terrible value for money. Not good news at all.
This is where the GTX 460 comes into the picture. Like the observant chaps that nVidia are, they went away and reworked the GF100 architecture to offer a native mid range graphics card. That's right, no added baggage and no inflated price tags. I think we're all eager to see what it has to offer so let's jump straight to the specifications.
| Manufacturer | nVidia | nVidia | nVidia |
| Model | GeForce GTX 460 | GeForce GTX 460 | GeForce GTX 465 |
| Stream Processors | 336 | 336 | 352 |
| Memory | 768MB | 1024MB | 1024MB |
| Memory Interface | 192bit | 256bit | 256bit |
| Core Frequency | 675MHz | 675MHz | 606MHz |
| Shader Frequency | 1350MHz | 1350MHz | 1215MHz |
| Memory Frequency | 3600MHz | 3600MHz | 3206MHz |
| RRP | £150 | £180 | £220 |
First of all, we would like to make it clear that there are two versions of the GTX 460. If you decide to save £30 and purchase the 768MB version, not only will you lose out on 256MB video memory, data will be transmitted via a slightly narrower memory interface. It remains to be seen if this causes significant harm to performance but it's safe to say that downgrades such as these are never good news.
Next, I would like to point out the specifications of the GTX 460 relative to the preceding GTX 465. Despite a £80 and £40 price difference, the GTX 460 has just 16 less stream processors but considerably higher memory, shader and core clocks. Is that value for money we smell? Let's find out.
Most Recent Comments
Quote
would also love to see how these compare to a gtx480 when they are in sli
once again guys amazing review and has basically cemented my decision on buying 2 of these

EDIT: there is 2 brands of this card i was reading on futuremark u might want to avoid the palit and gainward ones (palit owns gainward?) they are what seems to be strangly shipping without ANY VRM heatsinks?Quote
It's about time
(I mean Nvidia, not OC3D) 
EDIT. Where are my manners today?
Thanks Mul, and thanks to Tom and OC3D for making this happen. I really look forward to the reviews on here (especially when Bryan is beating the crap out of a graphics card, takes me right back to our 9800/5950 rival days
)Quote|
Originally Posted by name='thestepster'
EDIT: there is 2 brands of this card i was reading on futuremark u might want to avoid the palit and gainward ones (palit owns gainward?) they are what seems to be strangly shipping without ANY VRM heatsinks? |
That said, even out of the box these are good enough.Quote
|
Originally Posted by name='AlienALX'
Notice at the time of release they were the cheapest. They are obviously designed for out of the box users who do not intend to overclock them at all and thus, perfectly adequate. No such thing as a free lunch, as they say, and peformance will cost you.
That said, even out of the box these are good enough. |
I'm not a big fan of Palit any way. They did a version of the 9800GT with a tiny flower cooler and pathetic ram coolers. Again, this is all fine if you intend to keep them at stock but that's where it ends.
IMO the best looking one for playing around with is the Asus. It doesn't look the nicest but it has heatpipes poking out and everything. Looks the best for Overclocking.
Also to mention something pointed out by Mul...
"Due to logistical issues with our original GTX 460 supplier, we had to request a nVidia Reference GTX 460 sample instead. While this is not really an issue, I must warn that there are no pretty photographs of decorated packaging or accessory sets "
To be completely honest mate I like the virgin reference cards. I think they look more mean. Once a company starts putting stickers on them they lose all their charm hehe. Mind you, I'm a bit of a Ratrod flat black man myself, so that may be why. But when the 5770 released I tried everything to get the reference plain ones with the ATI logo on the fan. At least that way you're not paying for some dumb sticker with a semi clad woman accross it lol.
Quote
Quoteyeah there cutting costs kinda looks daft aswell cause the cards not what u would call substantially cheaper
totally agree on the asus being the best non 1gb about and the vmod shows numbers around the 930 core which can differ obviously but thats awsome so thats how i want one of those and illl gladly pay the extra for itQuote
I said this to AMD the other day (where's he been? he's so quiet haha) but the amount of people who actually overclock their systems is very small tbh. So for the most part some one would buy a machine with a 460 gtx in and not care about overclocking/volting and so on.Quote
|
Originally Posted by name='AlienALX'
Yeah definnitely man. Of course to you and I those heatsinks make all the difference.. And yeah, they may only be a few quid cheaper but it all counts when you are a small OEM making machines in batches.. All adds up man.
I said this to AMD the other day (where's he been? he's so quiet haha) but the amount of people who actually overclock their systems is very small tbh. So for the most part some one would buy a machine with a 460 gtx in and not care about overclocking/volting and so on. |
Quote
Would two of these be faster than say, a 470?Quote|
Originally Posted by name='AlienALX'
I'd much rather have the Nvidia OEM card. Best looking of the lot IMO. And you're not paying EVGA £50 to put a sticker on it
![]() |
the asus where as its totally custom pcb and cooler with volt moddin
|
Originally Posted by name='siravarice'
Great review
Would two of these be faster than say, a 470? |
|
Originally Posted by name='tinytomlogan'
As with all of these cards just buy the one that you like the best, the PCB on them atm is all the same. The only thing to consider is voltage tweaking options and coolers.
|
|
Originally Posted by name='silenthill'
Hi tom, If I bought one of these 460 and added it as a physx card to my two 480 do you think it will increase the performance when I play metro with physx on advanced and also would it take the load off one of my 480 because I noticed that the one which is dedicated to physx heats up more than the other one and in general would it be a wise purchase.
|
If you have 2x 480's run them in SLI do not bother with one as dedicated PhysX thats just stooooopid.Quote
I was running a 280GTX at the time and offloaded Physx onto a 8 series onboard. The results were crazy.. As Tom said it would be downscaling. The 480 has more artillery to handle the Physx than the 460 will ever have.
In Arkham Asylum it reccomended a 9800GTX minimum. However, I don't even think that would have done much over a 280GTX tbh. Sure you're offloading it but if the card you are offloading it onto is a lot slower it will hurt performance.
I'll see if I can find the results and do a write up in a bit
Quote
Quote|
Originally Posted by name='tinytomlogan'
But as I said DONT have the 480 as physx just leave them both in SLI
|
|
Originally Posted by name='tinytomlogan'
No mate with 2 480's thats complete lunacy and wouldnt be worth it imho. Metro is just another crysis intended for us to want to buy more and more Nvidia kit. Even with 3x 480's or 2x 480s and one as physX its not that high.
If you have 2x 480's run them in SLI do not bother with one as dedicated PhysX thats just stooooopid. |
best regardsQuote
|
Originally Posted by name='thestepster'
yeah try and pick up an 8800 or 9800 if needs be that will more than likely do the job well enuff
|
What I mean is (sorry for the vague post a minute ago Tom, got a lot on my mind today
) would running, say a 280GTX and a 9800GT and offloading the Physx onto the 9800GT fare any better than just using the 280?I doubt it. The 280 is a far faster card and has Physx. The 9800GT obviously can not shove Physx out the door any quicker than a 280.. But it's finding out if offloading really does speed things up or just get bottlenecked by the inferior card.
Again I think this comes down to mere bragging rights TBH. That friend of mine who got a Classified and three 280s also got a 9800GTX for Phsyx. Sadly he never could put out real world figures as it just failed in the head
Quote|
Originally Posted by name='silenthill'
I suppose it's designed for the future like CRYSIS
|
Higher spec graphics are reserved for future systems.
Olbocks. Mate of mine had SLI 295 and got crappy scores. What they're really saying is
We cannot optimise this code as it is buggy and slow, and we can't be bothered putting in the time and effort so here, have it as it is with a watered down lame ass excuse
R* are known for sloppy code dude. Their games even slow down on the consoles. And GTAIV had no AA nothing. Compare it to Just Cause 2 and you will see a far better example of optimised coding.Quote
the 1gb card seems to bethe best IMO
also i think that some of these cards will hit 1ghz core in a review of the asus card they had it at 960 core,so quite close.Quote
|
Originally Posted by name='AMDFTW'
woooo,i have seen nothing but good things of these,lets hope scan have mine in stock for wednesday,over 100fps max in heaven is very good IMO.
the 1gb card seems to bethe best IMO also i think that some of these cards will hit 1ghz core in a review of the asus card they had it at 960 core,so quite close. |
Quote|
Originally Posted by name='AlienALX'
No such thing mate. That's simply an excuse for sh*t coding. That's the same excuse R* used with GTAIV on the PC.
Higher spec graphics are reserved for future systems. Olbocks. Mate of mine had SLI 295 and got crappy scores. What they're really saying is We cannot optimise this code as it is buggy and slow, and we can't be bothered putting in the time and effort so here, have it as it is with a watered down lame ass excuse R* are known for sloppy code dude. Their games even slow down on the consoles. And GTAIV had no AA nothing. Compare it to Just Cause 2 and you will see a far better example of optimised coding. |
I bought it in 2008 long after it released and didn't bother to update it. Just loaded it on and played. It suffered with constant slowdowns, gray outs on the smoke (IE smoke would lose all of its detail and just turn to gray sheets of colour) and so on.
Worst of all though was the total mess up at the end. I don't know if you have ever finished Crysis? I haven't as my game was impossible to complete

I got to the end on the aircraft carrier, went down into its guts and forgot to grab the TAC cannon. Came back up and out and the door slammed shut behind me. I was then treated to close to two hours of
USE THE TAC CANNON... USE THE F***ING TAC CANNON
Before I finally realised I did not have it and was locked out from getting it, making the game impossible to finish. The only way around it was to reload my save game about two hours before and do it all over again.
See, at E3 and IGN they don't bother to play a game long enough to find all of this out. Firstly IGN reviewed GTAIV by basically copying the 360 review word for word before handing it a 9.9/10. A couple of my computer mags I used to read did exactly the same damn thing. "wow game of the year" etc etc.
Yet none of them noticed it was absolute doggy toffee? Of course they did, they were being paid off. That's the problem with reviewers now days. They're absolutely terrified of biting the hand that feeds. Look at DRIV3R. That got glowing reviews from certain magazines. The reality was it was completely shafted and totally incomplete.
To relieve the situation in Warhead they just cut whacking great lumps out of it. However, compare Crysis' code to say, HL2 or Doom 3 which when they released screamed along like a dragster. Back in those days BFG could say "TURN IT ALL ON !" and actually be safe in the knowing the game would hit over 50fps.
That was all before the dawn of the 360 era. Since then we have been playing lazy hand me down rubbish..
Oh, I also don't rate Crysis much as a game either. Graphically Far Cry 2 is pretty much on par yet you get pretty much double the frames. Shame about the game though.Quote
|
Originally Posted by name='AlienALX'
Crysis on release was totally broken. As in V1.
I bought it in 2008 long after it released and didn't bother to update it. Just loaded it on and played. It suffered with constant slowdowns, gray outs on the smoke (IE smoke would lose all of its detail and just turn to gray sheets of colour) and so on. Worst of all though was the total mess up at the end. I don't know if you have ever finished Crysis? I haven't as my game was impossible to complete ![]() I got to the end on the aircraft carrier, went down into its guts and forgot to grab the TAC cannon. Came back up and out and the door slammed shut behind me. I was then treated to close to two hours of USE THE TAC CANNON... USE THE F***ING TAC CANNON Before I finally realised I did not have it and was locked out from getting it, making the game impossible to finish. The only way around it was to reload my save game about two hours before and do it all over again. See, at E3 and IGN they don't bother to play a game long enough to find all of this out. Firstly IGN reviewed GTAIV by basically copying the 360 review word for word before handing it a 9.9/10. A couple of my computer mags I used to read did exactly the same damn thing. "wow game of the year" etc etc. Yet none of them noticed it was absolute doggy toffee? Of course they did, they were being paid off. That's the problem with reviewers now days. They're absolutely terrified of biting the hand that feeds. Look at DRIV3R. That got glowing reviews from certain magazines. The reality was it was completely shafted and totally incomplete. To relieve the situation in Warhead they just cut whacking great lumps out of it. However, compare Crysis' code to say, HL2 or Doom 3 which when they released screamed along like a dragster. Back in those days BFG could say "TURN IT ALL ON !" and actually be safe in the knowing the game would hit over 50fps. That was all before the dawn of the 360 era. Since then we have been playing lazy hand me down rubbish.. Oh, I also don't rate Crysis much as a game either. Graphically Far Cry 2 is pretty much on par yet you get pretty much double the frames. Shame about the game though. |

The only scenarios where a pad works was something like Fallout 3 where you had VATS. The input lag on the 360 controller is pretty bloody poor too (on the PC) and makes my game stutter when I spin around. Of course it's not the game (though tbh AGAIN Fallout 3 is far from perfect and carried all of the bucket of poo that Oblivion did) but it does work (once you figure out editing the .ini to stop it crashing all the time
)Here. As a better analogy...
High graphics are reserved for future systems.
So wait. Firstly they are saying that they're clairvoyant. How do they know if their code is going to run well on hardware that doesn't even exist? How do they know what the future even holds in terms of hardware?
Think about it like this. If EA or whoever cannot even get their own code to run on say, SLI 8800GTX (the best out at the time) with a C2QEE then man, Houston we have a problem. How do they know how their game is going to react on new technology?
Secondly just take an example of good coding.
Race Driver : GRID. Seriously, how stunning was that game on release? Yet there I was with my lowly 9800GT with it cranked running Vsync and an easy 4XFSAA and 8XFSAA if I wanted microscopic lag with everything on full.
Dirt 2. Again, absolutely hauls ass on a single 5770, let alone adding another one.
And in contrast? The original DIRT. A bit of a dog really eh? DIRT 2 is faster and runs better and looks a country mile apart.
Honestly mush, if our graphics cards were being utilised to the fullest we would have games that would literally make you shoot your load. Now part of the blame is down to the hardware industry as they don't even pause for breath without going on to the next one (it's called greed) and the coders (in their defense) can't keep adjusting their code and engines to suit.
However, the buck stops for excuses when you take into account some one like I.D games or Valve. Both spend the time needed to fully debug their code and make absolutely sure they have exhausted every last effort before releasing it. As thus HL2 absolutely bloody flew along on modest hardware with all kinds of eye candy.
How on earth though could Crysis have made it out the door and pressed onto DVD without any one even realising the ending was totally bloody broken?Quote
I meant it
I'd probably be assasinated by the gaming industry within a month
Quote|
Originally Posted by name='tinytomlogan'
No mate with 2 480's thats complete lunacy and wouldnt be worth it imho. Metro is just another crysis intended for us to want to buy more and more Nvidia kit. Even with 3x 480's or 2x 480s and one as physX its not that high.
If you have 2x 480's run them in SLI do not bother with one as dedicated PhysX thats just stooooopid. |
|
Originally Posted by name='AMDFTW'
do the 4 series fare better with a dedicated physx card or not
|
|
Originally Posted by name='Br1t1shB33f'
Nice to see N(shody)a have made something worth while. Only issue is i think the new ATi 6xxx is to close for this card to gain Nvidia a decent market share in the midrange card market.
|
|
Originally Posted by name='thestepster'
this is prob now the best bang for buck card out there and i predict there's been a lot of ppl waitin on it i personally havent but its came as a nice surprise for me as in an avid folder and ati cards fail supremely bad at this
|
Nvidias problem right now is ATI are in the lead. Nvidia can try and overtake all they like (and probably will !) but ATI have been in a 4-5 month head start and lead dude.
I can absolutely bloody bet that they knew the 460 was coming and already have the counter attack fully in place. Why? because they should be more than ready. Even a simple price drop of the 5850 immediately gives them a better card for the same money. I also promise you that even though the 5 series were very reasonably priced ATI have been charging a premium for them before now because they had absolutely and utterly no competition AT ALL. They said jump, the paying customer asked how high.
Things are very very different now though
Quote|
Originally Posted by name='AlienALX'
Folding isn't something a lot of people do. You can do it with any Nvidia card (so for example a 9800GT).
Nvidias problem right now is ATI are in the lead. Nvidia can try and overtake all they like (and probably will !) but ATI have been in a 4-5 month head start and lead dude. I can absolutely bloody bet that they knew the 460 was coming and already have the counter attack fully in place. Why? because they should be more than ready. Even a simple price drop of the 5850 immediately gives them a better card for the same money. I also promise you that even though the 5 series were very reasonably priced ATI have been charging a premium for them before now because they had absolutely and utterly no competition AT ALL. They said jump, the paying customer asked how high. Things are very very different now though ![]() |
Im happy for both companies to do well TBH... as this will drive prices down and make for some interesting developments.Quote
|
Originally Posted by name='AlienALX'
Folding isn't something a lot of people do. You can do it with any Nvidia card (so for example a 9800GT).
Nvidias problem right now is ATI are in the lead. Nvidia can try and overtake all they like (and probably will !) but ATI have been in a 4-5 month head start and lead dude. I can absolutely bloody bet that they knew the 460 was coming and already have the counter attack fully in place. Why? because they should be more than ready. Even a simple price drop of the 5850 immediately gives them a better card for the same money. I also promise you that even though the 5 series were very reasonably priced ATI have been charging a premium for them before now because they had absolutely and utterly no competition AT ALL. They said jump, the paying customer asked how high. Things are very very different now though ![]() |
to give this one weight the current fastest super computer only does 1.75 petaFLOPS so simple maths shows that the f@h distributed computing networks is 2.8 times more powerfulQuote
Coincidence? I think not.
ATI basically poked the big doggy with a stick. Then they realised the big doggy was locked in a cage so poked it some more. When they realised they weren't going to get bitten they knew they were in full control. Hence the price ramps.Quote
A folding rig is just that. You can't really use it for anything else. Do you really think enough people would selflessly set up an entire PC and then let it eat electricity 24/7 and it be enough for Nvidia to beat off ATI?
Sadly folding has shown how unreliable Nvidia cards are if anything else. I read the Folding pages of Custom PC near on every month and the ammount of people saying "I have had to stop folding because my 2xx card died".
Unlike CPUs GPUs do not have the technology in place to combat the stresses of being ran 24/7. No GPU has ECC, no GPU can recover from a serious error without being rebooted. The GPU chip itself has already been pushed to the thermal limits by the manu. Look at the overclock overheads in say an I7 920. Then look at the overclock overheads ina GPU.
Most people buy gaming cards for playing games on dude, but I do really admire your attitude
Quotefolding@home is more stressful than gamin but then you have to look on things there most popular client is the gpu and thats taken 95% of it then then the ps3 slips in there with poorer numbers than an ati card then the cpu client is great but dependin on platform/overclock isnt so good
then they dont have to suffer then the whole hot/cold cycle and not all cards run on there thermal cycle none of mines are near at the top of that my 9800 ran for a few days without problem at 90 degree's (fan fudged up and wouldnt spin very fast while a new cooler was comin for it
yeah i get most of them do go into gamin rigs but there is are utterly 10's of thousands of them out there to make most of the 5 petaFLOPS of the networkQuote
|
Originally Posted by name='siravarice'
So almost 480 performance for £280 is you got these in SLI. Very nice job.
|
Quote|
Originally Posted by name='siravarice'
So almost 480 performance for £280 is you got these in SLI. Very nice job.
|
Try something like S.T.A.L.K.E.R : Call of Pripyat if you want to be fair to ATI.. I would be very interested to see the results from that.Quote

~BexQuote
|
Originally Posted by name='AlienALX'
Crysis is ATIs achilles heel. It always was and it always will be because Crysis was Nvidia sponsored. It utilises pretty much every aspect of Nvidia cards.
Try something like S.T.A.L.K.E.R : Call of Pripyat if you want to be fair to ATI.. I would be very interested to see the results from that. |
QuoteIt confirms what I thought though. As I say, Crysis is heavily loaded toward Nvidia and that's why pretty much every benchmark where they put ATI cards against Nvidia cards uses it. It basically makes the Nvidia cards look really great.
Hilarious how a pair of 460s actually beat a 480. Not saying much for Nvidia's flagship card is it?
As for the blank screen thing? quite possibly the bridge connector. I had so much crap out of mine when I ran the 8600s in SLI. Constant artifacts, blank screen and crashing. In the end I cleaned the cards contacts with alcohol and that sorted it. Common problem with SLI apparently.Quote
|
Originally Posted by name='silenthill'
Great review, well done Mul, great price too from CPL at scan and overclockers the cheapest is 152£ seams that nvidia have come up with actually something worth buying but ATI will definitely bring their prices down on certain cards to compete with this card so don’t be hasty guys.
|
alot better folding with pretty good fps > then > bad folding with slightly better fps > then > saving say £25 for for the same fps and bad folding - is the order most people who are both both avid gamers and folders would agree with tbh. (only time i could see if being difference is if they already had a top notch dedicated folding rig set up and wanted to make a pure bang for buck gaming machine so they didnt have to interupt any of the cards in the dedicated machine)
id be happy to pay say £25 more on a nvidia card than an ati card if it gave the same gaming performance just for the extra folding grunt as in all honesty £25 is very little if your trying to get good gaming and folding performance.
Should be picking one of these up either on thurs or in 2-3 weeks depending on how the bills are this week.Quote
id be pretty tempted to get a 2nd 460 tbh but id need a new mobo + new psu as my current board is crossfire only and my psu only has a 6+2 and a 6
Quote|
Originally Posted by name='I Hunta x'
yeah its not a huge % of the market realy, quite a large % of folders are just people who use there main/only card when the pc's on and there not gaming tbh.
id be pretty tempted to get a 2nd 460 tbh but id need a new mobo + new psu as my current board is crossfire only and my psu only has a 6+2 and a 6 ![]() |
Keep up the great work m8.Quote
He has a 775 C2D though, finding a half decent SLI board should be cheap and easy. I sold a 680i eVGA about 6 months back with an E4500 in it for £65. Sure it didn't support all the latest quads and wasn't the greatest overclocker in the world but hey, tri SLI capable man.Quote
how did you check the CUDA cores,as mine is showing the same amount in GPU-ZQuote
|
Originally Posted by name='w3bbo'
Nice review Tom. I would personally prefer some competitor values in the graphs other than vs OC but I also appreciate the time it takes to do the tests!
Keep up the great work m8. |
Quotealso did you find that the core and shader was linked in afterburner and also that you could not unlink it,its hurting mu overclockQuote

Continue ReadingQuote