ASUS nVidia GTX590 Review

Far Cry 2 and Metro 2033

ASUS GTX590 Review

Far Cry 2

The Dunia engined Far Cry 2 is a great test of pure GPU horsepower and the GTX590 finally puts up a decent showing. It's neck and neck with the HD6990.

 

Metro 2033

Finally Metro 2033 responds well to dual GPUs and the GTX590 is once again on a par with its AMD rival and right between the single and dual-card setups of its big brother.

«Prev 1 2 3 4 5 6 7 8 Next»

Most Recent Comments

24-03-2011, 08:54:42

tinytomlogan
Following hot on the heels of the dual-GPU AMD HD6990, nVidia bring us the GTX590 twin-GPU card, the successor to the GTX295.



Continue Reading

24-03-2011, 09:00:42

sources95
sweet mother of jesus!

24-03-2011, 09:05:30

MR J DAWG
that thing is BEAST!!!!

24-03-2011, 09:05:49

Zeals
Wow they really turned down the GPUs in this thing.

24-03-2011, 09:06:54

sources95
Just imagine that thing with a waterblock...

24-03-2011, 09:09:52

tinytomlogan
Ill post the F@H stats here in a bit but basically 6800 WU's were 13k on each core. All the details are in the video though. (had to go careful with uploading too early on this one but the vid will be up asap)

24-03-2011, 09:15:06

Zeals
Quote:
Originally Posted by sources95 View Post

Just imagine that thing with a waterblock...
I doubt it will make much difference, heat isn't an issue with this card from the looks of it. If only Nvidia gave an extra 6pin to supplement the GPUs for better overclocking. Or if AMD had a cooler as effective as the Nvidia one.

24-03-2011, 09:20:38

Morbious Stone
Nvidia is clearly the loser and about the heat and noise after market/vendor cards will fix that.... and hey if you have a couple of cpu fans lying around you could easily mount them to the stock 6990 MacGyver style.... wonder why nobodies covering the evga duel 460 2 win or what not

24-03-2011, 09:27:57

Zeals
On a side note Tom is there any chance that you'll be using Crysis 2 as a new benchmarking game?

24-03-2011, 09:40:03

sources95
Why is it clocked so low?

24-03-2011, 09:42:11

Black
I'll stick with my gtx570 and will buy another one if needed.

I'm a bit disappointed of new Nvidia card.

24-03-2011, 09:48:32

sources95
So essentiallly my 580s can eat this thing?

24-03-2011, 09:51:33

Zeals
Quote:
Originally Posted by sources95 View Post

Why is it clocked so low?
It's only running on two 8pins and the PCIe slot, so the GPUs can't get enough juice to pump up the clock rates.

@ sources95

Your 580s will eat this as a light snack.

24-03-2011, 09:59:16

sources95
Wow I didn't realize how much by, I reckon wants other manufacters such as msi and Asus will do a whole revamp of the Baird and have it clocked at around the same as the 6990 really strange that nvodia would do this :/

24-03-2011, 10:00:21

wraithien
AMD might have issues with noise and power however I'm very surprised Nvidia haven't been able to beat them with this given how well their other cards perform. One other thing i had to note Tom was you mentioned the AMD 6990 was priced at £599 i havn't seen one over £550. It's more disappointing therefore that Nvidia are charging more for a card which cant outperform its competitor.

24-03-2011, 10:02:18

Alpert_P
Water block's already confirmed for the 6990.

24-03-2011, 10:21:22

Ya93sin
Looking at reviews from other places on the internet, some have the 590 beating the 6990, while others have it vice-versa.

It does seem that the 6990 is better value for money, but I'm sure when other manufacturers bring out custom coolers and such that will make a difference to the heat and noise issue, making the 6990 an even better buy, depending on the price of course

And I thought £5 was the lowest note, 600 of them is £3k

24-03-2011, 10:36:35

Bungral
Hmmm I've owned a GTX295 and also a couple of 4870x2s but I think I'll be giving the dual GPU cards a miss this time around.

The heat of the 4870x2 literally heated up my room to uncomfortable levels. Can't be dealing with that again from a 6990 and the power just isn't there in the 590 judging from this review.

I'll read a few more later for comparison sake.

I remember getting the 295 in the £4**'s and the 4870x2 in the high £3**'s.

Prices are now obscene.

24-03-2011, 11:02:13

Jerome
Thanks Tom & OC3D staff

24-03-2011, 11:35:39

Andrew Moore
Had these 590s sat at 800/1600/4000 stable and issue free so either these are fantastic silicon or yours a little erm.. lacking.

Either way, 6990 average OC headroom 10% (830-910mhz ish.) gtx590 OC head room 32%.. (607-801mhz) Results = horrendously quick and I'm flabbergasted :|

24-03-2011, 12:14:57

nepas
What driver were you using?As there appear to be some problems!

http://www.youtube.com/watch?v=sRo-1VFMcbc

"oh ****" @18 secs

Looks like the drivers supplied with the card are faulty,Nv released some new drivers this morning(just for this card)

EDIT:just noticed the GPUZ screenshot,you were using the bad drivers,your lucky it didnt blow on you,as well as that vid I posted its happened to other reviewers as well,TPU for one)

24-03-2011, 12:47:54

dugdiamond
whoa - £600 @aria

for an extra £60 i could get THREE MSI GTX560Ti Twin Frozr II OC editions, and have them in tri-SLI

24-03-2011, 12:50:40

OEOTS
Not unless you will be modding the PCB (at least if this would be possible in this case )

The GTX560 like the 460 only supports dual SLI.

24-03-2011, 12:51:46

Ya93sin
But Tri-SLI doesn't have the greatest scaling, so the performance wouldn't be so close.

24-03-2011, 12:58:54

dugdiamond
at £600, i just think its a bit steep

EDIT: i cannot justify paying that amount of money to play ONLY about a dozen "worth-playing" games that would benefit from such a card!

24-03-2011, 13:08:47

UkGouki
Quote:
Originally Posted by dugdiamond View Post

whoa - £600 @aria

for an extra £60 i could get THREE MSI GTX560Ti Twin Frozr II OC editions, and have them in tri-SLI
dont you mean 3 480gtx's?? there the same price as 560s but you would need a monster psu and hell of a good cooling case or waterblocks on them...

24-03-2011, 13:12:05

CueBall
I think nvidia did listen to his users for once. The card may be beaten by a 6990 on very high resolutions however the card runs a lot cooler. There isnt a vacum cleaner in your pc going around. Secondly if you look at all the different benchmark out there it seems nvidia aimed at the 1920x1080 resolution. This is the most used resolution and if you look closely then the 590 beats the 6990 most of the time. So well done job nvidia. Great card.

24-03-2011, 13:25:07

sleeper52
Tom,

Now that you've tested both GTX 590 and AMD HD6990, which one would you recommend in a WATERCOOLED setup?

24-03-2011, 13:43:15

Ya93sin
Quote:
Originally Posted by CueBall View Post

I think nvidia did listen to his users for once. The card may be beaten by a 6990 on very high resolutions however the card runs a lot cooler. There isnt a vacum cleaner in your pc going around. Secondly if you look at all the different benchmark out there it seems nvidia aimed at the 1920x1080 resolution. This is the most used resolution and if you look closely then the 590 beats the 6990 most of the time. So well done job nvidia. Great card.
It's a bit odd imo, if you are going to be spending £600 on a GPU you would think the average user of that GPU would have a res of at least 1900x1200

24-03-2011, 13:50:21

Icekube
Quote:
Originally Posted by dugdiamond View Post

whoa - £600 @aria

for an extra £60 i could get THREE MSI GTX560Ti Twin Frozr II OC editions, and have them in tri-SLI
It would be awesome if you could run more than 2 in sli, sadly there lies the limitation of the 560

At the end of the day it's down to how much money you are happy to part with against performance, a 590 is around £575 on Scan, so about £35-40 more than a hair dryer, sorry 6990, and £220ish less than a 580 sli.

In most games by the looks of it the 580 sli cleans house, with the 6990 performing better than the 590, but I personally could not live with a wailing banshee in my rig, and I'm not prepared to splash out £800ish on 2 x 580's, so I'm more than happy to pay a little extra to get a 590 over the 6990 for the substantially reduced noise and more refined drivers.

Having said that I'd love to see someone do a direct comparison with 2 x 570's against a 590 and 6990, as it does seem to be the best bang for £/$ setup from what I have seen.

24-03-2011, 13:59:58

Rastalovich
Quote:
Originally Posted by nepas View Post

What driver were you using?As there appear to be some problems!

"oh ****" @18 secs

Looks like the drivers supplied with the card are faulty,Nv released some new drivers this morning(just for this card)

EDIT:just noticed the GPUZ screenshot,you were using the bad drivers,your lucky it didnt blow on you,as well as that vid I posted its happened to other reviewers as well,TPU for one)
I made mention to this the other day. To my knowledge there is both a bios and driver revision, that in fairness nvidia should have contacted reviewers about atleast the drivers. The bios is less important in this case.

There's also been a new GPUZ out this week that shows the card specs of both the dual cards that much better. I would hope the memory overclock in todays review is an oversight of GPUZ.

You can "alternatively" power the 8pin connectors to adopt more power (if you care less about that sorta thing) and get 580 based clocks out of it. But unless you manage the cooling, I'd not suggest it for 24/7.

The price of these dual cards is fantastic, but to be fair, dual cards are by nature high priced.

I'll go back to it again, just with the 6990/xfire/6970/xfire OC3D reviews, some of the graphs are still showing significant drops in framerates for overclocked runs in benches of the games - which just can't be the case. Something is definitely not right.

Imo you can pick either of these dual cards. If you favor dx11 games, I'd get the 6990 (well not me cos I like physx and quality), and if you play dx9/10 you can pick the GTX590 or 6990. Both of them are spanking performance for what they are, and if you complain about the price of both, you really aught not be considering them.

Fastest single gpu card for me everytime.

24-03-2011, 14:21:36

chaotic_russ
where is the video review ?

24-03-2011, 14:25:05

AmEpic
holy smoke, thats a beast of a card

24-03-2011, 14:26:48

CrissTM
hehe, exactly. were's the video.

and bth, i would love to see in the banchmarks a bad company 2.

24-03-2011, 14:31:19

VonBlade
Quote:
Originally Posted by sleeper52 View Post

Tom,

Now that you've tested both GTX 590 and AMD HD6990, which one would you recommend in a WATERCOOLED setup?
Well the primary problem with the HD6990 is heat and noise. A quick look at the graphs will show that the HD6990 whips the GTX590 if there is a difference. So Watercooling solving the heat/noise issue it's an easy pick for the HD6990.

But you'd still be better off going GTX570 SLI, assuming your budget wont stretch to GTX580s in SLI

24-03-2011, 14:32:43

VonBlade
Quote:
Originally Posted by CrissTM View Post

and bth, i would love to see in the banchmarks a bad company 2.
We'll be rejigging our game benchmarks soon and BBC2 is likely to make an appearance. Although personally I can tell you the GTX590 will eat it for breakfast at any setting.

24-03-2011, 14:46:28

sleeper52
Quote:
Originally Posted by VonBlade View Post

Well the primary problem with the HD6990 is heat and noise. A quick look at the graphs will show that the HD6990 whips the GTX590 if there is a difference. So Watercooling solving the heat/noise issue it's an easy pick for the HD6990.

But you'd still be better off going GTX570 SLI, assuming your budget wont stretch to GTX580s in SLI
In all honesty, I am considering GTX 570s in SLI. The problem doing this is that i would have to buy two waterblocks thereby increasing the cost and power consumption. if you consider that a 6990 performs the same and perhaps more when overclocked, you would end up with the lower cost and better value.

24-03-2011, 14:56:52

CrissTM
Quote:
Originally Posted by VonBlade View Post

We'll be rejigging our game benchmarks soon and BBC2 is likely to make an appearance. Although personally I can tell you the GTX590 will eat it for breakfast at any setting.
as far as i can tell, bad company 2 do likes amd/ati more, buth my problem r amd/ati drivers.

my question is if a single gtx570 (for now, sli later) will be stronger than a gtx 295 (never mind the dx 10/11)?!

i do play bc2 (2 much) and i dont know if it worth the jump.

24-03-2011, 15:06:52

Icekube
Can't believe for the 1st time TTL seems to be pretty much the last reviewer to put a 590 vid up

Quote:
Originally Posted by CrissTM View Post

as far as i can tell, bad company 2 do likes amd/ati more, buth my problem r amd/ati drivers.

my question is if a single gtx570 (for now, sli later) will be stronger than a gtx 295 (never mind the dx 10/11)?!

i do play bc2 (2 much) and i dont know if it worth the jump.
I play bc2 quite a bit, at 1920 x 1080 on a 24" with everything at max, and I'm getting between 80-90 average fps with a max of around 110, if that's any help to you as far as single 570's go.

24-03-2011, 15:17:18

tinytomlogan
Quote:
Originally Posted by Andrew Moore View Post

Had these 590s sat at 800/1600/4000 stable and issue free so either these are fantastic silicon or yours a little erm.. lacking.
You may well have but if you actually test them the scores drop like a brick once you push further because theres not enough juice to fuel them properly.......

This card deffo wouldnt go any further and keep increasing the scores.

Quote:
Originally Posted by Icekube View Post

Can't believe for the 1st time TTL seems to be pretty much the last reviewer to put a 590 vid up
YouTube is being a pitFa today. That and Im meant to be having some time off!

Game wise VB is correct and we will being bringing some new games in to test with, we just want to choose wisely.

We are waiting to test Crysis 2 and Shift 2 to see if they are demanding enough and not just gay console ports like MOH and COD.

@ NEPAS

We tested originally on the beta driver and then again on the new driver after it was released, I just forgot to GPU-Z the new drivers. not bad considering the card was only here 36 hours too :S

24-03-2011, 15:28:56

Icekube
Quote:
Originally Posted by tinytomlogan View Post

YouTube is being a pitFa today. That and Im meant to be having some time off!
And rightly so with the amount of work you and team put in, just looking forward to seeing it after teasing us all

24-03-2011, 15:31:16

Hay11Mac
Another great review, can't wait to see the vid (god i'm such a geek)

Enjoy your time off Tom, I know you will be living it up Saturday

24-03-2011, 15:33:46

tinytomlogan
Im just taking a few days to do some bits and bobs for myself, its actually the first time Ive taken time off since joining OC3D officially. So its the first time in 18months and I still end up working!

24-03-2011, 15:36:44

DT-525
Crysis 2 is just a gay port Tom, Just played it on the PC and there are no user settings to set the texture quality or the AA, it complete bulls**t i guess we can put Crytek on the list of PC sellouts. Good game though. There is no DX 11 or 64bit built in to the game as of yet crytek say they will bring out a patch in the next few weeks.

24-03-2011, 15:47:54

BamBam
I had hoped to see this just smash the 6990 and do it all cool and quite. Now I'm not sure if i should get the gtx 590 or go with my orginal better preforming although much more expensive EVGA gtx 580 FTW HC in sli, they look sexier then hell and single slot is always great. Or i could buy 2 gtx 590's for the epeen and to see the envy on the faces of my progamming class.

24-03-2011, 15:53:10

tinytomlogan
Quote:
Originally Posted by sleeper52 View Post

Tom,

Now that you've tested both GTX 590 and AMD HD6990, which one would you recommend in a WATERCOOLED setup?
Wait for the video dude, its covered in that but tbh its pretty much covered in this review aswell.

24-03-2011, 15:56:00

murphy7801
Quote:
Originally Posted by tinytomlogan View Post

Wait for the video dude, its covered in that but tbh its pretty much covered in this review aswell.
I know you stated best bang for pound was SLI GTX 570 could possiblely please do a review of that ?

24-03-2011, 15:58:02

adamjamesroe
Im realy annoyed that the HD 6990 is better, had realy high hops for the GTX 590. Hopefully there will be a non reference one with more power connectors so the clocks can be ramped up a little more, push it closer to GTX 580 SLI.

24-03-2011, 15:58:51

adamjamesroe
Quote:
Originally Posted by murphy7801 View Post

I know you stated best bang for pound was SLI GTX 570 could possiblely please do a review of that ?
That would be nice, before i invest im my second GTX 570

24-03-2011, 16:03:32

sleeper52
Quote:
Originally Posted by tinytomlogan View Post

Wait for the video dude, its covered in that but tbh its pretty much covered in this review aswell.
So it's the 6990 then. Thanks Tom.

Quote:
Originally Posted by adamjamesroe View Post

Im realy annoyed that the HD 6990 is better, had realy high hops for the GTX 590. Hopefully there will be a non reference one with more power connectors so the clocks can be ramped up a little more, push it closer to GTX 580 SLI.
Me either. I actually sold my 5970 hoping to jump on the GTX 590. It looks like I'm gonna have to stick with AMD still.

24-03-2011, 16:06:02

CueBall
Quote:
Originally Posted by Ya93sin View Post

It's a bit odd imo, if you are going to be spending £600 on a GPU you would think the average user of that GPU would have a res of at least 1900x1200
Well depending point of view offcourse. Not saying its a briljant card.

Just saying I can see what they aimed at. Scoring wise at higher resolutions I think its a bit disappointing.

I would have thought with the gpu used from the 580 it would perform a bit better.

And spending 600 quid on a GPU is a bit silly anyway for me at least.

All cards nowadays perform well at higher resolutions anyways.

For me getting a card like the 590 or 6990 isnt about getting the highest performance but the WOOT factor. Or the more show off factor. These cards are more for benchers in mine point of view anyway.

People like tom for instances who loves searching for the limmit and fiddle around with.

Doesnt mean though I like what nvidia has done with it and also what their aiming for.

So yeah I say way to go green team. Red team isnt doing bad aswell, but should have payed more attention to sound and heat.

24-03-2011, 16:07:40

adamjamesroe
Quote:
Originally Posted by sleeper52 View Post

So it's the 6990 then. Thanks Tom.

Me either. I actually sold my 5970 hoping to jump on the GTX 590. It looks like I'm gonna have to stick with AMD still.
Well i hope on getting a second GTX 570 which hopefully will in SLI be faster than the GTX 590 and cheaper, but nothing beats a card like this. if you could turn back time and someone offerd me GTX 570 SLI or GTX 590, eventhough the 570 sli would be better i think i would pick the GTX 590, noting can beat it in my mind. like the GTX 295, simply a work of art. noting better than one card with the perfromance of nearly two.

24-03-2011, 16:08:56

tinytomlogan
TBH with this sort of money to spend I have to say the sensible money is 2x GTX570's.

Dual waterblocks and some serious overclocking would be nerd p0rn

24-03-2011, 16:10:18

adamjamesroe
Quote:
Originally Posted by tinytomlogan View Post

TBH with this sort of money to spend I have to say the sensible money is 2x GTX570's.

Dual waterblocks and some serious overclocking would be nerd p0rn
Indeed it would.

24-03-2011, 16:14:42

sleeper52
Quote:
Originally Posted by tinytomlogan View Post

TBH with this sort of money to spend I have to say the sensible money is 2x GTX570's.

Dual waterblocks and some serious overclocking would be nerd p0rn
The problem with this is that the cost of 2 waterblocks would bring the total to around $1000 for two GTX 570s in SLI whereas as single GTX 590 or HD6990 with waterblock would cost $800 for practically the same performance.

24-03-2011, 16:17:37

adamjamesroe
Quote:
Originally Posted by sleeper52 View Post

The problem with this is that the cost of 2 waterblocks would bring the total to around $1000 for two GTX 570s in SLI whereas as single GTX 590 or HD6990 with waterblock would cost $800 for practically the same performance.
Yeah but SLI looks cool.

24-03-2011, 16:21:23

sleeper52
Quote:
Originally Posted by adamjamesroe View Post

Yeah but SLI looks cool.
haha point taken. But you could argue you can add another 6990 for CrossfireX later too lolz.

24-03-2011, 16:23:06

Mustang

24-03-2011, 16:23:48

adamjamesroe
Quote:
Originally Posted by sleeper52 View Post

haha point taken. But you could argue you can add another 6990 for CrossfireX later too lolz.
Yes but then instead oh having a hover running all the time, you will have two hovers running all the time. but yes HD 6990 CF would look quite epic.

24-03-2011, 16:24:14

adamjamesroe
Quote:
Originally Posted by Mustang View Post
lol

24-03-2011, 16:28:23

piotrekhc
tbh I was expecting it to be much more faster than 6990, well at least cost less than AMD one which is good

24-03-2011, 16:34:53

sleeper52
Quote:
Originally Posted by adamjamesroe View Post

Yes but then instead oh having a hover running all the time, you will have two hovers running all the time. but yes HD 6990 CF would look quite epic.
But on watercooling, there wouldn't be hovers though.

24-03-2011, 16:59:35

adamjamesroe
Quote:
Originally Posted by sleeper52 View Post

But on watercooling, there wouldn't be hovers though.
I Know but you wound need some massive RAD's to cool them, and it would cost a fortune. tbh i dont realy like waterbloks, the only ones i like are the EVGA full cover ones like on the GTX 580 and GTX 590 FTW

24-03-2011, 17:08:22

VonBlade
Quote:
Originally Posted by sleeper52 View Post

The problem with this is that the cost of 2 waterblocks would bring the total to around $1000 for two GTX 570s in SLI whereas as single GTX 590 or HD6990 with waterblock would cost $800 for practically the same performance.
I'm sure you've tested the GTX570 in SLI against the HD6990 to make such outlandish claims.

GTX570 can be overclocked to around GTX580 performance. So two would be GTX580 SLI speeds.

Now if you want to claim that a HD6990 or the GTX590 is equal to a GTX580 SLI setup, good luck with that

24-03-2011, 17:12:03

adamjamesroe
TBH as well it also anoys me that you can get a HD 6990 for £540 and a GTX 590 for £580+ even though the GTX 590 on paper is worse.

24-03-2011, 17:17:20

Tortuga
Quote:
Originally Posted by adamjamesroe View Post

TBH as well it also anoys me that you can get a HD 6990 for £540 and a GTX 590 for £580+ even though the GTX 590 on paper is worse.
On the egg here in the states, The 590 looks about 10 dollars cheaper than the 6990, so thats not very significant, like it says in the review, it might be worth it for the reduction of noise.

*EDIT Well, The cheapest 590, and the cheapest 6990

24-03-2011, 17:40:49

warfox101
Before seeing the 590 benchmarks the 6990 did not look so good with the heat and noise issue. But now I have to say I think ill grab me two 6990's and water block them. Thanks for the review Tom.

24-03-2011, 17:51:31

Rastalovich
Quote:
Originally Posted by tinytomlogan View Post

TBH with this sort of money to spend I have to say the sensible money is 2x GTX570's.

Dual waterblocks and some serious overclocking would be nerd p0rn
Quote:
Originally Posted by VonBlade View Post

I'm sure you've tested the GTX570 in SLI against the HD6990 to make such outlandish claims.

GTX570 can be overclocked to around GTX580 performance. So two would be GTX580 SLI speeds.

Now if you want to claim that a HD6990 or the GTX590 is equal to a GTX580 SLI setup, good luck with that
Watercooling indeed.

.. which you do realize is the equivalent of suggesting 2x480 in SLI, and it would have more memory and cost less money. The 480 is now £196 and the 570 around £260 (@ Scan as of right now, for example)

Watercooled or 3rd party cooled, I honestly never thought I'd see this suggestion after all the palaver.

24-03-2011, 17:57:52

dugdiamond
they have only been on sale for 8hrs in the UK. no-one has had the opportunity to really see what this baby can do, single or SLI'd.

i will sit on the fence and wait a few weeks, before i decide which is the best step forward for me.

by then the prices may have dropped a bit too

24-03-2011, 18:07:57

Dark NighT
Quote:
Originally Posted by dugdiamond View Post

they have only been on sale for 8hrs in the UK. no-one has had the opportunity to really see what this baby can do, single or SLI'd.

i will sit on the fence and wait a few weeks, before i decide which is the best step forward for me.

by then the prices may have dropped a bit too
Smart!

Then again first day purchases of such hardware is a bit stupid in my opinion.

I am looking to replace my 2 gtx 280 cards for either 2 560TI, or if the price goes down to 2 570's, However the 590 is looking very interesting, and given time to let the drivers mature, i might pick one up for the sole reason that it runs quiet and gives my audio card some breathing room, as its right on top of one 280 card now, does get a bit hot.

24-03-2011, 18:46:28

sources95
Well that settles it time to buy waterblocks for the 580s

24-03-2011, 18:46:37

sleeper52
Quote:
Originally Posted by VonBlade View Post

I'm sure you've tested the GTX570 in SLI against the HD6990 to make such outlandish claims.

GTX570 can be overclocked to around GTX580 performance. So two would be GTX580 SLI speeds.

Now if you want to claim that a HD6990 or the GTX590 is equal to a GTX580 SLI setup, good luck with that
Again, cost. Watercooled GTX 580 in SLI would cost at least $1200. and watercooled GTX 570 SLI is $1000. Not that i'm claiming that the 6990 performs better than GTX 580s in SLI cuz i never claimed that. Regarding GTX 570s SLI, the benchmarks do suggest they perform similar with HD6990s albeit in stock. Overclocked, I don't know. Please stop suggesting that i do 'outlandish claims' because i never did. You are putting words on my mouth, sir.

24-03-2011, 19:17:25

Game Over
Quote:
Originally Posted by adamjamesroe View Post

Im realy annoyed that the HD 6990 is better, had realy high hops for the GTX 590. Hopefully there will be a non reference one with more power connectors so the clocks can be ramped up a little more, push it closer to GTX 580 SLI.
I bet our Snow White competition first place winner would be thrilled, That baby is rocking a 6970 which chip to chip looks to be superior to the 590, or at least extremely close to even. I hope people at least look at reviews and benchmarks before lining up to buy the new card.

24-03-2011, 21:43:39

chaotic_russ
i expect the gtx595 to come soon , a fixed gtx590 with a few improvements of the vrm's etc

24-03-2011, 23:59:34

unknownuser200
well i was watching your review among very many others i liked how the card was quiet and had good performance..

HOWEVER heres my question, i currently run 2560 x 1600 and i was curious if im going to run into a vram issue and if in my case im better with the 6990?

25-03-2011, 02:32:02

Game Over
Also looking forward to Maybe seeing these in SLI, I wonder if they will do any better on the fluctuations on high and low frame rates than the 6990 which was all over the place on the Crysis test.

25-03-2011, 02:45:36

Zeals
Quote:
Originally Posted by DT-525 View Post

Crysis 2 is just a gay port Tom, Just played it on the PC and there are no user settings to set the texture quality or the AA, it complete bulls**t i guess we can put Crytek on the list of PC sellouts. Good game though. There is no DX 11 or 64bit built in to the game as of yet crytek say they will bring out a patch in the next few weeks.
Yeah atm the game is an obvious console port, it doesn't even scale properly with multi GPUs atm. Hopefully with the patches it will become more PC game and a less obvious port. Crysis that's playable on max settings isn't Crysis.

25-03-2011, 07:34:18

Rastalovich
Quote:
Originally Posted by Zeals View Post

Yeah atm the game is an obvious console port, it doesn't even scale properly with multi GPUs atm. Hopefully with the patches it will become more PC game and a less obvious port. Crysis that's playable on max settings isn't Crysis.
I've played the 64bit beta bins. Why they never hit the market, I don't know. They were iffy, running the cpu @ 100% across the cores, but it was beta and there are test consoles running at the same time.

Apparently the "hooky" download that was leaked of the game also had 64bit bins in it.

25-03-2011, 07:43:55

hdaviator
Quote:
Originally Posted by Zeals View Post

Yeah atm the game is an obvious console port, it doesn't even scale properly with multi GPUs atm. Hopefully with the patches it will become more PC game and a less obvious port. Crysis that's playable on max settings isn't Crysis.
Lets face it, if crysis hadn't been such a huge performance hungry beast, nobody would be talking about it 3 years later. Its a good game but nothing to write home about. I think people put the game on a pedestal.

25-03-2011, 08:24:21

Karlos_the_n00b
Just wanted to share EVGA offerings. I love there waterblock cards although I could never afford one.

http://www.techpowerup.com/142842/EVGA-Storms-Forth-GTX-590-Launch-with-Four-Classified-Series-Products.html

Tom, any chance of you doing a quick video to show how this card runs 3 monitors? Would really like to know if its as good as eyefinity.

25-03-2011, 10:32:30

Jerome
I think 590 is gear'd to use full PCI express 2.0 x16 bandwidth at full load, tripple monitor resolutions so it doesn't always need to be faster, maby it'll cripple at 3D vision, and enthuiast overclockers, gamers etc would go for single core SLI instead for those reasons and more, their motherboard is designed for it, and as quad SLI is a rare performer in games and these are GeForce cards, 1 590 is ideal for the media X entertainment computer and the motherboards with 1 PCI express 2.0 x16 slot for graphics. Getting a proper enthusiast rig need the motherboard to begin.

I'll be upgrading when I've lived alot of overclocking life on my cards I think. 590 might need PCI frequency OC's to get more from it and I read the review and they didn't get a massive OC on the 590.

25-03-2011, 12:45:37

Bungral
Glad I went for the 570 now... Have good choices on a future SLI setup that should still beat both dual chip cards.

25-03-2011, 16:48:47

Zeals
Quote:
Originally Posted by Jerome View Post

I think 590 is gear'd to use full PCI express 2.0 x16 bandwidth at full load, tripple monitor resolutions so it doesn't always need to be faster, maby it'll cripple at 3D vision, and enthuiast overclockers, gamers etc would go for single core SLI instead for those reasons and more, their motherboard is designed for it, and as quad SLI is a rare performer in games and these are GeForce cards, 1 590 is ideal for the media X entertainment computer and the motherboards with 1 PCI express 2.0 x16 slot for graphics. Getting a proper enthusiast rig need the motherboard to begin.

I'll be upgrading when I've lived alot of overclocking life on my cards I think. 590 might need PCI frequency OC's to get more from it and I read the review and they didn't get a massive OC on the 590.
You're meant to use x16 bandwidth when using a dual core GPU. Problem with this card when it comes to overclocking is it's inability to get more power, if Nvidia decided to add an extra power connector; or prehaps a non-reference card has the added power connector than you'll be able to overclock it to more respectable speeds. Finally I doubt people will buy an Nvidia ahard for triple monitor setups, AMD Eyefinity is more mature, scales better and at high resolutions memory makes a difference and the 6990 has an extra gig of it.

25-03-2011, 18:50:34

Glax
To quite honest, I was expecting to be blown out the water with heaps of reasons why it would be superior to the 6990. But with ttl's unbiased video and 3rd party criticism, its really tough(for me) to pick a clear victor. Like everyone else I'm vacillating betwixt the two companies and waiting for non-reference model with after market coolers and what not.(Particularly MSI's Twin Frozr 6990 and 590)

Thanks for another video Tom! Hope the weather is nicer in England, starting to get swamplike down here again in Georgia

-Gentlemen

25-03-2011, 22:21:42

Jerome
mm a dual 480 would be bottlenecked slightly at tripple monitor HD resolutions so the 590 is I think gear'd about right for todays PCI express standard. Suiting media, entertainment PC's for the all round PC user with 3 monitors and no spare PCI express slots for single GPU SLI and for power consumption, noise, heat etc. There's alot of users like that, enjoying Blu Ray videos, editing, gaming etc. Film makers might use this card. LAN party gamers want small chassis, this card can do for that. GeForce

26-03-2011, 05:28:36

silenthill
[/size][size="3"]Tbh all you are really buying is a convenient form of GTX570 SLI. If you want a all-in-one silent GTX 570 SLI package.I believe bothNvidia and AMD have delivered nothing useful for us SLI users because I got41000 P score in vantage with my two 480s with a bit of overclocking.

26-03-2011, 08:13:01

Fez
Could do with a couple of these in my gaming rig haha

28-03-2011, 04:31:38

oneseraph
Quote:
Originally Posted by sources95 View Post

Why is it clocked so low?
Nvidia has a serious power problem with the 590 if they increase the clocks on the 590 the power goes above PCI Express limmits. So overclock at you own risk. Just remember that in this case you are risking you motherboard as well as your graphics card.

28-03-2011, 04:44:58

oneseraph
Quote:
Originally Posted by VonBlade View Post

Well the primary problem with the HD6990 is heat and noise. A quick look at the graphs will show that the HD6990 whips the GTX590 if there is a difference. So Watercooling solving the heat/noise issue it's an easy pick for the HD6990.

But you'd still be better off going GTX570 SLI, assuming your budget wont stretch to GTX580s in SLI
Well no actually, Two 6950 in crossfire will outperform two 570 in SLI because the 69xx cards scale better in dual and triple card configurations. The other advantage of the 6950 is that they are substantially less expensive. if you are worried about noise with these cards get them with an aftermarket cooler they are still cheaper than a pair of 570. If you decide to bios mod them they will be faster than a pair of 580.

28-03-2011, 05:55:11

Rastalovich
Quote:
Originally Posted by oneseraph View Post

Nvidia has a serious power problem with the 590 if they increase the clocks on the 590 the power goes above PCI Express limmits. So overclock at you own risk. Just remember that in this case you are risking you motherboard as well as your graphics card.
Nope. Every card with an additional power source plugged into the pcb in addition to the pcie slot feed, is intended to go over the pcie limits.

Theres nothing stopping any manufacturer putting 4x 8pin pcie power connectors on a card. There are no limits. You can have a 700W++++ card if you want.

The 590 """could""" be overclocked to the 580 levels, and probably beyond, not by conventional methods - AND - if the circuitary around them is up to it. Which will depend on the build.

Time will tell, just keep an eye on those overclocking records.

29-03-2011, 04:55:29

oneseraph
Quote:
Originally Posted by Rastalovich View Post

Nope. Every card with an additional power source plugged into the pcb in addition to the pcie slot feed, is intended to go over the pcie limits.

Theres nothing stopping any manufacturer putting 4x 8pin pcie power connectors on a card. There are no limits. You can have a 700W++++ card if you want.

The 590 """could""" be overclocked to the 580 levels, and probably beyond, not by conventional methods - AND - if the circuitary around them is up to it. Which will depend on the build.

Time will tell, just keep an eye on those overclocking records.
Yes you are correct high end graphics cards usualy have additional power input they can be 6 pin 8 pin or both. That said the following is an excerpt from the PCI E 2.0 Spec

A11: PCI-SIG has developed a new specification to deliver increased power to the graphics card in the system. This new specification is an effort to extend the existing 150watt power supply for high-end graphics devices to 225/300watts. The PCI-SIG has developed some boundary conditions (e.g. chassis thermal, acoustics, air flow, mechanical, etc.) as requirements to address the delivery of additional power to high-end graphics cards through a modified connector. A new 2x4 pin connector supplies additional power in the 225/300w specification. These changes will deliver the additional power needed by high-end GPUs. The new PCI-SIG specification was completed in 2007.

What that means is that at maximum with the two 8 pin connectors + the 75 watts from the PCIE adapter there is a maximum of 375 watts available under PCIE 2.0. Nvidia says the GeForce GTX 590 is a 365 W board. By the way, there is a reason Nvidia did not add a third 8 pin connector or a forth for that matter, they will not build a card outside of the PCIE 2.0 specification. If they did they would have to warn anyone who installed such a card that they are voiding the warranty of there motherboard. If it were as simple as just adding power ad hoc nvidia probably would have done it so they could run the 590 at higher clock rates.

Sure a manufacturer could put as many power connectors on a board as they want. Doing so would put the product outside the PCIE 2.0 specification so no one would be stupid enough to install the thing. Which is why none of them are doing it, neither Nvidia or AMD have engineers that are so moronic that they would design a board out of spec.

I hope that cleared things up for you.

29-03-2011, 05:09:12

thewire
Quote:
Originally Posted by oneseraph View Post

Well no actually, Two 6950 in crossfire will outperform two 570 in SLI because the 69xx cards scale better in dual and triple card configurations. The other advantage of the 6950 is that they are substantially less expensive. if you are worried about noise with these cards get them with an aftermarket cooler they are still cheaper than a pair of 570. If you decide to bios mod them they will be faster than a pair of 580.
You make a fine point.

29-03-2011, 05:13:17

thewire
Quote:
Originally Posted by oneseraph View Post

Yes you are correct high end graphics cards usualy have additional power input they can be 6 pin 8 pin or both. That said the following is an excerpt from the PCI E 2.0 Spec

A11: PCI-SIG has developed a new specification to deliver increased power to the graphics card in the system. This new specification is an effort to extend the existing 150watt power supply for high-end graphics devices to 225/300watts. The PCI-SIG has developed some boundary conditions (e.g. chassis thermal, acoustics, air flow, mechanical, etc.) as requirements to address the delivery of additional power to high-end graphics cards through a modified connector. A new 2x4 pin connector supplies additional power in the 225/300w specification. These changes will deliver the additional power needed by high-end GPUs. The new PCI-SIG specification was completed in 2007.

What that means is that at maximum with the two 8 pin connectors + the 75 watts from the PCIE adapter there is a maximum of 375 watts available under PCIE 2.0. Nvidia says the GeForce GTX 590 is a 365 W board. By the way, there is a reason Nvidia did not add a third 8 pin connector or a forth for that matter, they will not build a card outside of the PCIE 2.0 specification. If they did they would have to warn anyone who installed such a card that they are voiding the warranty of there motherboard. If it were as simple as just adding power ad hoc nvidia probably would have done it so they could run the 590 at higher clock rates.

Sure a manufacturer could put as many power connectors on a board as they want. Doing so would put the product outside the PCIE 2.0 specification so no one would be stupid enough to install the thing. Which is why none of them are doing it, neither Nvidia or AMD have engineers that are so moronic that they would design a board out of spec.

I hope that cleared things up for you.
Yeah your right about the specification. It's a shame those GF104 chip are so power hungry. Nvidia really had to tune the 590 down to make it work. I was hoping for something better from them. Epic Fail!

29-03-2011, 06:47:27

Rastalovich
Thing you need to bear in mind also is that that specification is years old (2007), where the thought of anything coming close to 365w was crazy talk. Even tho there are/were professional cards that make a mockery of that.

Another point is that the PCI-SIG would not specify that the slot AND external power would put forward a limit to the power considerations regarding it. They would ONLY concentrate on the slot itself. All the quoted paper suggest is that "with the addition of the suggested/new" 8 and 6 pin power connectors (i.e. the quoted NEW 2x4pin) - this is at a point in time (2007) when we moved from 4 pin molex as very much a standard to 6 and 8 pin pcie.

Time has moved on, and besides manufacturers of mobos moving onto pcie 2.0a/b/c/etc addendums to the original PCI-SIG submission for the original 2.0, there's nothing preventing the psu manufacturers suggesting a 10x pin pcie power connector. Or to hell with it, here's a 20pin.

Reading the PCI-SIG on 3.0 last week, there is no mention about wattage boundries, only the inference that "they can supply more power" whilst at the same time "will be more efficient" - which can be read a number of ways. Plenty of bandwidth speech.

In effect there is no limit put forward by the 2.0 specification, except for the slot itself and suggestions of what can be added to it, taking into account the technology available at the time of writing.

29-03-2011, 10:38:52

thewire
Quote:
Originally Posted by Rastalovich View Post

Thing you need to bear in mind also is that that specification is years old (2007), where the thought of anything coming close to 365w was crazy talk. Even tho there are/were professional cards that make a mockery of that.

Another point is that the PCI-SIG would not specify that the slot AND external power would put forward a limit to the power considerations regarding it. They would ONLY concentrate on the slot itself. All the quoted paper suggest is that "with the addition of the suggested/new" 8 and 6 pin power connectors (i.e. the quoted NEW 2x4pin) - this is at a point in time (2007) when we moved from 4 pin molex as very much a standard to 6 and 8 pin pcie.

Time has moved on, and besides manufacturers of mobos moving onto pcie 2.0a/b/c/etc addendums to the original PCI-SIG submission for the original 2.0, there's nothing preventing the psu manufacturers suggesting a 10x pin pcie power connector. Or to hell with it, here's a 20pin.

Reading the PCI-SIG on 3.0 last week, there is no mention about wattage boundries, only the inference that "they can supply more power" whilst at the same time "will be more efficient" - which can be read a number of ways. Plenty of bandwidth speech.

In effect there is no limit put forward by the 2.0 specification, except for the slot itself and suggestions of what can be added to it, taking into account the technology available at the time of writing.
The PCIE 2.0 spec sets the total power standard for High performance PCIE cards. The specification is very clear about the overall power and thermal limit's regardless of power source. It is true that the PCIE 3.0 standard "may" increase some of those power/thermal limits, however the PCIE 3.0 standard is not finalized and is not available on any motherboard available to the public.

29-03-2011, 10:59:21

Rastalovich
Quote:
Originally Posted by thewire View Post

The PCIE 2.0 spec sets the total power standard for High performance PCIE cards. The specification is very clear about the overall power and thermal limit's regardless of power source. It is true that the PCIE 3.0 standard "may" increase some of those power/thermal limits, however the PCIE 3.0 standard is not finalized and is not available on any motherboard available to the public.
How bizarre, cos you can look at the 150w specification here : http://www.pcisig.com/specifications/pciexpress/graphics/

.. and you can search for anything in addition (if you have membership) and there's nothing to be found. Outside of just 150w (and suggestions on how to achieve up to 365w). It used to have 'suggestions' of how to reach 300w, which are now deleted (or struck through as is the method).

Also you can download the 3.0 base spec plus the to-come 3.1 spec.

29-03-2011, 12:20:14

oneseraph
Quote:
Originally Posted by Rastalovich View Post

How bizarre, cos you can look at the 150w specification here : http://www.pcisig.com/specifications/pciexpress/graphics/

.. and you can search for anything in addition (if you have membership) and there's nothing to be found. Outside of just 150w (and suggestions on how to achieve up to 365w). It used to have 'suggestions' of how to reach 300w, which are now deleted (or struck through as is the method).

Also you can download the 3.0 base spec plus the to-come 3.1 spec.
That is 150 watts per 8 pin connectors 2x8pin = 300 watts + 75 watts from the PCIE 2.0 slot = 375 watts. Yes PCIE 2.0 spec includes up to 2x8pin connectors. See 375 watts, as I suggested. Though it wasn't really a suggestion it is the PCIE 2.0 production specification not to be confused with the PCIE 2.0 white paper as they are not the same thing.

The PCIE 3.0 spec was released to the PCI0-SIG partners on November 18, 2010. PCI-SIG expects the PCIe 3.0 specifications to undergo rigorous technical vetting and validation before being released to the public. This process, which was followed in the development of prior generations of the PCIe Base and various form factor specifications, includes the corroboration of the final electrical parameters with data derived from test silicon and other simulations conducted by multiple members of the PCI-SIG.

The PCIE 3.0 final production spec is likely to change as the many PCI-SIG stakeholders produce functioning silicone from the current spec. As a result neither Intel or AMD plan on including PCIE 3.0 on there current chipsets (sandy bridge, bulldozer). Both companies have "suggested" that they don't intend to integrate PCIE 3.0 until late 2012 or 2013. This is early speculation from both companies so those estimates could be substantially delayed.

Now on to the matter of the physics. The main reason there is a 375 watt per slot limit in the PCIE 2.0 spec is because when you put that much power in you need to get that much heat out. Given the space constraints of the form factor the PCI-SIG partners agreed that 375 watts as an upper thermal and electrical limit per slot would be sufficient. The limit can be doubled simply by using 2xPCIE 2.0 slots (crossfire, SLI).

It is unlikely that the PCI-SIG partners will increase the 375 watt limit in the PCIE 3.0 production spec for two main reasons. One the increased production cost is prohibitive and two as the pitch of GPU silicon is reduced the number of transistors that can be included will increase while producing far less heat. That is to say that future GPU's are likely to use less power and produce less heat while improving performance. There simply isn't any need to increase the thermal or electrical profiles.

I have two questions for you Rastalovich

If Nvidia could have simply added another power connector why didn't they?

Is it because they are stupid or because they are smart?

29-03-2011, 12:30:03

Rastalovich
They would go along with the PCI-SIG suggestion of what they've ratified in conjunction with what the ATX x.x standard put forward as a method of power supply. i.e. "we've made an 8pin pcie connector" - and PCI-SIG adjust their documents accordingly once it's passed through the ecn.

375w wouldn't be put forward as a limit due to heat disappation within a pc case as they know full well you can put 4x card in xfire/sli within a said case. 8x if you could the productivity servers you can install parallel gpu setups in.

To insist 375w was max and to allow the further addition of pcie x16 electical slots would be silly, don't ya think

EDIT: I have a feeling that maybe you're not seeing my view of how the system works, so I've put together a "brief" explanation in as best laymen's terms as I can:

In the model we're looking at, there are 3 prominent bodies:

ATX

PCI

Graphic card manufacturers

ATX come out with the standards of which power supply manufacturers are suggested to abide by when producing psus for the industry.

PCI have standards that apply to the use of busses, in the main, that are most commonly looked at as slots (even though they can be integrated also).

Graphic card manufacturers obviously produce the cards in our little model that display stuff - basically. AMD, Intel, nVidia, Silicone, Matrox and a few more.

Aside from these 3 there are many groups, with their own standards to which the above three bear in mind when putting forward their own studies/papers/standards, which range from safety people, environmental people, electrical, motherboard and other component people, the list does go on quite a bit.

Between all these groups there is a whole load of interaction, co-operation and studies. As an example, a graphic card manufacturer will come forward wanting to make an oem card that oems can use in their mass produced pcs aimed at business and the public. The oem has told them, as they usually do -very strongly- that they don't want ANY external power connections to this card, but it has to be more powerful than the present integrated/embedded selection. In this case, the gfx people can think of power and look directly at what PCI have put forward. They'll comply with their papers on what they've had motherboard manufacturers in turn comply with.

As time goes by, the consumer market gets more demanding. When PCI came out with their new standard for their gfx slot, they gave everyone the boundries at which the power would/could be used. As the gfx cards advance, their manufacturer saw the easy option of adding an additional molex cable to the side of their cards pcb to go beyond what PCI had stated would be available. Everyone spoke, and PCI added their errata/addendum to their previous paper on power use. It now includes "to achieve the power required for blah blah, a single molex is used" and so on.

Now the PCI's paper will include this addition. The bar has been raised as far as the gfxcard people are concerned, and time continues to move on, advances are made on this new power level.

Oems (HP, Dell, Acer, etc) btw are still insisting on NO extra pcb plugs.

The gfx people have reached a new era in their rnd, they need to surpass this poxy molex supply. They talk with the ATX people, who come up with a new type of connector (the 6 pin pcie for example). They produce their new paper ATX x.x and in turn PCI will catch wind of this - run a bunch of tests, do some studies, tell everyone they're happy, and bring out their new errata/addendum to the existing PCI standard.

Now the PCI's paper will include yet another raising of the power level supporting the *new* 6 pin pcie connector.

Repeat for the use of 2x 6pin, 8pin and 6+2pin, 2x 8pin and so forth.

Theoretically, the gfxcard people and the ATX people could be in discussion about a new 10 pin or 6+2+2 pcie connector. Each of the people will talk to each other, tests will be done, as per usual, regulations and studies will be re-issued with a new proposed power level. The ATX people say they'll bring out psus with 1x 8+2+2 connectors for the lower end of the market and 2x for ... possible dualing of these newer cards. Bringing a possible new power threshold that 2x 10pin connectors on a single gfxcard can handle. (purely theory, I can't see this happening with the proposed new die shrinks also - but who knows - it is possible)

As each of these groups talk with each other, conduct their own internal test, bars are continuously raised. A quoted paper stretching back to 2007 regarding the power levels for pcie use can only suggest about what is currently available. It, at that time, has little idea that 2x 8pin may become that popular.

One thing is for sure, stress to pcbs due to the plugging and unplugging, pull and such like of additional power sources is not favored. Which is why alot of oems dislike them. One of the defenses against a 2x 10pin power arrangement is that it emulates the mobo power connectors which would come too close to stressing the rear or top of the card. But ingenious inventions could work around it somehow. 3x 8pin is obviously suggesting a similar cable to what mobos have now. Sticking those at the back of an 11inch pcb is not wanted I don't think.

29-03-2011, 17:47:59

Jerome
375watts, is that the max 2 x 8 pin + mobo power can give because AMD 6990 can draw 450watts with the faster BIOS, according to AMD's website a few days ago?

8 pin = 150 watts

mobo PCI express = 75 watts

i think

29-03-2011, 18:37:07

oneseraph
Quote:
Originally Posted by Rastalovich View Post

They would go along with the PCI-SIG suggestion of what they've ratified in conjunction with what the ATX x.x standard put forward as a method of power supply. i.e. "we've made an 8pin pcie connector" - and PCI-SIG adjust their documents accordingly once it's passed through the ecn.
If by "suggestion" you are referring to industry excepted standards then I agree. Trade organizations create standards, IP companies design to those standards, manufacturers build to those standards. The result is a larger and more stable market for the consumer which at the end of the day is what it's all about.

Quote:
Originally Posted by Rastalovich View Post

375w wouldn't be put forward as a limit due to heat disappation within a pc case as they know full well you can put 4x card in xfire/sli within a said case. 8x if you could the productivity servers you can install parallel gpu setups in.
Your right 375 watts was not put forward as total thermal limit within a PC case. 375 watts is the electrical/thermal limit of a single PCIE 2.0 slot. PCIE does not restrict the total number of potential PCIE 2.0 slots. ATX however seems to think that 7 expansion slots are enough. So for the most part 7 PCIE slots are the most you can get on a standard ATX motherboard.

Quote:
Originally Posted by Rastalovich View Post

To insist 375w was max and to allow the further addition of pcie x16 electical slots would be silly, don't ya think
No, because by putting that load on another PCB in another PCIE 2.0 slot the surface area for thermal dissipation has at least doubled. What is silly is suggesting putting that same thermal load "in this case 750 watts" on a single PCIE 2.0 card. Thank you for making my point for me.

Quote:
Originally Posted by Rastalovich View Post

EDIT: I have a feeling that maybe you're not seeing my view of how the system works, so I've put together a "brief" explanation in as best laymen's terms as I can:
That's true, best I can tell you think industry standards are just suggestion because there is no enforcement body "other than the marketplace". What you seem to be saying is that in theory a graphics chip manufacturer could design and build a single 750 watt graphics board. OK, sure in theory, my point is they wont because of those pesky industry standards. That and putting that much heat in that small a form factor without an extraordinary cooling solution is a good way to start a fire.

So this whole back and forth started because I said:



Nvidia has a serious power problem with the 590 if they increase the clocks on the 590 the power goes above PCI Express limmits. So overclock at you own risk. Just remember that in this case you are risking your motherboard as well as your graphics card.





and you replied



Nope. Every card with an additional power source plugged into the pcb in addition to the pcie slot feed, is intended to go over the pcie limits.

Theres nothing stopping any manufacturer putting 4x 8pin pcie power connectors on a card. There are no limits. You can have a 700W++++ card if you want.

The 590 """could""" be overclocked to the 580 levels, and probably beyond, not by conventional methods - AND - if the circuitary around them is up to it. Which will depend on the build.

Time will tell, just keep an eye on those overclocking records.





Now the original question that I replied to was



"why is the Nvidia 590 clocked so low"





I stand by my answer, Nvidia says that the 590 at load draws 365 watts. Nvidia says that the clock rates they set for the reference 590 are to ensure compliance with PCIE electrical/thermal standards. The fact that Nvidia did not characterize this as a problem doesn't make it any less the case. The fact is if Nvidia could have set the reference clocks higher they would have. Nvidia would love to claim the fastest single graphics card title, As it stands AMD's 6990 holds that title, costs between $75.00 and $100.00 dollars less and uses less power. The one key drawback of the 6990 reference design is noise, cheers to Nvidia for making the 590 quite. That said, the OEMS already have aftermarket cooling in the pipeline for the 6990, soon they will be very quite as well and you can bet it wont have a $75.00 premium.

That said a pair of 6950 in crossfire will outperform both the 590 and the 6990. Two 6950 in crossfire will outperform the more powerful 570 in SLI because the 69xx cards scale better in dual and triple card configurations. There are very, very quite version of the 6950 available from several OEMS. The 6950 is a lot less expensive than any of the other above options. So I say if you need extreme performance and your smart (like a good value) get a pair of 6950's and call it a day.

29-03-2011, 18:46:07

oneseraph
Quote:
Originally Posted by Jerome View Post

375watts, is that the max 2 x 8 pin + mobo power can give because AMD 6990 can draw 450watts with the faster BIOS, according to AMD's website a few days ago?

8 pin = 150 watts

mobo PCI express = 75 watts

i think
Yeah Jerome that is exactly right PCIE 2.0 spec is

8 pin 150 watts each

PCIE slot 75 watts

So 2x8 pin + PCIE slot = 375 watts

Wow if there is a faster bios that can cause a 75 watt increase in power draw over the PCIE spec. That is one bios I would have to say "Not a chance in Bleep" to. there is no way I would flash that to my card. Crazzy..

J can you post a link, that is one train wreck I just have to see.

Thanks man

29-03-2011, 19:30:20

Rastalovich
Quote:
Originally Posted by oneseraph View Post

If by "suggestion" you are referring to industry excepted standards then I agree. Trade organizations create standards, IP companies design to those standards, manufacturers build to those standards. The result is a larger and more stable market for the consumer which at the end of the day is what it's all about.
Correct they create standards, and those standards aren't set in stone as they're constantly revised and amended as technology around them evolves. 2.0 becomes 2.0a..... upto 2.1 to which they're adapting to the coming 3.0.

As do/will the power statements.

Quote:
Originally Posted by oneseraph View Post

Your right 375 watts was not put forward as total thermal limit within a PC case. 375 watts is the electrical/thermal limit of a single PCIE 2.0 slot. PCIE does not restrict the total number of potential PCIE 2.0 slots. ATX however seems to think that 7 expansion slots are enough. So for the most part 7 PCIE slots are the most you can get on a standard ATX motherboard.
Almost, 375 was the most that was wanted/could be supplied, within reason, given the want of the gfx people and what could be supplied by atx power within reason.

PCI (slots/busses) and atx do not between themselves alone decide how many a mobo can supply for expansion. There is no standard to this. It's mostly down to controllers, chipsets to service the busses and what the mobo manufacturer has in mind. They often tie up a number of lanes with onboard devices. Again, really depending on what embedded companies they have 'onboard' with their design.

Quote:
Originally Posted by oneseraph View Post

No, because by putting that load on another PCB in another PCIE 2.0 slot the surface area for thermal dissipation has at least doubled. What is silly is suggesting putting that same thermal load "in this case 750 watts" on a single PCIE 2.0 card. Thank you for making my point for me.
No, because you can put 1000W, for example, on a pcie 2.0 card if you chose to do so, as long as you manage the waste (heat) effectively. They could create a 4x gpu card that was 22 inches long and only fitted in a customized case if really wanted to. And it would be within the PCIe base standards.

Quote:
Originally Posted by oneseraph View Post

That's true, best I can tell you think industry standards are just suggestion because there is no enforcement body "other than the marketplace". What you seem to be saying is that in theory a graphics chip manufacturer could design and build a single 750 watt graphics board. OK, sure in theory, my point is they wont because of those pesky industry standards. That and putting that much heat in that small a form factor without an extraordinary cooling solution is a good way to start a fire.
Both the 590 and 6990 can go beyond these 375w "standards" already. Keep a good eye out on hwbot in the days/weeks to come, where we will see how enthusiasts use the cards, and their 2x 8pin connectors, to overclock the crap out of them.

Quote:
Originally Posted by oneseraph View Post

I stand by my answer, Nvidia says that the 590 at load draws 365 watts. Nvidia says that the clock rates they set for the reference 590 are to ensure compliance with PCIE electrical/thermal standards. The fact that Nvidia did not characterize this as a problem doesn't make it any less the case. The fact is if Nvidia could have set the reference clocks higher they would have. Nvidia would love to claim the fastest single graphics card title, As it stands AMD's 6990 holds that title, costs between $75.00 and $100.00 dollars less and uses less power. The one key drawback of the 6990 reference design is noise, cheers to Nvidia for making the 590 quite. That said, the OEMS already have aftermarket cooling in the pipeline for the 6990, soon they will be very quite as well and you can bet it wont have a $75.00 premium.
nVidia have done exactly that with the reference 590, correct. But there are scarier things that their partners are coming out with that also tip the balance over this magic 375.

Quote:
Originally Posted by Jerome View Post

375watts, is that the max 2 x 8 pin + mobo power can give because AMD 6990 can draw 450watts with the faster BIOS, according to AMD's website a few days ago?

8 pin = 150 watts

mobo PCI express = 75 watts

i think
Yeah, given standard of pcie plus additional power supplied to the card if the psu can handle the supply, or you use another source of supply like adapters.

2x 8 pin sockets 'can' and will supply beyond 450w if required, it can be required for overclocking 2x 8 pin cards especially when going beyond a stock cooler.

29-03-2011, 20:01:24

Jerome

29-03-2011, 21:38:17

thewire
Quote:
Originally Posted by Jerome View Post
Thanks Jerome

Looks like AMD and Nvidia both agree with oneseraph.

"Dual-BIOS Support

The AMD Radeon™ HD 6990 graphics card features dual-BIOS capabilities. This feature is controlled by the Unlocking Switch, which toggles between the factory-supported Performance BIOS of 375W (BIOS1), and an Extreme Performance BIOS (BIOS2) that can potentially unlock higher clock speeds and up to 450W of mind-blowing performance!

Caution:

Do not use the 450W setting unless you are familiar with overclocking and are using high-quality system components to ensure maximum system stability. If you encounter system instability or other unexpected system performance while using the 450W setting, return the graphics card to the factory-supported 375W setting, as your system may not be properly equipped to handle the increased demands of the 450W setting.

The following procedure describes how to switch between BIOS settings using the Unlocking Switch on your AMD Radeon™ HD 6990 graphics card.

Locate the yellow caution sticker adjacent to the AMD CrossFireX™ connector on your AMD Radeon™ HD 6990 graphics card. This sticker covers the Unlocking Switch and must be removed to access and change dual-BIOS switch positions.

WARNING: Before proceeding, thoroughly review the documentation for your AMD Radeon™ HD 6990 graphics card and assure that your computer meets all minimum system requirements.

Remove the sticker and set the Unlocking Switch to the desired setting:

Position 1 — 450W Extreme Performance BIOS (BIOS2).

Position 2 (shipping position) — 375W factory-supported Performance BIOS (BIOS1).

WARNING: AMD graphics cards are intended to be operated only within their associated specifications and factory settings. Operating your AMD graphics card outside of specification or in excess of factory settings, including but not limited to overclocking, may damage your graphics card and/or lead to other problems, including but not limited to, damage to your system components (including your motherboard and components thereon (e.g. memory)); system instabilities (e.g. data loss and corrupted images); shortened graphics card, system component and/or system life; and in extreme cases, total system failure. AMD does not provide support or service for issues or damages related to use of an AMD graphics card outside of specifications or in excess of factory settings. You may also not receive support or service from your system manufacturer.

DAMAGES CAUSED BY USE OF YOUR AMD GRAPHICS PROCESSOR OUTSIDE OF SPECIFICATION OR IN EXCESS OF FACTORY SETTINGS ARE NOT COVERED UNDER YOUR AMD PRODUCT WARRANTY AND MAY NOT BE COVERED BY YOUR SYSTEM MANUFACTURER’S WARRANTY."

29-03-2011, 22:37:57

BrokenThingy
Quote:
Originally Posted by Rastalovich View Post

Almost, 375 was the most that was wanted/could be supplied, within reason, given the want of the gfx people and what could be supplied by atx power within reason.

PCI (slots/busses) and atx do not between themselves alone decide how many a mobo can supply for expansion. There is no standard to this. It's mostly down to controllers, chipsets to service the busses and what the mobo manufacturer has in mind. They often tie up a number of lanes with onboard devices. Again, really depending on what embedded companies they have 'onboard' with their design.
I have read all the back and forth between you and oneseraph. I decided to do a little research of my own. I thought you both might be full of it. here is what I found out.

Standard ATX allows 7 expasion slots.

http://www.formfactors.org/FFDetail.asp?FFID=1&CatID=1 look at 3.3.1 Expansion Slots

oneseraph 1 Rastalovich 0

PCIE power limit 375 watts

oneseraph 2 Rastalovich 0

Quote:
Originally Posted by Rastalovich View Post

No, because you can put 1000W, for example, on a pcie 2.0 card if you chose to do so, as long as you manage the waste (heat) effectively. They could create a 4x gpu card that was 22 inches long and only fitted in a customized case if really wanted to. And it would be within the PCIe base standards.
Both Nvidia and AMD have both stated that they are staying within the pcie 375 watt limit.

oneserph 3 Rastalovich 0

Basically everything I look up points to oneseraph being right. No offense but the more you comment the less you appear to know. I suggest letting it go mate, you are on the wrong side of the debate.

29-03-2011, 23:14:03

MrBlack
Quote:
Originally Posted by BrokenThingy View Post

I have read all the back and forth between you and oneseraph. I decided to do a little research of my own. I thought you both might be full of it. here is what I found out.

Standard ATX allows 7 expasion slots.

http://www.formfactors.org/FFDetail.asp?FFID=1&CatID=1 look at 3.3.1 Expansion Slots

oneseraph 1 Rastalovich 0

PCIE power limit 375 watts

oneseraph 2 Rastalovich 0

Both Nvidia and AMD have both stated that they are staying within the pcie 375 watt limit.

oneserph 3 Rastalovich 0

Basically everything I look up points to oneseraph being right. No offense but the more you comment the less you appear to know. I suggest letting it go mate, you are on the wrong side of the debate.
You said it, I was curious myself. Checked it out and oneseraph got pretty much everything right. Keep up the good work oneseraph-props to your tech wisdom.

29-03-2011, 23:46:33

oneseraph
Quote:
Originally Posted by Jerome View Post
Now that is something you dont see everyday. This thing reads like one of those stupid medication commercials we have here in the states. The kind where they tell you how great there allergy meds are then a voice in the background starts saying thing like "in some cases "xmed" has been known to cause kidney failure or in the event of brain hemorrhage please consult you doctor immediately.

This is what happens when you operate so far out of spec. First you get the following hilarious disclaimer.

"WARNING: AMD graphics cards are intended to be operated only within their associated specifications and factory settings. Operating your AMD graphics card outside of specification or in excess of factory settings, including but not limited to overclocking, may damage your graphics card and/or lead to other problems, including but not limited to, damage to your system components (including your motherboard and components thereon (e.g. memory)); system instabilities (e.g. data loss and corrupted images); shortened graphics card, system component and/or system life; and in extreme cases, total system failure. AMD does not provide support or service for issues or damages related to use of an AMD graphics card outside of specifications or in excess of factory settings. You may also not receive support or service from your system manufacturer."

Really going beyond the PCIE 375 watt limit by a solid 75 watts could damage to your system components (including your motherboard and components thereon (e.g. memory). Wow that's surprising. Could cause complete system failure you say, Hmm that doesn't sound good. If I do this your not going to help me you say.

Anyway, you see what I mean.

Thanks for the link J man, I haven't laughed that hard in a while.

30-03-2011, 05:40:35

Rastalovich
Quote:
Originally Posted by BrokenThingy View Post

I have read all the back and forth between you and oneseraph. I decided to do a little research of my own. I thought you both might be full of it. here is what I found out.

Standard ATX allows 7 expasion slots.

http://www.formfactors.org/FFDetail.asp?FFID=1&CatID=1 look at 3.3.1 Expansion Slots

oneseraph 1 Rastalovich 0

PCIE power limit 375 watts

oneseraph 2 Rastalovich 0

Both Nvidia and AMD have both stated that they are staying within the pcie 375 watt limit.

oneserph 3 Rastalovich 0

Basically everything I look up points to oneseraph being right. No offense but the more you comment the less you appear to know. I suggest letting it go mate, you are on the wrong side of the debate.
Excellent research, and best of luck with your ATX standard mobo with the agp for advanced graphics and isa lanes for the 7 expansion slots. Let us know how you get on with that, I assume your current mobo fits these standards.

There is no debate as to whether PCI-SIG mention 375w as the operational limit, but as I'm try to explain to you all about how the system works, this is/was dependant what is available at the time of testing and writing of the said document. i.e. 2x 8pin atx pcie power connectors - it would not be envisaged that anyone would go beyond that at the time.

Back with the base 2.0 specification they would have stated 75w, for the slot, plus *whatever additional power could feasibly and realisticly be supplied*. They added molex, upped the power, 6x pin, upped the power, 8x pin, upped the power .. til the documents now read 375 .. another change comes along and so the documents will/may change.

As a statement of intent, for sure both parties stated they were sticking within this, and they have. They're all nice and friendly behind the scenes, even tho tho poke tounges out at each other when the other's not looking.

This isn't a political battleground or anything buddy, I'm merely trying to explain to you how the system works.

At the current moment in time, the PCIe 2.0 (latest 2.1) 'should' (I can't confirm this) be being used within modern up to date mobos being release. 2.1 is (should be) the final stepping stone to 3.0, it is *practically*, for all intensive purposes, the same in everything except data usage (for arguments sake). Whether this itself carries the same electrical properties of 3.0 I can't tell you, as the documents within PCI-SIG, which you'll need membership for to look at maybe, do-not give specific numbers on what the power req for base-3.0 will be. I did speculate the other week, knowing that both nvidia and amd have release these cards that so fragrantly play the "I'm not touching you" game with 375w, amd especially as they have a bigger hand in mobos these days, that they are infact in-house using "3.0" and are gearing up for it for future releases.

The reasoning behind many of the failures of going beyond 375 in testing gfxcards is down to how these 8x pin sockets are supplied. If you go down the standard route of conventionally hooking up your psu and using 8x pin designated cables, your system can and usually will, shutdown as the psu trips out. BUT if you supply the 8x pin socket with a combination of power sources, using adapters in the main, much more than 375 can be there if the card requires it - i.e. what overclockers will tend to do if they're intent on breaking the boundries.

For sure, there will be disclaimers all over the websites of these manufacturers explaining to you how awful it will be for your system if you go beyond 375, they'll probably even use it as a means to refuse warranty. But hey - it's a disclaimer - Intel have disclaimers about how many volts they want you to put over their cpus, and how much notice of that do enthusiasts take notice of ? In today's climate we need disclaimers on bridges that dangling your baby off the edge of it could result in harm and the bridge people won't be held responsible.

30-03-2011, 05:50:49

Jerome
good, thanks Rast'. EVGA makes/sell a PCIe power booster: http://www.evga.com/products/moreInf...0Hardware&sw=4

30-03-2011, 05:54:22

Rastalovich
Ahh, that kinda makes sense.

Their harping back to the old requirement of plugging in molex connectors to mobos if you intended to use the second pcie slot for graphics.

EDIT: oops, ofc what I should be saying is how dare they, this breaks so many regulations, it's rediculous !

30-03-2011, 07:11:59

I Hunta x
Is it bad i find it funny when people try to argue with rasta?

30-03-2011, 07:27:49

Rastalovich
+rep

No, cos you've been around so long that you already know how much of a pain in the arse, opinionated, waffle-spouting, member I can be.

Really only trying to advise how the respective document system works, and I attract the arguments probably by the way I write things. Sue me, it took me ages to get an english qualification, whilst at the same time I got math, electronics and physics, and went on to work in the field where we get these document releases.

And despite how members may take things, or increasingly tend to over the last so many years, I honestly don't mean any offense by any of it. If someone takes it, I'll be the first to appologise.

30-03-2011, 07:29:25

tinytomlogan
Quote:
Originally Posted by Rastalovich View Post

+rep

No, cos you've been around so long that you already know how much of a pain in the arse, opinionated, waffle-spouting, member I can be.

Really only trying to advise how the respective document system works, and I attract the arguments probably by the way I write things. Sue me, it took me ages to get an english qualification, whilst at the same time I got math, electronics and physics, and went on to work in the field where we get these document releases.

And despite how members may take things, or increasingly tend to over the last so many years, I honestly don't mean any offense by any of it. If someone takes it, I'll be the first to appologise.
The banter can get heated but its always welcomed

30-03-2011, 09:56:24

Karlos_the_n00b
Quote:
Originally Posted by I Hunta x View Post

Is it bad i find it funny when people try to argue with rasta?
If so then we're going to hell together. lol

30-03-2011, 10:12:46

cupojoe
Quote:
Originally Posted by Rastalovich View Post

+rep

No, cos you've been around so long that you already know how much of a pain in the arse, opinionated, waffle-spouting, member I can be.

Really only trying to advise how the respective document system works, and I attract the arguments probably by the way I write things. Sue me, it took me ages to get an english qualification, whilst at the same time I got math, electronics and physics, and went on to work in the field where we get these document releases.

And despite how members may take things, or increasingly tend to over the last so many years, I honestly don't mean any offense by any of it. If someone takes it, I'll be the first to appologise.
This post pretty much says it all, pain in ares is right. You say you have an English qualification, so no excuse there. You claim something about math, electronics and physics, so no excuse there. You claim that you will be the first to apologize if someone is offended.

Well I am offended, everyone who has suggested that you are wrong has been responded to with off point often rude, sarcastic and hyperbolic comments. So all these people don't know what they are talking about? You think you are the only person who understands the way things work? My how lucky we all are that you are here to inform the rest of us. What would we idiots do without you.

What a load of SH*T!

The company I work for is a member of the PCI Special Interest Group and I am a consulting engineer. So when I say that you are wrong I think everyone on this forum will appreciate my meaning.

So here goes, you are wrong.

Now anyone can argue for the sake of argument. It takes real character to admit when you are wrong. So what's it gonna be, do you have the stones to admit when you are wrong or are you that other guy?

30-03-2011, 10:25:32

MrBlack
Quote:
Originally Posted by cupojoe View Post

This post pretty much says it all, pain in ares is right. You say you have an English qualification, so no excuse there. You claim something about math, electronics and physics, so no excuse there. You claim that you will be the first to apologize if someone is offended.

Well I am offended, everyone who has suggested that you are wrong has been responded to with off point often rude, sarcastic and hyperbolic comments. So all these people don't know what they are talking about? You think you are the only person who understands the way things work? My how lucky we all are that you are here to inform the rest of us. What would we idiots do without you.

What a load of SH*T!

The company I work for is a member of the PCI Special Interest Group and I am a consulting engineer. So when I say that you are wrong I think everyone on this forum will appreciate my meaning.

So here goes, you are wrong.

Now anyone can argue for the sake of argument. It takes real character to admit when you are wrong. So what's it gonna be, do you have the stones to admit when you are wrong or are you that other guy?
I'm betting he is the other guy.

30-03-2011, 10:42:28

Rastalovich
Quote:
Originally Posted by cupojoe View Post

The company I work for is a member of the PCI Special Interest Group and I am a consulting engineer.
So you should be aware of how the system works and the possibilities of future statements/papers/addendums that may come on the back of technology developements.

And I do appologise, unreservedly if any offense is taken.

30-03-2011, 10:50:52

BrokenThingy
Quote:
Originally Posted by Rastalovich View Post

Excellent research, and best of luck with your ATX standard mobo with the agp for advanced graphics and isa lanes for the 7 expansion slots. Let us know how you get on with that, I assume your current mobo fits these standards.

There is no debate as to whether PCI-SIG mention 375w as the operational limit, but as I'm try to explain to you all about how the system works, this is/was dependant what is available at the time of testing and writing of the said document. i.e. 2x 8pin atx pcie power connectors - it would not be envisaged that anyone would go beyond that at the time.

Back with the base 2.0 specification they would have stated 75w, for the slot, plus *whatever additional power could feasibly and realisticly be supplied*. They added molex, upped the power, 6x pin, upped the power, 8x pin, upped the power .. til the documents now read 375 .. another change comes along and so the documents will/may change.

As a statement of intent, for sure both parties stated they were sticking within this, and they have. They're all nice and friendly behind the scenes, even tho tho poke tounges out at each other when the other's not looking.

This isn't a political battleground or anything buddy, I'm merely trying to explain to you how the system works.

At the current moment in time, the PCIe 2.0 (latest 2.1) 'should' (I can't confirm this) be being used within modern up to date mobos being release. 2.1 is (should be) the final stepping stone to 3.0, it is *practically*, for all intensive purposes, the same in everything except data usage (for arguments sake). Whether this itself carries the same electrical properties of 3.0 I can't tell you, as the documents within PCI-SIG, which you'll need membership for to look at maybe, do-not give specific numbers on what the power req for base-3.0 will be. I did speculate the other week, knowing that both nvidia and amd have release these cards that so fragrantly play the "I'm not touching you" game with 375w, amd especially as they have a bigger hand in mobos these days, that they are infact in-house using "3.0" and are gearing up for it for future releases.

The reasoning behind many of the failures of going beyond 375 in testing gfxcards is down to how these 8x pin sockets are supplied. If you go down the standard route of conventionally hooking up your psu and using 8x pin designated cables, your system can and usually will, shutdown as the psu trips out. BUT if you supply the 8x pin socket with a combination of power sources, using adapters in the main, much more than 375 can be there if the card requires it - i.e. what overclockers will tend to do if they're intent on breaking the boundries.

For sure, there will be disclaimers all over the websites of these manufacturers explaining to you how awful it will be for your system if you go beyond 375, they'll probably even use it as a means to refuse warranty. But hey - it's a disclaimer - Intel have disclaimers about how many volts they want you to put over their cpus, and how much notice of that do enthusiasts take notice of ? In today's climate we need disclaimers on bridges that dangling your baby off the edge of it could result in harm and the bridge people won't be held responsible.
Quack Quack quack quack quack......

I cant confirm this, I cant confirm that, what a bunch of doubletalk. All these excuses just to get out of admitting your wrong. It's like cupojoe said arguing for the sake of argument.

Man up and admit your wrong

30-03-2011, 10:53:37

SieB
What has become of this thread...

30-03-2011, 10:54:32

tinytomlogan
Aye calm it down now you lot please.

/warning shot off the bow

30-03-2011, 11:12:55

oneseraph
Quote:
Originally Posted by tinytomlogan View Post

Aye calm it down now you lot please.

/warning shot of the bow
Roger that!

Apologies to all for my part, that includes you Rastalovich.

31-03-2011, 03:09:47

cupojoe
Quote:
Originally Posted by oneseraph View Post

Roger that!

Apologies to all for my part, that includes you Rastalovich.
Way to step up oneseraph.

I share your sentiment.

12-04-2011, 15:18:07

Chaney
SO um idk what this has to the lot of it all... But i did have my first cup of tea... Watching this vid review on youtube... First off the cup of tea was awful... I hate it and dont understand secondly how you all can drink it... But after i hopped in the car and headed off for a redbul and got back to watch... I quickly began to see a huge grin just a builden on my face... mmmmmm GTX 590 = lesbian porn between two 580's... Daddy liky
Reply
x

Register for the OC3D Newsletter

Subscribing to the OC3D newsletter will keep you up-to-date on the latest technology reviews, competitions and goings-on at Overclock3D. We won't share your email address with ANYONE, and we will only email you with updates on site news, reviews, and competitions and you can unsubscribe easily at any time.

Simply enter your name and email address into the box below and be sure to click on the links in the confirmation emails that will arrive in your e-mail shortly after to complete the registration.

If you run into any problems, just drop us a message on the forums.