[h] 4 Weeks with NVIDIA GeForce GTX 980 SLI

96Firebird

Diamond Member
Nov 8, 2010
5,746
342
126
Seems like his biggest complaint with the 290X setup was the noise and heat, which is understandable because he had the reference coolers. I wonder how his opinion would change if he had a better cooler on the 290X's.

Edit - I really hope AMD completely rebuilds their reference cooler from the ground up for their next big release. They got absolutely slammed on reviews because of it, on an otherwise great GPU.
 
Last edited:

n0x1ous

Platinum Member
Sep 9, 2010
2,574
252
126
Edit - I really hope AMD completely rebuilds their reference cooler from the ground up for their next big release. They got absolutely slammed on reviews because of it, on an otherwise great GPU

yes ++++++++++++++++++++++++++++++
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
It was just an alright GPU, IMO. Great perf/$ but all the other characteristics were disappointing. Perf/w and perf/mm2 were not improved at all over the 22 month old Tahiti, and heat was a major issue. Even with custom cooled cards, Hawaii ran hotter than any other chip on the market in comparable cooling solutions. All of those dings led to a mixed bad, and despite even bringing 64 ROPs to the table did not make a massive difference except when gaming at a resolution (4k) that that hardly anyone games at yet and still requires major concessions to run at fluid frame rates.

Anyways, 980 shines when overclocked to the max. Performance is untouchable (even vs. OC'd multiple 290x's superior scaling) while noise, temps, and power are fantastic.
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
It was just an alright GPU, IMO. Great perf/$ but all the other characteristics were disappointing. Perf/w and perf/mm2 were not improved at all over the 22 month old Tahiti, and heat was a major issue. Even with custom cooled cards, Hawaii ran hotter than any other chip on the market in comparable cooling solutions. All of those dings led to a mixed bad, and despite even bringing 64 ROPs to the table did not make a massive difference except when gaming at a resolution (4k) that that hardly anyone games at yet and still requires major concessions to run at fluid frame rates.

Anyways, 980 shines when overclocked to the max. Performance is untouchable (even vs. OC'd multiple 290x's superior scaling) while noise, temps, and power are fantastic.

The problem with "comparable cooling solutions" is those solutions were designed for the GK110 and a compromised fir for Hawaii. Custom cards with properly designed coolers (Sapphire Powercolor) weren't any hotter. Whether 4K, the current state of the art" support is important or not, is a matter of opinion. I think there are plenty of 1080p solutions already though.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
The problem with "comparable cooling solutions" is those solutions were designed for the GK110 and a compromised fir for Hawaii. Custom cards with properly designed coolers (Sapphire Powercolor) weren't any hotter. Whether 4K, the current state of the art" support is important or not, is a matter of opinion. I think there are plenty of 1080p solutions already though.

I really doubt that MSI and ASUS designed their cooling solutions are 1 particular chip only, GK110, to the point where the same cooling solution would suffer on smaller chips like GK104, Tahiti, Hawaii, Pitcairn, and Tonga, etc. That's an absurd statement and the first time hearing it. Not believable. Lets just all agree with the overwhelming facts that Hawaii runs hotter and generally consumes more power than GK110 when all else is equal. And every modern GPU supports 4k, my point was that the 64 ROPs didn't make enough of a difference at 4K to make it a single-card viable solution in most games without sacrificing graphics settings.
 
Last edited:

KaRLiToS

Golden Member
Jul 30, 2010
1,918
11
81
I like this quote particularly

My recent NVIDIA driver experience has been less than golden. If you noticed from the previous page that I am running an older WHQL driver you would be correct. I am currently running v344.16. NVIDIA WHQL driver v344.48 is released. Trying to install v344.48 proved to be nothing but a huge trouble. v344.48 would not install properly for me. During my first try at a successful install I was met with my very first Windows 8.1 Blue Screen. And it is a new BSOD for those of your wondering. Upon reboot, I had one of my three NV Surround screens not showing quite correctly. Lots of popping and flashing going on.

We are aware of forum conversations that go like this. "NVIDIA drivers suck." "No, AMD drivers suck." "No, NVIDIA drivers suck more." "No, AMD....." The fact of the matter is that in the land of the high end PC, there are always going to be some folks with driver issues. We are all early adopters compared to the rest of the world, so let's just consider ourselves beta testers and move on.

When I had 4 x HD 7970, took them months before the driver were fix for my specific resolutions.

Same thing when I had the R9 290x Quad.

Like he said, early adopters are beta testers. No matter what camp you choose, especially Ultra High end rigs with odd resolution, you'll get drivers issues at some point.


Great editorial.
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
I like this quote particularly

Like he said, early adopters are beta testers. No matter what camp you choose, especially Ultra High end rigs with odd resolution, you'll get drivers issues at some point.

Great editorial.


Indeed well thought and balanced article. But I would not cut the quote at that place.

It's not a big deal anyway, but it's a deal:

If you can't handle that, go buy a Mac. (And be shunned forever.)
The driver sets of today have gotten considerably better than where we were 5 years ago, and we have seen both camps take steps backwards at some junctions.
Overall though, both Red and Green continually move forward though.

And yeah, AMD drivers suck more than NVIDIA's, but not by much.
 

KaRLiToS

Golden Member
Jul 30, 2010
1,918
11
81
Indeed well thought and balanced article. But I would not cut the quote at that place.

It's not a big deal anyway, but it's a deal:


If you are refferring to this:

And yeah, AMD drivers suck more than NVIDIA's, but not by much.

It is a matter of opinion, thats why it's an editorial and not a review.
 

Zardnok

Senior member
Sep 21, 2004
670
0
76
I really doubt that MSI and ASUS designed their cooling solutions are 1 particular chip only, GK110, to the point where the same cooling solution would suffer on smaller chips like GK104, Tahiti, Hawaii, Pitcairn, and Tonga, etc. That's an absurd statement and the first time hearing it. Not believable. Lets just all agree with the overwhelming facts that Hawaii runs hotter and generally consumes more power than GK110 when all else is equal. And every modern GPU supports 4k, my point was that the 64 ROPs didn't make enough of a difference at 4K to make it a single-card viable solution in most games without sacrificing graphics settings.
You doubt it?

cooler2.jpg


Please note how only three of the five heat pipes make contact with the core and explain how that does not diminish the performance of the cooler.
 
Feb 19, 2009
10,457
10
76
I really doubt that MSI and ASUS designed their cooling solutions are 1 particular chip only, GK110, to the point where the same cooling solution would suffer on smaller chips like GK104, Tahiti, Hawaii, Pitcairn, and Tonga, etc. That's an absurd statement and the first time hearing it. Not believable. Lets just all agree with the overwhelming facts that Hawaii runs hotter and generally consumes more power than GK110 when all else is equal. And every modern GPU supports 4k, my point was that the 64 ROPs didn't make enough of a difference at 4K to make it a single-card viable solution in most games without sacrificing graphics settings.

Actually that was the problem. Gigabyte had a direct heatpipe design for their WF and for Hawaii, 2 heatpipes had NO contact on the GPU die because it was smaller than GK110 that the cooler was designed for, this lead to it running the fan a lot higher than required. Toms (Germany) did an article on this early on showing clearly the problem. Gigabyte even admitted it and put a statement saying only an early batch was affected, which computerbase.de had a review sample, throttling on nearly max fan speeds.

This led to Gigabyte modifying it, they basically added a shim, so its no longer direct heatpipe (reducing its performance). The same thing occurred for ASUS & MSI where their copper contact plate to the heatpipes was much larger than the Hawaii die but it wasn't a direct design so it was affected less.

The only coolers that were designed specifically for Hawaii is the Tri-X and PCS+, guess what, both perform amazingly. The "All else being equal" is a custom cooler designed with optimum performance for the GPU, which isn't the case for several brands because they re-use the GK110 design for R290/X.

Comparing efficiency, temps and noise, clearly a good designed custom Hawaii is a different beast when compared to the reference design:

power_average.gif


power_peak.gif


temp.gif


fannoise_load.gif


temps8.png


GPU-power.png


When given a decent designed cooler, Hawaii is very competitive versus GK110 in a lot of metrics, and in 4K, surpasses it especially in multi-card configs. It's a very good chip for its generation, but Maxwell takes it to another level being a next-gen GPU.
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I really doubt that MSI and ASUS designed their cooling solutions are 1 particular chip only, GK110, to the point where the same cooling solution would suffer on smaller chips like GK104, Tahiti, Hawaii, Pitcairn, and Tonga, etc. That's an absurd statement and the first time hearing it. Not believable. Lets just all agree with the overwhelming facts that Hawaii runs hotter and generally consumes more power than GK110 when all else is equal. And every modern GPU supports 4k, my point was that the 64 ROPs didn't make enough of a difference at 4K to make it a single-card viable solution in most games without sacrificing graphics settings.

ACC_5047_DxO.jpg

Note how the 2 outside heatpipes don't even touch the chip. I agree, it's absurd. They did it anyway, though.


Edit: Didn't notice Zardnock had already pointed it out. I really thought that anyone who had been paying attention new it already. It wasn't a secret and was discussed on many occasions. Again, Sapphire and Powercolor coolers have no issues at all keeping Hawaii cool and running quiet.
 
Last edited:

guskline

Diamond Member
Apr 17, 2006
5,338
476
126
Great Review. WOW what performance for the watts used. I tested my rig below and it uses a lot more with my watercooled 290s cranked up.

I turned all of my fans on max (3 Corsair SP120s on my RX360 radiator and 9 XSPC 140s on my MO RA3-420 external radiator). I run dual D5 pumps with my 3930k cranked to 4.6hz. Inside my case I also have a 230mm fan and a 90 mm fan. My sabertooth mb has a chipset fan. Both of my Sapphire Tri-X gpus are watercooled and overclocked via afterburner to 1075 core/1400 memory with power settings maxed.

I decided to run my licensed version of Aida64 which has a stress test feature to stress various components. With fans set at high my idle draw was 175W (155 if fans set to low). Max power usage with ALL components stressed was 910W. When I ran the test without stressing the gpus, the lowest total usage was 410W (went as high as 425).

All tests were done with my Kill A Watt meter. I only tested my computer, not my monitor.

It's a fair conclusion that the gpus uses as much as 500W stressed. or 250W per gpu.

No doubt the GTX980 uses less.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
ACC_5047_DxO.jpg

Note how the 2 outside heatpipes don't even touch the chip. I agree, it's absurd. They did it anyway, though.


Edit: Didn't notice Zardnock had already pointed it out. I really thought that anyone who had been paying attention new it already. It wasn't a secret and was discussed on many occasions. Again, Sapphire and Powercolor coolers have no issues at all keeping Hawaii cool and running quiet.

"Sigh".. The heat pipes are touching "each other" aren't they? You act like there isn't any heat being transferred to the 1st and 5th pipes and only the center pipe and 2/3rds of the 2nd and 4th pipes are collecting any heat from the GPU. Heat is conductive through metal.
 
Last edited:

caswow

Senior member
Sep 18, 2013
525
136
116
"Sigh".. The heat pipes are touching "each other" aren't they? You act like there isn't any heat being transferred to the 1st and 5th pipes and only the center pipe and 2/3rds of the 2nd and 4th pipes are collecting any heat from the GPU. Heat is conductive through metal.

you cant be serious:rolleyes:
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
I'm sure he's serious and I agree with him.

Here is the Asus gtx 980 strix :
cooler2_small.jpg
\\
http://www.techpowerup.com/reviews/ASUS/GTX_980_STRIX_OC/4.html

Hawaii is just a hot chip. We can try to excuse it with saying AMD somehow didn't know what they were doing and it's just 'a lousy cooler'. Then with another stroke of strangeness we have AIB's not 'knowing what they are doing'.
Maybe the obvious common denominator is a very hot chip, and pronounced more when a leaky chip is examined.
 

KaRLiToS

Golden Member
Jul 30, 2010
1,918
11
81
"Sigh".. The heat pipes are touching "each other" aren't they? You act like there isn't any heat being transferred to the 1st and 5th pipes and only the center pipe and 2/3rds of the 2nd and 4th pipes are collecting any heat from the GPU. Heat is conductive through metal.

This is even worse though

heatpipe.jpg


Even after the official response above, someone has brought it to our attention that the 3rd heatpipe isn’t even sealed off therefore having no functioning operation whatsoever as shown below.
 

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
After reading the OP, I'm curious as to why we are on the topic of third party heatsinks for the 290(x).

On point, I just want to say that I really like rice with linguica and kidney beans... yum!
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
After reading the OP, I'm curious as to why we are on the topic of third party heatsinks for the 290(x).

On point, I just want to say that I really like rice with linguica and kidney beans... yum!

I think because Kyle made a big deal about heat and power that the 290s he had put out and they ran so hot, but somewhere someone thought to counter with yet another IF statement. The if being "If they used better coolers" ?

I dunno. And those beans are awesome!
 

KaRLiToS

Golden Member
Jul 30, 2010
1,918
11
81
Are they going by the pic? Is there no solder inside that pipe that is circled?
Even if it was wide open, the metal of the pipe STILL conducts heat.

The gas inside the heat pipe is suppose to dissipate the heat better.

BUT

I remember I had to cut a VRM cooler in 2009 in order to make some room in the case (Antec 900) and the VRM heatsink ( the one that is vertical ) was getting freakin hot so the heat pipe was still doing its job even if it was cut and no gas inside.

What an ugly rig, ughhhh
19ul1y.jpg
 
Last edited: