[HEXUS]Overclocked 980 series big power consumption

Status
Not open for further replies.

CakeMonster

Golden Member
Nov 22, 2012
1,621
801
136
Yep, even thought the chip has tons of OC headroom, NV have decided to have stock speed at the most efficient point. If they had released it at higher speeds (and they easily could), we would have been praising the 780Ti killer and the performance leap, but complained about power use...
 

thilanliyan

Lifer
Jun 21, 2005
12,040
2,256
126
Still not bad though, considering it uses less power than a 780Ti or 290X, both of which are slower cards.
 

Annisman*

Golden Member
Aug 20, 2010
1,931
95
91
Yep, even thought the chip has tons of OC headroom, NV have decided to have stock speed at the most efficient point. If they had released it at higher speeds (and they easily could), we would have been praising the 780Ti killer and the performance leap, but complained about power use...

^ This X 1000
 

96Firebird

Diamond Member
Nov 8, 2010
5,738
334
126
Still under the Titan/780Ti/290/290X? I was expecting the power consumption to be higher than everything else when I clicked this thread.

Not bad...
 

Paul98

Diamond Member
Jan 31, 2010
3,732
199
106
Gives nVidia a chance to sell these cards at the price they are at and gives them the chance to down the line get more performance out of them at more power consumption if needed. Or next major release can look even better performance wise.
 

amenx

Diamond Member
Dec 17, 2004
4,418
2,740
136
Imo they should have bumped up performance a bit more to widen the spread vs the 970 and 290x. Then maybe left gm200 as the follow up to it.
 

mindbomb

Senior member
May 30, 2013
363
0
0
These premium cards trade energy efficiency for overclocking headroom...but if the user never overclocks (which is a possibility), then it is just a lose-lose situation, where you spend more and get less.
 
Last edited:

Face2Face

Diamond Member
Jun 6, 2001
4,100
215
106
It has everything to do with the increased core clock and the power limit not being the same as reference, even at the stock 100%. GPU Boost 2.0 at work.

While I don't have the numbers for the GTX 980's, I do have them for the GTX 970's.

The ASUS STRIX card allows for ~163w with the 100% power limit and ~196w with the power limit set to 120%.
The MSI Gaming card allows for 200w with the 100% power limit and 220w with the power limit set to 110%.
The Gigabyte G1 card allows for 250w with the 100% power limit and 280w with the power limit set to 112%.
 
Last edited:

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,329
126
http://hexus.net/tech/reviews/graphics/75153-gigabyte-geforce-gtx-980-g1-gaming/?page=9

Gigabyte G1 shows the same thing. How is a 100mhz factory overclock causing a 25% increase in power consumption ? Even with a change in the max possible power limit, why does the card make such high use of it for a small overclock ?

Do the reference 980s throttle often and these aftermarket cards sustain their clocks ? I don't think the reference 980s could be throttling much because they run relatively cool.
 
Last edited:

Sohaltang

Senior member
Apr 13, 2013
854
0
0
People really care about 50 watts in a gaming PC?? Give me moar speed! Double the speed and IDC if the GPU pulls 500 watts. Ill deal with the heat and PSU. Just give me speed.
 

96Firebird

Diamond Member
Nov 8, 2010
5,738
334
126

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,329
126
People really care about 50 watts in a gaming PC?? Give me moar speed! Double the speed and IDC if the GPU pulls 500 watts. Ill deal with the heat and PSU. Just give me speed.

On high end cards ultimately I doubt it. You hear it argued here because it's something to bicker over. 50w is nothing.
 

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
my reference 980 with +200 MHz core OC is barely showing any increase in power consumption over stock. It looks to be in the range of 10-15 watts if that. I'm comparing over repeated metro LL bench runs with a warmed up card
 

CakeMonster

Golden Member
Nov 22, 2012
1,621
801
136
Are you checking the actual clock speeds and voltage live when running benchmarks? Is it throttlign or adjusting at all or staying 100% of the time at the 200Mhz difference?
 

CakeMonster

Golden Member
Nov 22, 2012
1,621
801
136
Do the components of the individual card matter for power draw? Do the cards equipped with more power inputs or higher quality components draw more by themselves on idle and load? Or is it the voltage settings, clock speed, and power limit that decides power draw regardless of layout?
 

Face2Face

Diamond Member
Jun 6, 2001
4,100
215
106
Do the components of the individual card matter for power draw? Do the cards equipped with more power inputs or higher quality components draw more by themselves on idle and load? Or is it the voltage settings, clock speed, and power limit that decides power draw regardless of layout?

Or is it the voltage settings, clock speed, and power limit that decides power draw regardless of layout?


Mostly this^

Cards with larger VRM systems are actually more efficient as more phases means less heat, more reliability, and less voltage ripple. The MOSFET's are transistors that convert PSU 12v to a GPU ready voltage, such as 1.25v. This process causes the MOSFETS to output a fair amount of heat, which is why you see passive heatsinks over the MOSFET's and typically not anything else in a GPU VRM setup. The hotter the MOSFETs become, the less the efficient they become, just like in a PSU.
 
Last edited:

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
Are you checking the actual clock speeds and voltage live when running benchmarks? Is it throttlign or adjusting at all or staying 100% of the time at the 200Mhz difference?

I'm watching my power meter in real time as the bench runs it's course. I pretty hover right around the 296 watts system power mark.

4790k- 4.8 ghz
980- 1440 MHz
2 ssd's
2 sticks of ram
3 case fans.

power consumption of this machine is pretty incredible actually


320 watts in far cry 3
 
Last edited:

mindbomb

Senior member
May 30, 2013
363
0
0
Or is it the voltage settings, clock speed, and power limit that decides power draw regardless of layout?


Mostly this^

Cards with larger VRM systems are actually more efficient as more phases means less heat, more reliability, and less voltage ripple. The MOSFET's are transistors that convert PSU 12v to a GPU ready voltage, such as 1.25v. This process causes the MOSFETS to output a fair amount of heat, which is why you see passive heatsinks over the MOSFET's and typically not anything else in a GPU VRM setup. The hotter the MOSFETs become, the less the efficient they become, just like in a PSU.

i've heard the opposite. smaller vrm systems deliver less current, but are more efficient since the work gets divided among the components better.
 

Face2Face

Diamond Member
Jun 6, 2001
4,100
215
106
i've heard the opposite. smaller vrm systems deliver less current, but are more efficient since the work gets divided among the components better.


Correct, when delivering less current, but not at full load. A large VRM system will not use all of it's phases at idle or when delivering a small amount of current (low gpu load).
 

thilanliyan

Lifer
Jun 21, 2005
12,040
2,256
126
I'm watching my power meter in real time as the bench runs it's course. I pretty hover right around the 296 watts system power mark.

Are the clock speeds fluctuating though? It might be hovering around the same wattage because it is at its TDP limit already and is throttling clocks to keep below it.
 

x3sphere

Senior member
Jul 22, 2009
722
24
81
www.exophase.com
Are the clock speeds fluctuating though? It might be hovering around the same wattage because it is at its TDP limit already and is throttling clocks to keep below it.

I have my reference 980s in SLI overclocked to +200 on core, which translates to 1466 Mhz boost clock. Looking at the Afterburner graph there was absolutely no throttling after playing Crysis 3 for an hour or so, it stayed pegged at the boost clock.

Don't know how much wattage they are pulling, but the cards stay at 80C and I've never seen the fan ramp up more than 60%
 
Status
Not open for further replies.