Is it possible SLI Classified could be pulling this much juice??

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

YBS1

Golden Member
May 14, 2000
1,945
129
106
But it's crashing out 3Dmark right?

What happens if you just play a game?

Your CPU shouldn't load up much except in the physics test and in the combined, but I'd imagine you'd be bottlenecked by your your CPU in the combined test either way.

If it's crapping out on the first test it doubt it really matters that your CPU is at 5.1GHz, meaning your cpu isn't making a huge impact on power consumption.

That said my highest draws are in 3Dmark on the gpu tests, even if my cpu wasn't being worked very hard. I draw less power on combined than I do in the first two tests.

Try Crysis 3 maxed out, see if it crumbles there too, because that's actually a fairly balanced load it should work both the cpu and gpus.

Yes, just as the loading screen is finished, the instant the actual rendering starts to "fade in" it shuts down, not crashes, turns OFF. It doesn't happen at 4.8GHz with the cards all roided up, only 5.0-5.1GHz, and it's fine as well at 5.1GHz so long as the cards are left in a relatively tame state (1.2v 1300MHz). It's just the combination, which is what led me to believe I'm pushing up against some kind of power limitation.
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
Yes you are reaching the maximum power that 105amps can sustain you could easily pass 125 amps in a 1500w too. Use ur 650 for one card. A card on a full cover block at 30C can use up to 25% less power at identical voltages & clocks than the same card on stock air at 80C. Kill-a-watts are $6 on ebay.

doesn't happen at 4.8GHz with the cards all roided up, only 5.0-5.1GHz, and it's fine as well at 5.1GHz so long as the cards are left in a relatively tame state (1.2v 1300MHz). It's just the combination

that's proof right there. im guessin both PSUs on a powerstrip ded 20-30amp breaker, ded circuit, cpu 1.55v+, vgas 1.4/1.7v+ pushing same test for highest overclocks would show 1800-2100 on a killawatt. probably 1100-1400w right now at only 1.27v for vgas...

These guys use ZERO peripherals, are only a few 100mhz above you, and still like to have ~1000 watts per card (not including PCIE)

943083_371045173000930_1098458191_n.jpg


quad-crossfire-hd-7970-ln2-cooled.jpg
 
Last edited:

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
^Ya crazies.

Easy to see what you need when you look at what the pros are doing. Take 400w out for the CPU, and these guys are giving themselves headroom of 1400 watts & 1100 watts per GPU respectively, while you're allowing 450 watts per GPU. Obviously you will black out at a power wall sooner than they will. 6,000 + 4,800 watts. They bench 2.3v+ on CPU, and 1.9v/2.0v+ on VGAs. All kinds of extra capacitors & phases soldered on. That is Kingpin & Andre Yang, 4x Titan & 4x 5870. Their cpu clocks are 5.5-6.5ghz & vgas at 1.8-2.0ghz.
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
BTW congrats on 9th place for 2x GPU. Try cpu at 5.2-5.3, use 331.40 driver, add that 2nd PSU & up volts from 1.27 to ~1.45 and I see you taking 3rd or 4th Place! :p
 

Topweasel

Diamond Member
Oct 19, 2000
5,437
1,659
136
Keep in mind that 91% efficiency at full draw is about 1090 on a 1200w power supply. Which means if you are drawing 2x400 (800) and 250 for the CPU. That's 1050, only 40W less then the max internal power. This doesn't count cooling and other periphs or the MB itself.
 

YBS1

Golden Member
May 14, 2000
1,945
129
106
I have a Lepa 1600 watt on the way, we'll see if that solves the issue.
 

Slomo4shO

Senior member
Nov 17, 2008
586
0
71
Keep in mind that 91% efficiency at full draw is about 1090 on a 1200w power supply.

What? What does the efficiency have to do with the load capacity of the PSU? A 91% efficient model that is loaded at 1200W would pull 1200W/.91 = 1318.68W from the outlet. Having a lower efficiency model doesn't lower the DC load capacity of the model, it just means that the unit is just less efficient at delivering that power...
 
Last edited:

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
Grab a $6 Killawatt off Ebay to go along with your new $340 PSU. Actually SEE how much juice you're using
 

YBS1

Golden Member
May 14, 2000
1,945
129
106
Just a quick update on this:
I've had the new PSU for a few days now and got around to installing it tonight. That was a pain in the butt I wasn't expecting due to numerous cumulative things. One being the pci-e cables are quite short compared to the previous psu's, which in itself wouldn't be an issue but with the HAF-XB's unusual layout it was a challenge. This was compounded by the wiring itself being quite stiff. That also led to the pcie plug/wiring of the primary card solidly impacting the fan housing of the 200mm top mount fan, preventing the top panel from sliding forward enough to close...*sigh*, had to get the dremel out to shave that down a bit.

Anyway, long story short ---> I guess a 3930K clocked north of 5GHz and two overvolted 780 Classifieds can indeed shut down a 1300watt psu. That is to say, yes, the new 1600 watt psu performed beautifully and solved the issue. Now to figure out where these cards top out at I guess, didn't have time for that tonight. To be honest that will probably be put on the backburner until I get my other new toys installed, I've had a pair of 1TB Samsung Evo's sitting on the desk for a week now crying out for 8.1 to be installed. :twisted:
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
/jelly

Shoot 300w more headroom, I'd go dual supply >.<

Also you should look into cable extenders to relegate the cable problems with the new PSU and for beautification.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
The general guesstimate is about 400W with 1.3V going through them. On OCN someone running 4 Titans @ 1.3V and an OCed SB-E was shutting down his 1500W PSU.

Just out of curiosity, what would be expect at the following volts (or if you can link me to an article that might have the info):

@1.125v
@1.175v

I got a dinky 750W and it seems running higher clocks than 1125mhz crashes my driver at @1.175v.

I've ran a suicide test @1.225 1250/1570 on one card, no issue.
 

seitur

Senior member
Jul 12, 2013
383
1
81
3930K alone at ~5 GHZ OC will draw ~400W+, maybe even more depending on chip lottery, temperatures, etc

http://www.bit-tech.net/hardware/cpus/2011/11/14/intel-sandy-bridge-e-review/10

3930K@4,7Ghz platform on CPU Prime95 test draw 200W more than 2600K@5Ghz.
Extra cores and beffier platorm come at a Wattage price.
Additional 400 MHZ at such high frequencies will come at high additional power draw.


So 3930K @ 5.1 Ghz + 780 SLI Overclocked + other stuff (how many HDDs, memory sticks and other stuff you have?) can EASILY draw more than 1260W.
btw. Simply buy Watt Meter and check yourself :)
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
My 3930k pulls ~200W at 4.4 Ghz and 1.3V. Its rubbish. 4.5Ghz isn't stable despite 1.4V pumped through it but the power usage climbs well above 300W for that single 100Mhz and so far I haven't found a set of stable settings so I keep knocking it back to the much easier to achieve 4.4Ghz.
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
What is the 100% value of the power target referring to with the new bios? Stock 100% is 250W.
Check with a bios editor what the value actually says:


This is my mod bios. As you can see, the maximum wattage that one of my Titans can pull is 375W. 100% is still at 250W, though. If it is the same with your card, your cards cannot possibly draw more than 250W*120%*2=600W (add to that the losses at the power supply of course).
 
Last edited:

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,330
126
^ With the Skynet BIOS these cards are opened up to a maximum of over 500W, 100% is somewhere around 400W and 130% is well over that. Take a look at OCN, people have used Killawatt. GK110 is going up above 350W and near 400W with 1.3V+ pumped through it.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Yeah, you can easily exceed Titan's power draw with the classified since there are specific tools to unlock the card fully with EV Bot and some BIOS' going around. Additionally, TDP is not power consumption. The 250W TDP does not correlate to 250W of power consumption, TDP has never equaled maximum power consumption.

I remember over-volting my GTX 680 lightnings and on a single card I could go past 800W power consumption measured at the wall when I over-volted to 1.4xV and unleashed the krakn.. With a single card, at stock settings? It was usually in the 300s. The point is, the TDP is not a measurement to pay attention to in terms of power consumption, TDP is NOT max power consumption. TDP does not measure power consumption directly. Never has been, never will be, especially when you overclock or over volt.
 
Last edited:

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
^ With the Skynet BIOS these cards are opened up to a maximum of over 500W, 100% is somewhere around 400W and 130% is well over that. Take a look at OCN, people have used Killawatt. GK110 is going up above 350W and near 400W with 1.3V+ pumped through it.

That's why I asked what bios he exactly used :)
I suspected that there are some variants out there which have 100% at a higher value than 250W, but I was not sure.

@TDP:
With today's GPUs, TDP = max power consumption if the card is not overclocked. TDP of the Titan is 250W, and if your Titan reports 100% power usage (no mod bios), it draws quite exactly 250W. That's because the card will boost as high as possible (barring the temp target) until the power regulating circuitry cuts in and limits the clocks so the card doesn't exceed the power target.
Before we had boost or the power limiting "feature", this was different. A card could draw much less than it's TDP or quite a bit more. Today though, these features lead to the cards pretty much staying at their TDP if the GPU load is sufficiently high.
 
Last edited:

Abwx

Lifer
Apr 2, 2011
11,964
4,935
136
What is the 100% value of the power target referring to with the new bios? Stock 100% is 250W.
Check with a bios editor what the value actually says:


This is my mod bios. As you can see, the maximum wattage that one of my Titans can pull is 375W. 100% is still at 250W, though. If it is the same with your card, your cards cannot possibly draw more than 250W*120%*2=600W (add to that the losses at the power supply of course).

ROFL.......step is in mW...:D
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
That's why I asked what bios he exactly used :)
I suspected that there are some variants out there which have 100% at a higher value than 250W, but I was not sure.

@TDP:
With today's GPUs, TDP = max power consumption if the card is not overclocked. TDP of the Titan is 250W, and if your Titan reports 100% power usage (no mod bios), it draws quite exactly 250W. That's because the card will boost as high as possible (barring the temp target) until the power regulating circuitry cuts in and limits the clocks so the card doesn't exceed the power target.
Before we had boost or the power limiting "feature", this was different. A card could draw much less than it's TDP or quite a bit more. Today though, these features lead to the cards pretty much staying at their TDP if the GPU load is sufficiently high.

That doesn't even make sense because TDP is not a measure of maximum power consumption and never has been. Anyway, this isn't worth really arguing about because I do like GPU Boost 2.0 quite a lot - so I guess we can agree to disagree about the TDP tidbit. That said, the classified does allow quite a bit more in terms of power consumption than any Titan, especially with EV Bot and the modded BIOS. It can pretty much go sky high as the 680 classified and 680 lightning could, I don't think this is possible with Titan unless you hard solder a VRM mod to it and whatnot.

I just find it confusing that the GTX 780 was allowed to be modified with custom PCBs like this, yet the Titan wasn't. Confusing. It seems that the other way around would make more sense, if anything. Not that i'm complaining, being a 780 user myself. ;)
 
Last edited:

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
That doesn't even make sense because TDP is not a measure of maximum power consumption and never has been. Anyway, this isn't worth really arguing about because I do like GPU Boost 2.0 quite a lot - so I guess we can agree to disagree about the TDP tidbit. That said, the classified does allow quite a bit more in terms of power consumption than any Titan, especially with EV Bot and the modded BIOS. It can pretty much go sky high as the 680 classified and 680 lightning could, I don't think this is possible with Titan unless you hard solder a VRM mod to it and whatnot.

I just find it confusing that the GTX 780 was allowed to be modified with custom PCBs like this, yet the Titan wasn't. Confusing. It seems that the other way around would make more sense, if anything. Not that i'm complaining, being a 780 user myself. ;)

It hasn't been, but since Kepler it is since the power target and the TDP are closely related if not the same.

If Nvidia want's to stay ahead, they will have to rethink their greenlight program and allow custom versions of all their cards.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,330
126
That doesn't even make sense because TDP is not a measure of maximum power consumption and never has been. Anyway, this isn't worth really arguing about because I do like GPU Boost 2.0 quite a lot - so I guess we can agree to disagree about the TDP tidbit. That said, the classified does allow quite a bit more in terms of power consumption than any Titan, especially with EV Bot and the modded BIOS. It can pretty much go sky high as the 680 classified and 680 lightning could, I don't think this is possible with Titan unless you hard solder a VRM mod to it and whatnot.

I just find it confusing that the GTX 780 was allowed to be modified with custom PCBs like this, yet the Titan wasn't. Confusing. It seems that the other way around would make more sense, if anything. Not that i'm complaining, being a 780 user myself. ;)

Somehow I think EVGA being their biggest partner got a pass for the 780 Classified. You have to use an external software tool or EVBOT to do the adjustments and I wouldn't be surprised if most people who have bought a Classified never go beyond using Precision and the standard 1.21 voltage maximum.

I'm still interested how the 780 HOF does so well. I wonder if there is some hidden voodoo on the card pumping extra juice into it for those 1300 clocks you see some of them getting. There is no way either of my Classifieds can do 1300 stable with the standard 1.21V