GK110 Fully Enabled

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
If the rumor's of AMD's GPU refresh coming October/November are true, then I think we'll see a fully loaded GK110 on the Geforce side, too. Nvidia played this game with Fermi, releasing cut down parts just fast enough to be clearly faster than AMD's (in part because they probably had to with their first gen Fermi parts). This time they obviously had more wiggle room and time GK110 (and they probably needed the time to get GK110 right), but I can definitely see AMD matching gtx780 and forcing Nvidia to drop it's price and release a gtx785 that will actually outperform Titan in games.
 

Elfear

Diamond Member
May 30, 2004
7,163
819
126
If the rumor's of AMD's GPU refresh coming October/November are true, then I think we'll see a fully loaded GK110 on the Geforce side, too. Nvidia played this game with Fermi, releasing cut down parts just fast enough to be clearly faster than AMD's (in part because they probably had to with their first gen Fermi parts). This time they obviously had more wiggle room and time GK110 (and they probably needed the time to get GK110 right), but I can definitely see AMD matching gtx780 and forcing Nvidia to drop it's price and release a gtx785 that will actually outperform Titan in games.

I suspect that will be true as well. Nvidia doesn't want to be caught with their pants down if the 9970 happens to match the Titan.
 

n0x1ous

Platinum Member
Sep 9, 2010
2,574
252
126
anyone else surprised this was rated @ 225w? clocks werent listed but they figure 900mhz.

really specially binned chips then or what?
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
If a GeForce variation appears, >$1K or supplement Titan at $1K and Titan takes $800 spot?

Damn, this is really making it seem like 9970 won't be <$600. If it matches Titan, and a 785 exist...oh God, my wallet...
 
Feb 19, 2009
10,457
10
76
Sounds really boring to me, in a few months its nearly 2 years into this gen and we're getting such tiny bumps..

Would these cards really entice users who own a 680 or 7970 for this long to upgrade? I guess GPUs are becoming like CPUs.. keeping them for longer and longer.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Sounds really boring to me, in a few months its nearly 2 years into this gen and we're getting such tiny bumps.

Personally better than no bumps.

Would these cards really entice users who own a 680 or 7970 for this long to upgrade? I guess GPUs are becoming like CPUs.. keeping them for longer and longer.

When my cycle comes to upgrade, the whole package is a factor, not just performance boost (however that is always the most important factor.) Efficiency and features is very important.

With the stagnation, cards seem to hold their values just a little better, so the upgrade cost (at least in my experience) isn't as steep.

However, if the costs get into the >$800 price points - woof. Even selling both my cards won't get me a new card :(
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
anyone else surprised this was rated @ 225w? clocks werent listed but they figure 900mhz.

really specially binned chips then or what?

Titan consumes 200-210W at ~850 MHz (German and French reviews with handicapped boost). So it's not too far out there.
 

Sable

Golden Member
Jan 7, 2006
1,130
105
106
If the rumor's of AMD's GPU refresh coming October/November are true, then I think we'll see a fully loaded GK110 on the Geforce side, too. Nvidia played this game with Fermi, releasing cut down parts just fast enough to be clearly faster than AMD's (in part because they probably had to with their first gen Fermi parts). This time they obviously had more wiggle room and time GK110 (and they probably needed the time to get GK110 right), but I can definitely see AMD matching gtx780 and forcing Nvidia to drop it's price and release a gtx785 that will actually outperform Titan in games.
Wait, so Titan was their midrange card as well?!
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Wait, so Titan was their midrange card as well?!

Titan is GK110, that's the high end chip. The high end chip can be cut, like say a GTX 480... But it's still their high end chip.

Titan would be like the 480, SMX disabled but full ROP and BUS.

GTX 680/770 would be the 560Ti of Fermi, fully enabled mid-range die.

GTX 780 is unique in that it still has full ROP and BUS but it has two SMX disabled, which is probably why clock for clock it's barely any slower than Titan.

A fully enabled GK110 isn't all that inspiring, unless they boost the TDP limit for GeForce. Kepler seems to be running against other bottlenecks not of the shader variety, any boost this card would bring would be single digits at the same clocks over Titan.

Nvidia still has a wealth of TDP to tap into though, so if AMD does by some stretch become competitive with nVidia in these upper segments expect Nvidia to release this with power consumption more inline with the TDP budget.

I don't think a fully enabled GK110 is very well balanced for Gaming, it's more workstation oriented. However Nvidia could put 770 ram speeds on it, and boost it's clocks up so it uses a fair bit more power than Titan and 780, that would create some separation.
 
Last edited:

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
Titan@full boost already uses more power than the 7970 GHz. About 20W more. There is not really any more headroom left.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Where?

04-Power-Consumption-Gaming.png


power_average.gif
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
How are Tom's and TPU measuring card power on its own? Sure they can measure power usage from the two external power connectors, but they cannot measure power going through the PCI-E port.
 

sushiwarrior

Senior member
Mar 17, 2010
738
0
71
It's total system power consumption. Not Titan's.

Like that makes a difference. Doesn't matter what makes it draw more power, your PC will draw more power with a Titan in the system than with a 7970GE. Maybe there is increased chipset traffic, maybe there is increased CPU load with more driver overhead, who knows what it is. Fact of the matter is system power consumption is more relevant than just the GPU power consumption.
 

Elfear

Diamond Member
May 30, 2004
7,163
819
126
Like that makes a difference. Doesn't matter what makes it draw more power, your PC will draw more power with a Titan in the system than with a 7970GE. Maybe there is increased chipset traffic, maybe there is increased CPU load with more driver overhead, who knows what it is. Fact of the matter is system power consumption is more relevant than just the GPU power consumption.

I don't usually agree with Ibra but in this case I think his statement is relevant to the discussion of how much TDP Nvidia has to work with. If the extra system power consumption comes from other components besides the GPU, they need to be subtracted out for a TDP discussion.

When considering power consumption by itself, I agree with you. A GPU needs the rest of the system to function and if other components use more power with a different video card, it's relevant to a consumer's decision.
 

Saylick

Diamond Member
Sep 10, 2012
3,921
9,134
136
It's total system power consumption. Not Titan's.

Please correct me if I am wrong here but if we assume everything else in the test rig is held constant between tests, the difference in power consumption between the bars on the chart will be a result of the video cards themselves.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Right, Titan isn't turning 30% more performance with the same system load...

The conversation was about GK110 power consumption specifically and how that relates to what nVidia has left on the table this gen.

Total system consumption is a moot point in this discussion, Titan was pulling 53% more frames than GHz in the Anandtech graph at 1440p.

Anyone with a basic understanding of how a PC works will know that 53% more frames delivered by a gpu means the cpu and everything else is working that much harder to keep up with the faster GPU.
 

KompuKare

Golden Member
Jul 28, 2009
1,224
1,582
136
How are Tom's and TPU measuring card power on its own? Sure they can measure power usage from the two external power connectors, but they cannot measure power going through the PCI-E port.

I'd assume they use something like:
pci-express-bus-extender.jpg

as hardware.fr use.
http://www.hardware.fr/articles/781-1/dossier-vraie-consommation-73-cartes-graphiques.html

As for the Quadro K6000, using a max of 225W sounds very good considering it has 12GB of GDDR5. I though the memory would use a fair bit all by itself - I know from measurements that I did for bitcoining (SHA256) that the 3GB 7950 I tried consumed about 30W less at the wall with the memory set to 310MHz vs what it pulled 1500MHz. That was with one of the newer Gigabyte Windforce models which doesn't report memory voltages.
 

Granseth

Senior member
May 6, 2009
258
0
71
Can't you just log how many amps and volts the cards use to get the wattage?
And there is a sea of software solutions to read those values, at least for AMD's GPUs.

And if you read the Anandtech review they are using a much newer card than the one they got as a review sample, and it can't be any surprise that it gets more efficient as production improves.
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
As for the Quadro K6000, using a max of 225W sounds very good considering it has 12GB of GDDR5. I though the memory would use a fair bit all by itself - I know from measurements that I did for bitcoining (SHA256) that the 3GB 7950 I tried consumed about 30W less at the wall with the memory set to 310MHz vs what it pulled 1500MHz. That was with one of the newer Gigabyte Windforce models which doesn't report memory voltages.

The energy consumption of additional memory amount is negligible. This is because the refresh of the memory uses little energy, but writing and reading over the bus uses a lot (moving data is costly, Nvidia mentioned this in a presentation about future architectures a few years ago. Data locality is king). But since the bus width is constant, the amount of data transferred per time unit is also constant. Thus the dynamic power of 3, 6 and 12 GB on a 384-bit bus is the same.