FX5800: Most Powerful Graphics Card in History!

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

SSChevy2001

Senior member
Jul 9, 2008
774
0
0
Originally posted by: Wreckage
If you can read German, it would seem that the GT206 will use a lot less power
http://www.hardware-infos.com/news.php?news=2505

Wow even with google translator that's hard to read.

Now read this

http://www.anandtech.com/video/showdoc.aspx?i=3340&p=2

Under load, the GTX+ once again draws around 3% less power than EVGA's KO edition, it would seem that the move to 55nm actually doesn't buy NVIDIA much in the way of power savings

Doubt the GT200 will be any different, which is why their going right to 40nm next.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
I see your point with the memory clocks, but we cant say much til we know how much denser ICs have an impact on power consumption. In this case 4GB of GDDR3 vs 1GB.

edit - The power savings are probably determined by the clocks of the new chips based on 55nm. The GTX+ is clocked alot higher than the GTX card both core/shader and memory clocks.

It also depends if the 55nm is a simple die shrink OR nVIDIA decided to cut some unnecessary fat which might be possible seeing as the GT200 was 6 months late. (i.e its 55nm derivative was probably in the works).
 

SSChevy2001

Senior member
Jul 9, 2008
774
0
0
The GTX+ and the GTX KO are clock the same other than the KO having a slighty higher memory clock +25mhz.

Idle 187.9 vs 192.9
Load 264.3 vs 272.7
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
For the 12 watt difference between the two 260s- the 4870s are showing a 10 watt difference with identical clocks under load. As the reviewer clearly mentioned, the majority of that difference is likely due to chip variations. If you take the 4870 as an acceptable deviation then the difference between the parts would be 2 watts, with different core and mem speeds(different thermals between the chips could have the fans spun different accounting for some of this).

Doubt the GT200 will be any different

It already is, profoundly. Look at the AT charts you linked, the 260 consumes less power then either of the G9x cores despite utterly dwarfing them. The two different cores are entirely different beasts.

I don't say that any of this will add up to being the kind of huge decrease in power useage we are hearing nVidia talking about, that would put the 290 in the 4850 power range(depending on clocks), but RAM utilizes a very small portion of a boards power draw and comparing how one core reacts to a die shrink doesn't prove much with how a very different core will react.
 

SSChevy2001

Senior member
Jul 9, 2008
774
0
0
@BenSkywalker

Do you really think the GT200 will benefit more from a 55nm transition than the G92b did?

G92b saved about 8w and the GT200 will save 47w, sorry but I have a hard time seeing that.

If rams clocks don't matter then prove it.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: SSChevy2001
@BenSkywalker

Do you really think the GT200 will benefit more from a 55nm transition than the G92b did?

G92b saved about 8w and the GT200 will save 47w, sorry but I have a hard time seeing that.

If rams clocks don't matter then prove it.

Well so far that German site and the Quadro are pointing to lower power usage.

Why are you hoping and praying this is not so?
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Do you really think the GT200 will benefit more from a 55nm transition than the G92b did?

Absolutely, I don't think anyone would reasonably question that. The GT200 core is 1.8 times the size of the G92- furthermore there were no reasons for nV to do anything to the G92 besides increase the amount of yields per wafer. We know that the GT200b was respun, multiple times based on all reports. We also know that these parts are aimed towards refresh parts with higher performance and lower costs then what they replace, further we know that they are planning on releasing a x2 part. The G92 was a straight optical shrink, the GT200b clearly had more to it then that.

If rams clocks don't matter then prove it.

Ignoring a rather large variable there aren't you, the 4x increase in the amount of RAM-

The OCZ PC3-16000 and PC3-14400 Platinum Series will now be available in 2GB modules or 4GB (2x2048MB) dual channel kits and are backed by a lifetime warranty.

Specifications:

2000MHz DDR3
CL 9-9-9-28 (CAS-TRCD-TRP-TRAS)
Available in 1GB and 2GB modules and optimized kits
Unbuffered
Platinum Z3 XTC Heatspreader
Lifetime Warranty
1.8 Volts (1GB modules)
1.9 Volts (2GB modules)
240 Pin DIMM
1.85V EVP
EPP 2.0-Ready
Part numbers: 1GB Module - OCZ3P20001G, 2GB (2x1024MB)D/C PN - OCZ3P20002GK, 2GB Module PN - OCZ3P20002G, 4GB (2x2048MB)D/C PN - OCZ3P20004GK

Scroll down a little bit. So we have evidence that a 2x increase in ram comparable to the RAM utilized on the Quadro increases voltage demands by ~6%, what do you have showing a comparable decrease for clock speeds?
 

SSChevy2001

Senior member
Jul 9, 2008
774
0
0
@BenSkywalker

http://www.anandtech.com/video/showdoc.aspx?i=3415&p=9

The interesting thing is just how little difference going from 512MB to 1GB of RAM makes in terms of power. The two 4870s come in at just about the same power draw in both idle and load tests.
4870 1GB vs 512 - double the memory
279.3w vs 278.6w

Even if you had a 100% efficiency you'll only going to get 37w saving. There's no way it's going to be even close to that going from 55nm to 65nm. I would say at best you might see about 17w on the core. That still doesn't account for the remaining 30w.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
This just isn't going to get anywhere, I can throw you links showing the 1GB 4870 using 10watts less then the 512MB version, doesn't mean anything. The power draw of the RAM isn't significant on ANY graphics card currently available, but if you want to think it is, have at it.

Even if you had a 100% efficiency you'll only going to get 37w saving.

From a refresh and a die shrink.... OK.

A link for you. That is just a generic example, and it is a different type of shrink then what we are talking about- one full step down instead of one half, although it's just a straight die shrink. I'm not saying it will end up being exactly what we are having reported here as I have yet to see a neutral party test the Quadro, but whatever power savings the Quadro sees on a die shrink will be very close to what we can expect from the GTX.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Nice card, but if the HD 3870 based FireGL was able to outperform the 8800GTX based Quadro in rendering tests considerably, I couldn't see this card outperforming an RV770 based Firestream card (Whatever it's name is), but I might be wrong, but highly doubtful, quite cheap, I will buy 4 of them for cup holders!!
 

Zap

Elite Member
Oct 13, 1999
22,377
7
81
Originally posted by: keysplayr2003
Originally posted by: Zap
Originally posted by: keysplayr2003
It is a bit interesting to see that they can fit 4GB of memory on the card.

Part of it is just using higher density ICs, though that's usually good for just merely doubling RAM.

Yeah. I was thinking the same thing. 2GB would make more sense for higher density RAM.
4GB makes it sound like it has a 1024 bit memory controller, or 2 512's. LOL.

Alright, I have a GTX 280 in front of me, taken apart.

The PCB has pads on both sides for double the amount of memory ICs.

The reference fansink has extrusions for the extra unused memory IC pads.

So, double the number of ICs, and higher density ones. Easy!