FX5800: Most Powerful Graphics Card in History!

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
This thread should be renamed to "Quadro FX5800: Most Powerful Workstation Card in History!"

The most interesting thing from this recent releases isnt about 4GBs of ram, or the use of the infamous FX 5800 series moniker but the TDP values.

Quadro FX 5800 has a TDP of 189W. The GTX280 has a TDP of 236W. Thats a whopping 47W difference. Its get more Interesting i suppose since the Quadro not only packs in 4GBs of ram, but when comparing the fillrate numbers, the Quadro is actually clocked higher than the GTX280.

So could this new card be using a 55nm version of the GT200? ala GT206??
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: Cookie Monster


So could this new card be using a 55nm version of the GT200? ala GT206??

True, this could be our first look at the "refresh".
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
Somewhat related question regarding video card memory. Is it the case that if you run a 32 bit OS, the more video card memory then the less system memory? Isn't a 4870x2 technically a more powerful card too? Anyway, 4GB is pretty nifty, does it actually make a difference for workstation type jobs? It's funny how just a year or two ago 512MB was a lot of video memory.
 

pcslookout

Lifer
Mar 18, 2007
11,959
157
106
Originally posted by: SlowSpyder
Somewhat related question regarding video card memory. Is it the case that if you run a 32 bit OS, the more video card memory then the less system memory? Isn't a 4870x2 technically a more powerful card too? Anyway, 4GB is pretty nifty, does it actually make a difference for workstation type jobs? It's funny how just a year or two ago 512MB was a lot of video memory.

Yep I believe you ran a video card in your system that had 4 GB of video ram on it you would have to use a 64 bit OS or a 32 bit OS would not have any ram to use lol. As video card ram standards grow to 1 GB most gamers will have to go to a 64 bit OS unless they don't care about losing more and more system ram.
 

rjc

Member
Sep 27, 2007
99
0
0
Originally posted by: Wreckage
True, this could be our first look at the "refresh".

Er the second, the Quadro CX was also supposedly a GT206.

Interestingly the specs on the nvidia product page dont show the memory interface for the FX. The CX has a 384bit memory interface(looking today, that has been removed from the nvidia specs sheet for some reason).

Assuming they used the same memory on the 2 cards (512 / 384) x 76.8 = 102.4gb/s which closely matches the listed 102gb/s listed on the quadro fx page.

If it is a 55nm chip implies that they still have kept a 512bit memory interface.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Don't forget that Quadro cards are almost always clocked lower than the GF cards. This could account for the 189W power draw.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: keysplayr2003
Don't forget that Quadro cards are almost always clocked lower than the GF cards. This could account for the 189W power draw.

This ones clocked higher. Look at the fillrate numbers ;)

The Quadro FX5800 maintains a 512bit bus. Its simple. Since the 4GB GDDR3 is clocked at 800MHz that leaves us with 512/8 x 800 x 2 = 102.4GB/s.

 

rjc

Member
Sep 27, 2007
99
0
0
Originally posted by: keysplayr2003
Don't forget that Quadro cards are almost always clocked lower than the GF cards. This could account for the 189W power draw.

Except according to techreport it has a higher fill rate 52b texels/s compared to the GTX280 at 48.2b texels/s

Suggests maybe a higher clock? ie 52/48.2 * 600 = 647mhz
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: Cookie Monster
Originally posted by: keysplayr2003
Don't forget that Quadro cards are almost always clocked lower than the GF cards. This could account for the 189W power draw.

This ones clocked higher. Look at the fillrate numbers ;)

The Quadro FX5800 maintains a 512bit bus. Its simple. Since the 4GB GDDR3 is clocked at 800MHz that leaves us with 512/8 x 800 x 2 = 102.4GB/s.

Cookie, what does the "8" bolded above represent? 8 64 bit registers?

 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: keysplayr2003
Cookie, what does the "8" bolded above represent? 8 64 bit registers?

Thats simple. 512bit memory interface. Note bit. 8 bit per byte.

So basically 512bit memory interface is same as saying it has a 64byte memory interface. To calculate the memory bandwidth it memory interface x memory clock frequency. (The unit of frequency is Hz where one Hz is equal to one cycle per second or s^-1). Memory bandiwdth i.e is expressed by bytes per second.

(512/8)bytes * (800MHz) * 2 (double data rate) = 102.4GB/s

We get ~102GB/s.


edit: correct me if im wrong though
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: Cookie Monster
Originally posted by: keysplayr2003
Cookie, what does the "8" bolded above represent? 8 64 bit registers?

Thats simple. 512bit memory interface. Note bit. 8 bit per byte.

So basically 512bit memory interface is same as saying it has a 64byte memory interface. To calculate the memory bandwidth it memory interface x memory clock frequency. (The unit of frequency is Hz where one Hz is equal to one cycle per second or s^-1). Memory bandiwdth i.e is expressed by bytes per second.

(512/8)bytes * (800MHz) * 2 (double data rate) = 102.4GB/s

We get ~102GB/s.


edit: correct me if im wrong though

I was just wondering what this had to do with the core clocks. If anything.
Arrggh.. I'm just too tired to think right now. Got to hit the pillow.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
O i was just pointing out that the refresh of the GT200 probably retains the 512bit memory interface unlike shrinking down to 384bit etc.

Should get some sleep.

No need to monitor the forums 24/7 ;)
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
Originally posted by: keysplayr2003
Originally posted by: Zap
Originally posted by: keysplayr2003
It is a bit interesting to see that they can fit 4GB of memory on the card.

Part of it is just using higher density ICs, though that's usually good for just merely doubling RAM.

Yeah. I was thinking the same thing. 2GB would make more sense for higher density RAM.
4GB makes it sound like it has a 1024 bit memory controller, or 2 512's. LOL.

not really, my first thought was just quadrupled memory capacity...they don't even have to use higher density, they can basically just throw more chips onto each channel.



Originally posted by: pcslookout
Originally posted by: SlowSpyder
Somewhat related question regarding video card memory. Is it the case that if you run a 32 bit OS, the more video card memory then the less system memory? Isn't a 4870x2 technically a more powerful card too? Anyway, 4GB is pretty nifty, does it actually make a difference for workstation type jobs? It's funny how just a year or two ago 512MB was a lot of video memory.

Yep I believe you ran a video card in your system that had 4 GB of video ram on it you would have to use a 64 bit OS or a 32 bit OS would not have any ram to use lol. As video card ram standards grow to 1 GB most gamers will have to go to a 64 bit OS unless they don't care about losing more and more system ram.

They'll have to care unless harddrive technology makes an unprecedented breakthrough in performance.
 

Hauk

Platinum Member
Nov 22, 2001
2,806
0
0
Originally posted by: Arkaign
Nvidia already had a 5800, and look how well that turned out ;)

Yea that's the first thing that came to my mind... :D

 

aldamon

Diamond Member
Aug 2, 2000
3,280
0
76
Originally posted by: Wreckage
Originally posted by: Cookie Monster


So could this new card be using a 55nm version of the GT200? ala GT206??

True, this could be our first look at the "refresh".

NVIDIA said in their recent conference call that everything being manufactured now is 55nm.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Clocked higher, and a full 47 Watt lower power consumption. Looking over some numbers, this would seem to place this 55nm GT2xx part below the power useage of a single 4870 by a reasonable amount. This of course makes the talk of upcoming x2 parts seem far more viable, particularly considering this should be closer to the 290 part then the 270 offering if these numbers are correct.

I'd like to see some neutral tests on those power claims though, hard to believe that nV could really be THAT far ahead of ATi in terms of power useage.
 

SSChevy2001

Senior member
Jul 9, 2008
774
0
0
Originally posted by: BenSkywalker
Clocked higher, and a full 47 Watt lower power consumption. Looking over some numbers, this would seem to place this 55nm GT2xx part below the power useage of a single 4870 by a reasonable amount. This of course makes the talk of upcoming x2 parts seem far more viable, particularly considering this should be closer to the 290 part then the 270 offering if these numbers are correct.

I'd like to see some neutral tests on those power claims though, hard to believe that nV could really be THAT far ahead of ATi in terms of power useage.
The 4870 eats to much power, because it doesn't drop memory speeds.

4870 Idle Current
Core/Memory VDDC Current
800/975 25.7A 1.236v
500/900 17.6A 1.083v - This is what ATI uses for it's default idle power.
800/400 12.2A 1.236v
800/200 9.5A 1.236v
550/200 9.5A 1.236v
550/200 4.1A 1.083v - This is what I use for idle.

With the drops in speeds I pointed out a 4870 should be able to match or beat GTX280 on idle power.

I would say most of the power saving on the FX5800 is coming from a slower DDR3 memory it's uses, and not from a more power efficient core.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Actually you may have realized this but the shader core (basically the biggest part of the GPU) is the main factor for idle power consumption of the GTX series. They drop down to a mere 100MHz.

I dont think a 47W difference comes from memory being clocked at 800MHz compared to 1100MHz of the GTX280. Not to mention its 4GB vs 1GB. I think power consumption would have gone up, not down.
 

SSChevy2001

Senior member
Jul 9, 2008
774
0
0
@Cookie Monster

Here's a review I was looking for to confirm my theory, since I don't have a GTX card to test with.

http://www.bit-tech.net/hardwa...60-amp2-216-edition/10

Our test system with the Zotac GeForce GTX 260 AMP²! Edition consumes the same power as it did with BFG Tech's GeForce GTX 260 OCX Maxcore installed at idle, but is 12W more efficient when the system is under load with Crysis running. Some of this can be attributed to the slightly lower core and shader clocks, but more so to the lower memory clock.
Load power consumption

337w Zotac GTX 260 AMP²! - 650/1400/1050mhz
349w BFG Tech GTX 260 OCX Maxcore 896MB - 655/1404/1125mhz

12w from 75mhz drop in memory speed
1100 - 800 = 300mhz drop in memory speed
12w * 4 = 48w

It's more than likely not going to be a 48w drop, but it should be a good amount of that 47w drop.