EXPreview gets a 512SP GTX 480

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
Or those are wall watts. If the video card is drawing close to 400 watts and the extra load on the PSU has pushed it from being at optimum efficiency to way less than optimum efficiency then yes, an extra 100 watts to the GPU could translate to an extra 200 at the wall.
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
The fans used on that Arctic Cooling heatsink have a range of 800-2000RPM I believe. So even at max speed the fans shouldn't be terribly loud, they don't spin at 5000RPM or whatever the blower in the 480 was rated for.

And yeah, the power consumption makes absolutely no sense. That can't be right.


What I was going to say.

I'm pretty sure that the reference nvidia heat sink would just saturate even at 100% and everything would melt down.
 

PingviN

Golden Member
Nov 3, 2009
1,848
13
81
You do understand the power figures you quoted are just ATX specs, right, and not the limits of the plugs?

Since quite a few high output power supplies already have dedicated rails for PCI-e connectors rated at 30A (360W) or more--not including the PCI-e slot's power, it's not impossible for a power supply to deliver more than the suggested 150W per 8-pin connector. (Power supplies like the Enermax Galaxy EVO 1250 w/30A PCI-e rails, Antec Quattro 1200 w/38A PCI-e rails, etc.)

You learn something every day I guess :).
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
94°C with that cooler!!!! An OC (999MHz) hd5970 only ran at 71°C with the same cooler ( http://www.overclock3d.net/reviews/gpu_displays/sapphire_5970_toxic_review/7 ) That's 2 overclocked cypress processors under the one cooler and it still ran 23° cooler. That's painfull. nVidia surely isn't really going to release this?
well its probably easier for two smaller chips to run cooler than one big chip. all that heat is going to be concentrated in one place after all.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
well its probably easier for two smaller chips to run cooler than one big chip. all that heat is going to be concentrated in one place after all.

Well, apparently in this case it is. :D I think it might have more to do with power draw than one processor or two though.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Well, apparently in this case it is. :D I think it might have more to do with power draw than one processor or two though.
even if it was the same power draw having it spread over 2 smaller chips would make it much easier to cool.
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
94°C with that cooler!!!! An OC (999MHz) hd5970 only ran at 71°C with the same cooler ( http://www.overclock3d.net/reviews/gpu_displays/sapphire_5970_toxic_review/7 ) That's 2 overclocked cypress processors under the one cooler and it still ran 23° cooler. That's painfull. nVidia surely isn't really going to release this?

Those two overclocked Crpress use less power than the 480. That is probably the biggest reason. I'd say there could easily be a 100W difference.
 

Daedalus685

Golden Member
Nov 12, 2009
1,386
1
0
That much more power is not unheard of. There would have to be a reason for them to not have released it in the first place and this is likely a production sample of some sort.

The CUDA core block that were 'bad' need not simply be entirely broken. It is perfectly possible that all of the chips work as 512 core parts... just that the block they disabled is leaky as high hell and does this to the power/temperature draw....

Looks like a re-spin awaits us for a 512 core part.... Does anyone know where they got this card from though? Did they manage to bios crack a regular 480 or did they find up one of the early sample cards they had while they were still binning? If the former that would be an interesting experiment to see if you got a lucky card with no leakage or one that melts down immediately ;).
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
That much more power is not unheard of. There would have to be a reason for them to not have released it in the first place and this is likely a production sample of some sort.

The CUDA core block that were 'bad' need not simply be entirely broken. It is perfectly possible that all of the chips work as 512 core parts... just that the block they disabled is leaky as high hell and does this to the power/temperature draw....

Looks like a re-spin awaits us for a 512 core part.... Does anyone know where they got this card from though? Did they manage to bios crack a regular 480 or did they find up one of the early sample cards they had while they were still binning? If the former that would be an interesting experiment to see if you got a lucky card with no leakage or one that melts down immediately ;).

Yea, I can't see them releasing this part if these numbers are accurate. I think Fermi has taught Nvidia some lessons. Even on the high end, while performance and performance for the money matter above all else, heat, power, and noise still are factored in by many customers. At least they are when one part is very significantly worse in those areas compared to the competition as is the case with the 470 and 480 vs. the 58xx cards.

I think Nvidia has to have taken notice of that. I just don't see them releasing this part with all those negatives after seeing the 470 and 480's luke warm reception (and rumored luke warm sales). They'd still have the fastest single GPU but not the fastest card, so it wouldn't change anything as far as that goes. Sure, they could maybe sell them for a few bucks more than the GTX480, but with that power consumption and heat output I doubt they'd sell enough to make it worthwhile. I just don't think most people, even enthusiasts, are willing to buy a more expensive 350-400 watt part that gives you less than 6% more performance than what is on the market now, and it would likely be priced close to the superior performaning and less power hungry 5970 further hurting it.
 

tincart

Senior member
Apr 15, 2010
630
1
0
A smart marketing idea would be to send out 512SP cards to enthusiast overclockers (like AMD did with the TWKR CPU's). No one complains about power consumption or heat since they're not commercial products and they get publicity for any extreme overclocking projects the enthusiast crowd does with them. You get some more benchmarks on the net that make nV look good without any of reviewers slagging the cards for their failings as consumer products.

I agree that, IF this is a genuine review that gives a reasonable look at what the final 512SP product would be, it is not marketable.
 

Paratus

Lifer
Jun 4, 2004
17,690
15,938
146
I can't wait until they drop this into an OC'd dual GPU card. Think how fast it would be! Official powe draw would probably be around 300w. Plus with NVs super extra awesome financial deals with TSMC I bet a dual card would only be around $525!.

The 5970 is in deep trouble now!
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
I can't wait until they drop this into an OC'd dual GPU card. Think how fast it would be! Official powe draw would probably be around 300w. Plus with NVs super extra awesome financial deals with TSMC I bet a dual card would only be around $525!.

The 5970 is in deep trouble now!


:D:D
:D
:D

You are kidding right?
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Why would I be kidding......



;)

Because the only card that would come close would be a sandwich GTX 470 and yet, it would be hard to keep cool and the power consumption would be noticeable higher. Unless if they create a special GTX 480 SKU with its own power brick and cooling water and for the performance difference and much higher costs, the HD 5970 would still far more cost effective.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
Because the only card that would come close would be a sandwich GTX 470 and yet, it would be hard to keep cool and the power consumption would be noticeable higher. Unless if they create a special GTX 480 SKU with its own power brick and cooling water and for the performance difference and much higher costs, the HD 5970 would still far more cost effective.

I disagree,
A full 384'sp dual gf104 (gtx 475) would beat it easily. No need for the hot gtx 470.
 

taserbro

Senior member
Jun 3, 2010
216
0
76
Call me crazy but that card on the review looks as official as a 3 dollar bill and I'd wager it's more along the lines of an early engineering prototype that someone sneaked out of the official chain of custody and installed an aftermarket cooler on.
The chances this ending up being anything like the final product aren't looking too good but in the sensationalist journalism sense, I guess their traffic boost was well worth whatever they paid for it.
 

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
Maybe for once nV is trying to keep expectations low, so when it does come out and it actually works, it is a surprise.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
I disagree,
A full 384'sp dual gf104 (gtx 475) would beat it easily. No need for the hot gtx 470.

I hope that happens, so it can force competition to drop prices, and it will make a nice SKU, for I'm pretty certain that they will have to drop clocks a bit even with the far more power efficient GF104.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
You do understand the power figures you quoted are just ATX specs, right, and not the limits of the plugs?

Since quite a few high output power supplies already have dedicated rails for PCI-e connectors rated at 30A (360W) or more--not including the PCI-e slot's power, it's not impossible for a power supply to deliver more than the suggested 150W per 8-pin connector. (Power supplies like the Enermax Galaxy EVO 1250 w/30A PCI-e rails, Antec Quattro 1200 w/38A PCI-e rails, etc.)

Yep, and also something to note is that PCI 2.0 can deliver 150 WATTS for the slot itself.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
I hope that happens, so it can force competition to drop prices, and it will make a nice SKU, for I'm pretty certain that they will have to drop clocks a bit even with the far more power efficient GF104.

hey , i agree again....thats 4 times. :D:D:D