New dual gtx 460 incomming!? confirmed by Fudzilla

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
GTX275 at 633 core 1404 shader peaks at 219W at load.
GTX295 at 576 core 1242 shader peaks at 289W at load. -9% core -10% shader. To fit into 300W restriction.

In all, power consumption increased 31% over a single GTX275. They had only 81W of room to play with.

GTX460 1GB at 675 core 1350 shaders peak at 160W at load. (150 for 768MB version).

This leaves 140W of breathing room before they hit the 300W restriction.

This allows me to believe that not only can a dual GF104 incorporate it's full 384 shaders, but can be clocked higher as well.

Remember, the GTX295 core and shaders were reduced about only 10% to fit the envelope. And they started out at 219W on a 55nm process.

Somebody else who is much more "math worthy" can fill in the blanks, but I see a full fledged dual GF104 with 2GB GDDR5 256-bit at higher clocks as a walk in the park compared to the much tougher endeavor of putting 2 55nm GT200's together.
They have a LOT more breathing room over the last dual card endeavor. about 60W more than they had with GTX295. IMHO, it would be a cinch, especially if they actually utilized the entire additonal 140W of power, which I don't think they even need to come close. But yes, this is MY speculation.
 
Last edited:

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
GTX275 at 633 core 1404 shader peaks at 219W at load.
GTX295 at 576 core 1242 shader peaks at 289W at load. -9% core -10% shader. To fit into 300W restriction.

In all, power consumption increased 31% over a single GTX275. They had only 81W of room to play with.

GTX460 1GB at 675 core 1350 shaders peak at 160W at load. (150 for 768MB version).

This leaves 140W of breathing room before they hit the 300W restriction.

This allows me to believe that not only can a dual GF104 incorporate it's full 384 shaders, but can be clocked higher as well.

Remember, the GTX295 core and shaders were reduced about only 10% to fit the envelope. And they started out at 219W on a 55nm process.

Somebody else who is much more "math worthy" can fill in the blanks, but I see a full fledged dual GF104 with 2GB GDDR5 256-bit at higher clocks as a walk in the park compared to the much tougher endeavor of putting 2 55nm GT200's together.
They have a LOT more breathing room over the last dual card endeavor. about 60W more than they had with GTX295. IMHO, it would be a cinch, especially if they actually utilized the entire additonal 140W of power, which I don't think they even need to come close. But yes, this is MY speculation.

The best place to start is Xbitlabs, since they have the best power breakdowns I am aware of.
inno3d_power.png

So basically the GTX275 and GTX295 are waaaaaaaaaaaaay below the TDP, and the 295 comes in at 50% more power than the single card not-quite-equivalent.

And now for the HD5870.
5970_power.png

Again, way below TDP but this time nearly 90% more power.

(http://www.xbitlabs.com/images/video/radeon-hd5970/5970_fur.png Furmark power, but ATI limits performance/power consumption IIRC)

20719.png

Here's the HD5970 using slightly less than HD5850 Crossfire, while the GTX295 uses SIGNIFICANTLY less than GTX275 in SLI.

Obviously what they did for the GTX295 was get some very efficient chips in there, since the power is significantly lower than two cards.

http://www.techspot.com/review/91-asus-geforce-9800-gx2/page10.html
9800GX2 using the same as 8800GT in SLI.
I could link 4870 X2 using the same power as Crossfired 4870's as well.

Most of the time a single card with two regularly clicked GPUs will use the same as SLI/Crossfire with the equivalent card.
When the chips are binned (such as in the case of the HD5970) the power consumption will be a little less than SLI/Crossfire.
When the chips are seemingly heavily binned, then power consumption only increases a small amount.

Throwing two regular GTX460 dies, based on the past examples of the 9800GX2, the HD4870X2 and the HD5970, onto a single card would result in power consumption roughly equal to 2xGTX460s in SLI, which would be (based on TDP) up to 300w. If they use binned chips they could reduce it.

Using the GTX275 -> GTX295 as an example of how things go isn't right because pretty much every other dual GPU card does not exhibit that sort of change.
 

tincart

Senior member
Apr 15, 2010
630
1
0
Very informative post Lonyo. Thanks.

Maybe this variable voltage binning will be a good thing for nV if they are doing a dual-chip card.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
I'd say that using the GTX275 to GTX295 is the best example seeing how it's the most recent dual GPU endeavor from Nvidia. Don't forget the current GF104 is at A1 stepping and it's power consumption is already nice and low. I would wager that if, as you speculate, Nvidia used binned chips to achieve lower power consumption on a dual card, they will do it again. If it works, it works. Nothing wrong with that at all. But if you insist on using only the 9800GX2 and all the other ATI X2 cards and discount the GTX295, then we'd have no choice but to agree with your position here. That is not the case because GTX295 exists and must not be excluded. Exception to the "rule" or not. The "rule" was violated via GTX295. If it was done once, it can be done again. And as I said, MUCH more room to play with on this go-round.
 

tincart

Senior member
Apr 15, 2010
630
1
0
One solution to this vexing issue might be to wait for some official announcement of specifications from a real source. Like nVidia. I bet they will know about stuff like that.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
I'd say that using the GTX275 to GTX295 is the best example seeing how it's the most recent dual GPU endeavor from Nvidia. Don't forget the current GF104 is at A1 stepping and it's power consumption is already nice and low. I would wager that if, as you speculate, Nvidia used binned chips to achieve lower power consumption on a dual card, they will do it again. If it works, it works. Nothing wrong with that at all. But if you insist on using only the 9800GX2 and all the other ATI X2 cards and discount the GTX295, then we'd have no choice but to agree with your position here. That is not the case because GTX295 exists and must not be excluded. Exception to the "rule" or not. The "rule" was violated via GTX295. If it was done once, it can be done again. And as I said, MUCH more room to play with on this go-round.

This allows me to believe that not only can a dual GF104 incorporate it's full 384 shaders, but can be clocked higher as well.

You think they can do full shaders + clock higher + bin for lower power?
I guess it depends how many they want to make. If they want it to be a reasonable volume part, rather than super rare, then I don't see full shaders and higher clocks.

Your argument was that based on the binning of the G200b, NV can manage to increase shaders and clocks.
I showed that typically a card uses the same power as SLI, and then you say it's silly to talk about that because they could bin it like the G200b. The G200b was binned for low power, it had reduced clocks. You are arguing for binning with full shaders and increased clocks. It's not impossible, just rather unlikely IMO.
 
Last edited:

Daedalus685

Golden Member
Nov 12, 2009
1,386
1
0
If I were to place money on that anouncement I would put it on some sort of dual GPU card with one dedicated for physX. I'd have to think the true 490 (or whatever) would be a huge anouncement done by Nvidia themselves. It is likely some sort of custom crazy OC card (that might go to 11, but not two... ;) ) or some silly "Buy our MB and get a free GT240!".

Fun to imagine what it could be though :D. What other products have been announced by the partner first? Some sort of super overclocked card perhaps.. shrug.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
Lonyo, I'm getting annoyed from posting from my Blackberry. I'll address you post after my last one later.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
You think they can do full shaders + clock higher + bin for lower power?

Of course.
I guess it depends how many they want to make. If they want it to be a reasonable volume part, rather than super rare, then I don't see full shaders and higher clocks.

Was the GTX295 widely available? For a very long period of time? Yes, and yes.

Your argument was that based on the binning of the G200b, NV can manage to increase shaders and clocks.

No. The binning was your assumption and your assumption alone. I just ran with it.
So no, that was not MY argument.


I showed that typically a card uses the same power as SLI, and then you say it's silly to talk about that because they could bin it like the G200b. The G200b was binned for low power, it had reduced clocks. You are arguing for binning with full shaders and increased clocks. It's not impossible, just rather unlikely IMO.

Again, the binning was your argument, not mine. I'm not saying it isn't possible that they were binned strictly for dual card operation though. Because of course it is possible. Just as it's possible for a dual GTX460.

So I'm going to take it that your vote is for ~310W for a dual GTXvanilla460 card?

If Nvidia is able to do the exact same thing they did with the GTX460 dual card as they did with the GTX295, we're looking at a 30 to 40% increase in power consumption over a stock GTX460 336 shader 1GB card. They'd still end up around ~225W max load. That still leaves 75W give or take a few watts for enabling shaders and upping clocks. Yes, I absolutely "think" it is doable. No question. Whether it's done that way or not, as tincart profoundly declared, remains to be seen by the folks that make 'em.
 

Wag

Diamond Member
Jul 21, 2000
8,288
8
81
Whats an oxymoron?
A zit-faced idiot.;)

I really hope such a card arrives sooner than later as my single 295 is really having a hard time pushing my Dell u2711 @ 2560x1440 (native res) in some of the newer games. I would probably pick up two of these if they are in the $500 range.
 
Last edited:

bryanW1995

Lifer
May 22, 2007
11,144
32
91
I think that it is now technically possible for nvidia to surpass 5970, using either a beefier 460 or a 480 with all the crap cut out. but by the time they could put the thing together and come to market with it they'll have a single gpu card that's faster anyway.

think about it: wouldn't they have come out with a 384 sp gtx 460 if they could? when will they have enough "good" parts to release this part (probably with higher clocks as a gtx 475)? This part would be very strong and command a premium, and they're supposed to then take a bunch of them for testing a dual gpu card. then they take even more specially binned units to sell this card? After all this they still might not be able to clearly eclipse the 5970. I'll admit it's possible, but it's not likely. They're going to focus on winning mindshare back with gf104 and a revamped gf100, then they're going to update fermi and kick the crap out of amd in a year or two.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
If Nvidia is able to do the exact same thing they did with the GTX460 dual card as they did with the GTX295, we're looking at a 30 to 40% increase in power consumption over a stock GTX460 336 shader 1GB card. They'd still end up around ~225W max load. That still leaves 75W give or take a few watts for enabling shaders and upping clocks. Yes, I absolutely "think" it is doable. No question. Whether it's done that way or not, as tincart profoundly declared, remains to be seen by the folks that make 'em.

If they were able to do the exact same thing, we would already have a dual GPU card.
GTX275 TDP = 219w
GTX295 TDP = 289w

GTX470 TDP = 215w
GTX470X2 TDP = ~284w
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
EVGA announcement was for their 11th anniversary. Nothing about a dual-Fermi card.

Seems funny with all these Nvidia price drops ,no one is writing an article about it? Just the one from Europe.

I mean the gtx 480 dropped 60$, gtx 470 90$. Mabe the price drop is being made for the upcomming full 384 sp gtx 475?. It would fit rather nice at 350$, right in between the 440$ 480 and 280$ 470. Rather nice gap to fill ha?
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
the price drop SHOULD be bigger news than it has been so far. As to the reasons, I can think of two likely culprits, both of which could be right:

1. gtx 480/70/65 sales haven't been keeping up with supply, probably because it was so late that most people who wanted to upgrade with win 7 already bought an amd card.

2. they're making room for gf104 and/or revamped gf100, which is very good news for consumers.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
1. gtx 480/70/65 sales haven't been keeping up with supply, probably because it was so late that most people who wanted to upgrade with win 7 already bought an amd card.

As much as I would love the idea of a full version of GF104 to replace GF100, I think your first point better explains the reason for the price drops.

So far, AMD has mopped the floor with NV in DX11 generation.

As of today, AMD has sold 16 million DX11 videocards. http://www.xbitlabs.com/news/video/...ation_DirectX_11_Family_of_Chips_in_2010.html

So the 9 months head start, initially high prices of GTX470/480, and sloppy performance and power consumption from GTX465 virtually guaranteed that ATI still outsold Fermi since March 26, 2010. I guess NV overestimated the demand for PhysX, 3D vision, CUDA and underestimated the importance of price/performance/performance-per-watt ratio :rolleyes:
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
As much as I would love the idea of a full version of GF104 to replace GF100, I think your first point better explains the reason for the price drops.

So far, AMD has mopped the floor with NV in DX11 generation.

As of today, AMD has sold 16 million DX11 videocards. http://www.xbitlabs.com/news/video/...ation_DirectX_11_Family_of_Chips_in_2010.html

So the 9 months head start, initially high prices of GTX470/480, and sloppy performance and power consumption from GTX465 virtually guaranteed that ATI still outsold Fermi since March 26, 2010. I guess NV overestimated the demand for PhysX, 3D vision, CUDA and underestimated the importance of price/performance/performance-per-watt ratio :rolleyes:

Or ATI were selling DX11 parts for $40 to $800 while NV were selling from $250 to $500.