New dual gtx 460 incomming!? confirmed by Fudzilla

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Hauk

Platinum Member
Nov 22, 2001
2,806
0
0
GF104 launched with A1, an indicator that things went well and yields aren't terrible? You decide. It's very likely they're harvesting fully functional units IMO.

A dual gpu card? Why not. Availbility may suck, but look at 5970 supply history. Will dual GF104 win any crowns? No one knows what a fully functional GF104 x2 clocked to 800+ can do.

Regarding 300w limits, who cares? Asus doesn't. If they can't meet it, they'll probably come damn close IMO.

We have a very good candidate for dual gpu goodness.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
Regarding 300w limits, who cares? Asus doesn't. If they can't meet it, they'll probably come damn close IMO.

A massively overclocked gtx460 consumes considerably less power than a gtx275 and runs considerably cooler. Yet Nvidia managed to get TWO gtx275's together in one card and keep the power draw under 300w and temperatures under control.

How do people keep overlooking this?

A GF104 with all of it's shaders unlocked and clocked higher than a current reference gtx460 will still run cooler and draw less power than a gtx275. A dual, fully unlocked GF104 shouldn't be too much of a problem to make, and it'll be faster than an hd5970.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
A massively overclocked gtx460 consumes considerably less power than a gtx275 and runs considerably cooler. Yet Nvidia managed to get TWO gtx275's together in one card and keep the power draw under 300w and temperatures under control.

How do people keep overlooking this?

A GF104 with all of it's shaders unlocked and clocked higher than a current reference gtx460 will still run cooler and draw less power than a gtx275. A dual, fully unlocked GF104 shouldn't be too much of a problem to make, and it'll be faster than an hd5970.

No they didn't. They managed to put two 'GTX270's together, not a full GTX275.
GTX275 core at GTX260 clocks.

That's not to say a dual GPU GTX460 is difficult, it shouldn't be that hard, but it's unlikely they will be much faster GTX460 dies (be it through functional units or clocks).
The HD5970 is an HD5870 core at HD5850 clocks, so a dual HD5850 should work at around the power limit, and the GTX460 uses about the same amount of power.
Either they will be carefully selected dies for low power, and with some performance enhancements, or they could just slap two vanilla GPUs on their (probably still lower end of the voltage scale) and not have any problems. Expecting more than just 2xGTX460 is being optimistic though.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
Faster than a 5970 is pretty optimistic. I imagine it might be faster in some things (Heaven Benchmark) because it's a newer architecture, but for the overall crown I doubt it. I'd be interested in seeing if Quad SLI is now a reasonable solution with such a card.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
No they didn't. They managed to put two 'GTX270's together, not a full GTX275.
GTX275 core at GTX260 clocks.

That's not to say a dual GPU GTX460 is difficult, it shouldn't be that hard, but it's unlikely they will be much faster GTX460 dies (be it through functional units or clocks).
The HD5970 is an HD5870 core at HD5850 clocks, so a dual HD5850 should work at around the power limit, and the GTX460 uses about the same amount of power.
Either they will be carefully selected dies for low power, and with some performance enhancements, or they could just slap two vanilla GPUs on their (probably still lower end of the voltage scale) and not have any problems. Expecting more than just 2xGTX460 is being optimistic though.

I had to redo my homework on what you said. You're right. And it looks like current power draw for a gtx460 1 gig under load is ever so slightly less than a gtx260 - with big overclocks increasing power draw ever so slightly over a gtx260.

Hmmmmm... perhaps they'll end up doing a hybrid x2 card like the gtx295 was. Fully unlocked GF104 cores at gtx460 speeds. Time will tell. If they can't match or beat the hd5970 most of the time, they may not come to market with one at all.
 

SHAQ

Senior member
Aug 5, 2002
738
0
76
Why does it necessarily have to beat a 5970? Can't Nvidia release a competitive price/performance product to challenge it? The 5970 is $600 and two 460's are $400. If the 460 x2 is $450-$500 it will beat the 5970 at price/performance.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Why does it necessarily have to beat a 5970? Can't Nvidia release a competitive price/performance product to challenge it? The 5970 is $600 and two 460's are $400. If the 460 x2 is $450-$500 it will beat the 5970 at price/performance.

Yes but if it can do both, all the better.
 

shangshang

Senior member
May 17, 2008
830
0
0
Why does it necessarily have to beat a 5970? Can't Nvidia release a competitive price/performance product to challenge it? The 5970 is $600 and two 460's are $400. If the 460 x2 is $450-$500 it will beat the 5970 at price/performance.

I think you're on to something here. A dual gtx460 for $500 would pretty much obliterate the sales of the 5970. And even at equal price, a good argument can be made for the dual gtx460 since SLI scales better than Xfire. And I think NV drivers are better quality controlled. Based on my various reading from various complaints, I feel that ATI drivers still have small bugs creeping up on them.
 

Meghan54

Lifer
Oct 18, 2009
11,684
5,228
136
And I think NV drivers are better quality controlled. Based on my various reading from various complaints, I feel that ATI drivers still have small bugs creeping up on them.


You mean like burning customers' cards up with a driver upgrade?
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Faster than a 5970 is pretty optimistic. I imagine it might be faster in some things (Heaven Benchmark) because it's a newer architecture, but for the overall crown I doubt it. I'd be interested in seeing if Quad SLI is now a reasonable solution with such a card.

actually, it's only a "newer" architecture in the sense that it was late as shit coming out in the first place.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally Posted by SHAQ
Why does it necessarily have to beat a 5970? Can't Nvidia release a competitive price/performance product to challenge it? The 5970 is $600 and two 460's are $400. If the 460 x2 is $450-$500 it will beat the 5970 at price/performance.

Yes but if it can do both, all the better.

be careful wreckage, this isn't the internal message service at hq you know.
 

busydude

Diamond Member
Feb 5, 2010
8,793
5
76
Faster than a 5970 is pretty optimistic. I imagine it might be faster in some things (Heaven Benchmark) because it's a newer architecture, but for the overall crown I doubt it. I'd be interested in seeing if Quad SLI is now a reasonable solution with such a card.

SLI scales well in dual card configs. Over 2 cards and scaling takes a tumble for the worst. Whereas, tri-CrossfireX performance is better than tri-SLI.

5870.jpg


gtx480.jpg


Source
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
I had to redo my homework on what you said. You're right. And it looks like current power draw for a gtx460 1 gig under load is ever so slightly less than a gtx260 - with big overclocks increasing power draw ever so slightly over a gtx260.

Hmmmmm... perhaps they'll end up doing a hybrid x2 card like the gtx295 was. Fully unlocked GF104 cores at gtx460 speeds. Time will tell. If they can't match or beat the hd5970 most of the time, they may not come to market with one at all.

Well, how much power did a single GTX275 draw at full load?
And, how much power did a GTX295 draw at full load?

I'm going to wager that is wasn't double a GTX275.

A single GTX460 pulls 160W (just barely missed not needing the second 6-pin PCI-e connector at 10W over).

So, who here thinks the Dual version of GTX460 will pull 320W?

Show of hands?

Refer to GTX275 vs 295 power consumption.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
Well, how much power did a single GTX275 draw at full load?
And, how much power did a GTX295 draw at full load?

I'm going to wager that is wasn't double a GTX275.

A single GTX460 pulls 160W (just barely missed not needing the second 6-pin PCI-e connector at 10W over).

So, who here thinks the Dual version of GTX460 will pull 320W?

Show of hands?

Refer to GTX275 vs 295 power consumption.

Well that's what I've been saying constantly - that an dual graphics card will have less power consumption than two seperate cards that would otherwise make up a dual card configuration.

The problem with a gtx275 vs. gtx295 comparison is that a gtx295 is running at slower clocks equal to a gtx260. Even still, the power consumption of a gtx275 at gtx260 clock speeds is still higher than a straight up gtx260. So a dual, 384 shader GF104 can probably still - and will - happen.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Well, how much power did a single GTX275 draw at full load?
And, how much power did a GTX295 draw at full load?

I'm going to wager that is wasn't double a GTX275.

A single GTX460 pulls 160W (just barely missed not needing the second 6-pin PCI-e connector at 10W over).

So, who here thinks the Dual version of GTX460 will pull 320W?

Show of hands?

Refer to GTX275 vs 295 power consumption.

No one is saying a dual GTX460 will exceed the spec, the argument is about what they will do with the GPU.
A dual GTX460 core would be fine, no one has questioned that. People are discussing the possibilities of higher clocks or more functional units (i.e. full 384 SP cores) being used on a dual chip card.
When you add those in, it stops being a GTX460 and becomes a GF104, and the power draw may increase.
GTX460 x2 is no problem, but could/will/might they do more than vanilla GTX460 cores is the debate.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Ok, here's a good question: Will a 384 sp dual gpu gtx 460 be able to clock high enough to beat 5970? If so, when will it come out? By the time they have enough "good" gtx 460's available to even make the card won't we be on to the refresh/next gen? This is just another problem that nvidia has with being so late to the game this time, by the time they could truly be competitive in thermals as well as performance the bar will probably be set higher.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
This is just another problem that nvidia has with being so late to the game this time, by the time they could truly be competitive in thermals as well as performance the bar will probably be set higher.

I don't see AMD coming out with anything that will outperfom a 5970 X2. The Southern Islands high end part will no doubt be faster than an hd5870, but since they're staying at 40nm then it'll also be a bigger chip and it'll consume more power. In other words, they more than likely won't be able to make another dual GPU part until they move to 28nm.

GF100 is likely being put through the grinder much like the NV30 chip was, and it's going to come out much improved like NV35 did. The refresh parts from both camps should be really, really competitive out the gate.
 
Last edited:

cbn

Lifer
Mar 27, 2009
12,968
221
106
I'm betting due to the low volume that x2 cards are produced at, the chips will be cherrypicked allowing for possibly better clocks and or thermals.

ATI apparently does this with HD5970 (the Cypress cores are special low leakage bins).

But I am a little confused on how Nvidia will be able to do this with GF104 under the constraints of PCI-E 2.0 300 watt limit. Isn't Cypress more power efficient?

That being said, I am sure ASUS will be able to make one seriously good ARES dual GPU product out of GF104. Hopefully the overclocking will be better than what the ARES 5970 accomplished.
 
Last edited:

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
ATI apparently does this with HD5970 (the Cypress cores are special low leakage bins).

But I am a little confused on how Nvidia will be able to do this with GF104 under the constraints of PCI-E 2.0 300 watt limit. Isn't Cypress more power efficient?

That being said, I am sure ASUS will be able to make one seriously good ARES dual GPU product out of GF104. Hopefully the overclocking will be better than what the ARES 5970 accomplished.

Cypress is more power efficient, yes, but that doesn't matter when it uses more power (and gives more performance) per single chip.
An HD5850 and GTX460 use pretty much the same power (~150w), the chip on the HD5970 is a binned fully fledged core (HD5870 core) which is also underclocked.
If they used a regular HD5850 core it would probably require less power.
Since the GTX460 uses the same amount of power as the HD5850, and 20~30w less than am HD5870, sticking two normal GTX460 cores isn't a problem from a power perspective, and might not even need binning.
The performance would be lower per GPU, but there's also the scaling issue on a dual chip card, which allows NV to make up some of the performance difference.

No one is saying a dual GF104 would necessarily be faster than an HD5970, but it will fit within the same power envelope using currently binned chips. If they use low power binned chips with some speed increases they should still be able to slip under the power limit and also increase per GPU performance, which combined with the scaling superiority in most situations would result in two very close cards.

Even though per-GPU ATI has the edge, do some binning and add scaling differences, and the efficiency calculation changes a bit.
 

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
It does NOT say dual chip won't be 384 sp's.
We were speculating about a article, and it was pleasent.........WAS

Do you just like to here yourself argue or just like throwing wrenches in any positive Nvidia thread there is?

Your arguing with a poster who only been a member since April?....LOL...let him be Happy!....haha
 

tweakboy

Diamond Member
Jan 3, 2010
9,517
2
81
www.hammiestudios.com
The only thing Im gonna touch is a GF104 after reading all these reviews. ATI reviews nVidia reviews all about GPU and video card market.

ATI stuck a broom handle in nVidia chief executive with the 5970 ubber dual card. nVidia has yet to answer, we have to wait almost a year until they can answer ATI and grab the performance crown of the 5970. Oh and the Ares dreams about being as fast as the 5970, this coming from a nVidia fanboy.

Im simply gonna get a dual GPU GF104 when the time is right. If I wanna play Crysis and Crysis 2 and BF2 which I don't I have no need for this card. MW2 runs great for me and UT3 etc. No need to waiste money on something I don't need. Thanks
 

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
SLI scales well in dual card configs. Over 2 cards and scaling takes a tumble for the worst. Whereas, tri-CrossfireX performance is better than tri-SLI.

Old driver used there (197.41), scaling got much better with the next 2 releases
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
Nvidia have had the radeon 5970 to aim at for ages. If they release a dual card it will obviously beat it unless ati can surprise them with some massively performance enhancing drivers (unlikely this late on).

I would think it's highly likely that they worked out exactly what they needed to come out on top, and I bet that was a dual 384 shader card. Obviously there's a chance they won't be able to reach the target clocks & power usage in which case I expect them to just not bother releasing a dual card.