Dual GPU Fermi on the way

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
:rolleyes: Did you even bother to read what I wrote?
It was a fun reply to a theoretical scenario where NV already released a dual-Fermi exceeding 300W.

I have no idea what are you talking about. :)

Yes, I did read it. Nowhere did it say that NV would release a 300+w part. If you are not up to speed, that is over the the current PCI specs.

Let me help you with the math:

2x 256bit GDDR5 @ 1200mhz > 2x 256bit GDDR5 @ 1000mhz. Does that help?
 

nOOky

Diamond Member
Aug 17, 2004
3,272
2,355
136
:)

Whatever. Bring on the new cards already!
Alternatively, bring on the games (not console ports) that take advantage of these monsters!

Here's to hoping the new nvidia cards perform a reverse role this time around and drive the cost of the ATI cards down. Like most mainstream gamers I don't buy the $300-$600 cards, I limit it to around $200. The general trend has gradually been shifting up of late, anyone know of any projected prices for the new nvidia cards? I'm suspecting that any new dual gpu card will be outrageous...
 
Last edited:

GaiaHunter

Diamond Member
Jul 13, 2008
3,722
418
126
Yes, I did read it. Nowhere did it say that NV would release a 300+w part. If you are not up to speed, that is over the the current PCI specs.

He said that IF Fermi dual-GPU card uses more than 300W, so breaking the PCI-E specs, THEN AMD COULD/WOULD release a 5975 with higher clocks in response to another poster hypothesis, http://forums.anandtech.com/showpost.php?p=29052295&postcount=29, not to the OP.


Let me help you with the math:

2x 256bit GDDR5 @ 1200mhz > 2x 256bit GDDR5 @ 1000mhz. Does that help?

And what about the core speed?

The Crossfire 5870 is faster than a 5970 because it has 2x RV870 cores at 850MHz vs 2x RV870 at 725MHz plus its memory is clocked @1.2 GHz vs 1 GHz.

Saying that the 5970 loses to CF5870 because of bandwidth disregarding core speed is silly.
 

zsdersw

Lifer
Oct 29, 2003
10,505
2
0
Wow.. is it just me or is GPU power consumption completely ridiculous. Pretty soon you'll need a second power supply for the damn things.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
Wow.. is it just me or is GPU power consumption completely ridiculous. Pretty soon you'll need a second power supply for the damn things.

I think it's expected in the $600 price range. In the $100 price range power use seems quite acceptable.
 

PingviN

Golden Member
Nov 3, 2009
1,848
13
81
Wow.. is it just me or is GPU power consumption completely ridiculous. Pretty soon you'll need a second power supply for the damn things.

...or you could just get a decent one to begin with. If power consumption keeps on going up, so will the PSUs.
 

yh125d

Diamond Member
Dec 23, 2006
6,886
0
76
Taking note of the considerable sacrifices ATI had to make with the 5970, I have my doubts in NV being able to release a full fledged "GTX 395". Last round they sort of split the difference between GTX260 and GTX280 for the 295, this go around they may have to only do 2x GTX360 chips (which wouldn't be a bad card at all, but might not be quite as market dominating as last gens dual-gpu was at the time)
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
...or you could just get a decent one to begin with. If power consumption keeps on going up, so will the PSUs.

depending on where you live you start running into limitations on how many watts you can pull thru the wall loutlet on common circuits...most US households are wired to top out around 1500W.
 

PingviN

Golden Member
Nov 3, 2009
1,848
13
81
Hopefully it wont go that far :p. The day my computer overloads my wall outlet, Im finding another hobby.
 

shangshang

Senior member
May 17, 2008
830
0
0
this thread needs to stop until SINGLE FERMI is released to the public.

(but it looks like I just extended it!)
 

zsdersw

Lifer
Oct 29, 2003
10,505
2
0
I still think it's ridiculous and kinda outrageous that gamers aren't demanding more performance for less power. What they won't put up with in CPU power use they'll easily put up with for the GPU, which is quite silly.
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
I still think it's ridiculous and kinda outrageous that gamers aren't demanding more performance for less power. What they won't put up with in CPU power use they'll easily put up with for the GPU, which is quite silly.

Great post. If you think about it, a OC'd i7 ~3.6ghz uses around 160w, and thats not a whole lot different than some of the last P4 Preshotts that were single-core, and MUCH slower IPC.

Imagine if modern GPUs were the same size as an old GF256 card, but with the speed of a 5870. With a dual-GPU setup, you can easily use 3-5x power of the CPU just for graphics. It's insane.

Idle power (on the high end) has improved recently, but load is still rediculous.
 

fleshconsumed

Diamond Member
Feb 21, 2002
6,486
2,363
136
Bwahahahahahaha-hahahahahaha-hahahahaha

nVidia still doesn't have 360/380 out yet, and they are already talking about 395, heh?
 

yh125d

Diamond Member
Dec 23, 2006
6,886
0
76
I still think it's ridiculous and kinda outrageous that gamers aren't demanding more performance for less power. What they won't put up with in CPU power use they'll easily put up with for the GPU, which is quite silly.

I tend to think that the type of people who buy 200-300w TDP GPUs would be plenty willing to buy a 200W TDP CPU. But since they are such a small portion of the market, theres no real need to.




Modern CPUs are restricted to roughly <125W TDP by OEMs, not enthusiasts
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
It shouldn't be expected, though. Demand from the manufacturers more performance for the same or less power.

I do think this happens generally. The overall power may go up, but when you look at frame per watt measurements I do think most segments do improve. But in the highest end, I think AMD or Nvidia will keep raising the ceiling if they can and have to do so to one up each other.

But if you take a card like the 5770, it uses less power than a 9800GT or 4850 but is faster.

http://www.techpowerup.com/reviews/HIS/HD_5770/28.html

But at the highest end a lot of people aren't buying for frame per dollar or frame per watt, so you're not so likely to see those cards win in those areas.
 

fleshconsumed

Diamond Member
Feb 21, 2002
6,486
2,363
136
Ya, what a riot. I mean, none of the tech firms ever plan for future products do they ? :rolleyes:

If latest rumors are correct nVidia still hasn't taken delivery of the final silicon revision, A3 or whatever it is. If so, talking about dual GPU is like talking about furniture color when you haven't got a house yet.

But I understand. nVidia can't compete with 58XX so they have to talk about awesomeness of future unreleased products.
 

ginfest

Golden Member
Feb 22, 2000
1,927
3
81
If latest rumors are correct nVidia still hasn't taken delivery of the final silicon revision, A3 or whatever it is. If so, talking about dual GPU is like talking about furniture color when you haven't got a house yet.

But I understand. nVidia can't compete with 58XX so they have to talk about awesomeness of future unreleased products.

Actually you don't understand at all ;) Forget that the particular company is nVidia and that you can't appear to be relevant at this forum unless you bash them. Do a little research on business and the way businesses plan. Most companies that are successful always have a few generations of future products planed out, projected a number of years into the future.
 

fleshconsumed

Diamond Member
Feb 21, 2002
6,486
2,363
136
Actually you don't understand at all ;) Forget that the particular company is nVidia and that you can't appear to be relevant at this forum unless you bash them. Do a little research on business and the way businesses plan. Most companies that are successful always have a few generations of future products planed out, projected a number of years into the future.

I'm sorry, for a moment there I thought you were talking about 3dfx... :D

Anyway, nVidia is in no danger of going under like 3dfx, but the parallel with rampage is becoming kind of funny... :) Talk smack with no actual product on hands.