Fermi's (gtx 380) NDA to lift tonight? Finally some real benchies.

Page 17 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

SHAQ

Senior member
Aug 5, 2002
738
0
76
Errr, because it does not need it? :rolleyes:

And it's for regular Fermi, not overclocked. :twisted:

I'm not following you, I'm afraid... you claimed it's not important, now you're claiming it's the same as if they would use

Errr, no, most likely it isn't - think about it: 3x1.4B trannies on 55nm vs 2x3.2B trannies on 40nm but higher clocks and reportedly higher voltage

Doesn't matter OC'd or not. I think people need a special case for Xfire'd 5970's.

183x3=549watts/2=275watts tdp for the 380. We will see. Care to wager that 2 380's will consume more power than 3 280's? MUHAHAHA
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
That's my understanding of it. They design the cards to cool themselves sufficiently. They can't really design these cards and expect the user's case to just happen to have enough airflow to cool them.
Of course not, very true. My guess is the "certification" is mostly a marketing gimmick. However, a case with well designed airflow runs the cards cooler and quieter, especially if you go for multi-GPU configurations. My GTX295 would crank it's fan to 100% in games, even in the winter, and only after I removed the shroud around it as well as opening up the back of my case did the temps lower. However, this shouldn't be a necessity, see below:
That says to me that I could probably use these things as a mini-stove burner when I'm too lazy to walk upstairs to make myself some tea. For me, that's a problem. For others it won't be, of course, but personally I don't care if it's 200% faster than my 5850 if it's outside my budget and going to put out more heat than the space heater I have in my basement.
Which is one of the main reasons I bought a 5870 the day of its release. The rumors of power consumption/heat are one of the main reasons I'm not that excited about Fermi.
Nice new article up on BSON discussing GF100's A3 yields and cost per chip based on different yield results.

http://www.brightsideofnews.com/news/2010/1/21/nvidia-gf100-fermi-silicon-cost-analysis.aspx

The quick summary: @ 40% yields, which is likely where GF100 A3 will be at, the cost per chip will be $131. @ 60% yields, which is a best case scenario as 40nm matures and improves, cost per chip will be $87. Expected initial prices for gtx380 are $499-549 and gtx 360 will be $349-399.
I would imagine that if prices end up being at the lower end of BSON's estimates, it will be with a mail in rebate to get there. We shall see soon! Can't wait for the official reviews!
Interesting article, I'm not sure of it's accuracy, but it's good to consider the numbers. They're projecting $131 for the chips, but it states the A2 silicon was $208 a chip. Are retail Fermi parts only going to be A3? Even then, I'm very doubtful of the $350-$400 price point - wishful thinking at best. NVIDIA doesn't have a supply of these cards built up, they're going to milk supply and demand. Also, the article doesn't even consider how much more expensive the Fermi boards will be to manufacture compared to Cypress boards, but insists they'll be priced similarly or even lower? We'll see I suppose.

IIRC, this is exactly why NV went with the slanted fan approach a while back. It allows them to have a larger gap in between the fan and an adjacent card on motherboards that have the PCIe slots only one slot apart from each other.
I'm not sure why ATI hasn't done something similar, or why so many motherboards place the PCIe 16x slots so close to each other. Running a setup like this would worry me a bit because that top card is not getting a lot of cool air:

...then again, I pick my motherboards with dual gpu in mind so I make sure that the PCIe slots are two spaces apart instead of one. I like to always have at least one empty slot in between the cards.
Check the picture again; see the vents in the front? The cards draw in air from there too. Air will follow the path of least resistance, if the front gets somewhat occluded due to the card below, it'll draw in air from the front. Personally, I like this design much better. Slanting the card disturbs airflow creating turbulence and noise - keeping the flow straight should provide not only better flow and lower fan speeds, but a quieter card.

Sure, Fermi GTX 380s are going to toasty cards. I think that's obvious from the specs we've gotten, but I'm sure in a case with decent airflow they will be fine. Sure, if you want to run three of them right next to each other in 3-way SLI, you should probably pay a little extra attention to the thermals around the cards, but this has long been the case when running three high end cards.
Like I said, the cards will keep themselves cool. It's very difficult to overheat any modern graphics card running its fan at 100% and installed properly. However, dealing with the noise is the problem. If the cards run so hot that they're always noisy, that's a problem in my book. We'll see.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Check the picture again; see the vents in the front? The cards draw in air from there too. Air will follow the path of least resistance, if the front gets somewhat occluded due to the card below, it'll draw in air from the front. Personally, I like this design much better. Slanting the card disturbs airflow creating turbulence and noise - keeping the flow straight should provide not only better flow and lower fan speeds, but a quieter card.

The vents in the front are just for looks and to maybe to allow some air to reach the caps behind the fan. There is a shroud that surrounds the entire fan on the 5870 that directs the airflow over the heatsink.

18.jpg


It seems that the dual 5870 cards operate just fine in the pic I posted, but I'm sure the temps on the top card are higher than the bottom card.

Like I said, the cards will keep themselves cool. It's very difficult to overheat any modern graphics card running its fan at 100% and installed properly. However, dealing with the noise is the problem. If the cards run so hot that they're always noisy, that's a problem in my book. We'll see.

Having owned almost every top end GeForce and Radeon for the last 4-5 years, I'm not too worried that NVIDIA will put a decent and not too noisy cooler on Fermi. I'm pretty sure nothing either company puts out will rival the 5800 Ultra or X1800XT/X1900XTX in terms of noise.