• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Additional G80 Features...

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
My current system currently already consumes 520w 100%
My ol' Enermax Noisetaker 600w can't take no more 🙁

I'm gonna go for a 1Kw PSU next time...hopefully a quiet one.
 
Originally posted by: Gamingphreek
Originally posted by: Corporate Thug
Originally posted by: Gamingphreek
Link

My goodness!! This chip is going to be insane!!

As for the power and heat requirements, im taking 600Watt with a grain of salt. Nvidia always beefs up the recommendation for those of us who use Deer and Powmax and stuff. A quality 420 or 450 should have no problem running a G80 based system (Once again im just guessing though).

-Kevin

(Man its been a while since i have visited the Video Forum!!)


i doubt a 420 or a 450 will support this card.

Thats absurd. Currently a 7950GX2 system (Closest i could find real quick for power consumption), not just the card the entire system at peak load only consumes 248 Watt. Are you saying that just the graphics card will consume as much as 2x SLI systems. Somehow i doubt that.

400-450Watt should be absolutely fine based on current numbers.

-Kevin


I was referring to the complete system, not just graphics.

Sorry if i wasnt clear.
 
Originally posted by: gersson
My current system currently already consumes 520w 100%
My ol' Enermax Noisetaker 600w can't take no more 🙁

Shens, unless you've got a pair of Kentsfield rigs cranking. 😛

 
Originally posted by: gersson
My current system currently already consumes 520w 100%
My ol' Enermax Noisetaker 600w can't take no more 🙁

I'm gonna go for a 1Kw PSU next time...hopefully a quiet one.

Ya know somehow i SERIOUSLY SERIOUSLY doubt you use 520 Watt. Your system is incredibly fast, but no where near 520 Watt. Seeing as a comparably compared 7950GX2 system only consumes 248 watt, i seriously doubt that you use 2x the power of that system.

I was referring to the complete system, not just graphics.

Sorry if i wasnt clear.

Well, while that is possibly, still i seriously doubt that just another graphics card will add another 250 watt.

-Kevin
 
Interesting, looks like a nice card on paper. Hope those early adopters solve all driver problems for us cheapies. Other interesting thing is these leaks are written like they are directly from nvidia pr. Suggests that they are taking a kicking on the high end these days, as the last couple of releases were very secretive and much downplayed. :beer:
 
Sweet jesus...this card is going to be 20x as fast as what we have now.

Anyway, don't we already have 16xAA (if you enable it in RivaTuner)? (Not with FP16 HDR though.) This is FP128 HDR?
 
Anyway, don't we already have 16xAA
Yes, all single NV40 and G7x cards already support 16xAA. If this is nVidia's "new" AA then it's not really new at all unless they've changed the sample pattern in the G80.
 
Originally posted by: BFG10K
Anyway, don't we already have 16xAA
Yes, all single NV40 and G7x cards already support 16xAA. If this is nVidia's "new" AA then it's not really new at all unless they've changed the sample pattern in the G80.

What AA would they use for the new one. And which do they use now?

(Not with FP16 HDR though.) This is FP128 HDR?

We have 64bit HDR, the G80 supports 128bit HDR with 32bit FP Precision from what i understand, not that i look i may have mistyped that on the title.

-Kevin
 
I can see it now...
The Elder Scrolls V: Return to Morrowind

*WARNING: Bethesda Softworks is not responsible for any computer explosion related problems this game may cause.
128bit HDR?! This thing is gonna be a house.
 
What AA would they use for the new one.
Two possible scenarios I can think of, if indeed they are changing things:

-Increase the multi-sampling component from 4x to 8x.
-Change the existing super-sampling component to use rotated grid.

And which do they use now?
Now they use 4xRGMS + 2x2 OGSS.
 
16xAA

How useable is this going to be with this card?
I mean playable framerates at 1600 and up resolutions?

Well see I guess.
 
I mean all these great features of the card with HDR and 16xAA.

They mean nothing if I can't run them at 1920x1080.

Wait for the benchies to see what we will be getting I guess.
 
Originally posted by: Moya
I'm kind of relieved, I thought I was going to have to buy a new PSU. Looks like I will be fine with 550 Watts. I don't see myself doing SLI anytime soon anyway.

are you kidding me? lol


haha, the g80 is going to cost upward of $600 and you are relieved that you wont have to spend money on the PSU.

OH LORD THE IRONY!
 
In that other thread (the 1950xtx vs 7950gx2 iq comparison), look closely at the 8x and 16x AA. I think even the reviewers said that there wasn't any difference, but there was an incredibly significant performance drop. I might be wrong, though. I wonder if 16x AA will be better utilized in the G80. Better drivers, maybe? Ooh, that reminds me, I can't wait to see the new drivers nV is gonna drop on these cards. Hopefully they'll be exceptional. But, probably not, if I know anything about nV and new releases...
 
Originally posted by: m21s
16xAA

How useable is this going to be with this card?
I mean playable framerates at 1600 and up resolutions?

Well see I guess.

Probably will depend on the engine.

F.E.A.R. ... probably not 😛
 
Originally posted by: santz
Originally posted by: Moya
I'm kind of relieved, I thought I was going to have to buy a new PSU. Looks like I will be fine with 550 Watts. I don't see myself doing SLI anytime soon anyway.

are you kidding me? lol


haha, the g80 is going to cost upward of $600 and you are relieved that you wont have to spend money on the PSU.

OH LORD THE IRONY!

Highly likely that the 8800GTX/GTS comes with external PSUs with the whole package.

For the whole 16xAA, the performance depends on how efficent the AA algorithms etc. They are more variables such as the memory controller, what kind of AA this 16xAA is performing etc.

What im thinking is the whole AA aspect of G80 is much different than the G7x AA engine.



 
I have a strange feeling no one will be using 16xAA unless there running SLI with these bad boys.

I hope I'm wrong though 🙂
 
I would rather Nvidia improve SSAA performance to the point of general usability than adding another pointless 8x MSAA mode than does nothing other than "ATI has 6x...We got 8x!" bragging.

On Radeons 4x MSAA is indistinguishable from the 6x mode unless you zoom in the image and scrutinize every single slanted edge. 4x to 8x on NV case is most probably going to have the same difference. But then again there will be people who claim to have microscopic eyes...


 
Originally posted by: StrangerGuy
I would rather Nvidia improve SSAA performance to the point of general usability than adding another pointless 8x MSAA mode than does nothing other than "ATI has 6x...We got 8x!" bragging.

On Radeons 4x MSAA is indistinguishable from the 6x mode unless you zoom in the image and scrutinize every single slanted edge. 4x to 8x on NV case is most probably going to have the same difference. But then again there will be people who claim to have microscopic eyes...


there is a noticeable difference between 8xS and 4x.
 
Back
Top