• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

G80 Reviews thread

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Good stuff and 8xMSAA is a pleasant surprise.

I really hope the xS modes haven't been removed though as that's the main reason I'm getting a 8800 GTS.
 
Originally posted by: BFG10K
Good stuff and 8xMSAA is a pleasant surprise.

I really hope the xS modes haven't been removed though as that's the main reason I'm getting a 8800 GTS.
Click
NVIDIA has introduced a new method of anti-aliasing called Coverage Sample Anti-Aliasing. This differs from the traditional Multi-Sample AA by taking samples from a coverage area instead of subsamples from a pixel. There are several new AA modes. 2X AA is 2 sample MSAA. 4x AA is 4 sample multisample anti-aliasing. This is simple enough to understand right? NVIDIA has deviated from their previous practice of mixing Multi-Sample AA+ Super Sample AA.

In point of fact, 8XS and 4XS modes no longer exist in the NVIDIA drivers. 8x AA is really 4x multisample Antialiasing with 4 additional coverage samples. 8xQ AA is true 8 sample multisample anti-aliasing. 16x AA is 4x AA+8 coverage samples. This mode provides the best performance in most applications+ image quality as performance is just a little less than the 4x mode. 16xQ AA is 4x multisample anti-aliasing+12 coverage samples. This is the highest quality mode of anti-aliasing available on the GeForce 8 series.
I'm confused though. I didn't think Nvidia offered a 4xS before?
 
[In point of fact, 8XS and 4XS modes no longer exist in the NVIDIA drivers.
That doesn't mean anything since they're basically just saying the control panel doesn't present those options.

The fact is 16xS (for example) has never been offered through the control panel but it's been there since the NV40.

As long as the driver hasn't removed the AA code or the registry settings you should be able to still access the xS modes through tweakers just like you can for many other AA modes.

I didn't think Nvidia offered a 4xS before?
It used to be in the control panel back in the NV2x/NV3x days but now you need a tweaker to access it. It's pretty worthless though as there's absolutely no reason not to use 8xS instead.
 
So what new/popular games can justify this much horsepower...Nothing from what I see.
NWN2? G3? CoH? FEAR:EP is more demanding than the original FEAR. Also, for some reason DM:MM is a pretty intensive game with all settings maxed out.
You lost me there Josh. What do you mean? Doesn't the X1950XTX use about the same amount of power as an X1900? Sure, the 1950 uses GDDR4, but clocked much higher. Kind of cancels out the power savings? Well, whatever you meant there, it would seem that a good quality 700W PSU would suffice for 8800GTX SLI.
For instance, Ackmed had an X1900 CF system running on a PCP&C 510W SLI PSU. Now, if the 8800GTX's power draw at load is very close to one X1900XT(X), why do we need such monster PSU's to power them and not so much for the X1900 CF rigs?
That doesn't mean anything since they're basically just saying the control panel doesn't present those options.
So why did they change the SSAA availability? Does CSAA look better than SSAA?
 
Just read Anandtech's review, and all I can say is :Q

Definately a big :thumbsup: for nVidia

Tis will be a merry holiday season. 🙂

(Woah, apopin syle is rubbing off on me, heh)
 
Originally posted by: nib95
Anand's review is pretty technical and all, but the actual benchmarks are lame.
Here's hoping they get that retail unit soon so we can get a proper review.

I agree the benchmarks were totally useless for me, as I don't have a monitor that supports exotic resolution. The watts per frame cost, was the most dumb. Watts or total cost per hour of gaming and per hour of idle would have some use to me. Might make me more responsible in turning my power sucker off when not in use. :beer: IMO, huge monitors just add to the power drain, overflowing dumps.

No one can argue that at dx9, this card rocks. As drivers improve, bugs should go away and fps increase.
 
So why did they change the SSAA availability?
Probably because the masses find those settings unwieldy. You have to really understand the modes to use them well.

Does CSAA look better than SSAA?
No. It may be better at edges than 8xS but CSAA is still MSAA which means it can't AA textures. I would still be taking 8xS over any of the CSAA modes.

As for 16xS, I'd say it's still the best AA mode we have available, both for edges and textures.
 
this is crazy...i mean gfxcards are an enthusiastic hobby of mine for years already....but all those AA modi make me dizzy 🙂

So...8xS does textures and everything ? What about the "transparency antialising" as mentioned in AT review ? How does that come into play ?

Looks like i'd be sitting there an hour just to find out which AA modes is the best 🙂 (mean. highest quality with moderate performance impact)




 
and yes...i havent seen ONE G3 review/test-run so far...IMHO probably one (if not THE) most demanding game right now !
 
So...8xS does textures and everything?
Yes. Everything
What about the "transparency antialising" as mentioned in AT review? How does that come into play?
When SuperSamplingAA isn't being used and MultiSamplingAA is, TrAA performs AA on alpha textures.
Looks like i'd be sitting there an hour just to find out which AA modes is the best 🙂 (mean. highest quality with moderate performance impact)
For this card it looks like 16xAA with TrSAA would be a great combination. That, of course, is until a SuperSampling mode can be tweaked from the registry as BFG10K suggests.
 
Originally posted by: josh6079

You lost me there Josh. What do you mean? Doesn't the X1950XTX use about the same amount of power as an X1900? Sure, the 1950 uses GDDR4, but clocked much higher. Kind of cancels out the power savings? Well, whatever you meant there, it would seem that a good quality 700W PSU would suffice for 8800GTX SLI.
For instance, Ackmed had an X1900 CF system running on a PCP&C 510W SLI PSU. Now, if the 8800GTX's power draw at load is very close to one X1900XT(X), why do we need such monster PSU's to power them and not so much for the X1900 CF rigs?

Ah I see. But just consider that not all GPU "functionality" is being utilized benching these DX9 titles being benched here. I'm thinking that, when they arrive, full blown DX10 titles loaded with all the bells and whistles (developer pulls out all the stops) just might make the G80's work a bit harder, or more completely. In turn, pulling more power to feed that tremendously large 681 million transistor core.

Because something just doesn't add up.

1. Transistor count skyrocketed from 298 million transistors on 90nm, to 681 million on 90nm. HUGE increase. This card "should" be drawing more power than it was shown to pull.
I'm just guessing here, but TSMC did not pull off this kind of miracle. I think the core is not being fully utilized (well of course it isn't. No DX10 titles yet) and will pull more power when it actually has to do what it was designed for.

MHO

EDIT: Oh yeah, my point! Spend the extra few bucks on a PSU that will exceed manufacturers specs the first time around. Better to do that up front now, then to find out it just cant cut it later.
 
Originally posted by: keysplayr2003
Originally posted by: josh6079

You lost me there Josh. What do you mean? Doesn't the X1950XTX use about the same amount of power as an X1900? Sure, the 1950 uses GDDR4, but clocked much higher. Kind of cancels out the power savings? Well, whatever you meant there, it would seem that a good quality 700W PSU would suffice for 8800GTX SLI.
For instance, Ackmed had an X1900 CF system running on a PCP&C 510W SLI PSU. Now, if the 8800GTX's power draw at load is very close to one X1900XT(X), why do we need such monster PSU's to power them and not so much for the X1900 CF rigs?

Ah I see. But just consider that not all GPU "functionality" is being utilized benching these DX9 titles. I'm thinking that, when they arrive, full blown DX10 titles loaded with all the bells and whistles (developer pulls out all the stops) just might make the G80's work a bit harder, or more completely. In turn, pulling more power to feed that tremendously large 681 million transistor core.

Because something just doesn't add up.

1. Transistor count skyrocketed from 298 million transistors on 90nm, to 681 million on 90nm. HUGE increase. This card "should" be drawing more power than it was shown to pull.
I'm just guessing here, but TSMC did not pull off this kind of miracle. I think the core is not being fully utilized (well of course it isn't. No DX10 titles yet) and will pull more power when it actually has to do what it was designed for.

MHO

EDIT: Oh yeah, my point! Spend the extra few bucks on a PSU that will exceed manufacturers specs the first time around. Better to do that up front now, then to find out it just cant cut it later.

Yea...this is true. I wonder how big those L1 caches are in the streaming processors...And just wonder is this new arch that much of a power saver.....

Who knows.

Like u said maybe when its more utilized it'll draw more power....

Unless nVidia pulled some kinda voodoo power magic
 
Originally posted by: DeathBUA
Originally posted by: keysplayr2003
Originally posted by: josh6079

You lost me there Josh. What do you mean? Doesn't the X1950XTX use about the same amount of power as an X1900? Sure, the 1950 uses GDDR4, but clocked much higher. Kind of cancels out the power savings? Well, whatever you meant there, it would seem that a good quality 700W PSU would suffice for 8800GTX SLI.
For instance, Ackmed had an X1900 CF system running on a PCP&C 510W SLI PSU. Now, if the 8800GTX's power draw at load is very close to one X1900XT(X), why do we need such monster PSU's to power them and not so much for the X1900 CF rigs?

Ah I see. But just consider that not all GPU "functionality" is being utilized benching these DX9 titles. I'm thinking that, when they arrive, full blown DX10 titles loaded with all the bells and whistles (developer pulls out all the stops) just might make the G80's work a bit harder, or more completely. In turn, pulling more power to feed that tremendously large 681 million transistor core.

Because something just doesn't add up.

1. Transistor count skyrocketed from 298 million transistors on 90nm, to 681 million on 90nm. HUGE increase. This card "should" be drawing more power than it was shown to pull.
I'm just guessing here, but TSMC did not pull off this kind of miracle. I think the core is not being fully utilized (well of course it isn't. No DX10 titles yet) and will pull more power when it actually has to do what it was designed for.

MHO

EDIT: Oh yeah, my point! Spend the extra few bucks on a PSU that will exceed manufacturers specs the first time around. Better to do that up front now, then to find out it just cant cut it later.

Yea...this is true. I wonder how big those L1 caches are in the streaming processors...And just wonder is this new arch that much of a power saver.....

Who knows.

Like u said maybe when its more utilized it'll draw more power....

Unless nVidia pulled some kinda voodoo power magic


But to look at it another way, by the time true DX10 titles start popping up, the G80 will be G81 by then on a smaller manufacturing process. 80nm possibly, but more likely 65nm. But it also might be on a 512bit memory bus with a GB of GDDR4 by then as well, bringing the power draw back up again. So yeah. Buy a bit larger/better than the manufacturer states and have no worries.
 
Great new features. Not sure about "must have 450 watt supply or + (or 30A)". My OCZ powerstream 420watt has 30A and will most likely meet the power requirements. This might be another case of Nvidia being conservative (recall 480watt power supply requirement for 6800Ultra). Also let's not forget that P4D system with X1950XTX probably uses more power than a core 2 duo and 8800GTX if we consider the whole system consumption in aggregate 🙂 Great time to build a new system.

I must say this is probably the only time I can recall in the last 3-5 years where the top card actually justifies the price premium over the GT/GTS version at stock speeds.

GTX is simply amazing!

GTS stock performance is very disappointing for me from AT's review. Hopefully the overclocking results at Firingsquad are indicative of its potential.
 
Originally posted by: RussianSensation
Great new features. Not sure about "must have 450 watt supply or + (or 30A)". My OCZ powerstream 420watt has 30A and will most likely meet the power requirements. This might be another case of Nvidia being conservative (recall 480watt power supply requirement for 6800Ultra). Also let's not forget that P4D system with X1950XTX probably uses more power than a core 2 duo and 8800GTX if we consider the whole system consumption in aggregate 🙂 Great time to build a new system.

I must say this is probably the only time I can recall in the last 3-5 years where the top card actually justifies the price premium over the GT/GTS version at stock speeds.

GTX is simply amazing!

GTS stock performance is very disappointing for me from AT's review.


I thought the GTS did pretty well. And, the price of the GTS seems right in line with it's performance deficit to the GTX.

GTS 499.00 GTX 649.00 (GTS about 75% of the price of a GTX)
GTS 96 shaders GTX 128 shaders (75% of the shaders of the GTX)
GTS 640MB GTX 768MB (83% of the RAM but thinner bus)

I'd say the price/performance ratio is right on the money between GTX and GTS.

Methinks we will see a $550 8800GT(O) soon enough. Mmmmmm. could be...

And don't forget lads! These are beta drivers on a completely new architecture. Not an NV40 ancestor. Performance WILL improve.
 
From the clocks FS got when overclocking, the card is a beast. Of course, it's not guaranteed.

I would wait until better prices come around. I don't think the GTS's 11 frames per second in Cell Factor warrant the amount of $500.
 
Now ilike the fact that these cards own and all... cooler looks like crap but from reviews does well enough in the sound category

i didnt see anywhere mentioning heat output

anyways

Only if the price was right... at 570 CDN, cheapest i found, that is a little steep just for the 8800GTS ... hopefully by boxing day when im ready to blow some money it is around a more reasonable 450 (which i doubt, and thats till way to high for me 😛)

ah, cant wait til 8600GT-- only if those product cycles werent so long compared to high end
 
As everyone has stated, this card is AMAZING! It really has it all, MUCH improved IQ, overscan correction (HUGE deal for me), extremely quiet, and absolutely AMAZING performance at 2560x1600 (especially using SLI). The thing has it all. ATi has its hands full with this beast out in the open. I don't think I've been this excited about a video card since the 9700Pro. The thing is simply ground breaking. Way to go nVidia, you did a GREAT job!!! :thumbsup:
 
Originally posted by: flexy
Originally posted by: Hyperlite
Originally posted by: otispunkmeyer
Originally posted by: theprodigalrebel
Anyone who has doubts in these reviews should just go to BFG's website.
Look at the picture in the third column here

🙂

And Newegg has them for $660/$500 (GTX/GTS)

lol

i want one that says

FTMFCSCGASPFGIW

see if you can guess what that stands for !

for the mother fcking cork sucking christ...somthing, i dunno :laugh:


i don't even know what to say about G80. its unbelieveable, but true. ARGH! necessito dinero!!! y un psu de 700 watts 😉

>>>
[...]GASPFGIW
>>>
G@Y A$$ S***NG PERFORMANCE FOR GAMERS I WANT ? 🙂

ok it ws

for the mother F**king c*ck sucking cum guzzling ass slapping p***y fisting gag inducing win

dont tell the mods!
 
Originally posted by: BassBomb
Now ilike the fact that these cards own and all... cooler looks like crap but from reviews does well enough in the sound category

i didnt see anywhere mentioning heat output

anyways

Only if the price was right... at 570 CDN, cheapest i found, that is a little steep just for the 8800GTS ... hopefully by boxing day when im ready to blow some money it is around a more reasonable 450 (which i doubt, and thats till way to high for me 😛)

ah, cant wait til 8600GT-- only if those product cycles werent so long compared to high end

someone did

i cant remember which site it was now, but i remember seeing about 87 degrees somewhere
 
Originally posted by: BFG10K
[In point of fact, 8XS and 4XS modes no longer exist in the NVIDIA drivers.
That doesn't mean anything since they're basically just saying the control panel doesn't present those options.

The fact is 16xS (for example) has never been offered through the control panel but it's been there since the NV40.

As long as the driver hasn't removed the AA code or the registry settings you should be able to still access the xS modes through tweakers just like you can for many other AA modes.

I didn't think Nvidia offered a 4xS before?
It used to be in the control panel back in the NV2x/NV3x days but now you need a tweaker to access it. It's pretty worthless though as there's absolutely no reason not to use 8xS instead.


Right now I am unable to find 4xS/8xS in the registry and I've been poking around for over a week now 🙁. But these are entirely new drivers. At editors day this was the first questions I asked. Hybrid modes such as 4xS/8xS/ and even 16xS ((with 8xQ)) should be possible. I do not however know how much priority is putting in them into the G80 drivers. I assume with 4xS/8xS its relatively simple. Hard to say with the 16xS thats possible through 8xQ. If anyone does manage to find them in the registry that'd be great.
 
Originally posted by: BFG10K
So why did they change the SSAA availability?
Probably because the masses find those settings unwieldy. You have to really understand the modes to use them well.

Does CSAA look better than SSAA?
No. It may be better at edges than 8xS but CSAA is still MSAA which means it can't AA textures. I would still be taking 8xS over any of the CSAA modes.

As for 16xS, I'd say it's still the best AA mode we have available, both for edges and textures.

jus imagine what SLI AA is gonna be like !
 
Parroting everyone else, absolutely amazing what nVidia has done here. The sheer performance of the GTX on beta drivers in DX9 is just incredible :Q. One thing though, I'm at college and am currently working on a Dell 9300 with a 7800GTX Go, does anyone have any word/news on G80 mobile part? I'd be interested to see how they could shrink that kind of power into a mobile part, maybe I'll have to wait for the refresh, especially if it is on a smaller die.
 
Back
Top