Rage3d G80 review

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

avi85

Senior member
Apr 24, 2006
988
0
0
Originally posted by: Gstanfor
Originally posted by: enz660hp
Originally posted by: tanishalfelven
THE WIERD THING is that a 8800gts at 8800gtx clocks is almost equal in performance. makes one think that the power of these cards is not being fully utilized.

uuuuhh...if they are at equal clocks the performance is equal? I dont get it...

Remember the "magical" performance increase GF3 received when the 8500 launched? :D ;)
I have a suspicion history may well repeat itself...

No, I don't , please remind me...
 

Ratchet R3D

Junior Member
Nov 29, 2006
7
0
0
Hey guys, thanks for all the comments on my review. It took me a long time to put together so it's good to see that all the work hasn't gone under appreciated.
Originally posted by: Woofmeister
Look at the peak power requirements in the review. The GTX draws 354 Watts at peak load. Apparently the SLI certification Nvidia is giving PSUs actually means something.
This represents the entire system load, not just the graphics card, as CP5670 mentioned. I don't have a way to test power draw at the graphics card level.

Originally posted by: CP5670
Very nice review, as I've come to expect from Rage3D. It would have been good to see some scores for Splinter Cell: Double Agent instead of Chaos Theory, but I guess that game has issues with the 8800s.
I wanted to get Splinter Cell: Double Agent for this review but couldn't get my hands on it in time (I still haven't got it). Can it be benchmarked in the same as SC:CT?

 

CP5670

Diamond Member
Jun 24, 2004
5,668
768
126
Good to see you here. :)

I wanted to get Splinter Cell: Double Agent for this review but couldn't get my hands on it in time (I still haven't got it). Can it be benchmarked in the same as SC:CT?

Probably, but you may want to hold off on it in any case. Last I heard, that game had some serious stability bugs with 8800 cards and a lot of people couldn't get the game to run at all. (which is a pity, as it seems like the 8800s may be the only cards that can handle it well)
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Hey Rachet, great review. Although, I don't quite agree with this:
And that?s not even considering the image quality improvements you get with it; looks better...
The G80 doesn't seem to be rendering as far as the 7 series for those HL2 pics. Just looking at the G80's 4xAA and the G71's 4xAA, the G80's just got brighter in the distance like at the log supports for the dock next to the crane.

Also, if you look at the crane's railing on its platform the G80's AA leaves out a good chunck of it until some Transparency AA is applied, even when on 16xQ.

Considering the G80 only makes a ghost effect with things being rendered in the distance, I don't quite agree that it all of the sudden has better image quality, but it certainly is playable with a greater sample of MSAA than the 7 series.

Is there any way that you can put up some EverQuest II numbers? This game is still a hog on my rig when completely maxed out, and I've got the 4400+X2 with an X1900XT. I'm curious to see how the G80's handle all of it's shadows. and such
 

Ratchet R3D

Junior Member
Nov 29, 2006
7
0
0
Originally posted by: CP5670
Good to see you here. :)

I wanted to get Splinter Cell: Double Agent for this review but couldn't get my hands on it in time (I still haven't got it). Can it be benchmarked in the same as SC:CT?

Probably, but you may want to hold off on it in any case. Last I heard, that game had some serious stability bugs with 8800 cards and a lot of people couldn't get the game to run at all. (which is a pity, as it seems like the 8800s may be the only cards that can handle it well)

Those problems might be worth investigating then.
 

Ratchet R3D

Junior Member
Nov 29, 2006
7
0
0
Originally posted by: josh6079
Hey Rachet, great review. Although, I don't quite agree with this:
And that?s not even considering the image quality improvements you get with it; looks better...
The G80 doesn't seem to be rendering as far as the 7 series for those HL2 pics. Just looking at the G80's 4xAA and the G71's 4xAA, the G80's just got brighter in the distance like at the log supports for the dock next to the crane.

Also, if you look at the crane's railing on its platform the G80's AA leaves out a good chunck of it until some Transparency AA is applied, even when on 16xQ.

Considering the G80 only makes a ghost effect with things being rendered in the distance, I don't quite agree that it all of the sudden has better image quality, but it certainly is playable with a greater sample of MSAA than the 7 series.

Is there any way that you can put up some EverQuest II numbers? This game is still a hog on my rig when completely maxed out, and I've got the 4400+X2 with an X1900XT. I'm curious to see how the G80's handle all of it's shadows. and such

yep there's a problem with the fog in the Source engine and the G80, as I mentioned on that page. NVIDIA thinks it's a problem with the game and not their card or drivers, so we'll have to wait and see how that one plays out. It's particularly bad in Episode One.

Unfortunately I don't have EQII so I can't test that.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
yep there's a problem with the fog in the Source engine and the G80, as I mentioned on that page. NVIDIA thinks it's a problem with the game and not their card or drivers, so we'll have to wait and see how that one plays out. It's particularly bad in Episode One.
I thought the fog issue in Source happened a lot closer, like in this scene.

However, even without the fog issue the G80 is still missing a chunck of railing on the crane's platform until Transparency AA. The only mode that seemed to render that railing correctly without needing TrAA or AAA was the xS modes on the 7 series.

If you were to throw the G80's in SLI, what AA methods would that release? Could you do 32xAA or something due to each single card being able to perform 16xQ AA on it's own?
Unfortunately I don't have EQII so I can't test that.
Blast. Does anyone else have an 8800GTS(X) and EQII?
 

Yreka

Diamond Member
Jul 6, 2005
4,084
0
76
Josh,

Performance on EQII isnt looking so good. From what I have read around, that game is, believe it or not CPU limited even with the 3+G C2D's. Apparently it uses an antiquated culling system to track P&NPC positions that runs on the CPU, while more modern engines would offload some of that stuff to the GC. Shadows are also drawn on the CPU I believe, so forget about adding those. I didnt see any performance gain from my X1900XTX, although the visuals are a little better.

Too bad too, it really is a gorgeous game with all of the effects turned on. SOE really needs to give their code a douche, but I am not holding my breath for that to happen.
 

Ratchet R3D

Junior Member
Nov 29, 2006
7
0
0
Originally posted by: josh6079
yep there's a problem with the fog in the Source engine and the G80, as I mentioned on that page. NVIDIA thinks it's a problem with the game and not their card or drivers, so we'll have to wait and see how that one plays out. It's particularly bad in Episode One.
I thought the fog issue in Source happened a lot closer, like in this scene.
Yep, that's what I am seeing. It appears that in some games it's farther than in others. half-Life 2 it's farther out than it is with Episode One and (apparently) CS Source (which is why I used Half-Life 2 for the AA comparo). Here is what it looks like in Episode One: G80 Fog Issue with Half-Life 2 Episode One

Originally posted by: josh6079However, even without the fog issue the G80 is still missing a chunck of railing on the crane's platform until Transparency AA. The only mode that seemed to render that railing correctly without needing TrAA or AAA was the xS modes on the 7 series.
I loaded the level again and checked it out closer. It looks like alpha transparent textures are used to make the inside of the railings while geometry is used to create the rail itself. So the screenshots are right, and the only way that those parts would get anti-aliased are with the mixed xS modes of the 7-series or with TAA/AAA enabled.
 

jim1976

Platinum Member
Aug 7, 2003
2,704
6
81
Originally posted by: josh6079
I thought the fog issue in Source happened a lot closer, like in this scene.

Actually I play CS a lot and some times I have noticed that fog with my X1900XTX as well

However, even without the fog issue the G80 is still missing a chunck of railing on the crane's platform until Transparency AA. The only mode that seemed to render that railing correctly without needing TrAA or AAA was the xS modes on the 7 series.

You have a valid point here.. I guess this can answer your question partially.. (post #1032)

If you were to throw the G80's in SLI, what AA methods would that release? Could you do 32xAA or something due to each single card being able to perform 16xQ AA on it's own?

AFAIK since I don't own 8800GTX SLI , this is feasible.. Technically it was feasible with 7 series and Quad SLI as well, but due to limited resources it was unusable..

Blast. Does anyone else have an 8800GTS(X) and EQII?

Sry can't help with that . It would be nice if someone had it though to investigate it..

On a side note, thanx for the quality review Ratchet.. How typical of you ;)
I like Combatant's work as well and in general of all of your stuff
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Yep, that's what I am seeing. It appears that in some games it's farther than in others. half-Life 2 it's farther out than it is with Episode One and (apparently) CS Source (which is why I used Half-Life 2 for the AA comparo).
Hmmm.....strange.
I loaded the level again and checked it out closer. It looks like alpha transparent textures are used to make the inside of the railings while geometry is used to create the rail itself. So the screenshots are right, and the only way that those parts would get anti-aliased are with the mixed xS modes of the 7-series or with TAA/AAA enabled.
Which is my point. The G80's image quality isn't really better than other cards in itself, but rather better when the image quality per level of performance is measured. The G80 performs so well though that the SSAA modes would really benefit from its horsepower, you yourself said, "it is f***ing fast."

I still have yet to see a product that will offer the best image quality in every game. The R580's offered HDR+AA and great AF, the G71's offered the best AA, and now the G80 offers the best AF and HDR+AA. It just seems like they are only aiming to surpass the competition's features and not build on their already strong ones. It'd be like ATI releasing a card that performs better than anything out, can do 8xS but uses a worse AF than their R5*** series.

You said:
The G80 ROPs support multisampled, supersampled, and transparency adaptive antialiasing...
so hopefully we'll see the xS modes in a driver update or, at the very least, a registry hack. It makes no sense to keep the more demanding AA on the weaker card.
Actually I play CS a lot and some times I have noticed that fog with my X1900XTX as well
Is it to that extent?
On a side note, thanx for the quality review Ratchet.. How typical of you
I like Combatant's work as well and in general of all of your stuff
QFT. Great article, and very thorough.
 

jim1976

Platinum Member
Aug 7, 2003
2,704
6
81
Originally posted by: josh6079
so hopefully we'll see the xS modes in a driver update or, at the very least, a registry hack. It makes no sense to keep the more demanding AA on the weaker card.

We all hope for those modes.. I guess this would be feasible in the near future as well..

Is it to that extent?

Nope certainly not..
 

Zstream

Diamond Member
Oct 24, 2005
3,395
277
136
How can you say the 8800GTX/GTS is better then then 9700 release? The 9700 is still being used today. We have no freeking clue how the 8800 does in DX10, in fact it could be awesome or could be horrific.

I think the 8800 is a good card but do not put this bull about being better then 9700.
 

Nightmare225

Golden Member
May 20, 2006
1,661
0
0
Originally posted by: Zstream
How can you say the 8800GTX/GTS is better then then 9700 release? The 9700 is still being used today. We have no freeking clue how the 8800 does in DX10, in fact it could be awesome or could be horrific.

I think the 8800 is a good card but do not put this bull about being better then 9700.

Someone's jealous?;)

Compared to the 9700s launch and its ability to play then-current DirectX 8, this was in that league. We might not know the future benefits of these cards, but for now, they're as much faster in DirectX 9 games as the 9700 was in 8.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: munky
Originally posted by: Gstanfor
Originally posted by: enz660hp
Originally posted by: tanishalfelven
THE WIERD THING is that a 8800gts at 8800gtx clocks is almost equal in performance. makes one think that the power of these cards is not being fully utilized.

uuuuhh...if they are at equal clocks the performance is equal? I dont get it...

Remember the "magical" performance increase GF3 received when the 8500 launched? :D ;)
I have a suspicion history may well repeat itself...

I just hope it doesnt receive the same "magical" performance increase the fx5800 received shortly after it launched.

I don't think we'll be seeing that again, anytime soon. Besides, G80 can already use the existing driver optimizations if the user so desires (and while there is some IQ loss as a result the end result is still pretty darn good).
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: avi85
Originally posted by: Gstanfor
Originally posted by: enz660hp
Originally posted by: tanishalfelven
THE WIERD THING is that a 8800gts at 8800gtx clocks is almost equal in performance. makes one think that the power of these cards is not being fully utilized.

uuuuhh...if they are at equal clocks the performance is equal? I dont get it...

Remember the "magical" performance increase GF3 received when the 8500 launched? :D ;)
I have a suspicion history may well repeat itself...

No, I don't , please remind me...

My pleasure. click here.
 

Zstream

Diamond Member
Oct 24, 2005
3,395
277
136
Originally posted by: Nightmare225
Originally posted by: Zstream
How can you say the 8800GTX/GTS is better then then 9700 release? The 9700 is still being used today. We have no freeking clue how the 8800 does in DX10, in fact it could be awesome or could be horrific.

I think the 8800 is a good card but do not put this bull about being better then 9700.

Someone's jealous?;)

Compared to the 9700s launch and its ability to play then-current DirectX 8, this was in that league. We might not know the future benefits of these cards, but for now, they're as much faster in DirectX 9 games as the 9700 was in 8.

Emm what exactly are you talking about?

http://www.techwarelabs.com/reviews/video/ati_radeon9700p/index_5.shtml

Seems to be more then 30% with AA + AF

 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: Cookie Monster
DX10 is where G80 really belongs.

Its not unified for nothing you know.

Hope you are right ....... actually I hope we see at least one game actually using more than a token dx10 in the next 2 years. dx9 took forever and it looks so much better than dx8.


edit: Many have stated that the g80 is a bigger jump than the 9700pro was. They likely are right (hoping this is not bigger than elvis hype), but as far as I understand - for current games using a standard decent monitor - the improvement in IQ and game play is quite small. We need games that can use this stuff!
 

Dethfrumbelo

Golden Member
Nov 16, 2004
1,499
0
0
Does anyone think we'll see a GT version (as specified in the driver registry) within the next 3 months?
 

Nelsieus

Senior member
Mar 11, 2006
330
0
0
Originally posted by: ronnn
Originally posted by: Cookie Monster
DX10 is where G80 really belongs.

Its not unified for nothing you know.

Hope you are right ....... actually I hope we see at least one game actually using more than a token dx10 in the next 2 years. dx9 took forever and it looks so much better than dx8.


edit: Many have stated that the g80 is a bigger jump than the 9700pro was. They likely are right (hoping this is not bigger than elvis hype), but as far as I understand - for current games using a standard decent monitor - the improvement in IQ and game play is quite small. We need games that can use this stuff!

As far as IQ, I think the much needed AF overhaul and AA improvements will be prevalent no matter what game it is (DX9). And as far as improvement in game play being quite small...well, I'm not sure what benchmarks you're referring to? :confused:

But I do get excited just thinking about the potential developers have to create software that fully utilizes these next-gen architectures (G80 and R600).

Nelsieus

 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
I think we'll definitely see a GT version at some point. You only have to look at previous GPU families to see this is an inevitable consequence of the binning selection process used to maximize profits from delivered yields.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
If you were to throw the G80's in SLI, what AA methods would that release?
AFAIK only SLI 8x is operational at the moment.

nVidia are sitting on an absolute gold mine with regards to the potential new AA modes revolving around 8xQ/16xQ but they're really dragging their heels.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: BFG10K
If you were to throw the G80's in SLI, what AA methods would that release?
AFAIK only SLI 8x is operational at the moment.

nVidia are sitting on an absolute gold mine with regards to the potential new AA modes revolving around 8xQ/16xQ but they're really dragging their heels.

CSAA at 32xAA with SLi just might be possible. Not to mention nVIDIA will definately bring back the 4xS/8xS/16xS modes back.

As BFG10k said, they are literally sitting on a gold mine full of potential.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: BFG10K
If you were to throw the G80's in SLI, what AA methods would that release?
AFAIK only SLI 8x is operational at the moment.

nVidia are sitting on an absolute gold mine with regards to the potential new AA modes revolving around 8xQ/16xQ but they're really dragging their heels.

Odd for nvidia to drag their heels as you put it. But I have a little thought. Far fetched? Maybe, but here goes. Maybe nvidia is holding back intentionally. Why, you ask? Ok. Nvidia has released the last 2 gens well before ATI's corresponding gens. This gave ATI plenty of time to see how Nvidia hardware performed, and tweak their clocks and drivers and whatever, and do whatever they had to do to be competitive or just best nvidia's fps in games. It could be possible, some may not see it this way, but possible that nvidia is intentionally holding back and "dragging heels" to give DAAMIT a "false" sense of security with the G80's current performance with nvidia's latest driver offering. DAAMIT will do what they need to do to get their R600 just a scooch faster than current G80 performance for the R600's January debut, which is what ATI has done the last two gens. I think Nvidia likes to be first to market for obvious reasons, but there is the downside of letting your competition know exactly what they have to do to best it by being out first. So, put a grain of salt label on this theory if you desire, because it's just a few thoughts jotted down here in the forum.

Keys