To Anandtech:

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
The G80's can do HDR+AA, we've had other cards that can do it since Q1 of 06, and none of your Oblivion benchmarks ever enable it. Why?

I only ask because I was recently looking through the 8800GTX "Best of the Best" article and when I came to this section I was dissapointed to not see such a feature enabled when comparing nothing but cards that all support it. Who would get that kind of a card, run the game at that resolution, and not use AA with HDR in that game?
 

schneiderguy

Lifer
Jun 26, 2006
10,801
91
91
They probably wanted to compare to the 7950's and 7900's -- if they used AA they couldnt have the 7xxx's in the chart. :(
 

LittleNemoNES

Diamond Member
Oct 7, 2005
4,142
0
0
Originally posted by: schneiderguy
They probably wanted to compare to the 7950's and 7900's -- if they used AA they couldnt have the 7xxx's in the chart. :(

nah. that doesn't seem like a good reason to omit the results. if anything it's because of time restraints. Not 100% sure current drivers allow for it, either.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
If that is true then why did they even have the 7 series in the "GeForce 8800 Roundup: The Best of the Best" article? By that title they should only review the different G80's, not an ATi card nor previous nVidia card.

Also, if they really did want to incorporate the 7 series in that article, why even bench Oblivion? It's not like we didn't know how G71 performed with that game in the first place. The new and informative information would be the G80's overclocked performance and what kind of performance you can get with its IQ.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: gersson
nah. that doesn't seem like a good reason to omit the results. if anything it's because of time restraints. Not 100% sure current drivers allow for it, either.
Others have published results.

 

LittleNemoNES

Diamond Member
Oct 7, 2005
4,142
0
0
Originally posted by: josh6079
Originally posted by: gersson
nah. that doesn't seem like a good reason to omit the results. if anything it's because of time restraints. Not 100% sure current drivers allow for it, either.
Others have published results.

I needs pics! :p
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
I needs pics!
Indeed. We need some owners to put up some screenies ASAP.
Here's your answer josh: AT video reviews suck thats why.
It depends on what information you're seeking. They have pretty comprehensive reviews in other aspects except for HDR+AA, which is what this thread is about.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
While you're at it, ask AT why they refused to bench the g70/g71 cards at HQ driver settings when a number of more insightful review sites admitted that the shimmering problem really does exist when using default settings. :thumbsdown:
 

Rangoric

Senior member
Apr 5, 2006
530
0
71
*ponder* Aren't previous reviews worthless for comparison in this case?

Or do all reviews use the exact same computers, just changing the Video Card?

As a general comparison of DIFFERENT BRANDS of the same VC, including the pervious generation as an example is a good idea.

I'd rather know the extra frames are from the video card and no tthe brand spankin new chip/memory.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
I dont like AT reviews anymore. For indepth GPU architecture review, go to beyond3d. For theorectical tests, beyond3d and digit life is the best. IQ comparisons have to got to hothardware, hardocp and rage3d. Performance reviews i go to see is either hardocp or bit-tech (especially for the 8800 series). Numerous game benchs is xbitlabs. Guru 3d for the pics :) and rage3d for colourful graphs :D

There are others like techreport, AT, and others, but they seem way to generic.
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
Originally posted by: munky
While you're at it, ask AT why they refused to bench the g70/g71 cards at HQ driver settings when a number of more insightful review sites admitted that the shimmering problem really does exist when using default settings. :thumbsdown:

Umm they did.

Link

Most of the time it is 1-5fps slower for the 7 Series whereas the 8 series holds around 1fps or so. Don't be so quick to jump on one issue munky.

cuz they're paid by nvidia not write on every page that 7xxx sereis can't do hdr+aa when x1xxx can.

Oh get off of it. They mentioned it earlier in the article. The X1xxx series cards would be below a frame a second at those resolutions, the 7 series aren't able to perform it and the 8 series cards were already hovering around 20fps

Link

Additionally, AT showed earlier how little of a performance loss was seen when CSAA was enabled.

Come on people. You are smarter than this- think.

-Kevin
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
Originally posted by: josh6079
This guy's thread was more helpful than Anandtech's article as far as determining what kind of performance increases you can see with overclocking a G80 and using logical settings. $500 for a G80 and then use No AA+HDR, 16xAF, and 1920x1440? What do they think it is, a 7950GX2?

Umm did you see the resolutions they were benching it at?? Enabling AA would be illogical.

Additionally in their next article they overclock the 8 series and bench it at those settings.

-Kevin
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: Gamingphreek
Originally posted by: munky
While you're at it, ask AT why they refused to bench the g70/g71 cards at HQ driver settings when a number of more insightful review sites admitted that the shimmering problem really does exist when using default settings. :thumbsdown:

Umm they did.

Link

Most of the time it is 1-5fps slower for the 7 Series whereas the 8 series holds around 1fps or so. Don't be so quick to jump on one issue munky.
It's interesting that Anandtech didn't delve into the 7 series HQ performance hit or "Default" image settings until after nVidia introduced a different cash cow.

Munky was obviously talking about their prior 7 series reviews that were stuffed full to the brim of illogical settings, not the first G80 review that finally did an overdue analysis of the 7 series' horrid AF.

The X1xxx series cards would be below a frame a second at those resolutions...
At what resolutions, 1920x1440?
...the 8 series cards were already hovering around 20fps

Link
The GTS was at stock which is the point of the article. An overclocked GTS can get very similar if not better performance than a reference GTX. Considering a reference GTX scored 24 frames at a resolution next to no one uses, I'd say that their 1920x1440 numbers should have had some AA enabled as well.
Additionally, AT showed earlier how little of a performance loss was seen when CSAA was enabled.
So why not include it? You can't make the argument that enabling it would have been too much and then say that enabling CSAA gives too little of a performance hit.

EDIT:

Umm did you see the resolutions they were benching it at?? Enabling AA would be illogical.
This coming from the same person defends "how little of a performance loss was seen when CSAA was enabled."

Apparently it was logical enough for other review sites to do at higher resolutions than Anandtech's latest 1920x1440 bench.
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
It's interesting that Anandtech didn't delve into the 7 series HQ performance hit or "Default" image settings until after nVidia introduced a different cash cow.

Munky was obviously talking about their prior 7 series reviews that were stuffed full to the brim of illogical settings, not the first G80 review that finally did an overdue analysis of the 7 series' horrid AF.

When was the last time they did a 7 series review. This was a review on the G80, not an in depth analysis of what went wrong with the G70. If you want that find a site that did a review of the G70 mid-way through its lifespan, not one that is reviewing a completely new card. But just for kicks they did discuss its AA and AF algorithms.

At what resolutions, 1920x1440?

No. At 2560x1600 with HDR+AA (Which were the settings that AT was graphing them at) the X1xxx series would be sub frames per second. It is pointless to bench a card if it is going to run that slow. You wouldn't even be able to accurately compare it to the others.

The GTS was at stock which is the point of the article. An overclocked GTS can get very similar if not better performance than a reference GTX. Considering a reference GTX scored 24 frames at a resolution next to no one uses, I'd say that their 1920x1440 numbers should have had some AA enabled as well.

READ WHAT THE REVIEW SAYS. They are not benching at 1920x1440. They are benching at 2560x1600. At 2560x1600, there is no point enabling AA because the GTS is at 17.8fps and the GTX is at 24fps. No matter how little CSAA affects it, the effects bring it down too much to be sensible to bench.

Next to no one uses it because no other card can run it. Next to no one uses it, because next to no one is going to shell out 500-650 dollars for a card. The people buying this card are probably going to be driving huge 30" displays, or something. At 1600x1200 the 8 series was CPU limited. Granted AA would probably have fixed that a little.

So why not include it? You can't make the argument that enabling it would have been too much and then say that enabling CSAA gives too little of a performance hit.

Because even a minor performance hit at 17.8 and 24 fps is huge for them. The results go from playable to horribly unplayable. Even just a 5 fps drop (Which is about all there is for 2x) absolutely kills both of the cards.

Your last link "other review sites" is broken so I can't argue anything there other than standby and say that at the resolutions AT was initially benching at it is illogical to enable AA and bench.

-Kevin
 

beggerking

Golden Member
Jan 15, 2006
1,703
0
0
great job Gamingphreek ! for taking your time explaining to Josh.

I'd be irritated by Josh's section and sections of pointless questions.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
No. At 2560x1600 with HDR+AA (Which were the settings that AT was graphing them at)
Exactly, the newest G80 benchmarks--the ones I linked in my OP--were intended to show which G80 was the best pick and they were benched at 1920x1440, not 2500x1600. That is the bench I am referring to.
This was a review on the G80, not an in depth analysis of what went wrong with the G70. If you want that find a site that did a review of the G70 mid-way through its lifespan, not one that is reviewing a completely new card.
Like I said, why include it then? A 7 series GPU has no place in a G80 "roundup" article.
the X1xxx series would be sub frames per second. It is pointless to bench a card if it is going to run that slow.
It was pointless to bench the X1k series in the first place beings how the article was a G80 Roundup.
READ WHAT THE REVIEW SAYS. They are not benching at 1920x1440.
Yes they were.

You're confused and looking at the initial G80 reviews, not the "Best of the Best" review that this thread is based on.
They are benching at 2560x1600. At 2560x1600, there is no point enabling AA because the GTS is at 17.8fps and the GTX is at 24fps. No matter how little CSAA affects it, the effects bring it down too much to be sensible to bench.
It's just as illogical to bench the game at 2560x1600. Next to no one has that size of monitor. A more practical approach would be to bench the game with HDR+AA on lower resolutions of 1600x1200, 1920x1440, and the like.
The people buying this card are probably going to be driving huge 30" displays, or something.
The people I've seen who have this card do not have 30" displays with that kind of resolution. Many are doing it for the increased IQ and better performance, not to power a 30" monitor they've been trudging on. Even if they have a 30" monitor, they'll probably use some 8800GTX's in SLI considering an SLI setup is almost a necessity for that kind of display.
At 1600x1200 the 8 series was CPU limited.
That's debatable.

If they turn on all of the eye candy and use a resolution of 16x12 or greater and don't skimp out on AA, most games will show a GPU bottleneck before a CPU.
Because even a minor performance hit at 17.8 and 24 fps is huge for them.
Once again, you're concentrating on the wrong benchmark. Their latest one was at 1920x1440 and the frames were 25.2 for a stock GTS and 33.1 for a stock GTX. Once they overclocked the cards without using aftermarket cooling the GTS's frames were 35.1 and the GTX's frames were 41.8. This bench was more practical and showed enough cushion for some AA. The G80's can use 4xAA like the G71's and R580's could use 2xAA (It uses 2 ROP's on the new architecture whereas before it used 4) so the performance hit from using just 4xAA is literally next to nothing.
Your last link "other review sites" is broken...
Hmm.. strange, it worked for me a few hours ago. They showed some Oblivion frames at 2048x1536 using HDR+16xQAA/and AA. I reflected their scores in a post of mine in another thread. I'll see if I can find it.
I can't argue anything there other than standby and say that at the resolutions AT was initially benching at it is illogical to enable AA and bench.
It was illogical to bench the card's at 2560x1600 without providing other smaller resolutions. Having only one resolution as a benchmark is weak, especially if that one resolution is as rare as 2560x1600.

Anandtech is a well known hardware review site with an educated staff backing it. Their mark of satisfaction comes stamped with some of the hardware we buy (i.e. my Lanparty board came with an "Anandtech Editor's Choice Award" sticker on it as if that means that they overlooked all of that products capabilities and determined that it was the best board for it's time). When they're looking into the overclocking options of the G80, they are wanting to see the maximum performance while using each G80s' highest clocks. By not using any AA, their numbers could be distorted since nothing is as performance hungry as higher levels of AA. For those trying to decide on a GTS or GTX, the overclocking they did means little if they leave out AA for a hungry game.

I'm sick of seeing such unrealistic benches from an educated and popular site like this. They need to use more common resolutions with practical settings, not some 2560x1600 res without half of the card's eye candy on.

EDIT:

I found those numbers I reflected for that broken link:


Oblivion 16xAA+HDR / 16xAF

XFX 8800GTX
[*]1280x1024--95.921
[*]1600x1200--86.899
[*]1920x1200--70.591
[*]2048x1536--54.103

Asus 8800GTX
[*]1280x1024--95.838
[*]1600x1200--86.988
[*]1920x1200--70.941
[*]2048x1536--54.006

EVGA 8800GTS
[*]1280x1024--82.566
[*]1600x1200--68.585
[*]1920x1200--55.360
[*]2048x1536--41.117

Oblivion 16xQAA+HDR / 16xAF

XFX 8800GTX
[*]1280x1024--70.278
[*]1600x1200--47.045
[*]1920x1200--38.784
[*]2048x1536--30.360

Asus 8800GTX
[*]1280x1024--70.184
[*]1600x1200--46.995
[*]1920x1200--38.156
[*]2048x1536--30.303

EVGA 8800GTS
[*]1280x1024--51.778
[*]1600x1200--33.739
[*]1920x1200--28.488
[*]2048x1536--21.998
 

beggerking

Golden Member
Jan 15, 2006
1,703
0
0
Originally posted by: josh6079
I'd be irritated by Josh's section and sections of pointless questions.
What does that useless opinion of yours have to do with Anandtech's benches again?

let me rephrase:

great job Gamingphreek ! for taking your time explaining to Josh's utterly pointless questions/interrogations.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
What exactly about Josh's question is pointless...? One of the major benefits of the X1k cards over NV's 7-series was HDR+AA, and Oblivion was used more than any other title to highlight this difference. So, now that NV can do it too, I think a comparison between the two makes perfect sense. This is even moreso due to the fact that even an X1900XT is pretty much enough (and the 8800GTX overkill) for most games - you have to focus on the areas that G80 impoves upon.

Also, Kevin generally makes good points, but I think he's off base this time. A high resolution like 2560x1600 doesn't negate the need for AA. A monitor with that kind of resolution is generally also a very large monitor, so the aliasing will be just as bad at 2560x1600 on a 30" monitor as it would at 1680x1050 on a 20" screen. The fact that almost no one has a 2560x1600, 30" monitor just adds to the irrelevance of the benchmark. Sure, it's impressive that G80 can play Oblivion at 2560x1600, but how many people really care? I prefer quality over quantity. I have a 20" screen (probably get a 24" soon) because it allows me to play with a lot of eye candy. A 30" LCD would just be a whole lot of expensive ugly in most cases, and this benchmark only serves to prove that.

I think that as the regular readers and forum members of AT, we have the right to ask these kinds of questions. By the same token, AT can choose to respond, pay attention to, or simply ignore these types of questions. Hopefully, they will view them not as negative feedback, but as constructive criticism and a genuine desire to help make AT a better site, which I believe is the motive behind the original question.
 

Dethfrumbelo

Golden Member
Nov 16, 2004
1,499
0
0
Originally posted by: nitromullet
What exactly about Josh's question is pointless...? One of the major benefits of the X1k cards over NV's 7-series was HDR+AA, and Oblivion was used more than any other title to highlight this difference. So, now that NV can do it too, I think a comparison between the two makes perfect sense. This is even moreso due to the fact that even an X1900XT is pretty much enough (and the 8800GTX overkill) for most games - you have to focus on the areas that G80 impoves upon.

Also, Kevin generally makes good points, but I think he's off base this time. A high resolution like 2560x1600 doesn't negate the need for AA. A monitor with that kind of resolution is generally also a very large monitor, so the aliasing will be just as bad at 2560x1600 on a 30" monitor as it would at 1680x1050 on a 20" screen. The fact that almost no one has a 2560x1600, 30" monitor just adds to the irrelevance of the benchmark. Sure, it's impressive that G80 can play Oblivion at 2560x1600, but how many people really care?

I think that as the regular readers and forum member of AT, we have the right to ask these kinds of questions. By the same token, AT can choose to respond, pay attention to, or simply ignore these types of questions. Hopefully, they will view them not as negative feedback, but as constructive criticism and a genuine desire to help make AT a better site, which I believe is the motive behind the original question.

Well said. I think AT should use resolutions and AA settings that the vast majority of people use. What percentage of people in the PC market are using 2560x1600? Less than a half of one percent is my guess.

1280x1024, 1680x1050, 1600x1200, and 1920x1200 are the most commonly used resolutions.
 

beggerking

Golden Member
Jan 15, 2006
1,703
0
0
Originally posted by: nitromullet
What exactly about Josh's question is pointless...?... the original question.

everything except bolded.

I think there is a review that points out there is still bugs in driver preventing AA+HDR in Oblivion on 8800.
 

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
I'm getting the vibe from this thread that you guys are accusing Anandtech of being so wrapped up with Nvidia that somehow their reviews suck now.

Too bad they dont.

If you want other info that they dont provide, go elsewhere for that. Its that easy. If you are lucky, they might get back to you on all these questions but the easiest answer is: if you dont like their reviews.. either dont visit the site, or find it elsewhere.

Ive always found Anandtech articles to be the best overall. They are very complete and dont do that silly "playable settings" crap. I hate that.

Great review AT. :thumbsup: