No. At 2560x1600 with HDR+AA (Which were the settings that AT was graphing them at)
Exactly, the
newest G80 benchmarks
--the ones I linked in my OP
--were intended to show which G80 was the best pick and they were benched at 1920x1440, not 2500x1600. That is the bench I am referring to.
This was a review on the G80, not an in depth analysis of what went wrong with the G70. If you want that find a site that did a review of the G70 mid-way through its lifespan, not one that is reviewing a completely new card.
Like I said, why include it then? A 7 series GPU has no place in a
G80 "roundup" article.
the X1xxx series would be sub frames per second. It is pointless to bench a card if it is going to run that slow.
It was pointless to bench the X1k series in the first place beings how the article was a
G80 Roundup.
READ WHAT THE REVIEW SAYS. They are not benching at 1920x1440.
Yes they were.
You're confused and looking at the initial G80 reviews, not the "Best of the Best" review that this thread is based on.
They are benching at 2560x1600. At 2560x1600, there is no point enabling AA because the GTS is at 17.8fps and the GTX is at 24fps. No matter how little CSAA affects it, the effects bring it down too much to be sensible to bench.
It's just as illogical to bench the game at 2560x1600. Next to no one has that size of monitor. A more practical approach would be to bench the game with HDR+AA on lower resolutions of 1600x1200, 1920x1440, and the like.
The people buying this card are probably going to be driving huge 30" displays, or something.
The people I've seen who have this card do
not have 30" displays with that kind of resolution. Many are doing it for the increased IQ and better performance, not to power a 30" monitor they've been trudging on. Even if they
have a 30" monitor, they'll probably use some 8800GTX's in SLI considering an SLI setup is almost a necessity for that kind of display.
At 1600x1200 the 8 series was CPU limited.
That's debatable.
If they turn on
all of the eye candy and use a resolution of 16x12 or greater and don't skimp out on AA, most games will show a GPU bottleneck before a CPU.
Because even a minor performance hit at 17.8 and 24 fps is huge for them.
Once again, you're concentrating on the wrong benchmark. Their latest one was at 1920x1440 and the frames were 25.2 for a
stock GTS and 33.1 for a
stock GTX. Once they overclocked the cards without using aftermarket cooling the GTS's frames were 35.1 and the GTX's frames were 41.8. This bench was more practical
and showed enough cushion for
some AA. The G80's can use 4xAA like the G71's and R580's could use 2xAA (It uses 2 ROP's on the new architecture whereas before it used 4) so the performance hit from using just 4xAA is literally next to nothing.
Your last link "other review sites" is broken...
Hmm.. strange, it worked for me a few hours ago. They showed some Oblivion frames at 2048x1536 using HDR+
16xQAA/and AA. I reflected their scores in a post of mine in another thread. I'll see if I can find it.
I can't argue anything there other than standby and say that at the resolutions AT was initially benching at it is illogical to enable AA and bench.
It was illogical to bench the card's at 2560x1600
without providing other smaller resolutions. Having
only one resolution as a benchmark is weak, especially if that one resolution is as rare as 2560x1600.
Anandtech is a well known hardware review site with an educated staff backing it. Their mark of satisfaction comes stamped with some of the hardware we buy
(i.e. my Lanparty board came with an "Anandtech Editor's Choice Award" sticker on it as if that means that they overlooked all of that products capabilities and determined that it was the best board for it's time). When they're looking into the overclocking options of the G80, they are wanting to see the maximum performance while using each G80s' highest clocks. By
not using
any AA, their numbers could be distorted since nothing is as performance hungry as higher levels of AA. For those trying to decide on a GTS or GTX, the overclocking they did means little if they leave out AA for a hungry game.
I'm sick of seeing such unrealistic benches from an educated and popular site like this. They need to use more common resolutions with practical settings, not some 2560x1600 res without half of the card's eye candy on.
EDIT:
I found those numbers I reflected for that broken link:
Oblivion 16xAA+HDR / 16xAF
XFX 8800GTX
[*]1280x1024--95.921
[*]1600x1200--86.899
[*]1920x1200--70.591
[*]2048x1536--54.103
Asus 8800GTX
[*]1280x1024--95.838
[*]1600x1200--86.988
[*]1920x1200--70.941
[*]2048x1536--54.006
EVGA 8800GTS
[*]1280x1024--82.566
[*]1600x1200--68.585
[*]1920x1200--55.360
[*]2048x1536--41.117
Oblivion 16xQAA+HDR / 16xAF
XFX 8800GTX
[*]1280x1024--70.278
[*]1600x1200--47.045
[*]1920x1200--38.784
[*]2048x1536--30.360
Asus 8800GTX
[*]1280x1024--70.184
[*]1600x1200--46.995
[*]1920x1200--38.156
[*]2048x1536--30.303
EVGA 8800GTS
[*]1280x1024--51.778
[*]1600x1200--33.739
[*]1920x1200--28.488
[*]2048x1536--21.998