Extelleron
Diamond Member
- Dec 26, 2005
- 3,127
- 0
- 71
Originally posted by: Matt2
Originally posted by: Extelleron
Originally posted by: Matt2
Originally posted by: Genx87
Is the new buzzphrase ATI fanboi's going to use "Proper Review"?
I thought the war cry was, "G80 is not optimized for DX10!".
I think at the moment it's, "Wait a few more months for the driver bomb!".
To put it into perspective. How much performance did G80 gain from release drivers to current? I'm not talking stability, I am talking pure performance.
FEAR 2560x1600 4xAA/8xAF
GeForce 8800GTX Release: 32 FPS
GeForce 8800GTX @ Ultra launch: 49 FPS
Oblivion HDR + AA 2560x1600 4xAA/8xAF
GeForce 8800GTX Release: 17.0 FPS
GeForce 8800GTX @ Ultra launch: 22.6 FPS
Two titles, gains of 53% and 33%.
From Firingsquad.
What about something more like 1920x1200 or 1600x1200? I'm really not interested in 2560x1600 numbers because I dont have a 30" LCD.
Oblivion, Release - > Ultra release sees a 4% performance improvement @ 1920x1200 and a 2% improvement @ 1600x1200.
Fear, Release - > Ultra release sees a 6% performance improvement @ 1920x1200 and and another 6% improvement @ 1600x1200.
CoH, Release - > Ultra release sees a 35% performance improvement @ 2560x1600, a 18% improvement at 1920x1200, and a 20% improvement @ 1600x1200.
It seems most of the performance increases come at 2560x1600, but CoH sees some significant increases all around the board.
Unlike the GTX though, the HD 2900XT shows clear PROBLEMS with performance. In the Inquirer review it performed faster at 2560x1600 than 1920x1200 in one or two tests. Clearly there's something very weird going on.