Originally posted by: mazeroth
Very interesting...
A side note, drakore. You're only getting 2.79 ghz. out of your e4300 on water? I was able to hit 3.25 without upping my voltages at all and upped them to 1.38 to hit 3.3ghz with the stock cooling. You should definately be able to get higher than that, I would think.
not really ... depends how they tested ... and how well Cats run with Vista vs FW in VistaPretty impressive
that is simply ridiculous ... AMD is optimized better for 3DMark ... my HD2900xt scores over 1000 more 3dMarks in 06 than my GTS ... it doesn't perform 10% better overall with the latest cats [period]AMD has a lead here by over 3,000 points in the default test in 3DMark 06
Originally posted by: apoppin
<div class="FTQUOTE"><begin quote>Pretty impressive</end quote></div>
not really ... depends how they tested ... and how well Cats run with Vista vs FW in Vista
-and i believe the results are reversed with XP ...
<div class="FTQUOTE"><begin quote>AMD has a lead here by over 3,000 points in the default test in 3DMark 06</end quote></div>that is simply ridiculous ... AMD is optimized better for 3DMark ... my HD2900xt scores over 1000 more 3dMarks in 06 than my GTS ... it doesn't perform 10% better overall with the latest cats [period]
the ONLY thing i look at in 3DMark06 is to track changes in my rig and to roughly compare with other GTSes to make sure i ma in the 'ballpark' and that there is no serious inconsitency.
--and the next 7.7 drivers ... well, they are still future for the regular user
If I have a bench that's 100 seconds long and a /single/ frame gets rendered at 5000 fps (say, cause you're looking at a sky that can render at 5000 fps for one frame), it's not going to matter and I don't care. Lets say that arbitrarily this demo ended up getting an average of 40 fps. (generating a total of 4000 frames in 100 seconds).
But ...
what if, instead of one instantaneous frame that no one cares about, you pause on the sky for even 1 second. That will insert 5000 more frames into your benchmark, giving you an average of 90 fps rather than 40. And all this boost comes from 1/100th of the time you spend doing the benchmark.
yes my numbers are exaggerated for demonstration (though the oblivion pause screen can render at 3k+ fps on some cards) ...
but while in the real world we don't see instances where the difference is >2x performance, we do see real skew in the averages because of things like this.
You also have to remember that the "minimum FPS" - especially in STALKER is dependent also on the CPU as Keys and i are finding outIt isn't perfect to use min, but it is as close as we can get at this point.