<<
<< 3dMark2k1 rates pixel and vertex shaders. So a Geforce 3 or Radeon 8500 would probably have significantly higher scores. >>
I think only the games are counted.. >>
Yes, only the games are counted, but the extra features of the 8500 and GF3 push those extra "theoretical" tests which are translated into the "real world" performance seen in the game tests (particularly the nature test). Btw, there is a significant difference in the nature test between tweaks and cards. I got a slide show @ 3fps with my V5, my GF3 runs it @50fps and @30fps before I tweaked it.
anime you're right, I shoulda clarified what I meant. I consider "tweaking" running your system out-of-spec. I don't really consider software/bios optimizations as tweaking, I consider them stock. For instance, I don't run my system if it is unstable at turbo ram timings/cas2/4-way enabled. I won't go to cas 2.5 or 4-way off to bump up my FSB a few points. I also don't consider driver or AGP settings to be tweaks. Yes, the average user might not ever upgrade or set their AGP settings to where they *should* be set, but to me, these are normal settings. If you have a 4x AGP card and running 2x AGP, you're not running stock specs, you're running under-spec
BujinZero I haven't owned a GF2, only Voodoos prior to this GF3 ti200, but I must say I am extremely impressed with this cards speed. That's not saying a whole lot b/c the V5 struggled at resolutions above 32-bit and 1024 in fps games. I never tried running any of the newer gen. games, but I'm sure its struggles would have been magnified. Having seen the difference, I'm thinking the difference between a GF2 and a GF3 would be noticeable at higher resolutions in 32-bit color as well as lower resolutions in 32-bit color with AA on. You also might see differences in newer games that take advantage of the more robust GF3 and 8500 T&L engines. Basically, the GF3 and the 8500 make high resolution and Anti-Aliased gaming a more viable option. No, you won't see a big difference in Quake3 in 16 bit at 1024 on a GF3 or a GF2, but the difference (in FPS at least) might be significant at 32-bit and 1280. Average fps is deceiving as well, b/c I'm a firm believe anything over 70fps isn't very noticeable at all. However, in a scene where your frames start to stutter at 30fps, it becomes very noticeable. So even though a GF3 at 120 avg fps doesn't seem that noticeable over a GF2 at 70 avg fps, you need to take into consideration the lowest amount of frames in that average, b/c slowdowns will be very noticeable.
Chiz