Here are several reasons Mad Onion does not get my respect.
1. 3DMark 2001 does not allow the Geforce 2 chipset to score on DirectX 8 tests. However, it will run DirectX 8 games and demos fine.
2. There is a 25% difference in scores from 1000MHz to 1500Mhz on the same system, but all frame rates in games show very little improvement on the faster system (less then 10%)
3. I can fire up my computer on a cold morning, take of the case cover and put a fan there, go to 2.2v and 1620Mhz, overclock my Geforce 2 Ti 500 to 300/540 (the heck with the artifacts) and get just one pass without crashing. My score improves from 5500 to 5800, even though a second attempt will send my computer and video card crashing hard. Gee, now I can send my bogus score to Mad Onion for bragging rights, big deal!
Maybe I got the whole thing wrong, but isn't the whole point of a video card benchmark test to compare the real world performance of different video cards in different systems? Why can't they make a benchmark that incorporates a wide variety of games both new and old, allow testing with FSAA and without, loop it 5-10 times so you can't squeeze out one bullsh!t score, and include an artifact tester. That would be a real score, and would let you compare the real world speed of older video cards against brand new ones.
Sometimes I read about benchmarks to figure out if I want to buy a new video card and try to make an educated decision. But without a valid standard test, the only way to be sure is to buy the new card and test against the old in your own system, kinda of an expensive way to find out.
Whaddya guys think, am I just a complaining fool?
1. 3DMark 2001 does not allow the Geforce 2 chipset to score on DirectX 8 tests. However, it will run DirectX 8 games and demos fine.
2. There is a 25% difference in scores from 1000MHz to 1500Mhz on the same system, but all frame rates in games show very little improvement on the faster system (less then 10%)
3. I can fire up my computer on a cold morning, take of the case cover and put a fan there, go to 2.2v and 1620Mhz, overclock my Geforce 2 Ti 500 to 300/540 (the heck with the artifacts) and get just one pass without crashing. My score improves from 5500 to 5800, even though a second attempt will send my computer and video card crashing hard. Gee, now I can send my bogus score to Mad Onion for bragging rights, big deal!
Maybe I got the whole thing wrong, but isn't the whole point of a video card benchmark test to compare the real world performance of different video cards in different systems? Why can't they make a benchmark that incorporates a wide variety of games both new and old, allow testing with FSAA and without, loop it 5-10 times so you can't squeeze out one bullsh!t score, and include an artifact tester. That would be a real score, and would let you compare the real world speed of older video cards against brand new ones.
Sometimes I read about benchmarks to figure out if I want to buy a new video card and try to make an educated decision. But without a valid standard test, the only way to be sure is to buy the new card and test against the old in your own system, kinda of an expensive way to find out.
Whaddya guys think, am I just a complaining fool?