Originally posted by: ShawnD1
Originally posted by: JBT
My old GeForce 2 TI 64MB scored around 5000 in 3dmark01 My laptop with a Mobility 9000 64MB which is slower than a desktop card 9200 gets 7000-8000 I forget exactly the score as I don't ussually benchmark it. Maybe I will give it a run or two when I get home just to be sure.
I relize 3dmark sucks but it is a decent benchmark for general system performance.
3DMark is the AOL of benchmarking, seriously. 3DMark is also flawed because it will give a card an incredibly low score if the card is unable to play a scene. I benchmarked both my GF2 Ti and my FX5200 in 3DMark 2001, and since some of the scenes could not be rendered with the GF2, it got an incredibly low score. 3DMark 2001 said the FX5200 was much much faster than the GF2, but another program called GL Excess said the GF2 was faster. I tried playing Neverwinter Nights with both cards and found that the GF2 Ti really was faster than the FX5200 just like GL Excess said.
Please don't use NWN as a yardstick to measure vid card performance by. It's so biased towards
OLD nvidia technology it's not funny.
For example, I've been using a Geforce256 DDR (
geforce 1!) for the past 4 years or so(1). When I upgraded to a 9800 PRO 128MB 256-bit, my "feeling of performance" in NWN was WORSE, not better. Indeed, benchmarking(2) proved that I was only getting about 3-4fps more in NWN with the spiffy new (& expensive) hardware at the exact same settings as I had used previously! And a casual glance of the Bioware forums proves that other ATI R3xx hardware owners have an equally tough time with this game.
(1) Note that my overclocked Geforce256 DDR, which was an AGP 2x card, could probably trump the PCI-based Geforce2MX which is the subject of this thread! Hilarious!
(2) my benchmarking of NWN: using fraps to bench the introductory in-game cinematic of SOU, where all the kobolds go attacking that dwarf