32bit 3DMark

Deeko

Lifer
Jun 16, 2000
30,213
12
81
Ok, I'm gonna prove to the 3DMark supporters that I'm not just a stubborn idiot. I listened to what RobsTV and Ahfung said, so here's what I want you guys to do. Run 3DMark, same settings as default test, but with 1024x768x32 and 32 bit textures. I'm too lazy to put my full score here, but the overall was 3500...suprisingly only 400 lower than my "standard" default score. Let's see if your arguments hold up guys :)
 

jpprod

Platinum Member
Nov 18, 1999
2,373
0
0
I don't know what you're trying to prove in this thread, but explanation to the very slightly differing 32bit and 16bit scores is simple. 3Dmarks score consists largely of the two "game" benches, which have rather high polygon counts compared to even new T&L games. While Voodoo5 has loads of bandwidth and fillrate to combat 1024x768x32bit, it is horrible in polygon throughput. This caps the framerate in the meaningful "game" tests, especially medium and high-polycount ones.
 

Deeko

Lifer
Jun 16, 2000
30,213
12
81
They tell me that 3DMark scores look less like you would think they should because in 16 bit the cards aren't bandwidth limited, and in 32 bit they are, so they should look more realistic or whatever. So I wanted to see if what they said is true, or if 3DMark really is the piece of trash I've been saying it is.
 

RobsTV

Platinum Member
Feb 11, 2000
2,520
0
0

Deeko wrote in part: "Ok, I'm gonna prove to the 3DMark supporters that I'm not just a stubborn idiot."


No need to post full scores, as the only thing the scores matter too are identical systems.
O.K., I ran the test in 32bit, and it was just as expected.
3DMark showed me that my system was optimized at 32bit.
It ran at 98% of what other "identical" systems run at in 32bit.

So what's your point?? If it is proving that 3DMark actually does accuratly compare identical system, then you have succeded. No work is needed here on my machine. But, on another test machine that scored lower, I did discover that somehow DRAM Bank Interleave went from 4 way to disabled in Bios. Adjusting the Bios to the correct setting, and sure enough, that machine also scored the same as identical systems.

Give it a rest. This program is not an indicator of which video card is best. No one is saying there video card is better because it scores higher. If your system scores 100, and other identical systems score 98 or 105, then yours looks good! If yours scores 80 though, then you can improve it. This program will help you do just that.

BTW, ATI's 16bit performance is well documented as poor, and 32 bit is were it shines. This is attested to by the fact that you card lost about 10% in 3DMark when going from 16bit to 32bit. In comparison, my nVidia card lost about 25% when going from 16bit to 32bit. Again, these results are what was expected, and look very acurate.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Duron 650
GeForce DDR 130/301
(I honestly forget which driver rev I'm using, let me check....... 6.47)

1024x768 16bit - 4411

1024x768 32bit - 3555



1600x1024 16bit - 2860

1600x1024 32bit - 1733
Edit-

1600x1024 32bit - 1892 Clocked at 130/340/Edit

Those 1600 scores are not typos, I have a 32MB board and it refuses to run 1600x1200 32bit so I dropped to the highest res I could run in 32bit to compare. I'm taking a ~850 point hit at 10x7 and over 1100 at 1600x1024. Of course, I have slightly less bandwith then you do(~10%) so no surprise there. Try running at the two higher resolutions I have listed, odds are you should edge me out in 32bit at the very least:)