3Dmark 2000, is TOO inaccurate! Take a look

ndee

Lifer
Jul 18, 2000
12,680
1
0
Hey there,
yesterday, I let 3Dmark 2000 run, with only the CPU Speed Test. When my CPU was at 714MHz, the CPU Speed was 140 Points. Then, I o/c my CPU to 742MHz, and the CPU Speed went to 119. Now somebody have to tell me how this is possible!
 

Soulflare

Golden Member
Apr 16, 2000
1,801
0
0
I think the CPU test is buggy. With my system (P3-700E, 256Mb Micron
PC-133 CAS2 SDRAM, Asus P3B-F, ATI AIW 128 32Mb) that test will only
be completed if I enable every other test. Turning off even one test,
such as the 64Mb one, will cause the CPU test to fail every time. :|


BTW: My 3DMark score was 2007 (800x600, 32-Bit Color)
CPU Marks came in around 240 (1024x768, 32-Bit Color)
The CPU test never worked for me at 800x600.


.
 

jpprod

Platinum Member
Nov 18, 1999
2,373
0
0
The 3dmark CPU test doesn't test your CPU speed, it test theoretical speed in games which aren't fillrate but CPU/T&L limited. If your PCI, AGP or memory bus speed is higher in 714mhz than at 742mhz (or some other settings differ), it is possible that your V3 gets better polygon throughput in the test with lower CPU speed.
 

dkozloski

Diamond Member
Oct 9, 1999
3,005
0
76
ndee, the CPU speed test doesn't test the CPU speed directly. It tests video functions that are affected by CPU speed. An increase in CPU clocking may adversly affect video functions and this is reported as a reduced CPU speed score.
 

Deeko

Lifer
Jun 16, 2000
30,213
12
81
I've been saying this for how long now? It's the worst benchmark ever. NEVER EVER use it for anything other than a stress test, it's actually good at that.
 

LocutusX

Diamond Member
Oct 9, 1999
3,061
0
0
That's right, what we need is a good Direct3D game with a well-implemented benchmarking feature.
 

dawks

Diamond Member
Oct 9, 1999
5,071
2
81
Hehehe. Good luck in finding a good D3D game/engine. OpenGL is where its at.
 

Rand

Lifer
Oct 11, 1999
11,071
1
81
DaZ, maybe that's your opinion but I can name plenty of good D3D games. In fact about the only thing gaming under OpenGL is good for is first person shooters, anything outside of that is purely the realm of D3D.
I don't even OWN a single OpenGL game, never have in fact. And I've only even played under OpenGL a few times. All the games I own are either D3D or hybrid D3D/GLide
 

PG

Diamond Member
Oct 25, 1999
3,426
44
91
I don't understand my 3DMark 2000 scores either.
I have a 333 celeron, TNT2 M64 PCI, and 160 MB memory. I am using the 6.18 drivers. They seem to work the best for me.
At 1024 X 768, 16 bit color, I get fill rates of about 140 for single texture and 199 for multi-texturing. The 64MB texture rendering speed is only about 1 FPS.
When I keep the resolution the same and try 32 bit color my fill rate cuts to about half. It says about 55 for single and 101 for multi. Then for the 64MB texture rendering speed I get 23 FPS. How does this make sense? Am I missing something?

I have tried this over and over with different resolutuions and I always get the same type of results when I compare 16 bit color to 32.

PG
 

PG

Diamond Member
Oct 25, 1999
3,426
44
91
Wow, this got buried fast.

What does everyone think of my results? Is 3Dmark messed up or can it be some of my system settings?

PG
 
Jun 18, 2000
11,198
771
126
Well your fill rate should get cut in half because of memory bandwith contraints. In a perfect non-bandwith limited world, 16 and 32 bit scores should be the same.

Dunno about the texture rendering speed. That is quite the tough nut to crack. (lol nut)