Hey guys,
I have a friend who has a 1800+ on a Abit KR-7 mbd (KT266A chipset) w/512 crucial ram and a geforce 3 card. He is using 2k server. I have a 1400 athlon on a Abit KG-7 mbd (older amd chipset). I also have 512 crucial memory and an identical GF3 card. When I run 3dmark 2001 default benchmark w/o overclocking anything, I get about 6650 3dmarks. Now we both use the same detonator drivers and the same via 4 in 1's. His system gets about 6600 3dmarks w/o overclocking anything. I contend that he should get (b/c of the 266A chipset and the newer and faster XP processor) about 7000 3dmarks w/o overclocking anything.
My question is this:
Is his lower 3dmark score attributed to the way 2k server allocates the processor and memory? I have been racking my brains trying to figure this one out.
:disgust:
I have a friend who has a 1800+ on a Abit KR-7 mbd (KT266A chipset) w/512 crucial ram and a geforce 3 card. He is using 2k server. I have a 1400 athlon on a Abit KG-7 mbd (older amd chipset). I also have 512 crucial memory and an identical GF3 card. When I run 3dmark 2001 default benchmark w/o overclocking anything, I get about 6650 3dmarks. Now we both use the same detonator drivers and the same via 4 in 1's. His system gets about 6600 3dmarks w/o overclocking anything. I contend that he should get (b/c of the 266A chipset and the newer and faster XP processor) about 7000 3dmarks w/o overclocking anything.
My question is this:
Is his lower 3dmark score attributed to the way 2k server allocates the processor and memory? I have been racking my brains trying to figure this one out.
:disgust:
