- Jul 16, 2001
- 7,571
- 178
- 106
I had gotten my computer back up recently, and was tweaking/overclocking it a bit here and there, to see what I could squeeze out of my system. Out of simplicity, I used 3dmark2001SE to compare my scores, and I found an odd discrepency. Look at the following scores:
DDR433 w/360FSB - 15733 *both of these scores with an ~2.16ghz clock speed*
DDR346 w/346FSB - 15988
Maybe this might make sense to you, but to me it seemed weird. A higher memory and FSB speed, albeit asynchronously, was losing in score by a good 250 points, which is a decent ammout of points considering a small system tweak, since your video card composes most of your points in 3dmark. So, to quench my curiosity, I started running benches under all sorts of FSB, CPU, and memory settings. My main point of interest was roughly in the FSB range of 333-400. Here were my second series of scores:
DDR400 w/333FSB - 15300 *2.08ghz clock speed*
DDR333 w/400FSB - 15357 *2.1ghz clock speed*
DDR333 w/333FSB - 15646 *2.08ghz clock speed*
I would have thought that, despite the fact that the top two scores were run asynchronously, t he extra memory speed or FSB speed, respectively, would have given higher scores than a completely lower 333mhz memory and FSB. Now we're talking a 300 point difference. This made me think...hmm, I know that some people have slower CPUs but good memory, and run asynchronously with their memory at a higher rating, but this test shows that that will hurt gaming performance, not help it. Yet, I knew 3dmark wasn't an exactly reliable tool to use, so I went to benching a real game, Farcry. I tried to keep the time and ammount of frames spent recording as close as possible, and the point in the game I used for benching was the same for every test to keep the results more valid.
2004-05-08 16:43:36 - FarCry DDR333 FSB333 2.08ghz
Frames: 2077 - Time: 30703ms - Avg: 67.648 - Min: 30 - Max: 80
2004-05-08 16:51:06 - FarCry DDR400 FSB333 2.08ghz
Frames: 2050 - Time: 29297ms - Avg: 69.973 - Min: 58 - Max: 82
2004-05-08 16:58:15 - FarCry DDR333 FSB400 2.1ghz
Frames: 1966 - Time: 30046ms - Avg: 65.433 - Min: 24 - Max: 80
2004-05-08 17:03:40 - FarCry DDR400 FSB400 2.1ghz
Frames: 2215 - Time: 30937ms - Avg: 71.597 - Min: 38 - Max: 82
This showed the opposite trend as 3dmark. Running higher memory speeds helped performance marginally, which is what one would assume. Took another bench in the middle of tons of action just for fun, and to show a more "true" FPS value you'd expect to get when playing the game. My previous benches were a slightly more relaxing run through the trees and fall off a cliff type thing
2004-05-08 17:13:33 - FarCry DDR433 FSB433 2.16ghz
Frames: 3883 - Time: 67891ms - Avg: 57.194 - Min: 38 - Max: 94
So...hmm...not sure what to conclude. 3dmark and Farcry benches say the exact opposite. Should I go for more testing, or let this lie? Thoughts, comments?
DDR433 w/360FSB - 15733 *both of these scores with an ~2.16ghz clock speed*
DDR346 w/346FSB - 15988
Maybe this might make sense to you, but to me it seemed weird. A higher memory and FSB speed, albeit asynchronously, was losing in score by a good 250 points, which is a decent ammout of points considering a small system tweak, since your video card composes most of your points in 3dmark. So, to quench my curiosity, I started running benches under all sorts of FSB, CPU, and memory settings. My main point of interest was roughly in the FSB range of 333-400. Here were my second series of scores:
DDR400 w/333FSB - 15300 *2.08ghz clock speed*
DDR333 w/400FSB - 15357 *2.1ghz clock speed*
DDR333 w/333FSB - 15646 *2.08ghz clock speed*
I would have thought that, despite the fact that the top two scores were run asynchronously, t he extra memory speed or FSB speed, respectively, would have given higher scores than a completely lower 333mhz memory and FSB. Now we're talking a 300 point difference. This made me think...hmm, I know that some people have slower CPUs but good memory, and run asynchronously with their memory at a higher rating, but this test shows that that will hurt gaming performance, not help it. Yet, I knew 3dmark wasn't an exactly reliable tool to use, so I went to benching a real game, Farcry. I tried to keep the time and ammount of frames spent recording as close as possible, and the point in the game I used for benching was the same for every test to keep the results more valid.
2004-05-08 16:43:36 - FarCry DDR333 FSB333 2.08ghz
Frames: 2077 - Time: 30703ms - Avg: 67.648 - Min: 30 - Max: 80
2004-05-08 16:51:06 - FarCry DDR400 FSB333 2.08ghz
Frames: 2050 - Time: 29297ms - Avg: 69.973 - Min: 58 - Max: 82
2004-05-08 16:58:15 - FarCry DDR333 FSB400 2.1ghz
Frames: 1966 - Time: 30046ms - Avg: 65.433 - Min: 24 - Max: 80
2004-05-08 17:03:40 - FarCry DDR400 FSB400 2.1ghz
Frames: 2215 - Time: 30937ms - Avg: 71.597 - Min: 38 - Max: 82
This showed the opposite trend as 3dmark. Running higher memory speeds helped performance marginally, which is what one would assume. Took another bench in the middle of tons of action just for fun, and to show a more "true" FPS value you'd expect to get when playing the game. My previous benches were a slightly more relaxing run through the trees and fall off a cliff type thing
2004-05-08 17:13:33 - FarCry DDR433 FSB433 2.16ghz
Frames: 3883 - Time: 67891ms - Avg: 57.194 - Min: 38 - Max: 94
So...hmm...not sure what to conclude. 3dmark and Farcry benches say the exact opposite. Should I go for more testing, or let this lie? Thoughts, comments?