• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Why do people live and die by 3DMark to bench their systems?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
For one, the bot match tests themselves cannot be used to accurately compare systems. The bot match tests utilize "bots" that are players controlled by AI. CPU power is very important here. Derivations in what the CPU is doing while running the bot match test can actually change the actions of the bots in such a match. So much in fact, that bot match benchmarks run back to back on the same machine can yield a different result. The other problem is the fact that the resultant score is based on the average of both flyby benchmarks.

I don't like UT2K3 because it DOESN'T show real world results. The fps you get in a flyby is worthless, because you don't play a flyby. The botmatch is worthless because its so limited. It will be completely different from what you will see in actual gameplay. Try running the fps counter while playing and you'll see what I mean.
Not to mention that 9700 owners report higher scores in actual gameplay than when they benchmakr. Nvidia owners report lower scores in gameplay compared to the benchmark.

Don't get me wrong. I use UT2k3, 3dmark, Sisoft Sandra,etc. Why, because I can. And nobody has a perfect benchmark, so I use them all. Is it a waste of time? Of course it is. So is using a computer for playing games.
 
I suppose you'll say that UT2k3 is a good benchmark. A good benchmark for what?
UT is a game that you PLAY. You don't play 3DMark. There is no PERFECT benchmark, the best indicator is to run a program that measures and records your fps in a real game, but the problem with that is that factors change each time you play. A benchmark in a game will always be more useful than a synthetic benchmark, especially an inaccurate one.

As I see it, there are no excuses for flaming people, especially when they're posting reasonable questions. And questions about 3DMark scores are reasonable, whatever you say.
back in the day, I quietly informed the posters that we can't really help them by their 3DMark, and that they should run some game benchmarks so we can get a better idea and then help them with their problems. People didn't respond to that. Nowadays I yell at them. Guess what? No whiney 3DMark posts in awhile. Ya gotta do, what ya gotta do.
 
NFS4 i don't get it either. The most important thing with any video card is the minimum frame rates in games! I want a high minimum frame rates in all my games which would be 30 to 60 fps like in the most heavy battle/fighting scenes. I don't care about maximum fps. I care some about average fps but minimum is the most important.


 
You don't play 3DMark



1) Regestered owners of 3dmark2k1 SE CAN play 3dmark. It has a game embedded in the benchmark where you can drive futuristics trucks around a dirt track. Just to set the record straight...3dMark can be played.
 
Originally posted by: imtim83
NFS4 i don't get it either. The most important thing with any video card is the minimum frame rates in games! I want a high minimum frame rates in all my games which would be 30 to 60 fps like in the most heavy battle/fighting scenes. I don't care about maximum fps. I care some about average fps but minimum is the most important.

Oh Jesus, I was starting to wonder when people were going to start bantering this around. ONE website posts a benchmark displaying P4's showing lower minimum frame rates than Athlon's and now people are coming out of the wood works ranting about how important minimum frame rates are..... sheesh 😕
 
Originally posted by: fkloster
Originally posted by: imtim83
NFS4 i don't get it either. The most important thing with any video card is the minimum frame rates in games! I want a high minimum frame rates in all my games which would be 30 to 60 fps like in the most heavy battle/fighting scenes. I don't care about maximum fps. I care some about average fps but minimum is the most important.

Oh Jesus, I was starting to wonder when people were going to start bantering this around. ONE website posts a benchmark displaying P4's showing lower minimum frame rates than Athlon's and now people are coming out of the wood works ranting about how important minimum frame rates are..... sheesh 😕

Minimum fps has always been important, just rarely are they reported or even considered by reviewers.
 
Originally posted by: sandorski
Originally posted by: fkloster
Originally posted by: imtim83
NFS4 i don't get it either. The most important thing with any video card is the minimum frame rates in games! I want a high minimum frame rates in all my games which would be 30 to 60 fps like in the most heavy battle/fighting scenes. I don't care about maximum fps. I care some about average fps but minimum is the most important.

Oh Jesus, I was starting to wonder when people were going to start bantering this around. ONE website posts a benchmark displaying P4's showing lower minimum frame rates than Athlon's and now people are coming out of the wood works ranting about how important minimum frame rates are..... sheesh 😕

Minimum fps has always been important, just rarely are they reported or even considered by reviewers.


...And the reason for that would be....what?

 
UT is a game that you PLAY. You don't play 3DMark. There is no PERFECT benchmark, the best indicator is to run a program that measures and records your fps in a real game, but the problem with that is that factors change each time you play. A benchmark in a game will always be more useful than a synthetic benchmark, especially an inaccurate one.

Thank you. You just confirmed why UT2k3 benchmark is just as worthless as 3dmark.

Yes you can play UT2k3, you can't play UT2k3 benchmark. Not to mention that the benchmark doesn't even portray real gameplay.
 
The reasons why the minimum framerate is important should be fairly obvious IMO.
Weakest link in a chain and all that.

If Im scoring an avarge of 100, with a peak of 250, Im not gonna see any difference between the avarge and the peak, cause both are more than sufficent for smooth gameplay, however I can definately tell the difference if the framerate drops down to 30 all of a sudden.
All I care about in the end is that the game stays playable at all times, rather than extremely high sometimes with little dips here and there, which is why the min framerate is more important than the avarge framerate IMO.

Note that Im not saying your beloved P4 is worse than the Athlon or anything, for all I know it could very well have bjust been a fluke, in fact I do believe that's the case.
 
Am I the only one here that is astonished @ how ridiculous a 'minimum frame rate' speculation would be? I mean, what would cause one machine to have a 3 fps lower minimum frame rate than another.... 😕 How would 'one' even go about 'controlling' for minimum frame rate?
 
Originally posted by: fkloster
Am I the only one here that is astonished @ how ridiculous a 'minimum frame rate' speculation would be? I mean, what would cause one machine to have a 3 fps lower minimum frame rate than another.... 😕 How would 'one' even go about 'controlling' for minimum frame rate?

That's roughly what I said, I don't believe it's due to the P4 being any worse than the AXP, but on the other hand I don't know, and neither do you(taking a guess here).
Someone knowledgeable(such as Ace's) should have a look at it and try to come to a wellfounded conclusion.
 
3DMark is not useless. Sandra is not useless (the functioning modules contained within, that is). UT2003 bench is not worthless. The old UTbench is not worthless. The Quake 3 NV15 demo is not worthless. The JKIIffa is not worthless. ScienceMark (probably my favorite) is not worthless. SPEC2000 is not worthless.

Why are these not worthless? Simple. They are standard benchmarks that can be repeated with little differentiation. As such, they are extremely useful for comparing and contrasting the performance of different systems, configurations, tweaks, and overclocks. Without these STANDARD benchmarks that can be repeated, there would be no basis for comparison other than clock cycles, FLOPS, MIPS, texels/second, triangles/second, pixels/second, and other numbers that have little meaning. This is why we have benchmarks. This is why they are not a waste of time. This is why they are to be taken seriously.
 
Actually, I thought of another good reason to use 3DMark. It gives people a reasonable idea as to how games that make use of the future DX standard will run on their current system. If I remember correctly, 3DMark 2001 came out before any game that supported DX8. However, I still agree that people should not "live and die" by their 3DMark scores. People need to start using multiple benchmarks. That will give them a better idea as to how their system performance will be affected all around.
 
I use 3d mark all the time when overclocking my system. It is very good for making sure things are stable. If I can run through it the chances are my system will run well with pretty much anything else. It?s also good because it?s a benchmark that everyone has. I can tell my bud my bench mark for ut2003 but if he does not have it then there?s nothing to compare it to. It?s an easy and pretty accurate benchmark that is most importantly free. But I must admit some people get mad over some pretty stupid stuff when it comes to benchmarks.
 
Originally posted by: Daovonnaex
3DMark is not useless. Sandra is not useless (the functioning modules contained within, that is). UT2003 bench is not worthless. The old UTbench is not worthless. The Quake 3 NV15 demo is not worthless. The JKIIffa is not worthless. ScienceMark (probably my favorite) is not worthless. SPEC2000 is not worthless.

Why are these not worthless? Simple. They are standard benchmarks that can be repeated with little differentiation. As such, they are extremely useful for comparing and contrasting the performance of different systems, configurations, tweaks, and overclocks. Without these STANDARD benchmarks that can be repeated, there would be no basis for comparison other than clock cycles, FLOPS, MIPS, texels/second, triangles/second, pixels/second, and other numbers that have little meaning. This is why we have benchmarks. This is why they are not a waste of time. This is why they are to be taken seriously.

Exactely why people shouldn't be flamed for using them.
 
How did you guys get into a discussion about the P4 compared to the XP? Flokster, you just pulled that out of your ass. He was not mentioning anything having to do with that review when the XP got a higher minimum FPS. You somehow associate it with that, and I don't see how. However, minimum framerate is important. This is because after a certain number of frames (About 80-90) the human eye cannot capture any more frames than that. Therefore, it doesn't matter what the average framerate is, as long as it stays at that level or somewhere close to it. However, the minimum framerate does tend to drop below playable levels a lot of times. If one can maintain the minimum framerate at a playable level, then the entire game will be playable.
Am I the only one here that is astonished @ how ridiculous a 'minimum frame rate' speculation would be? I mean, what would cause one machine to have a 3 fps lower minimum frame rate than another.... 😕 How would 'one' even go about 'controlling' for minimum frame rate?
Why, when talking about minimum framerates, do you always assume the differences are as small as 3FPS? I have seen you do this same thing in another thread. The same reasons the average performance changes are the same reasons the minimum framerates change. Something is bottlenecking the system, just more so. What if the average FPS is only a 3FPS difference? Then no one cares. However, if you can increase the average 10FPS, it starts to turn some heads. If the minimum FPS can be increased 10, the result is the same. Increasing average framerate should increase minimum framerate as well, through some type of performance enhancement (Tweak, hardware upgrade, etc.). Stop pretending minimum framerate doesn't matter just because you want to protect your precious P4 from doing worst than it should in ONE benchmark when compared to an Athlon XP. Don't worry about it, the P4 has plenty of strengths, and one benchmarks isn't going to change anyone's mind. Now, if every benchmark yielded the same results, then there might be more to this fluke (I think it is) than meets the eye.
 
Stop pretending minimum framerate doesn't matter just because you want to protect your precious P4 from doing worst than it should in ONE benchmark when compared to an Athlon XP.

.... (taps finger)


I could really give a damn if a P4 has a lower 'minimum' frame rate (ridiculous) than an Athlon XP. The point I am trying to make, in vain, is that once 'one' ESTABLISHES that one 'system' renders a lower minimum frame rate but higher average frame rate than another 'system', then what does one do? How is it corrected? and more importantly.... HOW DOES ONE SYSTEM OFFER LOWER MINIMUM FRAME RATE THAN ANOTHER even when the average frame rates remain the same???? 😕 (hmmm, reading my post, I find that its very difficult trying to explain what my problem is) I mean, example....:

1) P4 + 9700pro + all current drivers and defualt settings running same app = ave fps: 50 high fps: 78 min fps: 4
2) Athy + 9700pro + all current drivers and defualt settings running same app = ave fps; 48 high fps: 79 min fps: 8

?????? big deal?
 
I could really give a damn if a P4 has a lower 'minimum' frame rate (ridiculous) than an Athlon XP. The point I am trying to make, in vain, is that once 'one' ESTABLISHES that one 'system' renders a lower minimum frame rate but higher average frame rate than another 'system', then what does one do? How is it corrected? and more importantly.... HOW DOES ONE SYSTEM OFFER LOWER MINIMUM FRAME RATE THAN ANOTHER even when the average frame rates remain the same???? 😕
First, I apologize about misspelling your name in my last post... Anyway, all it says is that the game's FPS do not spike as low, but they don't spike as high either. Minimum framerates occur in very intensive situations. There are certain breaking points with various pieces of hardware where the performance begins to drop dramatically. When the L1/L2 caches are filled up on a CPU for example. Or, when the textures are so detailed that the video card can no longer render things in one pass. Whatever the case may be, abnormally low minimum framerates occur when something like this happens.
 
How does one get the most electorial votes when you lose the popular vote? Sorry that was a cheap one 🙂

But the answer is simple... the average between 5 and 7 is 6. The average between 2 and 10 is also 6. If you have a 120 second timedemo doing an average of 60FPS, then a 2 second 'glitch' when the framerate drops to 15FPS isn't going to affect the average by alot, depending on how it is measured. Only if it drops to 15 for a longer period of time will it have an impact on the final score.
 
Just to shut fkloster up, a specific system can have a lower minimum frame rate than another system but still have the same average frame rate as another. How? By having a higher maximum frame rate. Also, there's no way a P4 can have an inherently lower minimum frame rather than an Athlon. The P4 has a larger, faster, and better-optimized L2 cache (check the ScienceMark benchmarks if you don't believe me) as well has more memory bandwidth.
 
It should also be said that measuring the minimum framerate can be a real hassle due to the OS. If something (like anything) happens in a background thread while you are running the benchmark it can affect the min fps number, making it artificially low. Anyone who have run the HD benchmark HD-tach will know what I mean..
 
Back
Top