• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Why do people live and die by 3DMark to bench their systems?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Time is also an important variable to consider. Like in the previous example, a computer running for 10 seconds at 5 to 7 fps will yield 6 fps average. However, a computer running 5-9 fps but drops down to 3 fps even once or twice during that 10 second span will result in approximately the same average fps. My math may be slightly off as I didn't bust out a calculator, but I think you get the idea. I hope I didn't just state the obvious
rolleye.gif
, but I think it adds simplicity to what andreasl was explaining.

Chiz
 
Originally posted by: chizow
Time is also an important variable to consider. Like in the previous example, a computer running for 10 seconds at 5 to 7 fps will yield 6 fps average. However, a computer running 5-9 fps but drops down to 3 fps even once or twice during that 10 second span will result in approximately the same average fps. My math may be slightly off as I didn't bust out a calculator, but I think you get the idea. I hope I didn't just state the obvious
rolleye.gif
, but I think it adds simplicity to what andreasl was explaining.

Chiz
Inconsequential. 3DMark takes the same time on any system, unless the system crashes, hangs, etc.
 
I agree that it is great for stability tests. I could care less about a few hundred marks, but the overall scores help in determining if a particular hardware upgrade was worth it. I don't know about most people but, seeing my scores rise from less than 8K to over 12K, and playing games is a real-world difference in performance charting of my hardware.

 
I guess I wasn't clear enough. Time allows people to forget about minimum fps, b/c the average fps will essentially CONCEAL any big drops in framerate over the course of time, but they can't trick the eye when you are actually VIEWING the image when it occurs. 😕

2 machines run the same demo, for the same amount of time, and end up with the same average fps.

Let's say for arguments sake, 1 machine kept an absolute framerate of 30 fps, and ended up with an average of 30 fps of course.

Now, if machine 2 alternates between 20 and 40 fps every time it is polled, it'll end up with 30 fps as well.

If you aren't in a coma from epilectic seizures, I'm pretty sure you'd be convinced that minimum framerates are in fact important, but as someone said, are too often overlooked.

Chiz
 
One thing I have suggested to FPS-freaks few times is to reverse your measurement: instead of timing frames per seconds, measure time per frame 😉 Doing so would eliminate the problem that very short-time game stalls doesn't really show up in measurement (half a second pause, when there still will be 60 frames rendered on the other half of the sec will be 30FPS minimum/average frame rate even thought it's anything but pleasant). I think millisecond should be good enough for timing frame drawing, 100FPS would be equal to 10MSPF (ms per frame), 50FPS to 20MSPF and so on. The other thing is that to get better idea about system performance and smoothness, frame rate should be presented in graph over the all testing scene (like some memory measurement programs do) not just by single or few numbers. There was one 3DMark-like OpenGL tester named Vulpine GLmark that did so, IMO very good idea/approach.

About 3DMark/Sandra scores: I have never cared about them, those are just random numbers to me (mine-is-bigger-than-yours thingie 😉 . I prefer performance figures of the real applications instead . IMO people are overreacting with all that FPS-chase and raping their hardware just to get some more random numbers out from some programs. Instead of endless loops of 3DMark, I'd perfer actual gameplay without even FPS-counter running on the corner (cause it disturbs). If game feels too choppy, go to slower resolution, disable some eye-candy or buy faster hardware: as easy as that. Why build yourself psychical barrier with staring the FPS-counter and making up problems to yourself...

Now, flame me 😉
 
The thing about 3DMark and Sandra is that they're fairly good indicators of relative system peformance. I don't agree with the term "Synthetic Benchmarks" - They are both real programs, they both make a computer do real work.

What is stupid IMHO, is to say anything except for "System A has a higher 3DMark than System B." when using data gotten from running these benches. You simply can't extrapolate anything about overall system performance from a single benchmark. Similarly, it's silly to claim that A is a better machine than B because it has a higher min/max/average FPS in UT2003 or anything else.

As far as the Idea of measuring the MSPF (as explained by Priit above,) how does that help us in any way? The MSPF is simply the reciprocal of the FPS. You're not producing any new data by measuring like that.

I think the bottom line is that all benchmarks are useful in giving us some idea about the relative speeds of two or more machines. It's silly to expect them to equate with real world performance, but then you can't benchmark real-world performance.
 
Somebody explain this to me: I used 3DMark on a KT133A system with a 1GHz Athlon and scored higher than the the same video card and drives on an i845g with a 2.4GHz P4. Why the higher score on the slower system, hmmm?

This really points out why I don't pay attention to 3DMark scores.
 
I don't care about the 3Dmark points my self. As long the system is stable and play games like I want them to play. Thats all that matters to me and once the game play is slow. Time to upgrade some stuff on the system. All though now and then I like to buy a new toy for my system. Just two days ago I bought my self a Toshiba 6X/32X DVD rom for $10 bucks. I just could not pass up that deal. 😀
 
Somebody explain this to me: I used 3DMark on a KT133A system with a 1GHz Athlon and scored higher than the the same video card and drives on an i845g with a 2.4GHz P4. Why the higher score on the slower system, hmmm?

This is exactely why benchmarks should be used. To test a system to find out if something is wrong or just to test system performance. 3dmark shouldn't be exclusively used, but it is still a good general indicator.

What you described would throw up a red flag on any respectable review site. Either something wasn't configured right or some driver is missing (such as the correct Intel chipset drivers). Or there's a bad piece of hardware, such as the video card, mb, ram, etc.

And don't tell me its because of 3dmark. I have yet to see ONE review site show a lower 3dmark score by increasing the speed of the processor and ram. So unless every site out on the web is wrong, you must have a problem.

 
Originally posted by: Dug
Somebody explain this to me: I used 3DMark on a KT133A system with a 1GHz Athlon and scored higher than the the same video card and drives on an i845g with a 2.4GHz P4. Why the higher score on the slower system, hmmm?

And don't tell me its because of 3dmark. I have yet to see ONE review site show a lower 3dmark score by increasing the speed of the processor and ram. So unless every site out on the web is wrong, you must have a problem.

Dunno. But I can tell you that performance did go up in the games I tried even though the 3DMark score was lower.
 
Originally posted by: kgraeme
Originally posted by: Dug
Somebody explain this to me: I used 3DMark on a KT133A system with a 1GHz Athlon and scored higher than the the same video card and drives on an i845g with a 2.4GHz P4. Why the higher score on the slower system, hmmm?

And don't tell me its because of 3dmark. I have yet to see ONE review site show a lower 3dmark score by increasing the speed of the processor and ram. So unless every site out on the web is wrong, you must have a problem.

Dunno. But I can tell you that performance did go up in the games I tried even though the 3DMark score was lower.

I don't buy it.... the 3dmark score went down for a REASON!!!! (lol "...performance did go up in games"...) Jesus why don't you guys just stop dancing around the issue and spit it out...: YOU THINK 3D MARK IS BROKE.....there, doesn't that feel better now? Is that what you want to say?

 

>>Minimum fps has always been important, just rarely are they reported or even considered by reviewers.
>>--------------------------------------------------------------------------------


>...And the reason for that would be....what?

Because the test isn't built into the game and the reviewers would have to figure a way to do it, which is way beyond their capability.
 
Originally posted by: uncleX
>>Minimum fps has always been important, just rarely are they reported or even considered by reviewers.
>>--------------------------------------------------------------------------------


>...And the reason for that would be....what?

Because the test isn't built into the game and the reviewers would have to figure a way to do it, which is way beyond their capability.

😕

way beyond who's capacity?
 
>way beyond who's capacity?

As usual, I don't know what you are talking about. Here we go again: By themselves, the reviewers haven't the remotest idea of how to measure frame rates of games. The people who program the games build in some tests, and those are the tests the reviewers report. If they don't build in a certain kind of test, the reviewers can't do that test.
 
>Oh Jesus, I was starting to wonder when people were going to start bantering this around.
> ONE website posts a benchmark displaying P4's showing lower minimum frame rates than
> Athlon's and now people are coming out of the wood works ranting about how important minimum

>Am I the only one here that is astonished @ how ridiculous a 'minimum frame rate' speculation
> would be? I mean, what would cause one machine to have a 3 fps lower minimum frame rate
> than another....


The 2800+ Athlon has a 31 fps lead over the P4 2.8G, not 3fps. 73 vs 42. A 74% increase where it counts.

P4
2.0G 400 FSB 41
2.4G 533 FSB 41
2.53_________40
2.8G PC1066 42
3G___HT on___47
Athlon
2100+ 1.73G 61
2200+ 65
2600+ 71
2800+ 73


by extrapolation a $52 1700+:
1700+ 1.47G 49
which stll beats the 3GHz P4 operating at twice the clock.
to ACES test
The ACES HARDWARE table gives 13 measurements all giving consistent results.

>How would 'one' even go about 'controlling' for minimum frame rate?
Get yourself a $52 Athlon.
 
3DMark2001se has helped me countless times to identify the affect of tweaks... It is a great tool for that.

One of the places it really helped me was when I was fooling around with cooling solutions and it gave me a fairly

consistant way to compare temps. (eg: Temp was almost always stable after a 6 loop pass and repeatable!)


Oh, you want to know my 3DMark2001se scores.... FORGET IT... you simply can't compare one machine to another accurately.
especially with the insecure little snots in here who couldn't tell the truth unless in fear of a spanking from their mommies.

Yeah, that 25,000 score is really impressive until you see the blank purple screen that went with it....
 
Originally posted by: fkloster

I don't buy it.... the 3dmark score went down for a REASON!!!! (lol "...performance did go up in games"...) Jesus why don't you guys just stop dancing around the issue and spit it out...: YOU THINK 3D MARK IS BROKE.....there, doesn't that feel better now? Is that what you want to say?

I don't know if 3DMark is broken. I'm just citing one instance where I've seen strange numbers. As was suggested, it could have been something with my system. Dunno.
 
Back
Top