• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

Say Goodbye to FutureMark.

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

robcy

Senior member
Jun 8, 2003
503
0
0
I really do think that it?s a big powerful company, forcing its will on a smaller one. There is a place for 3d Mark 03, as a part of the overall picture of GPU performance, just like 3D Mark 01 still represents a viable portion of the total performance measurement. I do not think that any one benchmark should be given absolute authority in making it the de facto measure of a GPU's power. I actually feel sad because what futuremark tried to do was to provide an honest tool with which to measure different video cards by a common standard. While futuremark has been kind of anal in its pursuit of fairness, they have done it so that their hard work is not used to falsely convince the public to buy a certain card, to only have said part not perform to expectations, and then say "but it did so good on 3D Mark 03". The only purpose to cheat on 3D mark is to use the incorrect information for the purpose of selling more cards. I do not live under the elusion that corporations (ATI & Nvidia) are really concerned with me making a sound purchasing decision, but I do have issues with them tampering with the process with which decisions are made. The losers in all of this fiasco are us.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
32,056
32,577
146
FutureMock..... I won't miss 'em. The fact that sites are being forced to use real games now is a good thing. Further efforts to use parts of games that aren't widely used as a benchmark will make optimizing for them to make a card's performance look better much more difficult too. It requires effort on the reviewer's part, but hey that's why they call it work ;)
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
When every major game release includes comprehensive benchmarking features, I might be in favor of tossing synthetic benchmarks entirely. But anyone deriding them as a complete waste of time is, IMHO, pushing too far the other way.

The problem with testing with real games is that the results are (generally) far less repeatable, and less predictable. Unless everybody tests everything with exactly the same settings, on the same systems, using the same timedemos, results can, have, and will vary widely. Games with built-in benchmarks alleviate this somewhat, and the differences are often not huge, but it's enough of a problem that it's still nice to see a few synthetic scores to provide a baseline. Another problem is that new graphics features (such as PS/VS 2.0 shaders) take a long time to trickle down into released games, whereas a benchmarking program can include tests of them early on. Of course, this doesn't correlate exactly with how games using those features will perform, but it's better than nothing.

Basing the entire evaluation of a product on any single benchmark is foolish, but so is refusing to look at synthetic benches, especially when Futuremark has shown they're at least *trying* to prevent cheating.
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
Game performance is a no no for this thing. GPU performance could have been more realistic. But being a synthetic benchmark they should have allowed custom optimizations or at least had a standard and an optimized result kind of like Spec.

The problem is it sounded like the engine was coded poorly to begin with so it didnt represent much of anything real world. Then you have them swaying on the cheat no cheat thing. Then you have reviews like Aceshardware where they show a P2-300 beating a P4 2.4Ghz because the P2-300 is using a 9800 pro vs the 2.4Ghz 9600 Pro. I mean they made it out to be a game performance benchmark when it was nothing but a synthetic GPU performance benchmark.

I think it would be more useful if they allowed for two results. One being standard compiled and one being optimized.

At least it would be useful to see how much more you can get out of GPU if the developer tuned the shaders for the GPU.

I like this idea.
 

Tab

Lifer
Sep 15, 2002
12,145
0
76
Originally posted by: Bucksnort
Futuremark has just become a congregation of ati punk fanboys.

Yea, right.
rolleye.gif


FutureMark sucks, bottemline. Show me real benchmarks.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
The problem is it sounded like the engine was coded poorly to begin with so it didnt represent much of anything real world.

I don't know that I would say it was coded 'poorly' as they did exactly what they were trying to do, however it was coded very differently from a game. They went out of their way to slow it down as much as possible by doing such things as forcing the boards to render back to front. This leads to the perception among some that to see the type of visuals that you get in 3DM2K3 you will see the same level of performance which is far removed from reality. GT3 as an example uses rendering techniques not unlike DooM3 although the performance is only a fraction of what D3 is running at, even though the game relies entirely on integer level accuracy.

I find this aspect of 3DM2K3 very disappointing particularly considering that the earlier versions of the bench were designed to run optimally, as game engines are built to run, to give a general indicator of what level of performance you can get with a given level of visuals. It appears that Futuremark couldn't think of a viable way to make the game extremely GPU limited without resorting to rendering it as slow as they could. I hope that situation is rectified in future synthetic benchmarks.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Originally posted by: BenSkywalker
Not sure how that translates to the end of FM, but it's definitely a win for nVidia and their "compiler technology" (read: hand-coded substitutions).

Any proof you could provide to back this up would be nice, I'm still waiting to see some of that. Anything using the 5x.xx drivers would be very good.
FM showed us how they altered their shaders with the v340 patch, baically by reordering ops or changing variable names. How could those alterations break a general optimizer?

And nV themselves said their unified compiler tech was a combination of general optimizations and hand-coded substitutions.

Anyway, 3DM03 has been largely relegated to the sidelines. We'll see how 3DM04 handles DX9(.1?) and the various GPU architectures. I think a synthetic test really has to include screenshots (to be evaluated by the reviewer, not the program) as part of the synthetic results, to force IQ as integral to the numerical result. I prefer benchmarking games, but isn't it obvious that only a synthetic benchmark can be released quickly enough to help predict a new architecture's potential performance? I'm still not convinced that 3DM03 was a failure in terms of predicting comparative "next-gen" performance, but I may be missing the point.
 

nemesismk2

Diamond Member
Sep 29, 2001
4,810
5
76
www.ultimatehardware.net
Why don't Futuremark use a "real" game engine for 3dmark instead of their own engine? If they plan on releasing a 3dmark04/05 then I suggest they use the serious 2 engine which is looking very impressive.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
FM showed us how they altered their shaders with the v340 patch, baically by reordering ops or changing variable names. How could those alterations break a general optimizer?

Reordering ops can break a compiler(in terms of optimal performance) very easily, particularly one as sensitive to scheduling as nV's. Futuremark disabled nV's driver level compiler with patch 340, have not seen any reason why they decided to do that either.

And nV themselves said their unified compiler tech was a combination of general optimizations and hand-coded substitutions.

I missed them saying they had hand coded substitutions, actually haven't seen anything close to that.

We'll see how 3DM04 handles DX9(.1?)

Good gawd Pete, don't tell me you are starting to buy in to the DX 9.1 rumor..... :p