• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Say Goodbye to FutureMark.

Goodbye FutureMark. On a serious note: it is sad to see a company end this way. Just because Nvidia wants their $$$. I really hope FutureMark makes a 3dmark version that is not so easily changed.
 
I really would like to hear what they think is cheating and why. Futuremark has been really stupid. But, I still like to see their 3d benchmarks, not to show performance, but it looks nice, and in any case they do show some type of performance.
 
[ H ] tried to convince people 3DM03 was useless, too. They, too, used rather skimpy logic in explaining their not using the benchmark. Basically, LR will stop using 3DM03 because they're too lazy to. Not sure how that translates to the end of FM, but it's definitely a win for nVidia and their "compiler technology" (read: hand-coded substitutions).

I'm curious to see in what light 3DM04 puts ATi and nV, and how each company will respond.

Edit: Bah, BBS code again. BTW, the LR article was reasonable, but I thought the conclusion was a cop-out. Unless LR changes their test system with every review, I'm not sure why they'd feel it necessary to continually retest 3DM03 with new drivers, as it's becoming obvious that the only way IHVs are achieving big performance improvements is either by inserting hand-coded shaders or with IQ-reducing filtering alterations, both anathema to 3DM's purpose as a fixed, repeatable benchmark.

But I suppose the one nice thing about this whole 3DM03 furor is that a lot of sites have greatly expanded their test suite of games, allowing for a better view of a card's overall performance.
 
I think 3dM03 is still pretty useful for troubleshooting and such... look, how many times we see topics in this forum where ppl asked "my score is so low, what's wrong with my system?" or "I see graphic corruptions in 3Dmark but not in games, why?" 😉
 
i'm not sure what all the fuss is about. now, assuming that image quality is not sacrificed on specific applications in a manner not representative of the "game" or "benchmark" setting (a cheat), there are only 2 ways i see this:

1. if specific optimizations (regardless if it's nvidia or ati doing it) result in increased performance in 3dmark, but are not "carried over" into other games/applications, then how is 3dmark is "representative" of games in general, and why would anyone consider it a viable way of measuring "gaming" performance? i mean, if 3dmark is so different from "real" games, than what good is it in the first place? doesn't dx9 pixel shaders work in 3dmark the same as they do in "real" games? as so often is said, synthetic "benchmarks" are crap.

2. if 3dmark is indeed an accurate "representation" of how 3d games are, then any optimization made for 3dmark would indeed "carry over" to other games/apps, making these "optimizations" completely legitimate, and beneficial to the end user. wouldn't recompiling how a driver handles codes for pixel shader 2.0 benefit every game that uses pixel shader 2.0?

am i missing something here? if so, what is it?
 
Personally, I like the graphics in 3DM03, but I do not base my decisions on it. I use apps to make my decision, and I love how has started benchmarking.
 
Quote

--------------------------------------------------------------------------------
Many Review sites have been doing this
--------------------------------------------------------------------------------


A review site that uses old drivers can't be any good to begin with.
They aren't using old drivers there. They just messed up on the date. Check it out, January 26, 2003, LOL.
 
Cainam:

1. Specific optimizations for Q3 don't carry over into UT2K3. Heck, specific optimzations for Q3 probably won't even carry over into games based on the Q3 engine, as game devs probably don't recycle shaders from other devs. So specific optimizations not carrying over from 3DM to games isn't really a negative, it's par for the course.

2. 3DM shouldn't (and I believe doesn't) deactivate those types of optimizations. They're legitimate in synthetic benchmarks because they carry over into any other 3D application. But it's been shown that nV isn't reaping huge speed gains from generic optimizations, but rather from hand-coded replacement shaders, a form of crib notes for benchmarks.
 
see, but that's just part of the problem.. what good is having api's such as dx9 with specific shader specs, etc. if everyone does their own thing? especially in an app that is supposed to mimic realworld gaming.
 
If the 53.03 drivers were using application detection, if you renamed the 3dmark exe to something else wouldn't the score change?
 
Not sure how that translates to the end of FM, but it's definitely a win for nVidia and their "compiler technology" (read: hand-coded substitutions).

Any proof you could provide to back this up would be nice, I'm still waiting to see some of that. Anything using the 5x.xx drivers would be very good.
 
Originally posted by: clicknext
Good, I'm getting tired of seeing 3Dmark2003 represent game performance.



Note: I wish LR had done a few screen caps of the 5900U using the older and newer drivers in 3DM for IQ comparison.
 
Game performance is a no no for this thing. GPU performance could have been more realistic. But being a synthetic benchmark they should have allowed custom optimizations or at least had a standard and an optimized result kind of like Spec.

The problem is it sounded like the engine was coded poorly to begin with so it didnt represent much of anything real world. Then you have them swaying on the cheat no cheat thing. Then you have reviews like Aceshardware where they show a P2-300 beating a P4 2.4Ghz because the P2-300 is using a 9800 pro vs the 2.4Ghz 9600 Pro. I mean they made it out to be a game performance benchmark when it was nothing but a synthetic GPU performance benchmark.

I think it would be more useful if they allowed for two results. One being standard compiled and one being optimized.

At least it would be useful to see how much more you can get out of GPU if the developer tuned the shaders for the GPU.
 
Back
Top