Please lock this thread. No more useful discussion going on.

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
And the 52.16s were on the approved list without disclaimer when they released the 340 build, then they changed it.
Maybe they performed more tests and realized that the drivers were cheating in areas they didn't realize before. Surely you aren't accusing FutureMark of lying as well?

Changing the approved driver isn't even in the same league as having Derek Perez, an nVidia employee, confirming the hard-coded cheats for 3DMark exist in current driver and also hinting that the trend is to continue to defeat FutureMark's detection routines. Speaking of which, what is your response to Derek's comments?

and you believe the screenshot story. Just post some links to some evidence
This thread has a discusion about some screenshot tests done (including a link).

To save rehashing of the thread in here, the final conclusion was that FRAPS 2.0 was showing AA in nVidia's screenshot where there was no AA applied onscreen. OTOH FRAPS 2.0 correctly captured the output of the ATi AA and non-AA modes.

I wouldn't class it as concrete evidence since it's not done by an official hardware site, but something is definitely fishy if FRAPS 2.0 is pulling correct AA out of nowhere when nVidia's on-screen image shows none.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Also I'd like to hear what your response is to nVidia's screenshots where not even Fraps 2.0 could produce the correct results.

Because nVidia is combining on scan out when using AA, not prior to that. I don't understand why that issue is worth discussing again, all of the FX parts have had this exact same issue with basic print screen functions since launch and even ATi has some issues with the most basic print screen.

Maybe they performed more tests and realized that the drivers were cheating in areas they didn't realize before. Surely you aren't accusing FutureMark of lying as well?

I find your use of the word cheat to be very amusing to be certain. This 'cheat' you are talking about now Futuremark couldn't catch, nor could they tell the difference in terms of the ouputted quality until after they stated it was all good. If nVidia was using prerendered state just so you know FM should easily spot that when doing PS tests(as they are angle dependant and FM's build of 3DM2K3 isn't 'stuck' on rails). FM should state exactly what is wrong with the test so everyone can be informed.

Changing the approved driver isn't even in the same league as having Derek Perez, an nVidia employee, confirming the hard-coded cheats for 3DMark exist in current driver and also hinting that the trend is to continue to defeat FutureMark's detection routines.

Now that I would call a lie. He said no such thing whatsoever. He stated optimizations would continue. If you believe everything that FM has stated to date has been 100% even and fair then nVidia has pulled out a significant amount of performance from optimizations that FM doesn't consider questionable. Optimization !=cheat.

and BFG and Ben are free to hammer away at each other provided they remain civil (no more kicking puppies! :D ).

Hey, he's the one kicking puppies, not me ;) :p
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Because nVidia is combining on scan out when using AA, not prior to that.
But the problem is not that we're aren't seeing AA in the screenshot when we are on the screen.

The problem is that we're seeing AA in the screenshot when there is none on the screen.

Do people look at the monitor or at screenshots when they play games? I don't know about you but I've never seen a valid AA method that doesn't do anything on the screen but magically comes into being when a screenshot it taken, have you?

all of the FX parts have had this exact same issue with basic print screen functions since launch and even ATi has some issues with the most basic print screen.
That's true but we're using FRAPS 2.0 which is more sophisticated than a simple print-screen. Also FRAPS 2.0 produces correct output on ATi cards (i.e. it doesn't show AA where there is none on the screen).

This 'cheat' you are talking about now Futuremark couldn't catch, nor could they tell the difference in terms of the ouputted quality until after they stated it was all good
You seem to have the incorrect idea that a cheat requires a change in IQ.

He stated optimizations would continue.
Come on now, you know exactly what the deal is here. I mean look at the response from the ATi guys and then look at Perez's response. ATi gives you straight non-BS answers while Perez is tip-toeing around the entire issue. I mean he admits that he's not even following the rules that FutureMark laid out and he also admits nVidia is planning to defeat FutureMark's anti-cheats mechanism:

It would seem that NVIDIA agrees with tone of the optimisation policy that Futuremark have put in place, but not all as they don't necessarily tally with the internal policies that NVIDIA have

Does he really think FutureMark are just putting the rules and anti-cheat coding in for fun? Of course not - FutureMark put them in to stop cheating and since Perez said he plans to defeat them in the future, he's basically admitted that it's nVidia's company policy to continue cheating, go against FutureMark's policies and mislead the public.

But of course he's never going to use "cheat". If you asked him if the static clip planes are a cheat do you think he'd admit that?
 

EarthwormJim

Diamond Member
Oct 15, 2003
3,239
0
76
Originally posted by: BFG10K

You seem to have the incorrect idea that a cheat requires a change in IQ.

Than what is the definition of a cheat? Why does it matter how a scene is rendered? Shouldn't the only thing that ever matters is what the scene looks like once it is rendered? If an image looks exactly the same with optimizations as without optimizations, why is it considered cheating to use those optimizations?

*edit*
Would the creator of this thread please change the name back to what it originally was. This has been a very interesting thread that pretty much stayed on topic, and I find it very rude and childish to change the name like you did and ask for it to be deleted.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Than what is the definition of a cheat?
A cheat is a section of code that aritifically inflates performance using one of two methods:

(1) Prior knowledge such as application detection, scene detection, benchmark detection or algorithmic detection.
(2) For all possible inputs it purposefully alters one of more outputs to incorrect ones, usually at the expense of rendering quality.

Why does it matter how a scene is rendered?
If you rely on prior knowledge you aren't optimizing, you're simply hard-coding cheats into the drivers. In addition to painting a deceptive picture, the optimizations are brittle and can easily fall over if any of the assumptions they rely on cease to exist.

Build 340 of 3DMark03 is a perfect example of this - nVidia took a nose-dive while everyone else stayed at the same performance levels.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
The problem is that we're seeing AA in the screenshot when there is none on the screen.

And you kneejerk to the assumption that it has something to do with nVidia? PrintScreen does not show the same results, nor does HyperSnap. When one application is screwing up while other applications are doing the same thing without problem, you don't entertain the possibility that perhaps it is an application bug? Particularly when the build of Fraps being discussed added code to simulate the AA output on nVidia parts? We know Fraps is using a hack to output screenshots on nVidia parts and it is still somehow nVidia doing something wrong.

That's true but we're using FRAPS 2.0 which is more sophisticated than a simple print-screen. Also FRAPS 2.0 produces correct output on ATi cards (i.e. it doesn't show AA where there is none on the screen).

ATi combines samples when it flips, not on scan out. Considering ATi is doing the same thing that has been done in terms of combining samples since 2K, I would hope the apps have figured out how to take a screenshot of them.

You seem to have the incorrect idea that a cheat requires a change in IQ.

You certainly have the incorrect idea that app detection = cheat.

Does he really think FutureMark are just putting the rules and anti-cheat coding in for fun? Of course not - FutureMark put them in to stop cheating and since Perez said he plans to defeat them in the future, he's basically admitted that it's nVidia's company policy to continue cheating, go against FutureMark's policies and mislead the public.

And PowerVR is excluded completely from having a valid part by FM's guidelines. This isn't just about nVidia here BFG, despite what you may want to think app detection is far from new nor is it a problem unless it causes issues.

If you rely on prior knowledge you aren't optimizing, you're simply hard-coding cheats into the drivers.

DooM3 anyone? Do you realize how much of a hypocrite you are coming across as?
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
he isn't being a hypocrite, he is talking about things like adding artifical clip planes to the mother nature test.
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: TheSnowman
he isn't being a hypocrite, he is talking about things like adding artifical clip planes to the mother nature test.

It's called occlusion culling, and its by no means a "cheat". If the scene looks the same, the rendering methods don't matter, period. You friggin dorks can cry about it all day long but joe shmoe can't tell the difference, and thats what counts. So quit crying about it and come up with better excuses other than NVIDIA uses their driver team to improve performance.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
he isn't being a hypocrite, he is talking about things like adding artifical clip planes to the mother nature test.

He has commented that the reason that ATi did so poorly in the DooM3 benches was their drivers weren't ready(to paraphrase). Now, he's taking the stance that any prior knowledge used to increase performance is cheating. For a general purpose optimization you don't need to optimize per application which would indicate you were using prior knowledge in you drivers. He is blasting nVidia for what he wants ATi to do.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
PrintScreen does not show the same results,
Actually it does. PrintScreen, FRAPS 1.0 and FRAPS 2.0 all showed the same issues.

We know Fraps is using a hack to output screenshots on nVidia parts and it is still somehow nVidia doing something wrong.
I agree that it isn't concrete evidence and it's sketchy at best. Nevertheless it still adds to the growing pile of nVidia inconsitencies that have been going on since the NV30 was released.

You certainly have the incorrect idea that app detection = cheat.
Unless it's used for compatibility reasons then it is a cheat.

And PowerVR is excluded completely from having a valid part by FM's guidelines.
And as a result there are no valid PowerVR drivers listed at FutureMark's website.

DooM3 anyone? Do you realize how much of a hypocrite you are coming across as?
Huh?

He has commented that the reason that ATi did so poorly in the DooM3 benches was their drivers weren't ready(to paraphrase). Now, he's taking the stance that any prior knowledge used to increase performance is cheating. For a general purpose optimization you don't need to optimize per application which would indicate you were using prior knowledge in you drivers. He is blasting nVidia for what he wants ATi to do.
You're oversimplifying the issue way too much. The issue here is if a new app exposes driver paths that haven't been exploited much / at all in the past like Doom3 does.

Doom 3 makes heavy use of stencil ops; no other game to date does. Therefore it's unlikely that drivers that have not been tested on Doom3 will have optimal stencil routines.

If an IHV - as a result of profiling the drivers running Doom3 - provides universal performance gains for their driver stencil operations then that's not cheating since any game that later comes along and uses stencil operations will benefit. What is a cheat is if the IHV comes along and sticks in if app == DOOM3 then do this; else do that into their drivers.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
It's called occlusion culling, and its by no means a "cheat".
It is a cheat since such a static form of occlusion culling cannot exist in any real-world situation because no real world app knows were the user will move or look.

Tell me, why do you think the entire application screwed up when someone moved the camera to a place the benchmark didn't cover? Do you play games by only running along their benchmarking pathway and never deviating from it? If not then how the heck can you class static clips planes as a valid optimization?

If the scene looks the same, the rendering methods don't matter, period.
That is utter rubbish.

So quit crying about it and come up with better excuses other than NVIDIA uses their driver team to improve performance.
rolleye.gif
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
BTW Ben, what are your thoughts on the XGI cheats, cheats which are the same kind as some of nVidia's have been?

Using your previously employed logic I'd expect you think XGI's application dectection to be perfectly acceptable and their reduced image quality as simple hacks, right? And the fact that a completely different picture is painted by simply renaming the executable is of no consequence too, right?

So in short, you don't believe XGI are cheating at all.

Correct?
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Cries of fanboyism aside, I find it interesting that XGI did a vaguely similar trick to what nV did a few months ago with their budget cards (can't remember if it was the GF4MX or 5200, and if D-L or iXBT noted it). Remember their alternating-frame AA trickery? IIRC, it was subsituting 1x2 AA on one frame and 2x1 on the next for 4x AA. I wonder if IHVs still think they can slip this stuff by 3D fansites, or if they just don't think the attention generated matters (in which case, why send out review samples at all)?

Consider these more like bemused rhetorical questions than actual conversation starters. I don't mean to stir up the pot against a particular IHV, I'm just thinking about the industry as a whole. (Thinking of the industry as a whole is part of what got me so upset about nV's actions in 3DM in the first place--fleecing customers hurts everyone in the industry, IMO. At some point, it's got to get more financially attractive to be honest than to attempt to deceive, though perhaps I'm being overly optimistic about the feasability of advancing 3D tech apace.)
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: BFG10K
BTW Ben, what are your thoughts on the XGI cheats, cheats which are the same kind as some of nVidia's have been?

Using your previously employed logic I'd expect you think XGI's application dectection to be perfectly acceptable and their reduced image quality as simple hacks, right? And the fact that a completely different picture is painted by simply renaming the executable is of no consequence too, right?

So in short, you don't believe XGI are cheating at all.

Correct?

The IQ drops, retard, therefore it is a cheat.

If you people think optimising is a cheat then you better disable SSE, SSE2, 3dnow, and hyperthreading because **GASP** those are hard coded application specific optmisations too.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
lol, Acanthus doesn't understand the difference between sse and static clip planes yet he is calling other people retarded. :D
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Actually it does. PrintScreen, FRAPS 1.0 and FRAPS 2.0 all showed the same issues.

I haven't been able to reproduce it with PS.

I agree that it isn't concrete evidence and it's sketchy at best. Nevertheless it still adds to the growing pile of nVidia inconsitencies that have been going on since the NV30 was released.

There was the 3DM2K3 issue, and the same 3DM2K3 issue. I've still seen nothing outside of that besides speculation on anything.

Unless it's used for compatibility reasons then it is a cheat.

And how many issues do you think they are working around using app detection? It's not just Splinter Cell.

And as a result there are no valid PowerVR drivers listed at FutureMark's website.

Which is complete BS. PowerVR's archiecture uses workarounds that don't impact IQ(well, they have their optimizations for things like comrpessed textures/box filter, but those are user selectable). No negative impact to the end user, it should be perfectly fine.

You're oversimplifying the issue way too much.

Not in any way shape or form.

Doom 3 makes heavy use of stencil ops; no other game to date does. Therefore it's unlikely that drivers that have not been tested on Doom3 will have optimal stencil routines.

Splinter Cell does(for certain) and IIRC so DeusEX 2(haven't checked it out myself, not a fan of the first although that is what is being widely reported). Also, ATi has had a build of DooM3 for a long time now, this isn't some secret app they are just finding out about. If all they needed was optimized paths for heavy stencil fill then it should have been in the drivers over a year ago and they could have left it in.

What is a cheat is if the IHV comes along and sticks in if app == DOOM3 then do this; else do that into their drivers.

But you have stated that it is OK to workaround compatibility problems using app detection.

BTW Ben, what are your thoughts on the XGI cheats, cheats which are the same kind as some of nVidia's have been?

I don't jump to the conclusion that they are cheats as of yet. Their drivers are utter crap at the moment, and they need to work out a lot of problems. Running a dual chip board I expect them to have a lot of issues they need to use app specific optimizations for, although right now what they are doing is utterly unacceptable due to the degraded IQ.

Using your previously employed logic I'd expect you think XGI's application dectection to be perfectly acceptable and their reduced image quality as simple hacks, right?

Depends on what is causing it. I believe in innocent until proven guilty. Even if some of the issues are intentional, if they can get the IQ close to comparable to what the big two are running then I would say it could fall under the hack category. If I didn't, I'd have to say ATi and nVidia cheat on everything using AF.

So in short, you don't believe XGI are cheating at all.

I don't assume they are, wouldn't want to make an @ss out of myself if their drivers improve and they work out their issues and it ends up being a wrong assumption. Is it possible? Of course. Go ahead and look back to when Quack was the hot topic BFG, I'm not going to jump to conclusions about an IHV cheating no matter who it is. I have made this clear in the past many times, when it was ATi, when it was PowerVR, when it was nVidia, and now that it is XGI. If I see some good evidence then I will look at it on a case by case basis and make a rational decission about it. So far I've seen that XGI has lousy drivers that output horrible IQ, and honestly having to choose between playing a game with horrible IQ or hard locking/blue screening I'll take the lousy IQ.
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: TheSnowman
lol, Acanthus doesn't understand the difference between sse and static clip planes yet he is calling other people retarded. :D

I know the difference, im just pointing out your idiotic logic.

If an optimisation gives the same result, and runs faster, WHAT IS THE FRIGGIN DIFFERENCE. I don't care how its done, i care if it looks the same, and works faster. Thats what im saying. A friggin monkey could grasp this concept, but appearently it is above you TheSnowman.
 

Blastman

Golden Member
Oct 21, 1999
1,758
0
76

?If an optimisation gives the same result, and runs faster
What runs faster? Just the timedemo benchmark that?s been hacked -- or the whole game. Big difference right? If NV can hack a UT3 demo benchmark with prerendered stuff and take the workload off the GPU, they can make things look faster with no IQ difference at all in the demo. But what happens when you run the game -- no prerendered stuff and performance won?t match the benchmark demo because they can?t do this for the whole game.

Application detection opens a whole bag of worms. And the Volari review at xbit has just given us a good glimpse of this. NV can ratchet back detail/texture settings for benchmarks and who knows what else. I would think after a few years of cheating on benchmarks NV (with their resources) has got cheating down to a fine art by now. And even if NV can use something like shader replacement or application specific optimizations to improve the speed of games with little or very hard to notice IQ differences (lets call these legitimate application specific optimizations) -- how many games can they actually do this for? It?s so labor intensive you?d probably be lucky if 2 or 3 games a year with receive this type of treatment. So what about the other 200 -300 games that come out every year?

And what do you think NV is doing with these optimizations? Is NV helping the consumer run a few of their games faster, or simply manipulating hardware sites by optimizing for the 10 - 12 games or so that hardware sites benchmark most with so their products will look faster than they really are against the competitions? It?s a simple task really. Survey the 10 most popular Hardware sites and the 10 most popular benchmark programs/games. They have an inordinate amount of influence on the graphics card industry. Target these optimizations at them. Good marketing -- maybe in some peoples view. Consumers may not be happy when their cards prove to be not so fast in the other games out there. The 5600 is a good example of this. When you think about it, you?d almost have to be crazy not to target these benchmarks. NV can simply claim they are just helping the consumer run games faster. But with the current situation, I would say it?s apparent their real motivation is manipulating hardware reviews. Sure, both ATI and NV try to influence the market and hardware sites but if you have no moral scruples you manipulate instead of just trying to influence.

Do you think NV is interested in fair play? They?re acting like a bunch of scoundrels with 3Dmark2003 --hacking it almost on a monthly basis. Should we think their attitude is any different with regard to the other benchmarks and timedemos out there? They?re simply chasing the dollar any way they see fit. A favorable built in benchmark in a game like HL2 could be worth millions in good publicity and marketing as hardware sites will readily run this benchmark and plaster the results all over the web. Think about that in context with Gabe Newel?s comments at the HL2 presentations.

If NV comes out clean and says we will application optimize the 5-10 most popular games each year or even the popular games that really stress a graphics card to improve performance for consumers -- then fine, put these up on their site and tell us about this optimization program. These application specific optimized for games should really be known so they aren?t used for misleading benchmarking purposes. This current situation stinks. Or, if they think what they are doing in 3Dmark3 is legitimate, then come forward and defend it.


 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
I haven't been able to reproduce it with PS.
Maybe the original poster is mistaken. Like I said, I'm not going to treat his findings as concrete evidence as it's quite possible he made a mistake. I will add his findings to the "potential" pile.

And how many issues do you think they are working around using app detection? It's not just Splinter Cell.
If the performance takes a nose-dive by renaming the executable names and/or running custom benchmarks of the most popular benchmarked titles then it's extremely hard to justify such as actions as compatibility tweaks. Especially since the IQ tends to increase when the app is renamed, not decrease (e.g. UT2003).

Which is complete BS.
How is it BS? I didn't see any PowerVR drivers listed there. Perhaps you'd like to prove me wrong by showing me otherwise?

No negative impact to the end user, it should be perfectly fine.
There's no negative impact of static clip planes if the user doesn't move the camera, therefore it should be perfectly fine.
Increasing performance only in benchmarks but not in gameplay has no negative impact on the user. Therefore it should be perfectly fine.

Simply looking at IQ differences is a very superficial thing to do.

Not in any way shape or form.
Actually you are. You still don't seem to get the distinction between profiling an application and removing driver bottlenecks vs simply hard-coding driver routines to boost one title. One method provides a boost to all applications who use the driver code in a similar fashion while the other only provides gains for one application.

Also, ATi has had a build of DooM3 for a long time now, this isn't some secret app they are just finding out about.
I fail to see how the length of time an app has been with ATi has anything to do with it. If they weren't expecting the benchmark and Carmack wasn't complaining about something then they wouldn't have given it a priority.

Tell me, why do you think nVidia didn't use their inital FX drivers in the test?

But you have stated that it is OK to workaround compatibility problems using app detection.
Yes sorry, I should've been more clearer.

I don't jump to the conclusion that they are cheats as of yet.
I think I counted no less than three applications that looked like total ass when they detected the likes of "Halo.exe." and "Aquamark.exe".

If I didn't, I'd have to say ATi and nVidia cheat on everything using AF.
That is completely different.

Go ahead and look back to when Quack was the hot topic BFG,
On the surface it was a cheat, certainly. However a beta tester for ATi later came out and said that the four textures (or so) that were reduced was left-over from a testing routine from an old build of the drivers. Tell me, how have the likes of Perez attempted to explain away nVidia's cheating?
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
The IQ drops, retard, therefore it is a cheat.
Please don't call me a retard.

If you people think optimising is a cheat then you better disable SSE, SSE2, 3dnow, and hyperthreading because **GASP** those are hard coded application specific optmisations too.
The issue isn't the application itself (which can obviously do anything it likes), the issue is shared code such an API or drivers which react to multiple applications hooking into them.

Ironically you bring up SSE - it works for any app. Any app come come along, split it's instructions into parallel chunks and ask the processor to handle them more efficiently.

Now imagine if its instructions worked only for the most popular games and it didn't do jack for any of the others. Imagine if SSE had checks like if app == Quake 3 do this; else do something else.

Would you still feel it's a valid SIMD instruction set? Because doing something like that is no different to tailoring a driver to artificially boost performance for one application and leaving the rest alone.

I know the difference, im just pointing out your idiotic logic.
Actually it appears that you have no idea what the difference is.

If an optimisation gives the same result, and runs faster, WHAT IS THE FRIGGIN DIFFERENCE.
There can be and there is a big difference depending on how it's done.

A friggin monkey could grasp this concept, but appearently it is above you TheSnowman.
You are the one having trouble grasping the concepts here and you also call everyone else names.
 

reever

Senior member
Oct 4, 2003
451
0
0
Jesus, why do you people even bother trying to sway Ben's opinion on anything. If Dave can't do it, nobody can. He will continue to believe whatever he thinks he is reading across the internet, whether it is true or not, and then present it in a way to suit his own agenda. It won't matter if you talk to game developers every day or work with Ati and Nvidia, he will still be unyielding in his own little opinions. this thread is going *nowhere*, I second for a lock

Mods, please lock this... this battle of "wits" and endless bickering and measuring of penises needs to stop.

BTW the plural of penis is penes.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Remember their alternating-frame AA trickery? IIRC, it was subsituting 1x2 AA on one frame and 2x1 on the next for 4x AA.
Very interesting Pete, I didn't know about this.

He will continue to believe whatever he thinks he is reading across the internet,
I don't always agree with Ben but make no mistake, he is a very smart guy when it comes to this sort of thing. :)
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
sure reever, delusional fanboys will most likely always be delusional fanboys. but if we let their nonsensical arguments go unchallenged their numbers will only continue to grow. i can't just stand by and watch that happen. ;)
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
BFG-

If the performance takes a nose-dive by renaming the executable names and/or running custom benchmarks of the most popular benchmarked titles then it's extremely hard to justify such as actions as compatibility tweaks. Especially since the IQ tends to increase when the app is renamed, not decrease (e.g. UT2003).

An issue that has been fixed(UT IQ). I'm not saying optimizations are sure fire solve everything items. Also the renaming of executables was only for a very few instances in nV's case, although XGI it is a more relevant situation.

How is it BS? I didn't see any PowerVR drivers listed there. Perhaps you'd like to prove me wrong by showing me otherwise?

You read that one wrong bud ;) The fact that they won't certify any PowerVR drivers is what is BS, not the fact that you stated as to FM not allowing it.

Increasing performance only in benchmarks but not in gameplay has no negative impact on the user. Therefore it should be perfectly fine.

Simply looking at IQ differences is a very superficial thing to do.

Talk to some NV3X users who played Halo(just to use an example that has had a decent amount of conversation) with the 4x.xx drivers and then the 5x.xx series. The improvement in benches doesn't indicate how much the actual gameplay improved. The only people I hear talking about nV's performance improvements with their new drivers not being represenative of actual gameplay are ATi owners. The gameplay performance has improved with the new drivers in many cases moreso then what the benches indicate.

Actually you are. You still don't seem to get the distinction between profiling an application and removing driver bottlenecks vs simply hard-coding driver routines to boost one title.

If ATi is profiling drivers to remove bottlenecks they seem to put them back in a bit too often. I can actually think of numerous hard coded optimizations that are a good thing to have(most of them with PVR's parts, but still).

One method provides a boost to all applications who use the driver code in a similar fashion while the other only provides gains for one application.

What happens when you have a case where profiling the driver in one fashion for one group of titles slows another group of titles down? Enabling a switch in the driver to use the optimized path based on the application is still wrong?

I think I counted no less than three applications that looked like total ass when they detected the likes of "Halo.exe." and "Aquamark.exe".

Not debating that in the least, but take a look at the card, its architecture, and the performance it is showing. It should have no problem at all outputting significantly higher framerates then we are seeing even with the optimized/ugly defaults that the drivers are currently using. That, combined with a complete lack of compelling evidence to the contrary, gives me plenty of reason to believe that these very well could be driver bugs they are trying to work out. Something is badly broken with XGI's drivers, I'm not assuming that they are trying to cheat but am currently leaning more towards the driver team is trying everything they can think of to get the performance of the part close to where it should be.

On the surface it was a cheat, certainly. However a beta tester for ATi later came out and said that the four textures (or so) that were reduced was left-over from a testing routine from an old build of the drivers. Tell me, how have the likes of Perez attempted to explain away nVidia's cheating?

3DM2K3? They stated it was overzealous optimizations by someone on the driver team IIRC(paraphrased). Outside of 3DM2K3 I haven't seen anything to indicate that any other cheats are or have been present. As far as Quack, I never went off about them cheating or any other such nonsense. Didn't think it was a real issue then, still don't now, and I've brought it up a lot more now that the rolls have been reversed then I ever did when ATi was in the hot seat for certain. That one blew over quickly, if this one had done the same I would have taken the same stance as I did then(non issue).

Reever-

Jesus, why do you people even bother trying to sway Ben's opinion on anything.

If there were facts to back these issues up that were concrete, there wouldn't be any need to sway anything. I'm not going to follow some trend because it bolsters one IHV. These boards are archived for the last ~3.5 years, feel free to check if my stance has changed on any of the issues no matter which IHV it favors or harms.

If Dave can't do it, nobody can.

Big, big difference between Dave and BFG, BFG is a gamer. We used to have an engineer from 3dfx that posted here(his name was Dave too, he was one of the founders of B3D) and he was regularly adamantly opposed to pretty much everything I stated also(like BFG I had a good deal of respect for him also and if any of us were to run in to each other in the real world I'd gladly buy the first round, Dave Barron, Dave Baumann or BFG, just because we argue a lot, it certainly doesn't mean that I have disrespect for them). A lot of our major discussions were around what was important in the industry, which direction it was headed in and what technology would be important. In terms of the internal workings of 3D hardware Dave Barron certainly knows a lot more then I do, but in the end that knowledge didn't change the fact that he didn't see things happening that I thought were obvious(and did happen BTW).

Reading some of Dave Baumann's most recent articles I see that he is still convinced that DX9 is seeing a very rapid adoption in the market by games, this is the kind of complete disconnect from what is actually happening that makes it easier for me to listen to BFG on a lot of issues then Dave. Anyone who is a big gamer and knows a reasonable amount about the technology driving the games knows that DX9 to date has been a non factor almost entirely. A year after DX9's release we have a whopping two titles that take advantage of any 2.0 pixel shaders and only one of them is worth owning(by any reasonable standard). Neither of those perform very well on any current hardware either. Within the next ~six months it looks like we will see another small handful of games, and that's it. In older DX versions that length of time was the life cycle of the version, and we had a lot of DX6 games in its era, a lot of DX7 games in its era, and a lot of DX8 games in its era. With DX9, we are only seeing that due to the significantly lengthened amount of time between releases.

With BFG there isn't any real disagreement between us about what is happening in terms of the game market at the moment, we both know what's going on right now. There is no need to debate if the sky is blue or not ;)

It won't matter if you talk to game developers every day or work with Ati and Nvidia, he will still be unyielding in his own little opinions.

Is talking to game developers supposed to be a big deal? And forget working with ATi or nVidia, how about for them? I'm not arguing upcoming parts with Dave as I'm absolutely certain that he has far more information then I do on at least one of them, but other then that what good do you think it is to talk to the IHVs? For developers, let's say that there is selective listening that regularly happens. HL2 info comes out about horrendous performance and rendering errors and its gospel, despite the game being six montsh(a year?) away from release. Halo's developers come out and say a shipping game has issues with one of the IHV's parts and it is ignored. I don't buy in to either of them, IME developers are easily swayed by IHV's PR/DevRel(or promo funds) not to mention most devs don't game anywhere near as much as your typical gamer.