Please lock this thread. No more useful discussion going on.

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: BFG10K
The IQ drops, retard, therefore it is a cheat.
Please don't call me a retard.

If you people think optimising is a cheat then you better disable SSE, SSE2, 3dnow, and hyperthreading because **GASP** those are hard coded application specific optmisations too.
The issue isn't the application itself (which can obviously do anything it likes), the issue is shared code such an API or drivers which react to multiple applications hooking into them.

Ironically you bring up SSE - it works for any app. Any app come come along, split it's instructions into parallel chunks and ask the processor to handle them more efficiently.

Now imagine if its instructions worked only for the most popular games and it didn't do jack for any of the others. Imagine if SSE had checks like if app == Quake 3 do this; else do something else.

Would you still feel it's a valid SIMD instruction set? Because doing something like that is no different to tailoring a driver to artificially boost performance for one application and leaving the rest alone.

I know the difference, im just pointing out your idiotic logic.
Actually it appears that you have no idea what the difference is.

If an optimisation gives the same result, and runs faster, WHAT IS THE FRIGGIN DIFFERENCE.
There can be and there is a big difference depending on how it's done.

A friggin monkey could grasp this concept, but appearently it is above you TheSnowman.
You are the one having trouble grasping the concepts here and you also call everyone else names.

So you read what im saying but you still dont get it, i am saying that if SSE had specific code that said "if app == Quake 3 do this; else do something else" and it made it run faster and the result the same. It is fine. It is an application specific optimisation. What you are saying is that because things are done differently its not "fair" to the other manufacturers, better occlusion detection is not a problem. What XGI does (dramatic loss in IQ on certain apps) is definately a problem. If you cant tell the difference why does it matter... thats what it all comes down to.

The playing field isn't supposed to be level, i know exactly what a static clip plane is. It's code that basically says "under no circiumstances do you render this", and if you at no point see that piece in the demo, it looks identical. Solution? Dont use crappy synthetic benchmarks, play real games and use average fps. Cameras on rail paths is no way to benchmark because hard coded clip planes and hundreds of other things can be done to inflate performance. Application detection isnt the issue, Futuremarks POS benchmark is. If they used a freeform camera hard coded clip planes would obviously degrade the IQ, but they dont, so it doesnt.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
32,039
32,525
146
Originally posted by: rimshaker
This sure is a lot of discussion going on for no more useful discussions, heh.
:D
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
An issue that has been fixed(UT IQ).
Well it's actually mutated into all Direct3D apps instead of UT2003. Also the driver detection strings are still left inside the drivers.

The fact that they won't certify any PowerVR drivers is what is BS, not the fact that you stated as to FM not allowing it.
Judging by your comments about how PowerVR's drivers work, I'd have to say that FutureMark have extremely good reasons to exclude them. I mean they basically violate all of their rules for heaven's sake.

Talk to some NV3X users who played Halo(just to use an example that has had a decent amount of conversation) with the 4x.xx drivers and then the 5x.xx series.
Well I agree that there are some legitimate optimizations going on. I certainly don't believe that all of nVidia's perfomance gains are cheats as that would simply be a ludicrous stance to take.

If ATi is profiling drivers to remove bottlenecks they seem to put them back in a bit too often. I can actually think of numerous hard coded optimizations that are a good thing to have(most of them with PVR's parts, but still).
Well it's tough a game to write and optimize drivers that work uniformly well in all applications. I mean nVidia have problems in almost every single release too (I think the latest 5xxx drivers cause hardlocks and fog problems in ET).

What happens when you have a case where profiling the driver in one fashion for one group of titles slows another group of titles down?
I can't answer that exactly as I'm not sure how a driver developer would deal with that (I do write code but I don't write drivers). I'd imagine they'd try to strike the best balance between the two groups of titles. I mean it's often the case where certain drivers run certain games better than others do - I've certainly seen this many times with a wide range of vendors.

Enabling a switch in the driver to use the optimized path based on the application is still wrong?
Yes, it is. If you start down that path you open a can of worms that ultimately ends up with the horrendous mess that XGI have given us. Hard-coding drivers around applications might give you a temporary advantage because of the extra performance but it's simply not sustainable and in the long run generic optimizations will come out the winners.

You can be sure other forms of shared code like such as the DirectX or MFC don't have hard-coded optimizations that revolve around certain applications. The goal of any shared code should be to attain universally good performance across the board without employing brittle pathways that fall over if you even look at them funny.

It should have no problem at all outputting significantly higher framerates then we are seeing even with the optimized/ugly defaults that the drivers are currently using.
That's correct and I believe the results we're seeing are no accident because of this very reason. I mean a few problems in one or two titles I could believe (ala Quack), but not this. If renaming the executables is making such large and unacceptable change it's simply no accident that it's happening.

3DM2K3? They stated it was overzealous optimizations by someone on the driver team IIRC(paraphrased).
Yes and they apologized and removed it. Perez certainly hasn't done anything like that and in fact he's stated that the fiasco will continue indefinitely.

Look at ATi's comments and nVidia's comments on the issue and then ask yourself:- which company is trying to give their customers the best possible gaming experience overall and which company is only trying to lengthen the benchmark bars in the most popular benchmarked software?

Yes, I know it's PR but even here the two sides have a very different stance on the issue.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
So you read what im saying but you still dont get it,
I get it perfectly; you're still wrong.

i am saying that if SSE had specific code that said "if app == Quake 3 do this; else do something else" and it made it run faster and the result the same. It is fine.
And again I'll point out that your position is completely and utterly wrong.

Do you really expect Intel to slap extra logic gates onto their SSE unit so that it only kicks in when it detects certain executable strings of the most popular benchmarks? Likewise, do you expect Microsoft to roll in hard-coding into their Windows APIs that only boost certain apps but not others?

Such a stance is simply ludicrous. No shared code or instruction set should ever do something like that.

What you are saying is that because things are done differently its not "fair" to the other manufacturers, better occlusion detection is not a problem.
It's not "better" at all. It can only exist inside a certain benchmark and it can never exist in valid gameplay, and for that reason it's a blatant cheat.

Again I'll ask, do you only ever watch the same benchmark running over and over again or do you actually play the games you purchase?

If you cant tell the difference why does it matter
Because it cannot exist in the real world and hence it only serves to artificially inflate perfromance, performance which the end-user will never see. For that reason it's both invalid and cheating.

i know exactly what a static clip plane is.
I very much doubt that.

It's code that basically says "under no circiumstances do you render this", and if you at no point see that piece in the demo, it looks identical.
And that's exactly why it's invalid.

Solution? Dont use crappy synthetic benchmarks,
The type of benchmark is irrelevant. What is relevant is that it is a benchmark and by definition it does the same thing over and over again. And because of this definition you can start using prior knowledge to artificially inflate the performance and that prior knowledge can never transfer into a real gaming situation.

Don't use benchmarks as a scapegoat for cheating; the problem is the cheating, not the benchmarks.
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Ok so write NVIDIA a letter, because no one seems to care but about 2000 geeks across various bulletin boards.

I'm done beating this to death.
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Originally posted by: reever
Jesus, why do you people even bother trying to sway Ben's opinion on anything. If Dave can't do it, nobody can. He will continue to believe whatever he thinks he is reading across the internet, whether it is true or not, and then present it in a way to suit his own agenda. It won't matter if you talk to game developers every day or work with Ati and Nvidia, he will still be unyielding in his own little opinions. this thread is going *nowhere*, I second for a lock

Mods, please lock this... this battle of "wits" and endless bickering and measuring of penises needs to stop.

BTW the plural of penis is penes.

It can be spelled either way.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Well it's actually mutated into all Direct3D apps instead of UT2003. Also the driver detection strings are still left inside the drivers.

The IQ issue has been fixed. The overall D3D optimization is now actually superior to how ATi is 'cheating' in both how close it is compared to what it should be, and how good it looks. Brilinear isn't a great example to use as nV 'cheating' as ATi is doing much worse(straight bilinear when trilinear + AF is requested except in a small handful of apps).

Judging by your comments about how PowerVR's drivers work, I'd have to say that FutureMark have extremely good reasons to exclude them. I mean they basically violate all of their rules for heaven's sake.

'Their rules' are a joke. Why not have a rule where drivers that hard lock games are ruled out? Any game, auto disqualification. That is an issue of much, much greater concern to end users that if a company is using per app optimizations. PowerVR's optimizations overall work extremely well, and when you do run into issues they get back to you on how to fix them very quickly(nV takes quite a while, I've been waiting almost a year now for ATi to tell me how to fix my R9500Pro ;) ).

I can't answer that exactly as I'm not sure how a driver developer would deal with that (I do write code but I don't write drivers). I'd imagine they'd try to strike the best balance between the two groups of titles. I mean it's often the case where certain drivers run certain games better than others do - I've certainly seen this many times with a wide range of vendors.

You would be happy taking a 10% hit in a large group of games over having them all running at full speed? That seems ludicrous to me. This is actually what PowerVR does with their app specific optimizations. A lot of what they do is work around issues due to the nature of TBR, but a great deal of their optimizations per app are pulled from a group of common settings. Certain groups of titles need to use WBuffer instead of Z as an example(due to depth accuracy on a tile level running in to issues with the narrower Z buffer), but WBuffer does have a slight performance hit so it shouldn't be used when not necessary. I do not, in any way, see anything wrong with using app detection to make adjustments to such issues. If you think ATi is immune from doing odd things with certain apps, take a screenshot in Halo on one of the shader intensive areas without AF and then w/16x quality.

Another way to think of it, what if ATi came out with a DooM3 specific optimization that increased the performance of their parts by ~30% while outputting identical quality, would that be a problem? I certainly wouldn't think so. I would think that you would be in favor of that even moreso then I as you are the one that is going to have to deal with the performance on the R3x0 cores in that title(unless you plan on stepping up to the next gen).

Yes and they apologized and removed it. Perez certainly hasn't done anything like that and in fact he's stated that the fiasco will continue indefinitely.

IIRC Derek is the one who I was paraphrasing. He has also stated that optimizations will continue, which I see no problem at all with(actual optimizations, not cheats).

Look at ATi's comments and nVidia's comments on the issue and then ask yourself:- which company is trying to give their customers the best possible gaming experience overall and which company is only trying to lengthen the benchmark bars in the most popular benchmarked software?

In terms of actions? nVidia by a wide margin. If ATi was interested in more then PR they would stop handing out millions of dollars to have devs spread FUD and take that money on making sure that their drivers worked at the very least on the most popular titles. The people griping the most about ATi's drivers are the people that have actually spent a decent amount of time using them if you'll notice, while the people griping about nV's 'cheats' haven't spent any time with the NV3X in their rig.

ATi's current PR stance is much like 3dfx's old PR, make it seem like you are part of the community, that you are one of them, that you care about what they think. Meanwhile they have failed to devote the resources required to ensure that gamers can play all their games on their parts.

(I think the latest 5xxx drivers cause hardlocks and fog problems in ET).

I wanted to ask this seperately, how do you get that to actually happen? I've been wanting to see one of these nV based driver bugs hard lock a system :)
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: BenSkywalker
Brilinear isn't a great example to use as nV 'cheating' as ATi is doing much worse(straight bilinear when trilinear + AF is requested except in a small handful of apps).

are you just making that up or can you even name one app where this is the case?
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
are you just making that up or can you even name one app where this is the case?

I was going to write a really big list but I'll summarize instead. If the game doesn't have AF in the options, you don't get AF+Trilinear in game you get AF+straight bilinear when you force AF in the control panel. This covers well over 95% of all games. This is not some big secret either, it has been widely known for some time although it is ATi 'cheating' and not nVidia so it doesn't get quite the same level press ;)

For nV, you get brilinear(which is a lot better the bilinear, most people can't tell the difference between brilinear and trilinear even when looking for it) which is all the time with AF under D3D, but under OpenGL you get AF and full trilinear in any game(well, if you are using AF anyway). Both of them are hacks and I'm not impressed with either, but they are both 'cheating' by the popular definition amongst those who like to throw the word around, ATi is just doing it with more games and in a more severe fashion in the ones it does do it in. There are only a very small handful of titles that ATi runs both AF+trilinear on that nV doesn't while there are quite a few more going the other way.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
you don't have your facts straight Ben. when you force quality af in ati's d3d control panel never get straight biliniar. if a d3d app does not have a texture filitering preference you will still get triliniar for the first mipmap stage and biliniar for the rest. this was not an issue in erlier drivers, but ati obviously felt compeled to folow nvidia's lead and lower the standards of their quality af inorder to remain competitive in benchmarks. while they are both cheap hacks and unimpressive to say the least, neither is cheating as long as the app does not request triliniar and the quality of one solution over the other is highly debatable.

furthermore, just like nvidia, ati also always provides full triliniar with quality af in opengl so your claim that ati uses their hacks in more games is compleatly unfoudned. however, back to d3d, if the aplication does request trilinear then it gets full triliniar all the way though even while af is forced by ati's d3d control panel. on the other hand, with nvidia control panel focceing af the applications request for triliniar is denied and you get mearly brilinear filtering instead. that is why anyone but a dilustion fanboy understands that in this issue nvidia is cheating while ati is not.
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: TheSnowman
you don't have your facts straight Ben. when you force quality af in ati's d3d control panel never get straight biliniar. if a d3d app does not have a texture filitering preference you will still get triliniar for the first mipmap stage and biliniar for the rest. this was not an issue in erlier drivers, but ati obviously felt compeled to folow nvidia's lead and lower the standards of their quality af inorder to remain competitive in benchmarks. while they are both cheap hacks and unimpressive to say the least, neither is cheating as long as the app does not request triliniar and the quality of one solution over the other is highly debatable.

furthermore, just like nvidia, ati also always provides full triliniar with quality af in opengl so your claim that ati uses their hacks in more games is compleatly unfoudned. however, back to d3d, if the aplication does request trilinear then it gets full triliniar all the way though even while af is forced by ati's d3d control panel. on the other hand, with nvidia control panel focceing af the applications request for triliniar is denied and you get mearly brilinear filtering instead. that is why anyone but a dilustion fanboy understands that in this issue nvidia is cheating while ati is not.

Prove it.
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
I want to see some image quality shots that compare 8X AF with bilinear filtering to 8XAF with trilinear filtering :D
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Ask and ye shall recieve.

Rig:
Intel Pentium 4 2.6C @ 3.15ghz (230fsb)
Asus P4P8X I865 motherboard
1GB OCZ PC4200
Asus FX5600 (non-ultra)

Game: Call of Duty

Settings: AA off, 8xAniso via driver settings

Graphics Settings:
Resolution 1024x768 @ full screen
Charecter Textures: High
General Textures: High
Texture filter: Bi in one ss, Tri in other
Texture Quality: 32 bit
NVIDIA Distance Fog: Yes

System Settings:
Wall Marks: On
Ejecting Brass: On
Dynamic Lights: Everything
World Dynamic Lighting Quality: Nicest
Model Detail: Maximum
Sync Every Frame: Yes
Corpses: Insane

bilinear

trilinear

Other than a noticible performance hit the screenshots look the same (the camera moved very slightly).

Screenshots taken using fraps.

Edit: My isp really sucks, moving links.
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Interesting... no noticeable difference... are you SURE trilinear is really activated and nVidia isn't cheating ;)
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: Acanthus

Prove it.

sure, just tell me where you live and i'll come by and show you. granted, the only reson i would go though the trouble is becuase while i'm there i'll give you a solid beating for being such a jackass. :D
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: TheSnowman
Originally posted by: Acanthus

Prove it.

sure, just tell me where you live and i'll come by and show you. granted, the only reson i would go though the trouble is becuase while i'm there i'll give you a solid beating for being such a jackass. :D

nice proof.

Edit: Wouldnt the performance hit in call of duty when trilinear and 8x aniso are enabled completely disprove your blatent lie? And the REALLY funny part is, trilinear doesnt look any better.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
The difference between bi and tri is only really visible in motion, not in screenshots. Obviously the type of graphics displayed affects the obviousness of the MIP-maps, too (large, flat floors will show banding more easily than a foliage-covered hill).
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
sure enough, and a big white mess of snow isn't exactly the optimal texture to show the difference either. regardless, cod is an opengl game and as i stated above both ati and nvidia do do full triliniear in opengl. so no Acanthus, you havn't proven anything but the fact that you don't even understand what you are talking about. :D
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
if a d3d app does not have a texture filitering preference you will still get triliniar for the first mipmap stage and biliniar for the rest.

My mistake on that part, you are correct on the first mip transition being trilinear, the rest isn't accurate unless ATi has changed how their AF works with the latest drivers.

this was not an issue in erlier drivers, but ati obviously felt compeled to folow nvidia's lead and lower the standards of their quality af inorder to remain competitive in benchmarks.

Their texture filtering has never been that good. Too much aliasing, too many angles they don't touch.

furthermore, just like nvidia, ati also always provides full triliniar with quality af in opengl so your claim that ati uses their hacks in more games is compleatly unfoudned.

Only if they changed this in the latest drivers.

if the aplication does request trilinear then it gets full triliniar all the way though even while af is forced by ati's d3d control panel.

Only if the app has selections for AF and trilinear. If you request trilinear and force AF you still get the hack.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
(straight bilinear when trilinear + AF is requested except in a small handful of apps).
ATi doesn't do straight bilinear, they do trilinear on the first mip-map at the very least. And with the maximum setting ATi's first mip-map starts further than nVidia's does and it's also so far back that you basically need colormips to spot any difference. Heck, I use bilinear AF all the time and I'm more than happy with the IQ.

'Their rules' are a joke. Why not have a rule where drivers that hard lock games are ruled out?
FutureMark's goal is to create a consistent synthetic benchmark whose results can be reasonably compared across the board. They are not concerned about policing the entire driver industry.

You would be happy taking a 10% hit in a large group of games over having them all running at full speed?
Well it depends what games. The newer and most popular titles will probably have a favourable bias over the older and non-popular titles. I mean the older titles can often get by with brute force if they have to.

Another way to think of it, what if ATi came out with a DooM3 specific optimization that increased the performance of their parts by ~30% while outputting identical quality, would that be a problem?
It depends on how they do it.

He has also stated that optimizations will continue, which I see no problem at all with(actual optimizations, not cheats).
FutureMark does class them as cheats.

If ATi was interested in more then PR they would stop handing out millions of dollars to have devs spread FUD and take that money on making sure that their drivers worked at the very least on the most popular titles.
As opposed to "works best on nVidia" logos plastering the game splash screens, games where often ATi is the leader?

I wanted to ask this seperately, how do you get that to actually happen?
I didn't but a few people have been complaining about hard locking. Also some have been having fog/haze problems in ET and MOHAA.

I've been wanting to see one of these nV based driver bugs hard lock a system :)
One of the official 4xxx drivers could consistently hardlock Serious Sam 2 on my Ti4600. I've also had one crash on my 9700 Pro in Combat Flight Simulator 3.