Please lock this thread. No more useful discussion going on.

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: TheSnowman
sure enough, and a big white mess of snow isn't exactly the optimal texture to show the difference either. regardless, cod is an opengl game and as i stated above both ati and nvidia do do full triliniear in opengl. so no Acanthus, you havn't proven anything but the fact that you don't even understand what you are talking about. :D

What game do you suggest that has explicit bilinear and trilinear settings. Out of all the games i have Call of Duty was the only one with the option.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: BenSkywalker
My mistake on that part, you are correct on the first mip transition being trilinear, the rest isn't accurate unless ATi has changed how their AF works with the latest drivers.

the only changes they have made to af on their dx9 cards is the biliniar the 2+ mipmap stages for d3d and fixing quality af force triliniar for af in opengl regardless of aplication settings. the former was after nvidia went to briliniar and the latter was way back in the 2.3s or so.

Originally posted by: BenSkywalker

Their texture filtering has never been that good. Too much aliasing, too many angles they don't touch.

lol, i know you never liked it and i will freely admint that the af is better on the fx cards at certian angles and the geforce3's and geforce4's af looks better beter yet, but many people would argue that your comment that it "has never been that good". personaly i think it is quite good and what image quality that is lost is well worth the smaller performace hit. regardless, the af "touch"s all angles, maybe not enough to suit you but to say that they "don't touch" certian angles is flat out wrong.

Originally posted by: BenSkywalker


Only if they changed this in the latest drivers.

that is not the case. full trilniear in opengl has always been an option as long as the app requests triliniar and has been working in apps that don't for well over a year. see full triliniar af in opengl shots at beyond3d.


Originally posted by: BenSkywalker
Only if the app has selections for AF and trilinear. If you request trilinear and force AF you still get the hack.

doh, my mistake here. i had been useing rtool to force proper trilinear with af in all d3d apps sense ati started useing their hack but i went to test things out and sure my memory was flawed on this point.

Originally posted by: Acanthus
What game do you suggest that has explicit bilinear and trilinear settings. Out of all the games i have Call of Duty was the only one with the option.

well ut2003 has a usetrilinear=true/flase in the .ini to let you toggle between bilinear and trilinear and a levelofanisotropy= setting as well, but wether you turn up the in game settings or force quality af in the control panel you will never get trilinear, but only brillinear. if you don't have ut2003 or just want to read up about the sitiation more, i dug up this artcle here:

http://www.3dcenter.de/artikel/ati_nvidia_treiberoptimierungen/index_e.php
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
BFG-

Heck, I use bilinear AF all the time and I'm more than happy with the IQ.

I'd suggest an optomotrist ;)

FutureMark's goal is to create a consistent synthetic benchmark whose results can be reasonably compared across the board.

And they have this lovely bridge for sale I heard.

FutureMark does class them as cheats.

This ties in with your last quote. Futuremark in the case of PowerVR is doing everything in their power to make certain that they can not be compared in any realistic sense with anyone else. If PowerVR does the exact same thing for 3DM that they do for every game, why the hell shouldn't it be considered valid? They do not care about being fair, they do not care about creating a level playing field for the sake of comparison. What their agenda is I'm not sure, but it quite clearly has nothing to do with making a synthetic bench that can be used to reasonably compare the IHVs.

Well it depends what games. The newer and most popular titles will probably have a favourable bias over the older and non-popular titles. I mean the older titles can often get by with brute force if they have to.

DooM3 10% slower and HL2 10% slower or both at full speed, which would you want? In order to get both to full speed they need to use app detection in this scenario. You really going to try and tell me that you wouldn't want full speed?

It depends on how they do it.

By using a custom driver profile for DooM3. You would be willing to give up ~30% performance(for this hypothetical) for no reason other then it requires a driver profile for the game? With D3 in particular odds are quite high that you will own a decent sized pile of games based on the engine over the next few years(I'd bet on it actually), and the D3 profile will almost certainly offer benefits to any of the games based on the engine also.

As opposed to "works best on nVidia" logos plastering the game splash screens, games where often ATi is the leader?

If I was having hard locks with nV's drivers you can bet your @ss I'd be blasting TWIMTBP.

I didn't but a few people have been complaining about hard locking. Also some have been having fog/haze problems in ET and MOHAA.

Do you recall which forum it was on? I'd like to see one of these hard lock nV bugs. The fog issue I have heard from several people, they also have a problem with current drivers and shadows/flickering in RiseOfNations, but no hard locks.

One of the official 4xxx drivers could consistently hardlock Serious Sam 2 on my Ti4600. I've also had one crash on my 9700 Pro in Combat Flight Simulator 3.

Do you recall which 4x.xx it was? I think I have all of the official ones on my HD(a good chunk of the betas also), I'll install them and give SS:SE another play through(fun game and short).

TheSnowman-

lol, i know you never liked it and i will freely admint that the af is better on the fx cards at certian angles and the geforce3's and geforce4's af looks better beter yet, but many people would argue that your comment that it "has never been that good".

For 'has never been that good', BFG is happy with bilinear<shrugs>. I've always been anal about texture filtering, that's why I'm not running a FX right now and one of the reasons I dumped the R300 board I had(the slew of driver issues was the main factor however).

regardless, the af "touch"s all angles, maybe not enough to suit you but to say that they "don't touch" certian angles is flat out wrong.

There is no AF applied at certain angles at all, it is straight isotropic filtering. I think it was XBit that had a review up of S3's new part, check out the half pipe filtering test they have and look at ATi's images.

that is not the case. full trilniear in opengl has always been an option as long as the app requests triliniar and has been working in apps that don't for well over a year. see full triliniar af in opengl shots at beyond3d.

They have AF selected in the application there. SeriousSam(both of them) have a menu option for AF which does allow you to get AF+full trilinear.

well ut2003 has a usetrilinear=true/flase in the .ini to let you toggle between bilinear and trilinear and a levelofanisotropy= setting as well, but wether you turn up the in game settings or force quality af in the control panel you will never get trilinear, but only brillinear. if you don't have ut2003 or just want to read up about the sitiation more, i dug up this artcle here:

That's not just UT2K3, that's every D3D app with nV right now. Running AF under D3D you don't get trilinear as of now, only brilinear. I need a bit more hands on time with it before I pass full judgement, but a lot of people couldn't pick out PVR's box filter and it jumped out of the screen and slapped me across the face......hard ;)
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: BenSkywalker<brThere is no AF applied at certain angles at all, it is straight isotropic filtering. I think it was XBit that had a review up of S3's new part, check out the half pipe filtering test they have and look at ATi's images.

scroll down the page on that b3d link and you will see images from xmas test app and there is anistropy at all angles.


Originally posted by: BenSkywalker
They have AF selected in the application there. SeriousSam(both of them) have a menu option for AF which does allow you to get AF+full trilinear.

but i wasn't refering to serious sam, again scroll down the page a bit.


Originally posted by: BenSkywalker
That's not just UT2K3, that's every D3D app with nV right now. Running AF under D3D you don't get trilinear as of now, only brilinear. I need a bit more hands on time with it before I pass full judgement, but a lot of people couldn't pick out PVR's box filter and it jumped out of the screen and slapped me across the face......hard ;)

Acanthus asked for a game to see what is going on in, i gave him a game. are you just speed reading looking for arguments or something?


oh and your futuremark considers pvr to be cheating argument is bunk. they have no reason to have anything against tile based rendering, futuremark just isn't going to waste their time certifying drivers for a dx7 level card to be used in a dx9 level benchmark.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
If PowerVR does the exact same thing for 3DM that they do for every game, why the hell shouldn't it be considered valid?
So FutureMark is supposed to accept cheating drivers because they cheat in other games too?

FutureMark's rule is "no cheating in 3DMark allowed". The rule isn't "no cheating in 3DMark allowed unless you also cheat everywhere else too".

You really going to try and tell me that you wouldn't want full speed?
I want full speed but I don't want the problems caused by application detection. Say a new version of Doom 3 is released and the optimizations fall over. What then? I can wait until the next drivers come and hopefully put performance back to how it was or I can use the old version of Doom3 and be forced to endure the problems that have been fixed in the newer version.

Now if ATi managed a 5% gain without app detection, that would be better in the long run since it wouldn't matter about future Doom3 patches and it woudn't require the driver devs redoing the application detection each time the game was patched.

You would be willing to give up ~30% performance(for this hypothetical) for no reason other then it requires a driver profile for the game?
You're making the flawed assumption that it's not possible to match or better such a speed gain by legitimate means.

With D3 in particular odds are quite high that you will own a decent sized pile of games based on the engine over the next few years(I'd bet on it actually), and the D3 profile will almost certainly offer benefits to any of the games based on the engine also.
If it benefits more than Doom3 then it's more than just simple application detection.

Do you recall which forum it was on?
Yes, it was at Ars.

Do you recall which 4x.xx it was?
I think it was 44.43 but I'm really not sure as it was almost two years ago now.

I'll install them and give SS:SE another play through(fun game and short).
SS:SE isn't that short. In fact it's quite a decent length, especially if you explore all the secret areas and play it on the hardest difficulty level.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
scroll down the page on that b3d link and you will see images from xmas test app and there is anistropy at all angles.

Uhhh, download that app and check it out for yourself and then compare it to whar XBit was looking at.

but i wasn't refering to serious sam, again scroll down the page a bit.

And they enable AF through the test app there too.

BFG-

So FutureMark is supposed to accept cheating drivers because they cheat in other games too?

FutureMark's rule is "no cheating in 3DMark allowed". The rule isn't "no cheating in 3DMark allowed unless you also cheat everywhere else too".

If you would stop kicking dogs this conversation would be easier :) You have nothing remotely resembling any proof that PVR is cheating in anything, and yet you are still jumping to that conclusion defying all logic to the contrary.

Say a new version of Doom 3 is released and the optimizations fall over. What then?

Site me an example. Show me a single time that that has happened and I'll start applying weight to it. PowerVR has been doing app specific optimizations since they started in the PC sector, you should have no problem whatsoever listing a slew of examples of this it were a real issue. It certainly can't be any worse then 'ATi's let's break a game' per driver release(this month- Tron 2.0) :p

You're making the flawed assumption that it's not possible to match or better such a speed gain by legitimate means.

I'm stating that the optimal driver path for D3 is not going to be the same as the optimal driver path for SeriousSam3 and you can quote me on that :)

SS:SE isn't that short. In fact it's quite a decent length, especially if you explore all the secret areas and play it on the hardest difficulty level.

My first play through the game took me ~4.5 hours on one notch above Normal. Currently I'm playing Morrowind GOTY(PC version, already have the original XBox version although that doesn't have Tribunal and Bloodmoon), at the rate I'm going I should finish it off in just a shade under 200 hours(not a typo, two hundred). SS:SE is damn short. Anything under 20 hours is short, under 10 hours is way too short for a $50 game(although for $20 I can live with it). Get away from shooters some time and you'll realize how sadly lacking they are in length ;)
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: BenSkywalker


Uhhh, download that app and check it out for yourself and then compare it to whar XBit was looking at.

if you want me to look you should post a link.

Originally posted by: BenSkywalker


And they enable AF through the test app there too.

no the app works off of controlpanel settings, download it and try it for yourself if you like.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
if you want me to look you should post a link.

You posted the link ;) The page that has the AF info on it @B3D has a download link for the file, it's real small too(less then 100K).

no the app works off of controlpanel settings, download it and try it for yourself if you like.

I have the app.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
i mean the link to "what xbit was looking at", xbit looks at a lot of things. as for Xmas's aa tester, sence you have it you should be able to quickly see for yourself that it goes by controlpanel settings and doesn't give you any af when the controlpanel is set to application preference.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
You have nothing remotely resembling any proof that PVR is cheating in anything,
They use application detection don't they?

Site me an example. Show me a single time that that has happened and I'll start applying weight to it.
Build 340 of 3DMark03.

I'm stating that the optimal driver path for D3 is not going to be the same as the optimal driver path for SeriousSam3 and you can quote me on that :)
That's true and it''s probably be the case that ATi will put the bias onto Doom3 by the time it rolls out.

My first play through the game took me ~4.5 hours on one notch above Normal.
4.5 hours? It took me 12-15 on the hardest diffculty.

Did you just race through the entire game or did you actually take the time to explore everything? How many sceret levels and areas did you find? Did you kill all of the hidden monsters?

Get away from shooters some time and you'll realize how sadly lacking they are in length
I tend to finish shooters at least five times and the most I've finished a game is Quake (eleven times). How many times do you finish your games?
 

Blastman

Golden Member
Oct 21, 1999
1,758
0
76
Another way to think of it, what if ATi came out with a DooM3 specific optimization that increased the performance of their parts by ~30% while outputting identical quality, would that be a problem? I certainly wouldn't think so. I would think that you would be in favor of that even moreso then I as you are the one that is going to have to deal with the performance on the R3x0 cores in that title(unless you plan on stepping up to the next gen).

That?s true if they are legitimate optimizations. And I could possibly see ATI eventually being forced into application specific optimizations simply to equal out whatever NV is doing. But the real question here is how NV is achieving these gains. You assume NV?s application detection?s are inserting legitimate optimizations. But if NV has legitimate application specific optimizations they can use -- why did they resort to clip planes in 3Dmark2003????? At this point I?m going to view NV?s application detection as hacks and with great skepticism unless they can convince me otherwise. .
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
as for Xmas's aa tester, sence you have it you should be able to quickly see for yourself that it goes by controlpanel settings and doesn't give you any af when the controlpanel is set to application preference.

When I set app preference in the control panel I get whatever level of AF I select in the app. Big slider, top says 'Max Aniso', does it for me(although I know my drivers are working properly).

BFG-

They use application detection don't they?

So?

Build 340 of 3DMark03.

A slight performance hit is the dire consequnce you are so afraid of....?

That's true and it''s probably be the case that ATi will put the bias onto Doom3 by the time it rolls out.

Any you would be opposed to having a switch that loads the optimal profile for SS3 built in? Instead you would rather dance around to different drivers for optimal performance?

4.5 hours? It took me 12-15 on the hardest diffculty.

Did you just race through the entire game or did you actually take the time to explore everything? How many sceret levels and areas did you find? Did you kill all of the hidden monsters?

I found probably about half of the secrets(first play through), still haven't found them all in that one. Remember at the that came out I was reviewing games, keeping GB going and working my 9-5, I wasn't going to quadruple my time with the game when I had so many others waiting(I didn't write the review for that one BTW).

I tend to finish shooters at least five times and the most I've finished a game is Quake (eleven times). How many times do you finish your games?

Depends on the game greatly. Max Payne I played through exactly once, and I wouldn't do it again if you paid me the $50 I paid for the game. Disgustingly poor piece of trash. Take the original Zelda and I'd say its easily over a hundred(I got so I could go through the game and play the second and then repeating quests each one taking me about 3-4 hours without a single death). Back in my younger days when the NES was around I didn't have the disposable income for games I do now, and I had more time. Right now I have eight or nine games I haven't touched yet, that will likely grow to over a dozen by the time I finish Morrowind(my wife and kids all game too) and DooM3 and HL2 are looming. More then likely, I won't play through any of them more then once for some time(most of them I'll never play through more then once actually) until we hit a lull in terms of decent games hitting and then I'll play back through the ones I really like. Occasionaly this gets screwed up, last vacation I took I had a stack of games and started with KoTOR(XBox, when it first shipped) intending to play through all the games I had backed up. Beat KoTOR three times, hardly touched the others.

For shooters unless they are really good I play through them once or twice. There are exceptions, HL I have played through probably a dozen times, Halo half a dozen, SS and SS:SE probably four times each(not including getting bored and firing up the final level- on either one, for some quick reflex fun). Exploring them to find everything isn't something I tend to do, not on a shooter anyway. I don't have enough time and rarely is it worth the effort(a few of the SS/SS:SE secrets I really liked- the movie set and Santa I really got a kick out of).

Blastman-

You assume NV?s application detection?s are inserting legitimate optimizations.

Why wouldn't I?

But if NV has legitimate application specific optimizations they can use -- why did they resort to clip planes in 3Dmark2003?????

If nV can figure out how to put clip planes in every game without being visible all the power in the world to them. ATi and nVidia both publicize the fact that they try and reach that goal(HSR- HZ etc) and PowerVR's entire architecture is designed around doing exactly that. Not drawing what isn't seen in real time is ideal, not something you try and avoid. The 3DM2K3 cheat that they did use was an overzealous optimization done by someone who shouldn't have been doing it. Since that time if you look at where nV's parts are performing after FM's official supported testing methods(approved drivers/patch etc) NV's performance is up significantly from prior to the static clip planes cheat. There are an awful lot of ways of optimizing without cheating even in on a per app basis.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: BenSkywalker
When I set app preference in the control panel I get whatever level of AF I select in the app. Big slider, top says 'Max Aniso', does it for me(although I know my drivers are working properly).

right, but if you do set it in the control panel and leave the slider alone you get your control panel settings and for ati's quality it is full trilinear.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
What does it say in FutureMark's rules about detecting 3DMark? A great big don't do it.

A slight performance hit is the dire consequnce you are so afraid of....?
20% isn't slight. Besides, that's not really what this is about. This is about how fragile application detection and optimizations are.

Any you would be opposed to having a switch that loads the optimal profile for SS3 built in?
And where are the switches for nVidia's profile that allows users to control it? Detecting "UT2003.exe" is not a profile or a switch, it's a cheat.

you would rather dance around to different drivers for optimal performance?
I wouldn't dance at all; I'd use the latest version.

Why wouldn't I?
Because all evidence points to the contrary.

Most of fiasco since the NV30 has been dodgy at best and blatant cheating at worst and at this stage nVidia has almost zero credibility. I mean you really have to wonder when their PR puts up optimization guidelines and the very next driver release breaks all of them.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
What does it say in FutureMark's rules about detecting 3DMark? A great big don't do it.

So Futuremark is now a deity unto which you worship? Think for yourself and rationalize the logic, or complete lack of in this case, behind them excluding PVR.

This is about how fragile application detection and optimizations are.

FM went so far as to recode their shaders to work around nV's compiler to slow down the FX with the last patch(feel free to check that out for yourself, they reordered code as nV's optimizations worked with all apps that used the same code). Why don't you find a single game that this has happened to. If these optimizations are one hundredth as fragile as you make them out to be you should be able to list dozens of games easily. I've already pointed you to the parts you should focus on, I've done everything for your end of this discussion except found any evidence of app specific optimizations being fragile with the sole exception of when a company tries to break them. Normal application updates will not have this sort of impact, and I have several years a slew of drivers and hundreds of games to back me up on this. You have FM.

And where are the switches for nVidia's profile that allows users to control it? Detecting "UT2003.exe" is not a profile or a switch, it's a cheat.

The UT2K3 specific optimization is long gone. And yes, detecting UT2K3 would be a profile(by definition actually). You still have not given a single good reason why this should be considered a cheat except that FM has told you how you will think and you follow them blindly, or talk of fragile optimizations that break so easily without giving a single example of this happening unless the developers attempt to do exactly that explicitly.

Because all evidence points to the contrary.

Except it doesn't. Even using FutureMark solely, who seem to tell you how to think, nVidia's performance is up ~30%-40% without clip planes.

Most of fiasco since the NV30 has been dodgy at best and blatant cheating at worst and at this stage nVidia has almost zero credibility.

Has almost zero credibility to you. Do you realize that nVidia has increased their marketshare a decent amount since the launch of the NV30? They have grown in strength in the industry in reality, very far removed from what you think is happening. They have lost credibility with you and people who think like you but that is really a very short term issue. As soon as the situation changes and the IHV you favor is doing the same thing it will be fine.

One other point I want to bring up to you. Do you realize that in the professional market having a driver profile that is entirely based around what application you are using is a major feature? Not talking about $200-$500 consumer parts, but the $2K-$5K professional parts. Exactly what you are lamenting is not only tollerated but people pay quite a bit extra to make sure they get it. For the record, ATi does it right along with anyone else. You want to start blasting E&S, SGI, HP, Sun et al for doing what you are showing such opposition to? Please start blasting ATi for the same thing in all of your posts from now on, or explain in detail why you don't think they should be. And for the record, ATi focuses their app specific profiles for the same applications that are most commonly used in pro 3D benches.
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
Quote

--------------------------------------------------------------------------------
Originally posted by: VIAN
I expect nVIDIA to pull ahead in this next battle. I feel that they are being tight lipped for a very good reason.
--------------------------------------------------------------------------------



blind fanboyism?
The KEY word there was EXPECT.

lol Ben, sure it is quite possible that nvidia could take the lead. but best i can tell VIAN saying that he thinks they will without any supporting evidence is nothing more than wishful thinking. also, if you are going with the opinion that the r420 is a "refresh" you idea of history leads you to presume the performance increase will be similar to the r350 and r360 refreshes you are greatly underestimating the situation.
Question, do you have any supporting evidence for what you just said? You know, you sound a lot like a ATIboy to me. Blind fanboyism? I am just as blind as you are, we are just going to have to wait and see.

na the r420 is confirmed to be .13m, so so it is clearly more than a refresh of the .15m r3xx core. it is more of a bastard child of the r3xx, being basically a souped up double rv360 and the rv3xx being a half r3xx redesigned for the .13m process. regardless, how the r420 will compare to the nv40 a matter of idle speculation for us until we start to see some solid confirmation of specs.
Well, I remember AMD releasing a .13u named the thourobred which was the same core as the .15u palomino. New manufacturing process, that's all you got. Not enough info for your thought. Keep daydreaming.

About the PowerVR, have heard that they were releasing a new chip, but have never actually dealt with one besides the Dreamcast GPU, which was awsome. Great graphics, great power, rather look at that than the Graphics Synthesizer.

i didn't say lier, i said moron. it was completely idiotic to believe that ati would undercut their own product line by releasing a midrange card that keeps up with their top shelf offerings, even if it is true that someone from ati privately made the claim. more than likely the issue was nothing more than a mater of miscommunication or confusion which seems to come up more and more often in the articles here at anandtech. regardless, ati never publicly made such a claim; it was nothing more than hearsay.
How is 9700 Pro top shelf, when they just finished releasing their new 9800 Pro. 9700 Pro is old stuff.

Every time someone buys an ATi card you get so excited you go out and kick a puppy. Noone can disprove this to me, therefore it obviously contains some truth, right?
L O L

Quote

--------------------------------------------------------------------------------
Think about it- Futuremark releases a patch and approves them for use in all drivers saying that they have eliminated all invalid optimizations.
--------------------------------------------------------------------------------


Except that didn't happen. Build 340 was always accompanied with a list of valid drivers that it was tested on.
Valid drivers? Futuremark validated 52.16, but says that it can only be compared to other Nvidida cards. What kind of BS is that? Was there a purpose to not valid other drivers then, or even to look for Nvidia cheats? Futuremark is BS.

To save rehashing of the thread in here, the final conclusion was that FRAPS 2.0 was showing AA in nVidia's screenshot where there was no AA applied onscreen. OTOH FRAPS 2.0 correctly captured the output of the ATi AA and non-AA modes.

I wouldn't class it as concrete evidence since it's not done by an official hardware site, but something is definitely fishy if FRAPS 2.0 is pulling correct AA out of nowhere when nVidia's on-screen image shows none.
Yeah, well, that time that ATI misplaced a shadow in Splinter Cell, I believe that it was a cheat to get better performance. -Couldn't it be a driver issue? Nvidia has many as does ATI.

BTW Ben, what are your thoughts on the XGI cheats, cheats which are the same kind as some of nVidia's have been?

Using your previously employed logic I'd expect you think XGI's application dectection to be perfectly acceptable and their reduced image quality as simple hacks, right? And the fact that a completely different picture is painted by simply renaming the executable is of no consequence too, right?

So in short, you don't believe XGI are cheating at all.

Correct?
Obviously, XGI thought that their IQ was so high, that they needed to lower it so that it may be comparable to the competition. How is that a cheat? You know what, its a bad cheat. GO to your room. Even prior knowledge and such is not cheating. What about when Developers optimize code for a piece of hardware is that cheating, unfare play, I think that shouldn't be done either. The P4 sucked until certain benchmarks and apps started coding for it, oops Intel cheats now. Its not handling things in real time, because the application is detecting the chip.

Cries of fanboyism aside, I find it interesting that XGI did a vaguely similar trick to what nV did a few months ago with their budget cards (can't remember if it was the GF4MX or 5200, and if D-L or iXBT noted it). Remember their alternating-frame AA trickery? IIRC, it was subsituting 1x2 AA on one frame and 2x1 on the next for 4x AA. I wonder if IHVs still think they can slip this stuff by 3D fansites, or if they just don't think the attention generated matters (in which case, why send out review samples at all)?

Consider these more like bemused rhetorical questions than actual conversation starters. I don't mean to stir up the pot against a particular IHV, I'm just thinking about the industry as a whole. (Thinking of the industry as a whole is part of what got me so upset about nV's actions in 3DM in the first place--fleecing customers hurts everyone in the industry, IMO. At some point, it's got to get more financially attractive to be honest than to attempt to deceive, though perhaps I'm being overly optimistic about the feasability of advancing 3D tech apace.)
Could it at all be possible that they were trying out different ways to create the same image with better performance. Now, we won't call NTSC cheaters for implementing interlacing won't we? They could just have been seeing the reaction from the public, and hey, for such a POS card, it's better to have something than nothing at all.

Quote

--------------------------------------------------------------------------------
No negative impact to the end user, it should be perfectly fine.
--------------------------------------------------------------------------------


There's no negative impact of static clip planes if the user doesn't move the camera, therefore it should be perfectly fine.
Increasing performance only in benchmarks but not in gameplay has no negative impact on the user. Therefore it should be perfectly fine.
You know why the arguement that it has a negative impact on is stupid. The competition can do it too. Why is it called cheating anyway, both can do it and then we will ultimately see who is faster. You can't cheat on this. There is no unfair advantage if anyone can do it.

 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
So Futuremark is now a deity unto which you worship?
If you can't follow an argument then don't ask questions about it. I am sick and tired of your constant dancing around issues and ignoring facts.

PowerVR has violated one FutureMark's rules. Therefore PowerVR is not certified. Put any spin on it you like, dance all you like but that isn't going to change that fact.

FM went so far as to recode their shaders to work around nV's compiler to slow down the FX with the last patch(feel free to check that out for yourself, they reordered code as nV's optimizations worked with all apps that used the same code).
They did no such thing. They made trivial changes which should not have made any changes to genuine optimizations. In fact we had this discussion before and you made the ludicrous claim that it's quite normal for any compiler to make such ridiculous assumptions.

Why don't you find a single game that this has happened to.
Excuse me? Were you asleep when the links to Unwinder's anti-detection tests were posted? Were you also asleep when custom benchmark results were posted showing nVidia taking a nose-dive in benchmarks they usually dominated?

No, you weren't asleep. The reason why you didn't see it is because it's not possible to post any evidence to someone who thinks nVidia can do no wrong.

The UT2K3 specific optimization is long gone.
Ah, so it's an optimization now is it? Not even a hack? It's been elevated by yourself yet again to make nVidia look good?

And yes, detecting UT2K3 would be a profile(by definition actually).
Really? I thought you said it was a hack? Now it's a profile? Interesting.

The next time we have this discussion I suspect you'll elevate it to a generic optimization, claim that there was no application detection to begin with and that in fact some ATi PR monkey was responsible for spreading the myth (possibly even Gabe!).

Except it doesn't.
It most certainly does with you being the sole exception.

Do you realize that nVidia has increased their marketshare a decent amount since the launch of the NV30?
(1) So have ATi.
(2) And?

As soon as the situation changes and the IHV you favor is doing the same thing it will be fine.
Says who?

Do you realize that in the professional market having a driver profile that is entirely based around what application you are using is a major feature?
Yes and you can usually control the profile in the control panel in case it doesn't work. Now show me how to control nVidia's "profiles" (and no, renaming the executable isn't "controlling" them).

Exactly what you are lamenting is not only tollerated but people pay quite a bit extra to make sure they get it.
And that's exactly why it happens. Now tell me, who in the consumer space is paying for nVidia to cherry pick their applications and code them so they make their hardware look better than it really is? Having a dozen or so professional applications is a lot different to dealing with 365+ games a year.

I'm done with this, I really am. I can see there is simply no way to ever change your pro-nVidia bias. I don't know what level of interest you have the company but there is simply no convincing you otherwise, not even in the face of overwhelming public evidence.

So post a reply if you like but like I said, I'm done with this because there is simply nothing more to be achieved here.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Valid drivers? Futuremark validated 52.16, but says that it can only be compared to other Nvidida cards.
And that changes nVidia's cheating how exactly?

I admit though, it is a little strange that FutureMark is listing those drivers as valid. I'm guessing if they didn't then there would be no certified drivers at all from nVidia so they've chosen the ones that cheat the least.

Yeah, well, that time that ATI misplaced a shadow in Splinter Cell, I believe that it was a cheat to get better performance. -Couldn't it be a driver issue? Nvidia has many as does ATI.
Driver issues that magically cause screenshot programs to produce AA when there is none?

Obviously, XGI thought that their IQ was so high, that they needed to lower it so that it may be comparable to the competition. How is that a cheat?]
That is total and utter garbage.

The P4 sucked until certain benchmarks and apps started coding for it, oops Intel cheats now. Its not handling things in real time, because the application is detecting the chip.
How is Intel cheating if developers choose to optimize their applications for their architecture? Do you even understand the relationships between code that shares something vs code that is shared?

Could it at all be possible that they were trying out different ways to create the same image with better performance.
Yes, it would be possible if the person lacked even basic understanding of the issues at hand.

Now, we won't call NTSC cheaters for implementing interlacing won't we?
Except NTSC has precisely nothing to do with anything being discussed here.

You know why the arguement that it has a negative impact on is stupid. The competition can do it too. Why is it called cheating anyway, both can do it and then we will ultimately see who is faster. You can't cheat on this. There is no unfair advantage if anyone can do it.
rolleye.gif
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: BFG10K
Valid drivers? Futuremark validated 52.16, but says that it can only be compared to other Nvidida cards.
And that changes nVidia's cheating how exactly?

I admit though, it is a little strange that FutureMark is listing those drivers as valid. I'm guessing if they didn't then there would be no certified drivers at all from nVidia so they've chosen the ones that cheat the least.

also is the fact that the 340 patch dissabled all the cheats aside from the ps2.0 shader test, and the ps2.0 shader test doesn't effect the overall score.
 

reever

Senior member
Oct 4, 2003
451
0
0
Do you realize that nVidia has increased their marketshare a decent amount since the launch of the NV30?
\

Actually that is false. They have done nothing but give up marketshare to ATI in all but one sector. Their overall marketshare numbers are less than 4 percent apart
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
I've given up trying to figure out FM already.

Quote

--------------------------------------------------------------------------------
The P4 sucked until certain benchmarks and apps started coding for it, oops Intel cheats now. Its not handling things in real time, because the application is detecting the chip.
--------------------------------------------------------------------------------


How is Intel cheating if developers choose to optimize their applications for their architecture? Do you even understand the relationships between code that shares something vs code that is shared?
The developers are cheating AMD, because it is unfair to them. Cheating is something that only one person can do making it unfair to the competition.

cheat ( P ) Pronunciation Key (cht)
v. cheat·ed, cheat·ing, cheats
v. tr.
To deceive by trickery; swindle: cheated customers by overcharging them for purchases.
To deprive by trickery; defraud: cheated them of their land.
To mislead; fool: illusions that cheat the eye.
To elude; escape: cheat death.

v. intr.
To act dishonestly; practice fraud.
To violate rules deliberately, as in a game: was accused of cheating at cards.
Informal. To be sexually unfaithful: cheat on a spouse.

n.
An act of cheating; a fraud or swindle.
One who cheats; a swindler.
Law. Fraudulent acquisition of another's property.
Botany. An annual European species of brome grass (Bromus secalinus) widely naturalized in temperate regions.

Now tell me that ATI doesn't fit into one of these definitions.

Quote

--------------------------------------------------------------------------------
Could it at all be possible that they were trying out different ways to create the same image with better performance.
--------------------------------------------------------------------------------


Yes, it would be possible if the person lacked even basic understanding of the issues at hand.
What did I misunderstand?

Quote

--------------------------------------------------------------------------------
Now, we won't call NTSC cheaters for implementing interlacing won't we?
--------------------------------------------------------------------------------


Except NTSC has precisely nothing to do with anything being discussed here.
I was relating NTSCs Interlacing as a cheat in order to give the illusion that there are more frames than there are. Nvidia using that AA in one frame and not in the other, if frames are fast enough may not be noticeable and may appear to look smoother than none at all, just like interlacing showing half the image on one frame and the rest on another.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
I'm done with this, I really am. I can see there is simply no way to ever change your pro-nVidia bias.

You blast XGi without merit I defend them.

You blast PowerVR without merit I defend them.

You blast NV without merit I defend them.

This equals pro nV bias in your eyes, interesting.

They did no such thing.

They reordered the shader instructions. Go over to B3D and check it out, they have FM coders on the boards over there.

Excuse me? Were you asleep when the links to Unwinder's anti-detection tests were posted?

And when did Unwinder start releasing game patches? Your assertion has been that a game update breaking the optimizations is the reason why they aren't any good. Give an example of it.

The next time we have this discussion I suspect you'll elevate it to a generic optimization, claim that there was no application detection to begin with and that in fact some ATi PR monkey was responsible for spreading the myth (possibly even Gabe!).

You start calling ATi's AF a hack and I'll start using the term again for nV. NV's UT2K3 'cheat' was still doing more filtering then ATi's AF does on the overwhelming majority of games(although not UT2K3 in particular).

Now tell me, who in the consumer space is paying for nVidia to cherry pick their applications and code them so they make their hardware look better than it really is?

Everyone is. nV pulls it off with the most popular titles as does ATi(compare the 9800 to the Ti4200 in NWN).

Having a dozen or so professional applications is a lot different to dealing with 365+ games a year.

And you take the 12 most popular game engines and you cover ~90% of actual game unit sales.

Yes and you can usually control the profile in the control panel in case it doesn't work.

Show me where this has happened from a game update. You keep talking about this like it has happened or because if you try to you can break it. Why not go through and deoptimize the graphics engine of the game, makes as much sense.

I don't know what level of interest you have the company but there is simply no convincing you otherwise, not even in the face of overwhelming public evidence.

It's not one company as you seem forget quite easily. Your stance is actually far more damaging to PowerVR then it is to nVidia and I've been defending both of them against your attacks.

So post a reply if you like but like I said, I'm done with this because there is simply nothing more to be achieved here.

I've told you already that this was going to be a pointless conversation. You want to blast everyone but ATi, I'm not going to change the stance I've held for years, including when ATi was on the hot seat, because they are the popular bandwagon now.

Reever-

Actually that is false. They have done nothing but give up marketshare to ATI in all but one sector. Their overall marketshare numbers are less than 4 percent apart

Check the full market numbers and ATi is still in third place with a ways to go to number two. nVidia just managed to get in to first place a couple of quarters ago.