BFG10K
Lifer
- Aug 14, 2000
- 22,709
- 3,003
- 126
That's not my problem.because no one seems to care but about 2000 geeks across various bulletin boards.
That's not my problem.because no one seems to care but about 2000 geeks across various bulletin boards.
Originally posted by: TheSnowman
sure enough, and a big white mess of snow isn't exactly the optimal texture to show the difference either. regardless, cod is an opengl game and as i stated above both ati and nvidia do do full triliniear in opengl. so no Acanthus, you havn't proven anything but the fact that you don't even understand what you are talking about.![]()
Originally posted by: BenSkywalker
My mistake on that part, you are correct on the first mip transition being trilinear, the rest isn't accurate unless ATi has changed how their AF works with the latest drivers.
the only changes they have made to af on their dx9 cards is the biliniar the 2+ mipmap stages for d3d and fixing quality af force triliniar for af in opengl regardless of aplication settings. the former was after nvidia went to briliniar and the latter was way back in the 2.3s or so.
Originally posted by: BenSkywalker
Their texture filtering has never been that good. Too much aliasing, too many angles they don't touch.
lol, i know you never liked it and i will freely admint that the af is better on the fx cards at certian angles and the geforce3's and geforce4's af looks better beter yet, but many people would argue that your comment that it "has never been that good". personaly i think it is quite good and what image quality that is lost is well worth the smaller performace hit. regardless, the af "touch"s all angles, maybe not enough to suit you but to say that they "don't touch" certian angles is flat out wrong.
Originally posted by: BenSkywalker
Only if they changed this in the latest drivers.
that is not the case. full trilniear in opengl has always been an option as long as the app requests triliniar and has been working in apps that don't for well over a year. see full triliniar af in opengl shots at beyond3d.
Originally posted by: BenSkywalker
Only if the app has selections for AF and trilinear. If you request trilinear and force AF you still get the hack.
doh, my mistake here. i had been useing rtool to force proper trilinear with af in all d3d apps sense ati started useing their hack but i went to test things out and sure my memory was flawed on this point.
Originally posted by: Acanthus
What game do you suggest that has explicit bilinear and trilinear settings. Out of all the games i have Call of Duty was the only one with the option.
well ut2003 has a usetrilinear=true/flase in the .ini to let you toggle between bilinear and trilinear and a levelofanisotropy= setting as well, but wether you turn up the in game settings or force quality af in the control panel you will never get trilinear, but only brillinear. if you don't have ut2003 or just want to read up about the sitiation more, i dug up this artcle here:
http://www.3dcenter.de/artikel/ati_nvidia_treiberoptimierungen/index_e.php
Heck, I use bilinear AF all the time and I'm more than happy with the IQ.
FutureMark's goal is to create a consistent synthetic benchmark whose results can be reasonably compared across the board.
FutureMark does class them as cheats.
Well it depends what games. The newer and most popular titles will probably have a favourable bias over the older and non-popular titles. I mean the older titles can often get by with brute force if they have to.
It depends on how they do it.
As opposed to "works best on nVidia" logos plastering the game splash screens, games where often ATi is the leader?
I didn't but a few people have been complaining about hard locking. Also some have been having fog/haze problems in ET and MOHAA.
One of the official 4xxx drivers could consistently hardlock Serious Sam 2 on my Ti4600. I've also had one crash on my 9700 Pro in Combat Flight Simulator 3.
lol, i know you never liked it and i will freely admint that the af is better on the fx cards at certian angles and the geforce3's and geforce4's af looks better beter yet, but many people would argue that your comment that it "has never been that good".
regardless, the af "touch"s all angles, maybe not enough to suit you but to say that they "don't touch" certian angles is flat out wrong.
that is not the case. full trilniear in opengl has always been an option as long as the app requests triliniar and has been working in apps that don't for well over a year. see full triliniar af in opengl shots at beyond3d.
well ut2003 has a usetrilinear=true/flase in the .ini to let you toggle between bilinear and trilinear and a levelofanisotropy= setting as well, but wether you turn up the in game settings or force quality af in the control panel you will never get trilinear, but only brillinear. if you don't have ut2003 or just want to read up about the sitiation more, i dug up this artcle here:
Originally posted by: BenSkywalker<brThere is no AF applied at certain angles at all, it is straight isotropic filtering. I think it was XBit that had a review up of S3's new part, check out the half pipe filtering test they have and look at ATi's images.
Originally posted by: BenSkywalker
They have AF selected in the application there. SeriousSam(both of them) have a menu option for AF which does allow you to get AF+full trilinear.
Originally posted by: BenSkywalker
That's not just UT2K3, that's every D3D app with nV right now. Running AF under D3D you don't get trilinear as of now, only brilinear. I need a bit more hands on time with it before I pass full judgement, but a lot of people couldn't pick out PVR's box filter and it jumped out of the screen and slapped me across the face......hard![]()
So FutureMark is supposed to accept cheating drivers because they cheat in other games too?If PowerVR does the exact same thing for 3DM that they do for every game, why the hell shouldn't it be considered valid?
I want full speed but I don't want the problems caused by application detection. Say a new version of Doom 3 is released and the optimizations fall over. What then? I can wait until the next drivers come and hopefully put performance back to how it was or I can use the old version of Doom3 and be forced to endure the problems that have been fixed in the newer version.You really going to try and tell me that you wouldn't want full speed?
You're making the flawed assumption that it's not possible to match or better such a speed gain by legitimate means.You would be willing to give up ~30% performance(for this hypothetical) for no reason other then it requires a driver profile for the game?
If it benefits more than Doom3 then it's more than just simple application detection.With D3 in particular odds are quite high that you will own a decent sized pile of games based on the engine over the next few years(I'd bet on it actually), and the D3 profile will almost certainly offer benefits to any of the games based on the engine also.
Yes, it was at Ars.Do you recall which forum it was on?
I think it was 44.43 but I'm really not sure as it was almost two years ago now.Do you recall which 4x.xx it was?
SS:SE isn't that short. In fact it's quite a decent length, especially if you explore all the secret areas and play it on the hardest difficulty level.I'll install them and give SS:SE another play through(fun game and short).
scroll down the page on that b3d link and you will see images from xmas test app and there is anistropy at all angles.
but i wasn't refering to serious sam, again scroll down the page a bit.
So FutureMark is supposed to accept cheating drivers because they cheat in other games too?
FutureMark's rule is "no cheating in 3DMark allowed". The rule isn't "no cheating in 3DMark allowed unless you also cheat everywhere else too".
Say a new version of Doom 3 is released and the optimizations fall over. What then?
You're making the flawed assumption that it's not possible to match or better such a speed gain by legitimate means.
SS:SE isn't that short. In fact it's quite a decent length, especially if you explore all the secret areas and play it on the hardest difficulty level.
Originally posted by: BenSkywalker
Uhhh, download that app and check it out for yourself and then compare it to whar XBit was looking at.
Originally posted by: BenSkywalker
And they enable AF through the test app there too.
if you want me to look you should post a link.
no the app works off of controlpanel settings, download it and try it for yourself if you like.
They use application detection don't they?You have nothing remotely resembling any proof that PVR is cheating in anything,
Build 340 of 3DMark03.Site me an example. Show me a single time that that has happened and I'll start applying weight to it.
That's true and it''s probably be the case that ATi will put the bias onto Doom3 by the time it rolls out.I'm stating that the optimal driver path for D3 is not going to be the same as the optimal driver path for SeriousSam3 and you can quote me on that![]()
4.5 hours? It took me 12-15 on the hardest diffculty.My first play through the game took me ~4.5 hours on one notch above Normal.
I tend to finish shooters at least five times and the most I've finished a game is Quake (eleven times). How many times do you finish your games?Get away from shooters some time and you'll realize how sadly lacking they are in length
Another way to think of it, what if ATi came out with a DooM3 specific optimization that increased the performance of their parts by ~30% while outputting identical quality, would that be a problem? I certainly wouldn't think so. I would think that you would be in favor of that even moreso then I as you are the one that is going to have to deal with the performance on the R3x0 cores in that title(unless you plan on stepping up to the next gen).
as for Xmas's aa tester, sence you have it you should be able to quickly see for yourself that it goes by controlpanel settings and doesn't give you any af when the controlpanel is set to application preference.
They use application detection don't they?
Build 340 of 3DMark03.
That's true and it''s probably be the case that ATi will put the bias onto Doom3 by the time it rolls out.
4.5 hours? It took me 12-15 on the hardest diffculty.
Did you just race through the entire game or did you actually take the time to explore everything? How many sceret levels and areas did you find? Did you kill all of the hidden monsters?
I tend to finish shooters at least five times and the most I've finished a game is Quake (eleven times). How many times do you finish your games?
You assume NV?s application detection?s are inserting legitimate optimizations.
But if NV has legitimate application specific optimizations they can use -- why did they resort to clip planes in 3Dmark2003?????
Originally posted by: BenSkywalker
When I set app preference in the control panel I get whatever level of AF I select in the app. Big slider, top says 'Max Aniso', does it for me(although I know my drivers are working properly).
right, but if you do set it in the control panel and leave the slider alone you get your control panel settings and for ati's quality it is full trilinear.
What does it say in FutureMark's rules about detecting 3DMark? A great big don't do it.
20% isn't slight. Besides, that's not really what this is about. This is about how fragile application detection and optimizations are.A slight performance hit is the dire consequnce you are so afraid of....?
And where are the switches for nVidia's profile that allows users to control it? Detecting "UT2003.exe" is not a profile or a switch, it's a cheat.Any you would be opposed to having a switch that loads the optimal profile for SS3 built in?
I wouldn't dance at all; I'd use the latest version.you would rather dance around to different drivers for optimal performance?
Because all evidence points to the contrary.Why wouldn't I?
What does it say in FutureMark's rules about detecting 3DMark? A great big don't do it.
This is about how fragile application detection and optimizations are.
And where are the switches for nVidia's profile that allows users to control it? Detecting "UT2003.exe" is not a profile or a switch, it's a cheat.
Because all evidence points to the contrary.
Most of fiasco since the NV30 has been dodgy at best and blatant cheating at worst and at this stage nVidia has almost zero credibility.
The KEY word there was EXPECT.Quote
--------------------------------------------------------------------------------
Originally posted by: VIAN
I expect nVIDIA to pull ahead in this next battle. I feel that they are being tight lipped for a very good reason.
--------------------------------------------------------------------------------
blind fanboyism?
Question, do you have any supporting evidence for what you just said? You know, you sound a lot like a ATIboy to me. Blind fanboyism? I am just as blind as you are, we are just going to have to wait and see.lol Ben, sure it is quite possible that nvidia could take the lead. but best i can tell VIAN saying that he thinks they will without any supporting evidence is nothing more than wishful thinking. also, if you are going with the opinion that the r420 is a "refresh" you idea of history leads you to presume the performance increase will be similar to the r350 and r360 refreshes you are greatly underestimating the situation.
Well, I remember AMD releasing a .13u named the thourobred which was the same core as the .15u palomino. New manufacturing process, that's all you got. Not enough info for your thought. Keep daydreaming.na the r420 is confirmed to be .13m, so so it is clearly more than a refresh of the .15m r3xx core. it is more of a bastard child of the r3xx, being basically a souped up double rv360 and the rv3xx being a half r3xx redesigned for the .13m process. regardless, how the r420 will compare to the nv40 a matter of idle speculation for us until we start to see some solid confirmation of specs.
How is 9700 Pro top shelf, when they just finished releasing their new 9800 Pro. 9700 Pro is old stuff.i didn't say lier, i said moron. it was completely idiotic to believe that ati would undercut their own product line by releasing a midrange card that keeps up with their top shelf offerings, even if it is true that someone from ati privately made the claim. more than likely the issue was nothing more than a mater of miscommunication or confusion which seems to come up more and more often in the articles here at anandtech. regardless, ati never publicly made such a claim; it was nothing more than hearsay.
L O LEvery time someone buys an ATi card you get so excited you go out and kick a puppy. Noone can disprove this to me, therefore it obviously contains some truth, right?
Valid drivers? Futuremark validated 52.16, but says that it can only be compared to other Nvidida cards. What kind of BS is that? Was there a purpose to not valid other drivers then, or even to look for Nvidia cheats? Futuremark is BS.Quote
--------------------------------------------------------------------------------
Think about it- Futuremark releases a patch and approves them for use in all drivers saying that they have eliminated all invalid optimizations.
--------------------------------------------------------------------------------
Except that didn't happen. Build 340 was always accompanied with a list of valid drivers that it was tested on.
Yeah, well, that time that ATI misplaced a shadow in Splinter Cell, I believe that it was a cheat to get better performance. -Couldn't it be a driver issue? Nvidia has many as does ATI.To save rehashing of the thread in here, the final conclusion was that FRAPS 2.0 was showing AA in nVidia's screenshot where there was no AA applied onscreen. OTOH FRAPS 2.0 correctly captured the output of the ATi AA and non-AA modes.
I wouldn't class it as concrete evidence since it's not done by an official hardware site, but something is definitely fishy if FRAPS 2.0 is pulling correct AA out of nowhere when nVidia's on-screen image shows none.
Obviously, XGI thought that their IQ was so high, that they needed to lower it so that it may be comparable to the competition. How is that a cheat? You know what, its a bad cheat. GO to your room. Even prior knowledge and such is not cheating. What about when Developers optimize code for a piece of hardware is that cheating, unfare play, I think that shouldn't be done either. The P4 sucked until certain benchmarks and apps started coding for it, oops Intel cheats now. Its not handling things in real time, because the application is detecting the chip.BTW Ben, what are your thoughts on the XGI cheats, cheats which are the same kind as some of nVidia's have been?
Using your previously employed logic I'd expect you think XGI's application dectection to be perfectly acceptable and their reduced image quality as simple hacks, right? And the fact that a completely different picture is painted by simply renaming the executable is of no consequence too, right?
So in short, you don't believe XGI are cheating at all.
Correct?
Could it at all be possible that they were trying out different ways to create the same image with better performance. Now, we won't call NTSC cheaters for implementing interlacing won't we? They could just have been seeing the reaction from the public, and hey, for such a POS card, it's better to have something than nothing at all.Cries of fanboyism aside, I find it interesting that XGI did a vaguely similar trick to what nV did a few months ago with their budget cards (can't remember if it was the GF4MX or 5200, and if D-L or iXBT noted it). Remember their alternating-frame AA trickery? IIRC, it was subsituting 1x2 AA on one frame and 2x1 on the next for 4x AA. I wonder if IHVs still think they can slip this stuff by 3D fansites, or if they just don't think the attention generated matters (in which case, why send out review samples at all)?
Consider these more like bemused rhetorical questions than actual conversation starters. I don't mean to stir up the pot against a particular IHV, I'm just thinking about the industry as a whole. (Thinking of the industry as a whole is part of what got me so upset about nV's actions in 3DM in the first place--fleecing customers hurts everyone in the industry, IMO. At some point, it's got to get more financially attractive to be honest than to attempt to deceive, though perhaps I'm being overly optimistic about the feasability of advancing 3D tech apace.)
You know why the arguement that it has a negative impact on is stupid. The competition can do it too. Why is it called cheating anyway, both can do it and then we will ultimately see who is faster. You can't cheat on this. There is no unfair advantage if anyone can do it.Quote
--------------------------------------------------------------------------------
No negative impact to the end user, it should be perfectly fine.
--------------------------------------------------------------------------------
There's no negative impact of static clip planes if the user doesn't move the camera, therefore it should be perfectly fine.
Increasing performance only in benchmarks but not in gameplay has no negative impact on the user. Therefore it should be perfectly fine.
If you can't follow an argument then don't ask questions about it. I am sick and tired of your constant dancing around issues and ignoring facts.So Futuremark is now a deity unto which you worship?
They did no such thing. They made trivial changes which should not have made any changes to genuine optimizations. In fact we had this discussion before and you made the ludicrous claim that it's quite normal for any compiler to make such ridiculous assumptions.FM went so far as to recode their shaders to work around nV's compiler to slow down the FX with the last patch(feel free to check that out for yourself, they reordered code as nV's optimizations worked with all apps that used the same code).
Excuse me? Were you asleep when the links to Unwinder's anti-detection tests were posted? Were you also asleep when custom benchmark results were posted showing nVidia taking a nose-dive in benchmarks they usually dominated?Why don't you find a single game that this has happened to.
Ah, so it's an optimization now is it? Not even a hack? It's been elevated by yourself yet again to make nVidia look good?The UT2K3 specific optimization is long gone.
Really? I thought you said it was a hack? Now it's a profile? Interesting.And yes, detecting UT2K3 would be a profile(by definition actually).
It most certainly does with you being the sole exception.Except it doesn't.
(1) So have ATi.Do you realize that nVidia has increased their marketshare a decent amount since the launch of the NV30?
Says who?As soon as the situation changes and the IHV you favor is doing the same thing it will be fine.
Yes and you can usually control the profile in the control panel in case it doesn't work. Now show me how to control nVidia's "profiles" (and no, renaming the executable isn't "controlling" them).Do you realize that in the professional market having a driver profile that is entirely based around what application you are using is a major feature?
And that's exactly why it happens. Now tell me, who in the consumer space is paying for nVidia to cherry pick their applications and code them so they make their hardware look better than it really is? Having a dozen or so professional applications is a lot different to dealing with 365+ games a year.Exactly what you are lamenting is not only tollerated but people pay quite a bit extra to make sure they get it.
And that changes nVidia's cheating how exactly?Valid drivers? Futuremark validated 52.16, but says that it can only be compared to other Nvidida cards.
Driver issues that magically cause screenshot programs to produce AA when there is none?Yeah, well, that time that ATI misplaced a shadow in Splinter Cell, I believe that it was a cheat to get better performance. -Couldn't it be a driver issue? Nvidia has many as does ATI.
That is total and utter garbage.Obviously, XGI thought that their IQ was so high, that they needed to lower it so that it may be comparable to the competition. How is that a cheat?]
How is Intel cheating if developers choose to optimize their applications for their architecture? Do you even understand the relationships between code that shares something vs code that is shared?The P4 sucked until certain benchmarks and apps started coding for it, oops Intel cheats now. Its not handling things in real time, because the application is detecting the chip.
Yes, it would be possible if the person lacked even basic understanding of the issues at hand.Could it at all be possible that they were trying out different ways to create the same image with better performance.
Except NTSC has precisely nothing to do with anything being discussed here.Now, we won't call NTSC cheaters for implementing interlacing won't we?
You know why the arguement that it has a negative impact on is stupid. The competition can do it too. Why is it called cheating anyway, both can do it and then we will ultimately see who is faster. You can't cheat on this. There is no unfair advantage if anyone can do it.
Originally posted by: BFG10K
And that changes nVidia's cheating how exactly?Valid drivers? Futuremark validated 52.16, but says that it can only be compared to other Nvidida cards.
I admit though, it is a little strange that FutureMark is listing those drivers as valid. I'm guessing if they didn't then there would be no certified drivers at all from nVidia so they've chosen the ones that cheat the least.
\Do you realize that nVidia has increased their marketshare a decent amount since the launch of the NV30?
The developers are cheating AMD, because it is unfair to them. Cheating is something that only one person can do making it unfair to the competition.Quote
--------------------------------------------------------------------------------
The P4 sucked until certain benchmarks and apps started coding for it, oops Intel cheats now. Its not handling things in real time, because the application is detecting the chip.
--------------------------------------------------------------------------------
How is Intel cheating if developers choose to optimize their applications for their architecture? Do you even understand the relationships between code that shares something vs code that is shared?
What did I misunderstand?Quote
--------------------------------------------------------------------------------
Could it at all be possible that they were trying out different ways to create the same image with better performance.
--------------------------------------------------------------------------------
Yes, it would be possible if the person lacked even basic understanding of the issues at hand.
I was relating NTSCs Interlacing as a cheat in order to give the illusion that there are more frames than there are. Nvidia using that AA in one frame and not in the other, if frames are fast enough may not be noticeable and may appear to look smoother than none at all, just like interlacing showing half the image on one frame and the rest on another.Quote
--------------------------------------------------------------------------------
Now, we won't call NTSC cheaters for implementing interlacing won't we?
--------------------------------------------------------------------------------
Except NTSC has precisely nothing to do with anything being discussed here.
I'm done with this, I really am. I can see there is simply no way to ever change your pro-nVidia bias.
They did no such thing.
Excuse me? Were you asleep when the links to Unwinder's anti-detection tests were posted?
The next time we have this discussion I suspect you'll elevate it to a generic optimization, claim that there was no application detection to begin with and that in fact some ATi PR monkey was responsible for spreading the myth (possibly even Gabe!).
Now tell me, who in the consumer space is paying for nVidia to cherry pick their applications and code them so they make their hardware look better than it really is?
Having a dozen or so professional applications is a lot different to dealing with 365+ games a year.
Yes and you can usually control the profile in the control panel in case it doesn't work.
I don't know what level of interest you have the company but there is simply no convincing you otherwise, not even in the face of overwhelming public evidence.
So post a reply if you like but like I said, I'm done with this because there is simply nothing more to be achieved here.
Actually that is false. They have done nothing but give up marketshare to ATI in all but one sector. Their overall marketshare numbers are less than 4 percent apart