somehow OK when you used to flame nVidia to Hell and back for this sort of thing?
But then how could you flame Ati to hell and back when you were the one defending Nvidia for this sort of thing?
somehow OK when you used to flame nVidia to Hell and back for this sort of thing?
Doh. If I had my FTP server up, I'd set you up with an account, unfortunately it's out with a bad case of the dead hard drives.Originally posted by: CaiNaM
i did a 15-20 sec movie capture in fraps which shows this. it doesn't stand out quite like it does in the game as the image is sized down, however you can definately see what i'm talking about. unfortunately, the .avi is 75mb (zipped - over 100min uncompressed), and dubbing to divx blends the texture in such a way it's useless to use as an example..![]()
Yeah, it's pretty obvious there's a line there even without your highlighting it for the viewer's benefit. The issue at the heart of this is showing there was degradation period. I for one would like to see an option to force trilinear just so we can do the quality comparisons and see if that banding in the screenshot you provided is a result of their "trylinear" mode or something else.edit: i did put up a sshot where you can see the line where the texture detail changes. again, in a still shot, it's not easily noticed, but obvious when you're moving as the line moves directly in front of you (there's actually 3 lines @ 4xaa/16xaf). it's a 2.8mb bmp, but if you have broadband you can view it here.
here some actual avi files you can download, from a guy named tEd. He also knows how to disable try-linear, but wont tell how.
Very obvious mipmap banding has been shown by two separate people at B3D with videos of MP2 and FC. IIRC, they said it wasn't obvious in all games all the time, but obviously there are at least two proven visible defects. Let's see if ATi can correct these "edge cases," or just insert a switch to disable trylinear.Originally posted by: sandorski
It's not an issue of whether he is honest or not. It's all too common for people to attribute one thing as the reason for an issue they may be experiencing. I know I have done it and many threads posted regularly place blame on certain things incorrectly all the time. I'm sorry I am not as accepting of it as others, but as I said ad nauseum already, someone needs to Prove it.
Originally posted by: rbV5
here some actual avi files you can download, from a guy named tEd. He also knows how to disable try-linear, but wont tell how.
Why won't he tell?
The drivers are the ones analyzing the situation and they make the decision when to activate the adaptive trilinear.It does do preset algorithms that are detected, then applied. the drivers are not "smart"
Are you sure that just isn't a symptom of low quality/resolution videos?I can see clearly see shimmering around the mipmap boundaries in both.
Originally posted by: BFG10K
The drivers are the ones analyzing the situation and they make the decision when to activate the adaptive trilinear.It does do preset algorithms that are detected, then applied. the drivers are not "smart"
Are you sure that just isn't a symptom of low quality/resolution videos?I can see clearly see shimmering around the mipmap boundaries in both.
Originally posted by: CaiNaM
Originally posted by: BFG10K
The drivers are the ones analyzing the situation and they make the decision when to activate the adaptive trilinear.It does do preset algorithms that are detected, then applied. the drivers are not "smart"
Are you sure that just isn't a symptom of low quality/resolution videos?I can see clearly see shimmering around the mipmap boundaries in both.
yup, it's clearly visible when moving, and it's clearly visible in the screenshot i linked to earlier. r420 does a very poor job of smoothing the mip transitions compared to nvidia, and even compared to my 9800pro.
it's my hope this situation can be rectified via drivers if ati receives enough "heat" from this issue.
2.The situation with anisotropic filtering is more difficult. First, ATI plays an unpleasant trick when anisotropy is forced from drivers. Filtering is only based on bilinear samples, so, despite a clearly increased sharpness, borders between MIP levels are unpleasantly obvious. While NVIDIA smoothes these borders with partial trilinear filtering (see pictures with coloured levels), ATI shows clear borders in all modes including Quality. But when filtering degree is set from the application and not forced from the drivers, things get back to normal. So, what's the matter? Such problems have been repeatedly observed in game applications. If an application can't manage anisotropy itself, it can be forced from driver settings, and everything will be OK. If an application can do it, then we disable anisotropy in the drivers and activate it from the application. But if anisotropy is activated in both the drivers and the application, we get sharp borders between MIP levels, annihilating all visual quality of anisotropic filtering.
Alexey Barkovoy's note: It is the control panel that is to blame for this in the case of ATI. If you move the Performance/quality slider from the 3D tab to the right and thus select AF/AA forcing, trilinear anisotropic filtering will always be selected in the first texture, and bilinear anisotropic filtering in others. And after you "play" a bit with the slider (select "use custom settings" again and than AF separately), everything will work all right and trilinear filtering won't disappear at anisotropy forcing.
Originally posted by: CaiNaM
texture quality is set to full, and af is set to 16x.
Originally posted by: Matthias99
Originally posted by: CaiNaM
texture quality is set to full, and af is set to 16x.
But did you force it in the drivers, thus possibly overriding it with a trilinear first stage and bilinear on every other texture stage?
Originally posted by: CaiNaM
Originally posted by: Matthias99
Originally posted by: CaiNaM
texture quality is set to full, and af is set to 16x.
But did you force it in the drivers, thus possibly overriding it with a trilinear first stage and bilinear on every other texture stage?
it's not "adjustable" from within the game....
Originally posted by: Shamrock
round 2 for all the naysayers
http://www.ixbt.com/video2/nv40-rx800-5-p1.shtml It has comparison videos, and PLAINLY shows the differences...ATI's are DEFINTELY lowering IQ.
The fact of the matter is that they did try to hide it - it doesn't just look that way. Hopefully, everyone can agree on this by now. In addition, an owner of the card has stated that they do see IQ degradation during gameplay. Whether or not this should have an effect on your decision to go w/ ATi or nVidia remains a different story and will probabaly include other factors, such as price/availability.Originally posted by: Robor
I agree ATI should've announced this feature so it wouldn't have looked like they were trying to hide something but IMO if it improves speed without degrading IQ then it's a feature, not a cheat. And as far as benchmarks go if Nvidia can't produce equal or better IQ without trilinear enabled then they need to have it enabled when they benchmark.
I don't want to suck them down on a low-bandwidth pipe.Check 'em out yourself, they're pretty obvious.
I'm almost certain the next drivers will include a setting to disable it as there are reports of registry keys that can control it in the current beta drives.The fact that they exist, though, is an issue that should prompt either the ability to disable the optimizations.
Okay.yup, it's clearly visible when moving, and it's clearly visible in the screenshot i linked to earlier.
Originally posted by: Matthias99
Originally posted by: CaiNaM
Originally posted by: Matthias99
Originally posted by: CaiNaM
texture quality is set to full, and af is set to 16x.
But did you force it in the drivers, thus possibly overriding it with a trilinear first stage and bilinear on every other texture stage?
it's not "adjustable" from within the game....
Hrmmm. Well, there's a registry hack that's supposed to disable the trilinear optimizations with the 4.6 beta drivers... could you give that a shot and see if it helps? It'd be nice to know if this is specifically what's causing that IQ problem, or if it's some other issue.
Also, does this show up everywhere in DAoC, or just in certain areas? The problems reported in other games (such as Max Payne 2 and Far Cry) seem to be isolated to particular areas/textures...
Originally posted by: nitromullet
CaiNaM: If you're still reading this thread, I have a request...
Will you give the registry tweak to enable full trilinear a try?
http://forums.anandtech.com/messageview.cfm?catid=31&threadid=1327006&enterthread=y
I would be curious to know if the full trilinear registry change fixes the IQ degredation that you have seen in game play. By the way, I have an XT on preorder and I appreciate your honesty in regards to your Pro. You certainly are taking quite a bit of heat for paying $400+ for a video card and being blunt about it's shortcomings.:beer:
Edit: looks like Matthias99 already requested this...
Originally posted by: BFG10K
I don't want to suck them down on a low-bandwidth pipe.Check 'em out yourself, they're pretty obvious.
I'm almost certain the next drivers will include a setting to disable it as there are reports of registry keys that can control it in the current beta drives.The fact that they exist, though, is an issue that should prompt either the ability to disable the optimizations.
Okay.yup, it's clearly visible when moving, and it's clearly visible in the screenshot i linked to earlier.
Originally posted by: CaiNaM
Originally posted by: Matthias99
Also, does this show up everywhere in DAoC, or just in certain areas? The problems reported in other games (such as Max Payne 2 and Far Cry) seem to be isolated to particular areas/textures...
i wouldn't say particular areas as much as i would say particular textures... there's certain textures for the ground that don't have alot of detail, and it does not stand out in those areas - only in areas that have a lot of texture details, like in areas where the ground is made up of pebbles/rocks or something similar.