StanTheMan
Senior member
sorry to interupt mate, thi may be quite off topic. nvidia has just updated the latest driver to 71.84. but then i can't find "refresh rate overide" option in nvidia's newest driver. anybody know where it is?
Originally posted by: THUGSROOK
for me its quite the opposite....
i get a higher framerate using HQ off off off then i do with Q off off off.
Q on on on is the fastest tho.
maybe its just a CSS thing?
🙂
Originally posted by: VIAN
The first set are in the exact same position. Although the second set aren't I assure you that the same thing is still there. You're just in denial. Anyway, you can go test it for yourself. Got COD. Go to the rocket area with Quality enabled and Anisotropic Filtering at 8x and go see for yourself. And then change it to High Quality and notice the clean beautiful surface that you paid much money for. If you read the above carefully it only appears with defined pattern so, it'd be really hard to find it some games that don't have textures like those, but many games also do have textures like those, maybe not everywhere, but they do.oh and i take your screen shots with a grain of salt, since you are in different position in all of them, if u wanna do a comparison you need to be in the exact same posiition to make it count
I can attest to that. But I still believe that ATI has better default IQ than. But if you set Nvidia to High Quality, it will have less, unnoticeable, textures shimmering and better overall image quality, if Trilinear is working.You will certainly see more contrast with an ATi part- their texture filtering accuracy is so low it gives the appearance of increased detail although it is aliasing artifacts more then anything. I've been running my R9800Pro for over a year now and ATi's texture filtering is extremely poor at best. Not saying that the current nV parts are a lot better, they have been on a downward trend since the last part that actually did real anisotropic filtering(the NV2x core parts).
It's missing.hey, where's "refresh rate overide" option in nvidia's newest driver (71.84)?
yes, but Q off off off is consistantly slower then HQ off off off.Originally posted by: TheSnowman
Originally posted by: THUGSROOK
for me its quite the opposite....
i get a higher framerate using HQ off off off then i do with Q off off off.
Q on on on is the fastest tho.
maybe its just a CSS thing?
🙂
Interesting, how much difference was there? Was it withing a reasonable margin of error?
Originally posted by: ifesfor
hmmmmm just UT 2004...
Quality http://pics.bbzzdd.com/users/ifesfor/UT200466.JPG
High quality http://pics.bbzzdd.com/users/ifesfor/UT200485.JPG
Iv got the same boost in WoW.
WOOZA
Word.Originally posted by: Insomniak
This thread is LOLable in the extreme.
Originally posted by: VIAN
The first set are in the exact same position. Although the second set aren't I assure you that the same thing is still there. You're just in denial. Anyway, you can go test it for yourself. Got COD. Go to the rocket area with Quality enabled and Anisotropic Filtering at 8x and go see for yourself. And then change it to High Quality and notice the clean beautiful surface that you paid much money for. If you read the above carefully it only appears with defined pattern so, it'd be really hard to find it some games that don't have textures like those, but many games also do have textures like those, maybe not everywhere, but they do.oh and i take your screen shots with a grain of salt, since you are in different position in all of them, if u wanna do a comparison you need to be in the exact same posiition to make it count
I can attest to that. But I still believe that ATI has better default IQ than. But if you set Nvidia to High Quality, it will have less, unnoticeable, textures shimmering and better overall image quality, if Trilinear is working.You will certainly see more contrast with an ATi part- their texture filtering accuracy is so low it gives the appearance of increased detail although it is aliasing artifacts more then anything. I've been running my R9800Pro for over a year now and ATi's texture filtering is extremely poor at best. Not saying that the current nV parts are a lot better, they have been on a downward trend since the last part that actually did real anisotropic filtering(the NV2x core parts).
It's missing.hey, where's "refresh rate overide" option in nvidia's newest driver (71.84)?
Originally posted by: VIAN
*UPDATED 3/6/05
This thread is to prove that ATI has better default IQ than Nvidia (although that's not happening right now with no ATI pics) resulting in unfair benchmarking with Anisotropic Filtering enabled. And if that's not good for ya, it's also a challenge to this.
Nvidia states the difference between Quality and High Quality in this 10MB pdf:
Accompanied by a graph that looks like this.Quality is the default setting that results in optimal image quality for your applications
High Quality results in the best image quality for your applications. This setting is not necessary for average users who run game applications. It is designed for more advanced users to generate images that do not take advantage of the programming capability of the texture filtering hardware.
There is and there isn't a difference between those settings, and any difference will be much more noticeable when you are in motion. If you don't enable Anisotropic Filtering, you will see no difference. Although there might be some subtle differences in certain situations like what I've seen in Call of Duty in the Ship level in mutlitplayer, differences like these are hard to spot. In this level you will notice it from a distance with the shadow of the railing of the ship.
If you turn on Anisotropic Filtering however, you will be able to see it on only certain types of textures. Sometimes you won't be able to see it at all, sometimes you'll be able to see it a bit in motion, and sometimes it will stick out like a black eye. Take COD for instance. The textures on the wall are nice, but never as defined as when there is are patterns on the floor. With Quality enabled you will notice artifacts that look like a crap load of texture aliasing a.k.a. texture shimmering with those patterns. It'll be visible in the pictures below(only on the patterned part). I'm pretty sure ATI doesn't have this problem as I never noticed it playin COD before. Also noticeable on the floorboards on the ship level. Painfully noticeable. If you follow the floorboards, you don't see anything, but as soon as you look at them diagnally or horizontally, then you have a serious problem.
Even at High Quality, there is just something weird about those textures that I noticed coming from an ATI card. For starters, not always does HQ do Trilinear - You may have to force it.
Being that Quality is default, benchmark numbers reflect this loss in Image Quality when they bench with Anisotropic Filtering, which is unfair to the competition that uses optimizations with NO Image Quality loss.
Quality.JPG VS High_Quality.JPG
In the High Quality picture above, you will see that near the wall, there is a bilinear transition there. Trilinear is set however in the game.
Quality_2.JPG VS High_Quality_2.JPG
Again in the High Quality pictures you are able to see some bilinear transitions, one is a bit farther than the lower, farther corner of the door.
If you are one of those dudes that was complaining about the second set not being in the same position. Here you go:
Q.JPG VS HQ.JPG
Although, these are using the 71.84 BETA drivers. There was an addition feature in 71.84 named Negative LOD BIAS, this was changed from default to Clamp.
If anyone can oppose that ATI has better default driver image quality with 8x Anisotropic Filtering enabled in an ATI pic, that would be great since I don't have an ATI card. But I'm looking into buying a cheap 9600 to prove this, but I don't know how long that will take.
SPECS
Athlon XP 2700
1GB Corsair Value Select PC2700
A7N8X-X
eVGA 6600GT - Forceware 66.93
Other than changing Image Settings in the drivers, Vsync was off and Anisotropic Filtering was set to 8x. Everything else left at default.
For Optimal Image Quality using drivers 71.84 without Anisotropic Filtering leave drivers at default settings. If you want Anisotropic Filtering, change the following:
Image Settings: High Quality
Force Mipmaps: Trilinear
Negative LOD BIAS: Clamp
PICS IN OTHER GAMES
HL2Q.JPG VS HL2HQ.JPG - Highest Settings, no Vsync
FarCryQ.JPG VS FarCryHQ.JPG - High Settings, AF set in driver, no Vsync
Originally posted by: VIAN
*UPDATED 3/6/05
PICS IN OTHER GAMES
HL2Q.JPG VS HL2HQ.JPG - Highest Settings, no Vsync
FarCryQ.JPG VS FarCryHQ.JPG - High Settings, AF set in driver, no Vsync
Originally posted by: McArra
Nvidia drivers sux... That's why I use XG, the IQ is much better.
I didn't see any such thing in your link.
Originally posted by: otispunkmeyer
Originally posted by: McArra
Nvidia drivers sux... That's why I use XG, the IQ is much better.
yes they are, im loving these XG 71.84s
Originally posted by: BenSkywalker
I didn't see any such thing in your link.
You can see it in every 3D PC game outside of Tron2.0 that I can think of. I posted the link in response as I figured you would try and deny their were any issues(as we have gone over this before) and I figured people who wanted the truth may be interested.
No it's not and I made sure that you would be able to see the artifacts at any angle, distance whatever. I showed you in the COD screen shot the same places and it held true, so now I still try to go to the same place, but not as worried if I screw up. I could be playing Splinter Cell: Pandora Tomorrow right now, I don't need to be wasting my time.VIAN... not to be picky, but your new screenshots still aren't taken in the exact same spot... save the game at a point and don't touch the mouse at all so you get the EXACT same view. Otherwise it makes it pretty much impossible to compare textures off in the distance and quality of AF and trilinear or bilinear filtering.
If it was about me, then I would, but other people are being duped in my opinion with these benchmarks.Nvidia drivers sux... That's why I use XG, the IQ is much better.
Trust me, it only looks harmess on the picture. If you hate aliasing, you will hate it. In fact, even when you can't tell the difference within a picture, in real life when moving it will still contribute to more aliasing vs High Quality. Negative LOD helps sharpen textures without Anisotropic Filtering. If you enabled Anisotropic Filtering Negative LOD turns off automatically because it looks bad. I haven't seen it, but It's good that it's automatically controlled.thats better, i think HQ just to say has the edge there, but its not something id notice playing. what does the Negative LOD do? does it help the filtering problems your documenting?
Are you sure you are viewing the picture in it's original size?well id chose the quality if i was deciding from them pics, i just cant tell the difference and id rather have the higher FPS
Originally posted by: VIAN
Again this ATI filtering thing you will probably not notice, especially if you just play a game. But if take time from your FPS and look at the graphics and the beauty you will see it. But you will also see Nvidia's, although Nvidia's is cleaner. I don't think you'll be able to spot the difference in a screenshot.
I used to see the ATI filtering thing in COD in the grassy hills next to this town, can't think of the name at the moment.
Originally posted by: Robor
Originally posted by: otispunkmeyer
Originally posted by: McArra
Nvidia drivers sux... That's why I use XG, the IQ is much better.
yes they are, im loving these XG 71.84s
What are the advantages?