Much thanks to ken and others for the pictures, and hosting.
Originally posted by: Praxis1452
what about the "needle in the eyes" shimmering. OMFG don't turn it into a flamewar. UGH I swear you and joker are the same person with 2 opposite personalities....
Oh, its there;
http://www.newegg.com/Product/ProductLi...ption=&srchInDesc=&minPrice=&maxPrice=
HardOCP gives ATi the image quality crown, and talks about various parts of IQ. What I like most about Hard, is that they go more indepth, and use settings most others do not. Gone are the canned benchmarks, with no mention of IQ.
What Hard said about AA:
When we compared ATI?s AA and NVIDIA?s AA with their DEFAULT settings we still found ATI to provide better image quality. However, once we enabled ?Gamma Correct AA? on the NVIDIA GPU based video cards this changed. Now with our naked eye looking at games like Half Life 2: Episode 1, FEAR, and World of Warcraft we did not notice any differences between the cards at 2X or 4X AA. This was side-by-side, the same scenes (using many different saved points in HL2: EP1) and we saw no difference with our own eyes in-game.
Now, we looked into this further by taking screenshots and zooming into them in Photoshop at 300x their normal size. What we found is still at 2X AA ATI is ever so slightly better, the colors just seem more blended and less ?harsh? than NVIDIA has. However, at 4X AA they looked damn near identical.
Basically, its a draw if you use gamma corrected AA. Unless of course you use 8xAA for NV cards, it will then look better. However, no game was ever playable with it for me, sadly. Not even with SLI. So it was a worthless option,
for me.
What they said about AF:
While regular AF was definitely identical in-game, turning on ATI?s special ?High Quality? AF turned the tables big-time. ATI gives you a wonderful option of being able to have a less angle dependent form of AF enabled. What this means is that at and around the 45 degree angles on textures ATI can do more filtering than NV can. Any texture that is on a steep angled surface will receive better AF.
This made a huge difference in Oblivion with its outdoor scenes having rolling hills and steep terrain angles. This also helped in Half Life 2: Episode 1. This didn?t help too much in FEAR which is a rather dark game and has mostly 90 degree angles. This also didn?t help much in Age of Empires III which also doesn?t have any steep angles. This did though help immensely in World of Warcraft which has large outdoor scenes.
Any way you slice it ATI has the upper hand in AF with the option to enable ?High Quality? AF.
What they said about shimmering;
Ok, let?s just get right to it shall we. NVIDIA has worse texture crawling and moiré with the default driver settings than ATI does. Notice I said worse, because ATI is not entirely out of the woods, as they have it as well, just to a lesser extent.
World of Warcraft has to be the worse game for texture crawling that we have come across. It is just downright annoying. You can notice it outdoors on cobble stone or dirt paths mostly. You can also see it on the grass texture but since it is a dark color it is harder to spot.
Texture crawling, as we are calling it, is basically where it looks like there are little marching ants crawling on your texture as you move away and toward it. For example walking down a path in WoW and you see that on the ground is a wavy or shimmering pattern that moves with you. You can read all about Moiré patterns here. It is really bad on NVIDIA GPUs with default driver settings in WoW. ATI also suffers from this but it isn?t as bad as it is on NVIDIA GPUs.
Another game where we saw this was in Half Life 2 and Half Life 2: Episode 1. We didn?t notice it in the Citadel maps in Episode 1, but once we got outside in City 17 we started seeing it on the ground. Moiré was also very noticeable in HL2: EP 1 outdoors.
Texture crawling and moiré are not something you can see in a screenshot, it can only be represented with movement. In the above screenshot of Half Life 2: Episode 1 we out outlined a portion of road in this map. This is one area we saw texture crawling and moiré as we moved down the road. It was visible on both ATI and NVIDIA hardware, but worse on NV hardware. Any places with detailed textures like this road you can spot texture crawling.
The great news is that this can be reduced on NVIDIA hardware, the bad news is that it takes a performance hit to do so. You can manually turn off all the filtering options in the advanced driver control panel as well as clamp the LOD bias. This greatly reduces it, but it doesn?t entirely do away with it, and it also takes a performance hit to do so. Still, if you want the best texture quality you will have no choice but to take the hit.
Overall ATI has the better texture which by default has less texture crawling and allows the ?High Quality? AF option.
Overall image quality;
We took an in-depth look at image quality in this evaluation. We had two 30? LCD?s side-by-side and were able to look at IQ in-game. Culminating everything we learned we can confidently say that the ATI Radeon X1900 XTX CrossFire platform offered the best image quality in games.
We have all the proof to back this up. With the Radeon X1900 XTX and CrossFire platform antialiasing plus HDR is possible in Oblivion. ATI has the ability to do multisampling plus floating point blending HDR. ATI has a ?High Quality? anisotropic filtering option which has real tangible image quality benefits in games with little to no performance hit. In large outdoor games this is a huge benefit. It also helps in games that have very high quality and detailed textures like Half Life 2: Episode 1 and Ghost Recon. Having these two displays side-by-side proved that texture crawling and moiré are worse on NVIDIA hardware at default driver settings compared to ATI hardware. We especially noticed this in World of Warcraft, Half Life 2: Episode 1, and Battlefield 2. When you look at all this added up it becomes clear that ATI still holds the image quality crown.
From my personal expierences, ATi to me, has the IQ crown. Mainly for two reasons, on my setup, NV shimmers much, much worse than ATi. So much, its down right distracting. Not everyone has the hardware that shows it as much as mine, so its not going to be a problem for most people. Second, HQ AF is for real, and I can easily see the difference. AA quality at 2x and 4x looks to the same to me while playing games. Its a wash to me. 8x for NV looks better than 6x for ATi, but since it wasnt playable for me (res of 1920x1200), it was a useless option.
Most people here, have not seen, or have had NV and ATi highend hardware in their personal PC. And still shots only show so much. Its something you really have to see in person. Everyone has their own opinion, its mine, that ATi has better IQ overall. Just as it is HardOCPs when having both LCD's side by side, one with NV, and the other with ATi. They said ATi looked better as well.