- Dec 7, 2005
- 3,251
- 1
- 0
I want to get one of these 8800 GTX cards, but I can't seem to find out if Nvidia has learned how to do HDR+AA yet.
I found this in AT's 8800 GTX Review:
With NVIDIA's new method of acquiring a more detailed blur via CSAA, angle independent anisotropic filtering, and high performance with Transparency AA, potential image quality is improved over G70 and R580. The new architecture is capable of floating point frame buffer blends and antialiasing of floating point data. ATI has continually called this ability HDR+AA, and while it is better to be able to use full floating point for HDR, this isn't the only solution to the problem. There are some rendering techniques that employ MRTs (Multiple Render Targets) that will still not allow AA to be performed on them alongside HDR. There are also HDR techniques that allow antialiasing to be performed along with HDR without the need for AA + floating point (like games based on Valve's Source engine).
In any case, we've already covered the major differences in AA and AF modes and we even looked at how the optimizations affect image quality. For this section, we'll take a look at three different cases in which we employ the non-AA graphics settings we will be using in our performance tests. We are looking for differences in alpha blending, effective AF level in a game, and shader rendering. We didn't see anything that stood out, but feel free to take a look for yourselves.
All this did was make my head hurt. I'm a simple man and don't understand all this technical stuff. Are they saying that they can't do HDR+AA specifically, but achieve the same result through a different channel?
My only concern is that upgrading to this card will cost me the HDR+AA ability which I currently love with Oblivion.
I found this in AT's 8800 GTX Review:
With NVIDIA's new method of acquiring a more detailed blur via CSAA, angle independent anisotropic filtering, and high performance with Transparency AA, potential image quality is improved over G70 and R580. The new architecture is capable of floating point frame buffer blends and antialiasing of floating point data. ATI has continually called this ability HDR+AA, and while it is better to be able to use full floating point for HDR, this isn't the only solution to the problem. There are some rendering techniques that employ MRTs (Multiple Render Targets) that will still not allow AA to be performed on them alongside HDR. There are also HDR techniques that allow antialiasing to be performed along with HDR without the need for AA + floating point (like games based on Valve's Source engine).
In any case, we've already covered the major differences in AA and AF modes and we even looked at how the optimizations affect image quality. For this section, we'll take a look at three different cases in which we employ the non-AA graphics settings we will be using in our performance tests. We are looking for differences in alpha blending, effective AF level in a game, and shader rendering. We didn't see anything that stood out, but feel free to take a look for yourselves.
All this did was make my head hurt. I'm a simple man and don't understand all this technical stuff. Are they saying that they can't do HDR+AA specifically, but achieve the same result through a different channel?
My only concern is that upgrading to this card will cost me the HDR+AA ability which I currently love with Oblivion.