nVIDIA Still Unable to do HDR+AA?

giantpinkbunnyhead

Diamond Member
Dec 7, 2005
3,251
1
0
I want to get one of these 8800 GTX cards, but I can't seem to find out if Nvidia has learned how to do HDR+AA yet.

I found this in AT's 8800 GTX Review:


With NVIDIA's new method of acquiring a more detailed blur via CSAA, angle independent anisotropic filtering, and high performance with Transparency AA, potential image quality is improved over G70 and R580. The new architecture is capable of floating point frame buffer blends and antialiasing of floating point data. ATI has continually called this ability HDR+AA, and while it is better to be able to use full floating point for HDR, this isn't the only solution to the problem. There are some rendering techniques that employ MRTs (Multiple Render Targets) that will still not allow AA to be performed on them alongside HDR. There are also HDR techniques that allow antialiasing to be performed along with HDR without the need for AA + floating point (like games based on Valve's Source engine).

In any case, we've already covered the major differences in AA and AF modes and we even looked at how the optimizations affect image quality. For this section, we'll take a look at three different cases in which we employ the non-AA graphics settings we will be using in our performance tests. We are looking for differences in alpha blending, effective AF level in a game, and shader rendering. We didn't see anything that stood out, but feel free to take a look for yourselves.


All this did was make my head hurt. I'm a simple man and don't understand all this technical stuff. Are they saying that they can't do HDR+AA specifically, but achieve the same result through a different channel?

My only concern is that upgrading to this card will cost me the HDR+AA ability which I currently love with Oblivion.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
It can do it. Quite well actually.

Click

Oblivion 16xAA+HDR / 16xAF

XFX 8800GTX
[*]1280x1024--95.921
[*]1600x1200--86.899
[*]1920x1200--70.591
[*]2048x1536--54.103

Asus 8800GTX
[*]1280x1024--95.838
[*]1600x1200--86.988
[*]1920x1200--70.941
[*]2048x1536--54.006

EVGA 8800GTS
[*]1280x1024--82.566
[*]1600x1200--68.585
[*]1920x1200--55.360
[*]2048x1536--41.117

Oblivion 16xQAA+HDR / 16xAF

XFX 8800GTX
[*]1280x1024--70.278
[*]1600x1200--47.045
[*]1920x1200--38.784
[*]2048x1536--30.360

Asus 8800GTX
[*]1280x1024--70.184
[*]1600x1200--46.995
[*]1920x1200--38.156
[*]2048x1536--30.303

EVGA 8800GTS
[*]1280x1024--51.778
[*]1600x1200--33.739
[*]1920x1200--28.488
[*]2048x1536--21.998
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: Kromis
What's QAA?

It means quality AA. E.g 8xQAA means true 8xMSAA, taking 8 samples etc. nVIDIA sure does hold the IQ crown with an iron grip.
 

Kromis

Diamond Member
Mar 2, 2006
5,214
1
81
Originally posted by: Cookie Monster
Originally posted by: Kromis
What's QAA?

It means quality AA. E.g 8xQAA means true 8xMSAA, taking 8 samples etc. nVIDIA sure does hold the IQ crown with an iron grip.

Ahh, okay. Thanks for clearing that up! I was guessing quality AA...
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
a repost of mine from lopri's G80 speculation thread.

Just a quick post with two purposes in mind. First confirmation that G80's ROP's do indeed take 4 samples per cycle (all previous ROP's take only 2 samples). This means single cycle 4x AA (or 4x AA for the price of last gen's 2x AA).

Secondly, I want to firmly bury Josh's HDR + AA bogey-man / doubts.

This is taken from the B3D G80 article
it's able to perform 8x multisampling using rotated or jittered subsamples laid over a 4-bit subpixel grid, looping through the ROP taking 4 multisamples per cycle.

It can multisample from all backbuffer formats too, NVIDIA providing full orthogonality, including sampling from pixels maintained in a non-linear colour space or in floating point surface formats. Thus the misunderstood holy grail of "HDR+AA" is achieved by the hardware with no real developer effort.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Secondly, I want to firmly bury Josh's HDR + AA bogey-man / doubts.
No need for you to do that. Several benches, namely the GamePrye one, did that several hours ago.

Besides, it's not like I was the only one with last-second doubts before it launched:
Originally posted by: Gstanfor
It's quite possible nvidia will choose not to expose HDR+AA on DX9 at all, but wait for DX10. We'll just have to wait and see.

I base that comment upon firiing squad's Dx10 article and this slide contained therein, btw.
Honestly, we didn't know what was going to happen up until today with the HDR+AA feature. In fact, not even every article addressed it today and it is one of the biggest improvements, IMO.

nVidia can do it, and it can do it better than ATi's current offerings and certainly better than nVidia's own previous GPU's. :p
 
Jun 14, 2003
10,442
0
0
yeah it can do it

in oblivion, far cry and splinter cell 3 and SS2 it can definately do it

the only game i know of where no card can do AA and HDR is GRAW which uses multiple render targets, and no card, not the x1900 nor the 8800 can do HDR and AA in that game, its just the way its been implemented. Nvidia do have an answer to that though. they do have an Edge AA setting that is kinda a loose AA setup that attempts to get rid of the jaggies on edges, its not great but its better than nothing
 

redbox

Golden Member
Nov 12, 2005
1,021
0
0
Dang! It is getting really hard to not jump on one of these beauties.
I really want to wait untill there is some compition but damn. Nvidia did one hell of a job.
 

hardwareking

Senior member
May 19, 2006
618
0
0
off topic-They should put this thing in the PS3.
Back on topic-
This is the first single card that can play oblivion with all in-game settings turned up and with a whopping 16xAA.
Its fast,its furious and its really expensive.
And one question.Is there a big difference between FP16 HDR and FP32 HDR?In terms of quality and appearence.
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: Gstanfor
Secondly, I want to firmly bury Josh's HDR + AA bogey-man / doubts.

Josh ALREADY said "It can do it. Quite well actually." and posted not only a link, but resolution/FPS figures three posts above yours in this very thread.

Quit trying to deliberately stir up trouble.
 

giantpinkbunnyhead

Diamond Member
Dec 7, 2005
3,251
1
0
One more thing... this card is GDDR3. Didn't somebody already cut loose a GDDR4? Or am I getting mixed up on something. Is there much of a difference between 3 and 4?
 

Dethfrumbelo

Golden Member
Nov 16, 2004
1,499
0
0
Originally posted by: giantpinkbunnyhead
One more thing... this card is GDDR3. Didn't somebody already cut loose a GDDR4? Or am I getting mixed up on something. Is there much of a difference between 3 and 4?

The X1950XTX and X1950XT have GDDR4.

It's faster, but obviously more expensive as well.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
The X1950XTX has the GDDR4 and the G80's have GDDR3, however, the GDDR4 used on the X1950XTX is pretty much the slowest GDDR4. While it is faster than GDDR3, the latency is lesser on GDDR3 and with the G80's high memory frequency, it's GB's/sec are actually greater than the X1950's. Not to mention it's bus size is greater.

The X1950XTX is nice, but if I were buying high-end to ultra-high-end, I'd go with an 8800GTS or 8800GTX, not question about it.
 

Modular

Diamond Member
Jul 1, 2005
5,027
67
91
There will be revisions of the 8800gtx with GDDR4. I would post a link (I think this was stated in the AT article) but I don't have time.