Physically Impossible to do HDR+AA?

TheRyuu

Diamond Member
Dec 3, 2005
5,479
14
81
I have heard all the talk of HDR+AA is impossible due to someway the core of the G70 is designed and sh!t like that.

I just want to know WHY its physically impossible for it to do HDR+AA? I want an explination so don't say "because it just it" or "because of someway the core is", try to make it detailed, even if I might not understand it :p

EDIT: and why would nvidia make something that cant to HDR+AA?
 

Sable

Golden Member
Jan 7, 2006
1,130
105
106
Originally posted by: wizboy11


EDIT: and why would nvidia make something that cant to HDR+AA?
Maybe because of the massive frame rate hit of using HDR with AA. Only Sli rigs could cope I reckon.
 

McArra

Diamond Member
May 21, 2003
3,295
0
0
No it can't do subwayeatbig. It works (the patch) not the HDR+AA. It a design limitation, what else do you want wizboy? it's like asking way a radeon 8500 can't render DX9...
Also HDR (OpenEXR we are talking) takes a huge performance hit, AA is pointless in most situation (although is nice to have the option, and it is a step in the right direction).
The HDR by Valve can be used with AA with nice performance, and so does G70. It's only the OpenEXR standard that G70 can't do with AA (the only full precision HDR if I remember correctly).
 

pibrahim

Member
Jan 13, 2006
48
0
0
Perhaps this is overly simplistic, but for those with SLI rigs, why cannot card 1 render HDR while card 2 deals with AA?
 

F1shF4t

Golden Member
Oct 18, 2005
1,583
1
71
Originally posted by: pibrahim
Perhaps this is overly simplistic, but for those with SLI rigs, why cannot card 1 render HDR while card 2 deals with AA?


Each card renders half the screen. And AA or HDR is not some applied effect the images have to be rendered with it.

EDIT: well unless u want half ur screen HRD and the other half with AA :p
 

M0RPH

Diamond Member
Dec 7, 2003
3,302
1
0
Originally posted by: wizboy11
EDIT: and why would nvidia make something that cant to HDR+AA?

Because when Nvidia designed the G70, they couldn't figure out a way to do HDR+AA. I seem to remember a quote from an Nvidia engineer where they said it wasn't possible with current graphics technology. Lo and behold, ATI was able to figure out a way to make it work.

Now the question becomes, assuming that future games will use HDR (safe assumption) and will support the ability to do AA+HDR (pretty safe as well), why would you buy any video card that doesn't do AA+HDR? AA is obviously a very important IQ enhancing feature. HDR, while some may debate its merits, seems to be the hot thing now for giving a more immersive game experience. Why would you pick a video card that makes you choose between these two features when you can have both?
 

Janooo

Golden Member
Aug 22, 2005
1,067
13
81
Originally posted by: pibrahim
Perhaps this is overly simplistic, but for those with SLI rigs, why cannot card 1 render HDR while card 2 deals with AA?


Simplistic answer: because HDR changes the look of the screen. AA would be a waste of calculation if it knows nothing about HDR.
 
Jun 14, 2003
10,442
0
0
Originally posted by: M0RPH
Originally posted by: wizboy11
EDIT: and why would nvidia make something that cant to HDR+AA?

Because when Nvidia designed the G70, they couldn't figure out a way to do HDR+AA. I seem to remember a quote from an Nvidia engineer where they said it wasn't possible with current graphics technology. Lo and behold, ATI was able to figure out a way to make it work.

Now the question becomes, assuming that future games will use HDR (safe assumption) and will support the ability to do AA+HDR (pretty safe as well), why would you buy any video card that doesn't do AA+HDR? AA is obviously a very important IQ enhancing feature. HDR, while some may debate its merits, seems to be the hot thing now for giving a more immersive game experience. Why would you pick a video card that makes you choose between these two features when you can have both?

nice one morph.....you just confirmed to me you are an absolute ATI back door burglar, your posts no longer exist to me.

thats just pure BS............nvidia couldnt figure out, yeah because nvidia dont emply real engineers :confused::disgust:
 

M0RPH

Diamond Member
Dec 7, 2003
3,302
1
0
Here's what the chief scientist at Nvidia said about this (from here):

For those of you with super-duper graphics cards, you will have come across a problem: you can't use Anti-Aliasing when using HDR lighting, for example in Far Cry. In these cases, it's a situation where you have to choose one or the other. Why is this, and when is the problem going to get solved?

"OK, so the problem is this. With a conventional rendering pipeline, you render straight into the final buffer - so the whole scene is rendered straight into the frame buffer and you can apply the AA to the scene right there."

"But with HDR, you render individual components from a scene and then composite them into a final buffer. It's more like the way films work, where objects on the screen are rendered separately and then composited together. Because they're rendered separately, it's hard to apply FSAA (note the full-screen prefix, not composited-image AA! -Ed) So traditional AA doesn't make sense here."

So if it can't be done in existing hardware, why not create a new hardware feature of the graphics card that will do both?

"It would be expensive for us to try and do it in hardware, and it wouldn't really make sense - it doesn't make sense, going into the future, for us to keep applying AA at the hardware level. What will happen is that as games are created for HDR, AA will be done in-engine according to the specification of the developer.

"Maybe at some point, that process will be accelerated in hardware, but that's not in the immediate future."

But if the problem is the size of the frame buffer, wouldn't the new range of 512MB cards help this?

"With more frame buffer size, yes, you could possibly get closer. But you're talking more like 2GB than 512MB."

Now ask yourself, if it was so difficult as he says, why was ATI able to do it?

Spunkmeyer, if you're going to call what I'm saying BS, at least be informed enough to say why you think it's BS.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: otispunkmeyer


nice one morph.....you just confirmed to me you are an absolute ATI back door burglar, your posts no longer exist to me.

thats just pure BS............nvidia couldnt figure out, yeah because nvidia dont emply real engineers :confused::disgust:

LMAO!!! Back door burglar!!!! That just wailed on my funny bone! Thanks otis. :)
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
IMO, the main reason Nv did not make HDR+AA possible is because it would have cost too much, and possibly would have delayed the launch. Despite what many ppl like to believe, the g70 was not a whole new architecture compared to the nv40, it was just an evolutionary step forward with some tweaks and added shader capabilities. The nv40 did not support HDR+AA, and to enable it on the g70 might have required too many changes in the gpu from it's original design.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: M0RPH
Here's what the chief scientist at Nvidia said about this (from here):

For those of you with super-duper graphics cards, you will have come across a problem: you can't use Anti-Aliasing when using HDR lighting, for example in Far Cry. In these cases, it's a situation where you have to choose one or the other. Why is this, and when is the problem going to get solved?

"OK, so the problem is this. With a conventional rendering pipeline, you render straight into the final buffer - so the whole scene is rendered straight into the frame buffer and you can apply the AA to the scene right there."

"But with HDR, you render individual components from a scene and then composite them into a final buffer. It's more like the way films work, where objects on the screen are rendered separately and then composited together. Because they're rendered separately, it's hard to apply FSAA (note the full-screen prefix, not composited-image AA! -Ed) So traditional AA doesn't make sense here."

So if it can't be done in existing hardware, why not create a new hardware feature of the graphics card that will do both?

"It would be expensive for us to try and do it in hardware, and it wouldn't really make sense - it doesn't make sense, going into the future, for us to keep applying AA at the hardware level. What will happen is that as games are created for HDR, AA will be done in-engine according to the specification of the developer.

"Maybe at some point, that process will be accelerated in hardware, but that's not in the immediate future."

But if the problem is the size of the frame buffer, wouldn't the new range of 512MB cards help this?

"With more frame buffer size, yes, you could possibly get closer. But you're talking more like 2GB than 512MB."

Now ask yourself, if it was so difficult as he says, why was ATI able to do it?

Spunkmeyer, if you're going to call what I'm saying BS, at least be informed enough to say why you think it's BS.

FYI, you are the only one who used the word difficult. The engineer said it would be expensive. And until you fix your link, we wont know the date of this article.

Ah, you fixed it, thank you. Published: 11th July 2005 by Wil Harris
Almost seven months ago. I would think things have changed in the last seven months.
If ATI could "whip up" Crossfire as fast as they did, albeit new and buggy, I think Nv can "whip up" HDR+AA even if new, and buggy.
 

ayabe

Diamond Member
Aug 10, 2005
7,449
0
0
Originally posted by: keysplayr2003
FYI, you are the only one who used the word difficult. The engineer said it would be expensive. And until you fix your link, we wont know the date of this article.

Ah, you fixed it, thank you. Published: 11th July 2005 by Wil Harris
Almost seven months ago. I would think things have changed in the last seven months.
If ATI could "whip up" Crossfire as fast as they did, albeit new and buggy, I think Nv can "whip up" HDR+AA even if new, and buggy.

What difference does it make if the article was written in July, we were talking about G70 and why it can't do HDR+AA which is pretty applicable. Whether G71 will support this is unknown, I don't think it's beyond NVIDIA's capabilities obviously, as to whether it's still too expensive to accomplish with G71 remains to be seen.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: ayabe
Originally posted by: keysplayr2003
FYI, you are the only one who used the word difficult. The engineer said it would be expensive. And until you fix your link, we wont know the date of this article.

Ah, you fixed it, thank you. Published: 11th July 2005 by Wil Harris
Almost seven months ago. I would think things have changed in the last seven months.
If ATI could "whip up" Crossfire as fast as they did, albeit new and buggy, I think Nv can "whip up" HDR+AA even if new, and buggy.

What difference does it make if the article was written in July, we were talking about G70 and why it can't do HDR+AA which is pretty applicable. Whether G71 will support this is unknown, I don't think it's beyond NVIDIA's capabilities obviously, as to whether it's still too expensive to accomplish with G71 remains to be seen.

It makes a lot of difference in the PC world. 7 months is a long time in this industry. The views of the engineers can vastly change in this amount of time. I agree that it remains to be seen if HDR+AA will be available on the G71. At the time this article was published, the G70's were already in production. So they were what they were, too late to change it. As for the expensive part, 7 months is a long time to ponder ways of reducing costs and methods to get the job done, if it is in fact going to be/is implemented on the G71.

 

ElFenix

Elite Member
Super Moderator
Mar 20, 2000
102,389
8,547
126
Originally posted by: keysplayr2003
Originally posted by: ayabe
Originally posted by: keysplayr2003
FYI, you are the only one who used the word difficult. The engineer said it would be expensive. And until you fix your link, we wont know the date of this article.

Ah, you fixed it, thank you. Published: 11th July 2005 by Wil Harris
Almost seven months ago. I would think things have changed in the last seven months.
If ATI could "whip up" Crossfire as fast as they did, albeit new and buggy, I think Nv can "whip up" HDR+AA even if new, and buggy.

What difference does it make if the article was written in July, we were talking about G70 and why it can't do HDR+AA which is pretty applicable. Whether G71 will support this is unknown, I don't think it's beyond NVIDIA's capabilities obviously, as to whether it's still too expensive to accomplish with G71 remains to be seen.

It makes a lot of difference in the PC world. 7 months is a long time in this industry. The views of the engineers can vastly change in this amount of time. I agree that it remains to be seen if HDR+AA will be available on the G71. At the time this article was published, the G70's were already in production. So they were what they were, too late to change it. As for the expensive part, 7 months is a long time to ponder ways of reducing costs and methods to get the job done, if it is in fact going to be/is implemented on the G71.

and yet we're still talking about G70 and that is nvidia's explanation of why G70 doesn't do HDR+AA.

god some people around here just can't read.
 

John Reynolds

Member
Dec 6, 2005
119
0
0
Originally posted by: keysplayr2003
It makes a lot of difference in the PC world. 7 months is a long time in this industry. The views of the engineers can vastly change in this amount of time. I agree that it remains to be seen if HDR+AA will be available on the G71. At the time this article was published, the G70's were already in production. So they were what they were, too late to change it. As for the expensive part, 7 months is a long time to ponder ways of reducing costs and methods to get the job done, if it is in fact going to be/is implemented on the G71.

No, 7 months is NOT a long time for yanking out your ROPs and redesigning them to allow multi-sampling to be applied to float render targets. I'd love it if NVIDIA could pull it off for their 7900s, but I seriously doubt it'll happen.
 

Cooler

Diamond Member
Mar 31, 2005
3,835
0
0
ATI wins with this one. I hope this will put the VTF to rest because there are things the 7800 cant do as well.
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
Originally posted by: tvdang7
thats y i like ati :) . looks great on the new maps on source nuke and militia.

Ok, to clarify things here:

nVidia CAN do HDR + AA. nVidia can't do openEXR HDR, like the guy many posts above said (wise guy, listen to him).

The practical HDR, is the one implemented by Valve. They were the first to introduce it, and their HDR works different and BETTER from the otherones available such as the one in Farcry (openEXR). Valve's HDR can be rendered by nVidia and ATI cards with AA simultaneously.

When I say practical, I say it because Valve's HDR affects performance very little, as it uses shaders to render HDR. once again has of now, ATI has the ball in their hands, because their cards handle SM 3.0 like butter (because of the 48 vertex shader units). But we cant speak the truth until we see nVidias 32/32 solution (at least thats what rumours speak of) with G71.


So stop whining about nVidia doesn't do HDR + AA. It does, and Valve's HDR is the HDR that will be used mostly in games, because it is the most efficient. No1 gives a fu-ck about openEXR, no1 including nVidia.