8800 GTX next month right? Will SM3.0 be properly supported then?

undeclared

Senior member
Oct 24, 2005
498
0
86
My question is..

will it be able to play SM3.0 HDR + FSAA in Oblivion like the 7900 GTX should have been able to already ;)

I wonder.
 

akshayt

Banned
Feb 13, 2004
2,227
0
0
AFAIK Geforce 7900 already supports SM 3.0 *properly* except that you can't use HDR + AA together. Anyway, it will support 16FPAA + HDR instead of just 12FPAA.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: akshayt
AFAIK Geforce 7900 already supports SM 3.0 *properly* except that you can't use HDR + AA together. Anyway, it will support 16FPAA + HDR instead of just 12FPAA.

Except that it falls flat on its face when running shaders with dynamic branching - which was the big feature of SM3 in the first place. But that's all history now, cuz I hear the g80 will support SM4 (lol), so get ready for the onslaught of posts claiming SM4 support will make the card future proof.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: akshayt
AFAIK Geforce 7900 already supports SM 3.0 *properly* except that you can't use HDR + AA together. Anyway, it will support 16FPAA + HDR instead of just 12FPAA.

:laugh::laugh::laugh::laugh:

Dear God akshayt and you're taking computer engineering classes and studying like crazy?

What is "16FPAA?" or "12FPAA?"

Stop posting already.
 

ShadowOfMyself

Diamond Member
Jun 22, 2006
4,227
2
0
Originally posted by: Dethfrumbelo
The only way to be 'future proof' is to literally stop time.

I think the most future proof card ever was NV30... 4 years gone by and people still mention it at every new release :laugh:
 

aka1nas

Diamond Member
Aug 30, 2001
4,335
1
0
Originally posted by: munky
Originally posted by: akshayt
AFAIK Geforce 7900 already supports SM 3.0 *properly* except that you can't use HDR + AA together. Anyway, it will support 16FPAA + HDR instead of just 12FPAA.

Except that it falls flat on its face when running shaders with dynamic branching - which was the big feature of SM3 in the first place. But that's all history now, cuz I hear the g80 will support SM4 (lol), so get ready for the onslaught of posts claiming SM4 support will make the card future proof.

Well, isn't SM4 the only option if you are programming a native DirectX 10 game? I would say that it's somewhat important for future compatability.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: aka1nas
Originally posted by: munky
Originally posted by: akshayt
AFAIK Geforce 7900 already supports SM 3.0 *properly* except that you can't use HDR + AA together. Anyway, it will support 16FPAA + HDR instead of just 12FPAA.

Except that it falls flat on its face when running shaders with dynamic branching - which was the big feature of SM3 in the first place. But that's all history now, cuz I hear the g80 will support SM4 (lol), so get ready for the onslaught of posts claiming SM4 support will make the card future proof.

Well, isn't SM4 the only option if you are programming a native DirectX 10 game? I would say that it's somewhat important for future compatability.

Well, yes, in that regard SM4 be a more important feature than SM3 was 2 years ago. But then the question remains of how many games, if any, will utilize DX10 before the next refresh cycle of those cards is released, how much visual improvement will the feature bring, and how well will the game run with such features enabled.
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
SM3 has nothing to do with HDR support.

All of the Geforce 6 and 7 cards support full SM3. The Radeon X8 series and lower support SM2.0b which is basically an ATI made middle ground.

HDR on the other hand is completely different. You dont need SM3 for HDR. Also it depends on which implementation you are talking about.
  1. OpenEXR uses FP (Floating Point precision). Right now Nvidia uses 16bit FP (IIRC) and ATI actually uses something else (Which escapes me right now) which allows them to use AA along with HDR.
  1. Half Life 2 uses an implementation which is lower precision but all the calculations are done in the Pixel Shaders. The advantage is that you dont need support for Floating Point however you sacrifice some IQ

As for DX10. Microsoft claims that they recompiled all the code for DX9 and optimized it significantly. So while DX10 will run fine, DX9 will actually run better inherently on cards/software which use DX10.

-Kevin

(Oh and akshayt you have no idea what you are talking about. First off what is FPAA? the 16 you are thinking of is Nvidia's new AA detail level (16x). FP has nothing to do with AA)
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: ShadowOfMyself
Originally posted by: Dethfrumbelo
The only way to be 'future proof' is to literally stop time.

I think the most future proof card ever was NV30... 4 years gone by and people still mention it at every new release :laugh:

LOL. Good one... :laugh:
 

thilanliyan

Lifer
Jun 21, 2005
12,040
2,255
126
Originally posted by: Gamingphreek
OpenEXR uses FP (Floating Point precision). Right now Nvidia uses 16bit FP (IIRC) and ATI actually uses something else (Which escapes me right now) which allows them to use AA along with HDR.
Half Life 2 uses an implementation which is lower precision but all the calculations are done in the Pixel Shaders. The advantage is that you dont need support for Floating Point however you sacrifice some IQ

ATI X1000 series and above use FP for OpenEXR...HDR version is not native to a specific brand of video card(ie.NVidia and ATI run the same precision HDR for say FarCry but HDR in FarCry is not the same as HDR in HL2)...just native to a specific game, like Half Life 2 that you mentioned.

 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
Right now Nvidia uses 16bit FP (IIRC) and ATI actually uses something else (Which escapes me right now) which allows them to use AA along with HDR.
No, ATi uses FP16 too but their hardware allows AA to be applied to a floating point framebuffer.
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
Was it the ATI X800 series that only supported FX10/FX12/FX16 HDR? Or am I thinking of the only HDR modes that NVIDIA can support with AA?
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
The r300 series and the r400 series support only integer format blending, so any of the FX## HDR formats will work, but they dont look nearly as good as FP16. In addition, the r300 and it's derivatives also support floating point textures and render buffers - but the only thing missing to implement FP16 HDR is FP blending.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: munky
Originally posted by: akshayt
AFAIK Geforce 7900 already supports SM 3.0 *properly* except that you can't use HDR + AA together. Anyway, it will support 16FPAA + HDR instead of just 12FPAA.

Except that it falls flat on its face when running shaders with dynamic branching - which was the big feature of SM3 in the first place. But that's all history now, cuz I hear the g80 will support SM4 (lol), so get ready for the onslaught of posts claiming SM4 support will make the card future proof.


R600 won't support SM4? Will it support SM3.0b then? ;)