G80/84 already support DX10.1 and Shader model 4.1?

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
"Nvidia said its G80, G84 and the chip is already WEI 10.1 and DirectX 4.1 Shader Model prepare, when Microsoft released DirectX 10.1, Shader Model 4.1, NVIDIA Drivers provide support programs, the problem will be resolved smoothly."

Quote from article was translated by Google Language Tools.

Linkage
 

JustaGeek

Platinum Member
Jan 27, 2007
2,827
0
71
Originally posted by: dclive
Originally posted by: JustaGeek
Fearmongering....?

From what I've read here at AT, the 8000 series card will not be able to support 10.1.

But please, correct me if I'm wrong.

Another words - if I buy the 8800GTS or GTX today, will this card support the new code paths associated with DirectX 10.1?

No, the 8800 series cards don't support 10.1 new additions.

Did your ATI 800XT PE become useless when DirectX 9.0c shipped? No, of course not. Those apps that used DirectX 9.0c and SM3.0 had another code path for the cards (the vast, vast, vast majority) that could not do SM3 in hardware. Problem solved.

You're being silly. And yes, that's fearmongering. You're acting as if the cards are useless (your words!) and that's completely and totally incorrect.

Based on the above statement by dclive, they don't.

It is a part of the thread discussing the use of >4GB of RAM...

http://forums.anandtech.com/me...AR_FORUMVIEWTMP=Linear
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
NVIDIA Drivers provide support programs, the problem will be resolved smoothly
Heh, I love how they use the words "nVidia drivers" and "smoothly" in the same sentence. :p
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Yeah, the context of the translation should not be taken literally.

@ Justageek: Who is dclive, and why should we put credence into his post there? Is he an nvidia or MS rep or something?
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
Originally posted by: keysplayr2003
Yeah, the context of the translation should not be taken literally.

@ Justageek: Who is dclive, and why should we put credence into his post there? Is he an nvidia or MS rep or something?

"There" is "here" (its an AT thread he has linked), dclive is one of our fellow AT Elite's. He is simply replying in that thread to the ridiculous accertation that somehow DX10 class hardware will become obsolete with MS's minor 10.1 update.

I don't believe any current hardware fully supports all DX10.1 features despite what the pooly translated HKEPC article "may" clumsily be trying to state. I also don't believe it makes any difference.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
I knew it was another AT thread. "There" meant "other thread". Anyway, I was just curious about it. What would make them state that G8x supports DX10.1 and Shader Model 4.1. Where do they get this stuff from? It would be very kewl if there was a new laws of physics and ethics, by which anyone typing a statement grabbed out of their butts be non lethally shocked by their computers. hehe.
That would be so "nifty", and would severely cut down of the net FUD.
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
Originally posted by: keysplayr2003
I knew it was another AT thread. "There" meant "other thread". Anyway, I was just curious about it. What would make them state that G8x supports DX10.1 and Shader Model 4.1. Where do they get this stuff from? It would be very kewl if there was a new laws of physics and ethics, by which anyone typing a statement grabbed out of their butts be non lethally shocked by their computers. hehe.
That would be so "nifty", and would severely cut down of the net FUD.

I'm guessing they are simply misconstruing DX10.1 backwards compatability with DX10 hardware. DX10 hardware will simply run the DX10 path if DX10.1 hardware features are implemented in the application just like other DX revisions.

DX10.1 is simply for support of future hardware, developers won't be implementing any the specific 10.1 features likely for a couple years. DX10 exclusive to VISTA just provided plenty of FUD fertilizer is all.


 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: rbV5
Originally posted by: keysplayr2003
I knew it was another AT thread. "There" meant "other thread". Anyway, I was just curious about it. What would make them state that G8x supports DX10.1 and Shader Model 4.1. Where do they get this stuff from? It would be very kewl if there was a new laws of physics and ethics, by which anyone typing a statement grabbed out of their butts be non lethally shocked by their computers. hehe.
That would be so "nifty", and would severely cut down of the net FUD.

I'm guessing they are simply misconstruing DX10.1 backwards compatability with DX10 hardware. DX10 hardware will simply run the DX10 path if DX10.1 hardware features are implemented in the application just like other DX revisions.

DX10.1 is simply for support of future hardware, developers won't be implementing any the specific 10.1 features likely for a couple years. DX10 exclusive to VISTA just provided plenty of FUD fertilizer is all.

So if the only additions are requirement to use 4xaa and 32bit floating point...what about that is impossible to do with current cards? Is there something I missed?
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
I'm pretty sure that the 2900XT from ATI has more features than the 8800 cards from nVidia. They have some sort of 'tesselator' or something that may be part of DX10.1.

I'm 99% positive that all current cards are *not* DX10.1 compatible. The 2900XT is probably halfway there. They need a higher precision when it comes to color depth IIRC, among other things.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Originally posted by: SickBeast
I'm pretty sure that the 2900XT from ATI has more features than the 8800 cards from nVidia. They have some sort of 'tesselator' or something that may be part of DX10.1.

I'm 99% positive that all current cards are *not* DX10.1 compatible. The 2900XT is probably halfway there. They need a higher precision when it comes to color depth IIRC, among other things.

Actually the Radeon HD 2900XT has the most complete DX10 implementation to date, pity that it doesn't have the performance of the 8800 Ultra, "may be" in DX10 tittles in the future. But as a Workstation 3D rendering card, even the Radeon HD 2600XT can outperform the 8800 based Quadro's in that discipline, quite weird the power under the hood of the R6x0 series.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: evolucion8
Originally posted by: SickBeast
I'm pretty sure that the 2900XT from ATI has more features than the 8800 cards from nVidia. They have some sort of 'tesselator' or something that may be part of DX10.1.

I'm 99% positive that all current cards are *not* DX10.1 compatible. The 2900XT is probably halfway there. They need a higher precision when it comes to color depth IIRC, among other things.

Actually the Radeon HD 2900XT has the most complete DX10 implementation to date, pity that it doesn't have the performance of the 8800 Ultra, "may be" in DX10 tittles in the future. But as a Workstation 3D rendering card, even the Radeon HD 2600XT can outperform the 8800 based Quadro's in that discipline, quite weird the power under the hood of the R6x0 series.

With a few driver tweaks it may be possible for the HD2900XT to outperform the 8800 series in DX10 games. Not the DX10 junk we have now, but true DX10 software like Crysis.

It's unknown though, but what you mentioned is something of a possibility. I often wonder if Nvidia built the 8800 series with DX9 in mind knowing that DX10 won't be a factor until they release their next card. Also that ATI went the other route and their DX9 performance was lackluster because the card was built for DX10 games yet to be released? *shrug*

p.s. Bioshock, call of juarez etc is not what I would consider a DX10 title not yet :D
 

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
From what I've read the R6x0 superscalar Architecture has to be at peak efficiency to really shine which I would assume is what AMD are trying to with driver updates as the latest Cat 7.8's have increased temps on some peoples setups by as much as 5degrees+ so I guess you could conclude it's doing more *work* hence a temp increase.

It would be cool if my GTX supported SM 4.1 + DX10.1....I'd reinstall it for that but alas, I am afraid that is highly unlikely.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: Sylvanas
From what I've read the R6x0 superscalar Architecture has to be at peak efficiency to really shine which I would assume is what AMD are trying to with driver updates as the latest Cat 7.8's have increased temps on some peoples setups by as much as 5degrees+ so I guess you could conclude it's doing more *work* hence a temp increase.

It would be cool if my GTX supported SM 4.1 + DX10.1....I'd reinstall it for that but alas, I am afraid that is highly unlikely.

I can't wait to see what type of performance Crysis gets. That's gonna be the determining factor.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Imagine it is not too important, as current cards will likely be a distant memory by the time we have games that really use even dx10 besides .1 or whatever.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Originally posted by: cmdrdredd
Originally posted by: rbV5
Originally posted by: keysplayr2003
I knew it was another AT thread. "There" meant "other thread". Anyway, I was just curious about it. What would make them state that G8x supports DX10.1 and Shader Model 4.1. Where do they get this stuff from? It would be very kewl if there was a new laws of physics and ethics, by which anyone typing a statement grabbed out of their butts be non lethally shocked by their computers. hehe.
That would be so "nifty", and would severely cut down of the net FUD.

I'm guessing they are simply misconstruing DX10.1 backwards compatability with DX10 hardware. DX10 hardware will simply run the DX10 path if DX10.1 hardware features are implemented in the application just like other DX revisions.

DX10.1 is simply for support of future hardware, developers won't be implementing any the specific 10.1 features likely for a couple years. DX10 exclusive to VISTA just provided plenty of FUD fertilizer is all.

So if the only additions are requirement to use 4xaa and 32bit floating point...what about that is impossible to do with current cards? Is there something I missed?
There are some specific non-DX things in "10.1". They relate to threading, as Microsoft is requiring GPU builders to make their cards capable of finer threading for better multitasking use of the GPU.