• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

ati to beat nvidia for pixel shader 3.0 support?

draggoon01

Senior member
http://babelfish.altavista.com/babelfish/urltrurl?lp=ja_en&url=http://pc.watch.impress.co.jp/docs/2003/1003/kaigai029.htm


...Because of that, next spring the possibility the R420/423 of the ATI which is regarded becoming the first Shader 3.0 tip/chip is high.

When we assume, the NV40 is not Shader 3.0, as for that because it is preceding developing, the possibility of not being in time is high...

wow, it looks like things could actually get worse for nvidia. it's like sitting in a stalled car on traintracks. problems this generation was with poor performance, but how bad will it be when there's no support at all? maybe if nothing takes advantage of 3.0 it won't matter, but HL2 came out of nowhere and is probably the biggest factor changing perception of nvidia.

EDIT:
when's longhorn coming out?
 
Are we already getting DX10? Or is this DX9.5? DX9 is barely starting to be used, in most cases not even fully.

-Por
 
agrees with GT,

The biggest titles are just coming out (HL2 & D3). I dont think that there will be another big release for a while that would take advantage of 3.0. It would be a better idea for ATI to release a much faster 2.0 shader card than to take that step. Unless some serious games comes out take advantage, i dont see 3.0 cards going anywhere for now!
 
Originally posted by: Goose77
agrees with GT,

The biggest titles are just coming out (HL2 & D3). I dont think that there will be another big release for a while that would take advantage of 3.0. It would be a better idea for ATI to release a much faster 2.0 shader card than to take that step. Unless some serious games comes out take advantage, i dont see 3.0 cards going anywhere for now!

but a 3.0 card could kick a 2.0 card's arse
 
Originally posted by: cmdrdredd


but a 3.0 card could kick a 2.0 card's arse

Ok. This is getting silly! Given that there IS no hardware for shader 3.0, how in the WORLD do you think you can say this??? Hey, it might look prettier, but it could also tank any frame rates. That would be POR...
 
Originally posted by: Snooper
Originally posted by: cmdrdredd


but a 3.0 card could kick a 2.0 card's arse

Ok. This is getting silly! Given that there IS no hardware for shader 3.0, how in the WORLD do you think you can say this??? Hey, it might look prettier, but it could also tank any frame rates. That would be POR...

uh...I'm talking about the Shader 3.0 card running 2.0 code faster. Not 3.0 code running faster than 2.0 code
 
It's warranted speculation IMO. Nvidia needs to learn their lesson that Cg is not good for the graphics industry as a whole. If you look at things historically, ATi has been a big supporter of DirectX whereas Nvidia has always been fastest in OpenGL (not "always", but for the past few years). Will this come into play with future generations? Quite possibly. Add to this the fact that playing catch-up in the graphics industry is nearly impossible...again, speculation... 🙂
 
Originally posted by: SickBeast
It's warranted speculation IMO. Nvidia needs to learn their lesson the Cg is not good for the graphics industry as a whole. If you look at things historically, ATi has been a big supporter of DirectX whereas Nvidia has always been fastest in OpenGL (not "always", but for the past few years). Will this come into play with future generations? Quite possibly. Add to this the fact that playing catch-up in the graphics industry is nearly impossible...again, speculation... 🙂

Actually back the day, when Glide still had support, nVidia mentioned that D3D was their "native" API.
Of course, they had bar none the best OpenGL drivers aming consumer level cards as well, but seeing as 3Dfx had no real OpenGL drivers and ATi was barely in the 3D market at all, this says more about nVidia's competiton rather than nVidia.
 
The real question is who gives a flying fvck if one of them has it before the other does? It'll still take developers 9 to 18 months to start using it. Does it really matter if you can run the Dawn rendering demo 50 frames fater then your friend down the street?

Thorin
 
Originally posted by: thorin
The real question is who gives a flying fvck if one of them has it before the other does? It'll still take developers 9 to 18 months to start using it. Does it really matter if you can run the Dawn rendering demo 50 frames fater then your friend down the street?

Thorin

But she's SOOO hot!!! 😛
 
Back
Top