Not surprised at slow dx9 proformance in hl2 claims any game using dx9 as the min. spec should also run slow! seeOh it's true:brokenheart::frown:
Originally posted by: OmegaRedd
Not surprised at slow dx9 proformance in hl2 claims any game using dx9 as the min. spec should also run slow! seeOh it's true:brokenheart::frown:
Originally posted by: Finnkc
I said this already in another thread
" I am just gona wait for Nvidia to release a pure 9x card ... no biggy really."
Everyone is so going nuts over a game that is running on a platform that is brand spanking new. Woooo ATI is better... and if my judgment serves me right that will last all of a few months at the most. ... the way these new cards and chips are being released at ... I expect a new Nvidia FX version in about a month max, that will par up or even surpass ATIs mark.
Originally posted by: Goi
I guess nvidia either just didn't have the resources to make a full fledged DX9 part, or didn't think it neccessary to do so. However, that didn't stop them to proclaim full DX9 support even though its DX9, or in particular PS2.0 shaders aren't particularly fast. This marketing move was to entice more people to buy it on the pretext that it was fully DX9 capable and "future-proof". When the public finds out about its shortcomings(which is now) it would've been too late since many would've already bought it. While it would leave a slight bitter aftertaste on many consumer's tongues, they're banking on their next release to sweeten things up and gain back their reputation, and I have little doubt that they would. Business as usual...
Originally posted by: Acanthus
Originally posted by: OmegaRedd
Not surprised at slow dx9 proformance in hl2 claims any game using dx9 as the min. spec should also run slow! seeOh it's true:brokenheart::frown:
Exactly.
Carmack is saying if you build a card to meet the MINIMUM for DX9 spec, its not going to run DX9 very well. NVIDIA did this, made it so the FX series simply supported DX9, but flew on DX8. The NV40 will be the 1st true DX9 card imho.
But thats all speculation.
What ATi did was design a powerful shader platform the first time around. That had no trouble running DX9 on the default DX9 codepath (ARB2?).
Originally posted by: Alkali
Are you the author of that article on the linked page OmegaRedd?
Originally posted by: NFS4
LOLOLOL!
http://www.nvnews.net/vbulletin/showthread.php?s=685b970858a77b05398a687a6fe38f8e&threadid=17823
Originally posted by: Budman
Originally posted by: NFS4
LOLOLOL!
http://www.nvnews.net/vbulletin/showthread.php?s=685b970858a77b05398a687a6fe38f8e&threadid=17823
haha that thread's funny, I peticularly liked that picture.
http://www.iinet.net.au/~jetha/misc/5900.jpg
Originally posted by: NFS4
Originally posted by: Alkali
Are you the author of that article on the linked page OmegaRedd?
No, he is resident NVIDIA master #2 behind Nebor.
Originally posted by: Alkali
Are you the author of that article on the linked page OmegaRedd?
Originally posted by: Nebor
Originally posted by: NFS4
Originally posted by: Alkali
Are you the author of that article on the linked page OmegaRedd?
No, he is resident NVIDIA master #2 behind Nebor.
I have absolutely nothing to do with OmegaRedd. Notice that I speak english...
Uh, wtf... Wasn't Carmack the one bitching for 128bit color and other crazy stuff? Or is this just because DoomIII is designed around GF4 style hardware? I can never keep track of this guy.The precision doesn't really matter to Doom...