lordtyranus
Banned
Which titles are those? Certainly not Farcry and HL2.I can think of a small handful and even then almost half of them are as fast, or quite close, on nV3x hardware as they are on R3x0. The shader hype so far has been just that.
Which titles are those? Certainly not Farcry and HL2.I can think of a small handful and even then almost half of them are as fast, or quite close, on nV3x hardware as they are on R3x0. The shader hype so far has been just that.
Originally posted by: TheSnowman
Originally posted by: BenSkywalker
Even quite a bit worse then the R3x0 parts which is saying a lot(the R3x0 parts are unuseable with any real shader load already- I can almost hit 10FPS with my R9800Pro 😛 ). It's completely and utterly useless with heavy shader loads(the R3x0) to completely utterly and totally useless with heavy shader loads(the NV3X).
Omg, an old part with lower clockspeeds half the pipelines of newer chips can't keep up in a benchmark made to stress the latest graphics cards? What the hell was Ati trying to pull!?! Then there is my xt-pe that can't even break 30fps in any of the tests, I suppose it is trash too, eh? :roll:
Originally posted by: Marsumane
Actually the reason is for different drivers. ATI released drivers specifically to take advantage of 3dmark and raised their top cards by around 15-20%. I suspect it may have something to do w/ catalyst ai (i think it was in those newer drivers) and its general optimizations (NOT app specific cuz they said they wouldnt do that for 3dmark) and maybe a few others. ATI is hurting for a good win and needed people to think that the nv4x archetecture isnt as good in some situations (like nv did w/ doom 3). So basically i call it a tie and say its a driver war and who cares cuz its just 3dmark and buy the cheaper card.
Is that how it is? Could you please explain why Ati's pci-express cards don't need the new drivers to get their high scores then?
Originally posted by: Gamingphreek
Well will drivers do anything. I mean people are paying like 130 for 9600XT's and i payed 198 for my 5900XT, and its getting the crap beat out of it. Is there anything that can happen BESIDES me upgrading. I was going to hold out until like Q3 of next year, however i dont know if ill be able to.
-Kevin
Which titles are those? Certainly not Farcry and HL2.
Originally posted by: BenSkywalker
Which titles are those? Certainly not Farcry and HL2.
I'll even spot you TombRaider: AoD to go along with FarCry- which other released game are you talking about? Been listening to the shader hype from people for two years now- quite telling when the staggering list of released titles is so huge, a massive shader revolution to be sure.
Oh yeah, Halo and DooM3.
Originally posted by: FuFighterStan
Don't forget certain leaked Nvidia drivers provide quite the hefty boost over the "official" ones on 3dmark05 scores as well
Originally posted by: Rage187
Originally posted by: FuFighterStan
Don't forget certain leaked Nvidia drivers provide quite the hefty boost over the "official" ones on 3dmark05 scores as well
Those arent leaked, those are approved by Futuremark to use w/ 3dmark05.
Originally posted by: FuFighterStan
Don't forget certain leaked Nvidia drivers provide quite the hefty boost over the "official" ones on 3dmark05 scores as well
Originally posted by: Rage187
....
Colin Mcrae 2004 makes pretty extensive use of shaders.
Ben, Hardware.fr's X700XT review shows the 5900XT in a less-than-flattering light with their games selection. The 5900XT averages 70% of the performance of a 9800P at 10x7 4x8, 16x12 0x0, and 16x12 4x8.
I suspect just about every game being released these days has a ?fair? amount of shaders
The much stronger DX9 shader engine on the R300 cards likely helps it run better in heavier DX8 shader situations too.
The DX8 path looks pretty good on HL2. On my 9600XT, depending on how the game runs, I might run it in DX8 and use 8AF, rather than DX9 and no AF. Same for Farcry, what?s the point of having nice shader-rendered water when it gets blurry after 10 ft because of no AF.
If by fair amount you mean miniscule then perhaps. You can count the amount of shaders utilized in 99% of games being released right now with your fingers- this is two years after all of the PR BS stating how huge they were going to be. It hasn't happened yet and it won't for some time.
NV cards have a separate shader path for Halo, and still aren?t rendering the PS2.0 effects in that game.
You can?t compare the shader load between ATI and NV cards in Halo.
You?re just skirting the issue.
You don?t think a GPU with a much stronger shader engine is going o be better equipped for the new games coming out in the next year?? --give me a break.
?In this another part of the game has something that jumps to the sight. The effect of the personage when it is invisible seems to function in way correcto in the 9600XT, since for who the Halo in the Xbox played, the effect is equal. Only that in the version for PC, the effect it requires PS2.0, and same in the FX5700U this effect is not to function, and same that if forces the use of PS2.0 in the game the effect does not function.
Don't kid yourself. There were more reasons than DX9 to buy a 9xxx card over a 5xxx. Your "all cards suck at shaders" comments are rather foolish, considering the XTPE is the best we have at running them. When will cards be good enough for you to run shaders? 2008?I could pull this same quote from a year ago or two years ago for that matter. You and the other PR followers have been saying the same sh!t for two years now and I'll ask you the same question all of them have refused to answer- Where are all these games? Two fvcking years. It got old a long time ago, now it's pathetic.
If 5xxx owners did not already regret their decision beforehand, they would have certainly regretted it this January when Farcry demo was released. This was 6 months before the NV40/R420.Remember your nV naming. The 5900XT was only 66% of the price of the R9800Pro. Besides that, I'm talking about how useless DX9 shader hardware has been despite the hype.