Originally posted by: DAPUNISHER
These 51.75's are total crap, I tend to agree with the conspiracy theorists who are asserting it was done for the sole purpose of jacking up scores in this benchmark. I have to say though, it is a clever gambit and I enjoy a good grift :evil:
Originally posted by: spam
Do you think Nvida would regain some customer trust if it corrected the labelling of 5200 and 5600 as DX9 compatible? It should be changed to DX8.
Originally posted by: nemesismk2
Originally posted by: spam
Do you think Nvida would regain some customer trust if it corrected the labelling of 5200 and 5600 as DX9 compatible? It should be changed to DX8.
Do you think ATI would gain customer trust if they put a sticker on their products saying "not guaranteed to run your software correctly due to our crappy drivers!"?
Of course you won't have problems if the only games you play are Free Cell and Teletubbies Adventure Storybook.Originally posted by: sandorski
Originally posted by: nemesismk2
Originally posted by: spam
Do you think Nvida would regain some customer trust if it corrected the labelling of 5200 and 5600 as DX9 compatible? It should be changed to DX8.
Do you think ATI would gain customer trust if they put a sticker on their products saying "not guaranteed to run your software correctly due to our crappy drivers!"?
I'd be quite pleased, no crappy drivers here!![]()
Exactly... I've seen the trailers and all I can say is drool...Originally posted by: 1ManArmY
Valve doesn''t need ATI to sell HL2, the product speaks for itself and will sell accordingly.
Um... pot, kettle, black...Originally posted by: nemesismk2
You can tell when your on Anandtech because it's got the highest number of moaners and whingers on the net. Nvidia doesn't owe you anything at all because they have given you no personal guarantee that products based on their gpu's are faster or better than ATI's. Seriously some of you people really need to get out more into the real world!![]()
You can tell when your on Anandtech because it's got the highest number of moaners and whingers on the net.
Originally posted by: DefRef
Of course you won't have problems if the only games you play are Free Cell and Teletubbies Adventure Storybook.Originally posted by: sandorski
Originally posted by: nemesismk2
Originally posted by: spam
Do you think Nvida would regain some customer trust if it corrected the labelling of 5200 and 5600 as DX9 compatible? It should be changed to DX8.
Do you think ATI would gain customer trust if they put a sticker on their products saying "not guaranteed to run your software correctly due to our crappy drivers!"?
I'd be quite pleased, no crappy drivers here!![]()
Serious point: In all the hoohaw about full-precision DX9 calculations, why is everyone ignoring the fact that ATI is calculating 24-bits while Nvidia uses 32-bits? Isn't the higher bit count responsible for slower performance on it's own and couldn't it be said that ATI is sacrificing precision for speed? (Not that the drooling Fanboys who pop a chubbie at the sound of the number "9800" would admit it.) If ATI converted all textures to 16-bit and ran faster, the Fanboys would hoot and holler and proclaim that this was proof that Nvidia hardware was inferior, right?
5800ultra leafblowerOriginally posted by: Jeff7181
Originally posted by: DAPUNISHER
These 51.75's are total crap, I tend to agree with the conspiracy theorists who are asserting it was done for the sole purpose of jacking up scores in this benchmark. I have to say though, it is a clever gambit and I enjoy a good grift :evil:
Which nVidia card did you try them on?
Originally posted by: Genx87
The only thing Gabe Newell has had enough of is the cheeseburgers that 8 million dollar check ATI wrote out to him bought.
This is kind of sad if Valve really did put their game up for auction. And if they did and ATI paid 8 million for it + something like 1000 copies do you honestly think Valve gave it the old bison try when the optimized for the NV3.x?!?!?!?!?!?!?
John Carmack appears to be doing it just fine and getting good results from te NV3.x cards. So what is wrong with Valve? I cant honestly believe after all this rucus they really tried to get Nvidia cards up and running to their potential.
Originally posted by: Jeff7181
This has gone off topic... bottom line, the ENTIRE GeForce FX line of video cards supports DX9. Supporting DX9 has nothing to do with how well the hardware runs DX9 software. Now if nVidia was advertising their FX5200 with Half-Life 2 and was using their "the way it's meant to be played" logo... I would have a problem with that. But taking the DX9 label off the low end FX cards is stupid... they DO support DX9... there's no if's and's but's or maybe's about it... it's a fact... it's not up for debate, end of story.
Originally posted by: Jeff7181
You don't understand... it has everything that's required to be called DX9 compatible. I just told you I would have a problem with it if they sold the FX5200 with Half-Life 2 sporting the "the way it's meant to be played" logo. I'm not arguing that the FX5200 is too slow to play HL2. But that's not what "DX9 compatability" means. Why can't you understand that?
Originally posted by: spam
Of course Jeff is right I am not contesting the point of DX9 compatability! What I am saying is I do not think that it is in Nvidia's interest to call it DX9 compatible. At least spedcify that the 5600 and the 5200 are not capable of running DX9 titles in Dx9 with shaders.Other wise it will continue to erode consumer confidence.
Originally posted by: spam
Do you think Nvida would regain some customer trust if it corrected the labelling of 5200 and 5600 as DX9 compatible? It should be changed to DX8.
