Take DX9 labeling off of fx chips?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Rogodin2

Banned
Jul 2, 2003
3,219
0
0
ATI offers a pci box that has a TVtuner with vivo-it is $80.

I'd sell your 5600, buy a 9600PRO and a TVwonder.

rogo
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
32,116
32,691
146
These 51.75's are total crap, I tend to agree with the conspiracy theorists who are asserting it was done for the sole purpose of jacking up scores in this benchmark. I have to say though, it is a clever gambit and I enjoy a good grift :evil:
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Originally posted by: DAPUNISHER
These 51.75's are total crap, I tend to agree with the conspiracy theorists who are asserting it was done for the sole purpose of jacking up scores in this benchmark. I have to say though, it is a clever gambit and I enjoy a good grift :evil:

Which nVidia card did you try them on?
 

nemesismk2

Diamond Member
Sep 29, 2001
4,810
5
76
www.ultimatehardware.net
You can tell when your on Anandtech because it's got the highest number of moaners and whingers on the net. Nvidia doesn't owe you anything at all because they have given you no personal guarantee that products based on their gpu's are faster or better than ATI's. Seriously some of you people really need to get out more into the real world!
rolleye.gif
 

nemesismk2

Diamond Member
Sep 29, 2001
4,810
5
76
www.ultimatehardware.net
Originally posted by: spam
Do you think Nvida would regain some customer trust if it corrected the labelling of 5200 and 5600 as DX9 compatible? It should be changed to DX8.

Do you think ATI would gain customer trust if they put a sticker on their products saying "not guaranteed to run your software correctly due to our crappy drivers!"?

 

sandorski

No Lifer
Oct 10, 1999
70,803
6,360
126
Originally posted by: nemesismk2
Originally posted by: spam
Do you think Nvida would regain some customer trust if it corrected the labelling of 5200 and 5600 as DX9 compatible? It should be changed to DX8.

Do you think ATI would gain customer trust if they put a sticker on their products saying "not guaranteed to run your software correctly due to our crappy drivers!"?

I'd be quite pleased, no crappy drivers here! ;)
 

DefRef

Diamond Member
Nov 9, 2000
4,041
1
81
Originally posted by: sandorski
Originally posted by: nemesismk2
Originally posted by: spam
Do you think Nvida would regain some customer trust if it corrected the labelling of 5200 and 5600 as DX9 compatible? It should be changed to DX8.

Do you think ATI would gain customer trust if they put a sticker on their products saying "not guaranteed to run your software correctly due to our crappy drivers!"?

I'd be quite pleased, no crappy drivers here! ;)
Of course you won't have problems if the only games you play are Free Cell and Teletubbies Adventure Storybook.

Serious point: In all the hoohaw about full-precision DX9 calculations, why is everyone ignoring the fact that ATI is calculating 24-bits while Nvidia uses 32-bits? Isn't the higher bit count responsible for slower performance on it's own and couldn't it be said that ATI is sacrificing precision for speed? (Not that the drooling Fanboys who pop a chubbie at the sound of the number "9800" would admit it.) If ATI converted all textures to 16-bit and ran faster, the Fanboys would hoot and holler and proclaim that this was proof that Nvidia hardware was inferior, right?

 

Robor

Elite Member
Oct 9, 1999
16,979
0
76
Originally posted by: 1ManArmY
Valve doesn''t need ATI to sell HL2, the product speaks for itself and will sell accordingly.
Exactly... I've seen the trailers and all I can say is drool...

 

Robor

Elite Member
Oct 9, 1999
16,979
0
76
Originally posted by: nemesismk2
You can tell when your on Anandtech because it's got the highest number of moaners and whingers on the net. Nvidia doesn't owe you anything at all because they have given you no personal guarantee that products based on their gpu's are faster or better than ATI's. Seriously some of you people really need to get out more into the real world!
rolleye.gif
Um... pot, kettle, black...
rolleye.gif
 

spam

Member
Jul 3, 2003
141
0
0
Hey Nemesisimk2,

You can tell when your on Anandtech because it's got the highest number of moaners and whingers on the net.

Would you stop your whining!


:D It is really ridiculous for us to have to listen to a whiner whining about all us whiners! The point of the post is how does Nvidia regain consumer trust and confidence? Nvidia has shot itself in the foot and the sad thing is it seems to be continuing to shoot itself in the foot. Nvidia keeps hurting itself- this is not ATI's fault Nvidia's doing it to itself and its customers!
 

McArra

Diamond Member
May 21, 2003
3,295
0
0
Originally posted by: DefRef
Originally posted by: sandorski
Originally posted by: nemesismk2
Originally posted by: spam
Do you think Nvida would regain some customer trust if it corrected the labelling of 5200 and 5600 as DX9 compatible? It should be changed to DX8.

Do you think ATI would gain customer trust if they put a sticker on their products saying "not guaranteed to run your software correctly due to our crappy drivers!"?

I'd be quite pleased, no crappy drivers here! ;)
Of course you won't have problems if the only games you play are Free Cell and Teletubbies Adventure Storybook.

Serious point: In all the hoohaw about full-precision DX9 calculations, why is everyone ignoring the fact that ATI is calculating 24-bits while Nvidia uses 32-bits? Isn't the higher bit count responsible for slower performance on it's own and couldn't it be said that ATI is sacrificing precision for speed? (Not that the drooling Fanboys who pop a chubbie at the sound of the number "9800" would admit it.) If ATI converted all textures to 16-bit and ran faster, the Fanboys would hoot and holler and proclaim that this was proof that Nvidia hardware was inferior, right?

You're wrong. DX9 specs say 24 bits, and NVidia doesn't sopport it. Moreover, half life uses 16bit precision for Nvidia as in 32 bit is even worse performer, so ATI hardware uses higher precission, and the same happens in Doom III. Just have a look at any of the HL2 performance review and see it yourself, and better explained than I do ;)
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
32,116
32,691
146
Originally posted by: Jeff7181
Originally posted by: DAPUNISHER
These 51.75's are total crap, I tend to agree with the conspiracy theorists who are asserting it was done for the sole purpose of jacking up scores in this benchmark. I have to say though, it is a clever gambit and I enjoy a good grift :evil:

Which nVidia card did you try them on?
5800ultra leafblower
 

Johnbear007

Diamond Member
Jul 1, 2002
4,570
0
0
Originally posted by: Genx87
The only thing Gabe Newell has had enough of is the cheeseburgers that 8 million dollar check ATI wrote out to him bought.

This is kind of sad if Valve really did put their game up for auction. And if they did and ATI paid 8 million for it + something like 1000 copies do you honestly think Valve gave it the old bison try when the optimized for the NV3.x?!?!?!?!?!?!?

John Carmack appears to be doing it just fine and getting good results from te NV3.x cards. So what is wrong with Valve? I cant honestly believe after all this rucus they really tried to get Nvidia cards up and running to their potential.


Why is that sad? Thats business. A good game, on a good card. ATI cards are not bad cards so if you want HL2 get one and sell your FX on ebay
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
This has gone off topic... bottom line, the ENTIRE GeForce FX line of video cards supports DX9. Supporting DX9 has nothing to do with how well the hardware runs DX9 software. Now if nVidia was advertising their FX5200 with Half-Life 2 and was using their "the way it's meant to be played" logo... I would have a problem with that. But taking the DX9 label off the low end FX cards is stupid... they DO support DX9... there's no if's and's but's or maybe's about it... it's a fact... it's not up for debate, end of story.
 

spam

Member
Jul 3, 2003
141
0
0
Originally posted by: Jeff7181
This has gone off topic... bottom line, the ENTIRE GeForce FX line of video cards supports DX9. Supporting DX9 has nothing to do with how well the hardware runs DX9 software. Now if nVidia was advertising their FX5200 with Half-Life 2 and was using their "the way it's meant to be played" logo... I would have a problem with that. But taking the DX9 label off the low end FX cards is stupid... they DO support DX9... there's no if's and's but's or maybe's about it... it's a fact... it's not up for debate, end of story.

So Jeff if the 5200 runs dx9 at say 1 frame per 10 seconds -thats okay with you? Its a DX9 compatile part? Thats the way its meant to be played? Do you think your perspective is going to be the common view on the matter? Would you tell a customer that this is a DX9 compatible part go ahead and buy it?

I really hope not! It would be more accurate to say that it is a DX8.1 at least it would be functional. I really do not think Nvidia is going to gain any credibilty by saying it is DX9 compatible.:frown: Don't you think that it shows at least a lack of accuracy and at the worst a lack of integrity?
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
You don't understand... it has everything that's required to be called DX9 compatible. I just told you I would have a problem with it if they sold the FX5200 with Half-Life 2 sporting the "the way it's meant to be played" logo. I'm not arguing that the FX5200 is too slow to play HL2. But that's not what "DX9 compatability" means. Why can't you understand that?

*EDIT* Is it misleading to Billy-Bob who walks into Joe Blow's Computer Parts 'n Stuff to buy one of "them there new fangled vidia card whatcha-call-it's" to upgrade his "comploota" so he can play Half Life 2 and Doom 3? Most definately. Are they wrong in saying it's DX9 compatible? Absolutely NOT.
 

McArra

Diamond Member
May 21, 2003
3,295
0
0
Originally posted by: Jeff7181
You don't understand... it has everything that's required to be called DX9 compatible. I just told you I would have a problem with it if they sold the FX5200 with Half-Life 2 sporting the "the way it's meant to be played" logo. I'm not arguing that the FX5200 is too slow to play HL2. But that's not what "DX9 compatability" means. Why can't you understand that?

Jeff is completly right.
 

spam

Member
Jul 3, 2003
141
0
0
Of course Jeff is right I am not contesting the point of DX9 compatability! What I am saying is I do not think that it is in Nvidia's interest to call it DX9 compatible. At least spedcify that the 5600 and the 5200 are not capable of running DX9 titles in Dx9 with shaders. :D Other wise it will continue to erode consumer confidence.
 

KGB1

Platinum Member
Dec 29, 2001
2,998
0
0
IIRC, the GF/FX was missing a bump mapping process used in DirectX9. Though it is a moot point and really has nothing to do with the shaders and pipelines, it is part of how things are calculated in directx. I for one am not convinced that GF/FX is DirectX 9 compliant at all. My GF4 ti4200 is looking mighty good compared to the new models. nVidia just can't use parts of DirectX9 and then generalize that the WHOLE gpu is compliant with it.

If I had money.. I'd invest in a good Radeon 9800 Pro and wait out until PCi Express is kicking some serious arse next year.
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Originally posted by: spam
Of course Jeff is right I am not contesting the point of DX9 compatability! What I am saying is I do not think that it is in Nvidia's interest to call it DX9 compatible. At least spedcify that the 5600 and the 5200 are not capable of running DX9 titles in Dx9 with shaders. :D Other wise it will continue to erode consumer confidence.

Originally posted by: spam
Do you think Nvida would regain some customer trust if it corrected the labelling of 5200 and 5600 as DX9 compatible? It should be changed to DX8.

No... it should not... they ARE DX9 compatible. Either you're changing your argument, or you still don't understand. THE FX5200 AND FX5600 ARE DX9 COMPATIBLE NO MATTER WHAT THEIR PERFORMANCE IS IN DX9 GAMES OR APPLICATIONS.

They are not advertising that the FX5200 will allow you to play Half-Life 2... they are advertising that the FX5200 is capable of running DX9 instructions. It makes no difference how fast it can run them... the fact that it can makes it DX9 compatible.

This is such a stupid argument... it's like saying AMD can't use multimedia performance as part of advertising their products because P4's do it better. That's insane.
 

Alkali

Senior member
Aug 14, 2002
483
0
0
Jeff is right.

Listen to him.

He is the Guru on this matter.

I have decided.

You can stop reading now.

Thanks.

Seeya Laters.

(Yeah keep reading the next post now thanks)

Look we are all done here...

Stop it, and go and read the next post...
 

spam

Member
Jul 3, 2003
141
0
0
A quote from the movie Matilda fits here,

" I'm big your small , I'm smart your dumb, I'm right your wrong" - So there!!!!!!!!!!!!!
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
How can you argue it? There's no question about it... they support DX9 end of discussion. It doesn't matter how they perform, they support DX9... hence the "DX9 compatible" label. Anyone with an IQ over 100 should be able to understand that.