ATI to make low cost DX9 products.

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: McArra
But they'll encode in DX9 and DX8 so any card can perform decently in the game. And tohse budget will HAVE TO USE DX8 as they can't perform DX9 of any kind fast enough.

I agree, it's all marketing jargon to sell graphics cards to people who know nothing about computers.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Originally posted by: Genx87
Would you rather the game developers code for the low end DX8 parts or the low end DX9 parts?

Get past the obvious dislike for Nvidia and see what the 5200 is really all about.

You mean marketing and money making because they say DX9, and the market jumps.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Originally posted by: Genx87
ATI as usual is playing catch

You mean like the 6 months when nVidia had no DX9 product and ATi had the R300 core, and nVidia had nothing that would even touch it in terms of performance (especially AA/AF).

You mean that kind of ATi always playing catch-up?

Good one.

Oh, and then the 5800 launch which wasn't too great, since it was a pretty short lived product.

Yeah, ATi always play catch-up.

ATi had the first DX9 product on the market, then you say their playing catch-up, good one.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Originally posted by: rbV5
beg to differ. I have never heard of 9500 pros being failed 9700s. Either way it is a bad decision to put a GPU that size in a card that fits the sub 200 market. ATI admitted this by releasing the smaller 9600 Pro. Even worse was the 9500 non pro which is the same GPU as the 9700 but was selling in the sub 150 market. It isnt about eating 9700 sales. It is the costs involved with making a 9500 Pro. That is why it was a bad idea and obviously rushed.

Whether you heard it or not makes no difference. The reason the 9700 could run at 325 MHz on 150 nm process was because the cores were hand binned for speed. Do you really think they put the extra pipelines and 256 bit memory path on the 9500 pro/9500 cards and then disable them so enthusiasts could "discover" them....yea right. I'm not sure why you think its a bad decision to increase your yield by saving cores you would have to otherwise toss by disabling half the memory path and/or pipelines...Its called smart in my book.

The 9600 was produced not because they admitted the mistake of the expensive 9500, its because they were smart and went the 130 process on their "value" card instead of betting the bank with a new process on their flagship like NV. We all see the result. From what I hear the R420 will borrow heavily from the 9600 (like 3 X 9600 pipelines on a single 013 core) sure looks like ATI has a plan to me.

To the first quoted poster also, why was the Ti4600 so much more expensive than the Ti4200 on release? Because it costs so much more to make? Yeah, so much more considering the differences are in speeds of memory and core, basically they were 90% the same.

Do you know that the GPU isn't all that much to make, and if you're getting good yields, it's even better?
The PCB/memory paths etc are a fairly hefty cost, as can be RAM (higher speeds cost more), thus using a high yield (R300) core on a simple PCB and cheaper RAM, means that you get a lower end product that's also a fair bit cheaper.

You going to say next that using a lower clocked RV350 on 64-bit mem bus PCB is stupid?
The differences between the possible low end product and their mid-range 9600 cards is going to basically be the same as the difference between the 9700's and 9500's.

Oh, and markups too, they may not have had such good margins on the 9500's, but if volume sales are good you still make a lot of money.

rbV5 is totally right.

So STFU, kthxbye.


Apparently, average selling price of a current-generation high-end graphics processor over its lifetime is $18, so, either graphics card maker gets RADEON 9800 PRO or the GeForce FX 5800 Ultra, it pays roughly $18 per chip depending on the timeframe. I believe that high-end graphics processors from NVIDIA and ATI are priced more or less equally and do not account for a substantial part of graphics cards? costs.

Given that in $399 per card there is retailer?s profit, taxes, transport fare as well as graphics card manufacturer?s margin, we can figure out that it does not really cost too much money to manufacture a high-end graphics card. Maybe about $120 or $150, but definitely not more than $200, at least, in case this is not something really exclusive or, maybe, the first batch of products with low yields and so on. Considering that graphics chips only cost about $18, we can figure out that there are two more components of a high-end graphics card that may cost substantial amount of money: these are PCB and memory. Based on various reports I can say that fast DDR SDRAM or DDR-II SDRAM cost more than a GPU or even PCB; that is basically why NVIDIA decided not to utilise DDR-II memory with its NV35 because of very high price on these advanced memory products.

Simplify the PCB = cheaper to make, slower RAM = cheaper to make. GPU = fairly cheap.
Hence the R300 wasn't a bad thing to use on the 9500 cards.
Just in case you wanted some backing up. Quote taken from Xbitlabs (Don't have the exact link)
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: Genx87
But how would you feel if R300 would have been a Radeon 8500 with DX9 features? Same performance but more features, hardly usable.
When a company takes out a new generation of X market product it should outperform the previous, shouldn't it?


If the R300 would of been the 8500 performance wise with DX9 features I would of laughed my ass off because the R300 is a top end part. Now if ATI would of released a DX9 part to the low end I would of thought it a wise move. Not trying to keep selling off the many many warehouses of R200s that flopped. And who told you the 5200 isnt useable?!?!?!?!?!? Just because it doesnt run HL2 at 60 FPS doesnt mean it isnt useable. Somebody who is purchasing a 5200 is not doing it to get a top end video card. They are getting it to toy around on the desktop and do some casual gaming. Hell the 5200 still runs QuakeIII@ ~50-60 FPS. That means the card should play 90% of the games out there just fine. That to me seems very useable.

It comes down to a simple fact. Nvidia is pushing the envelope to get DX9 parts to the masses. ATI as usual is playing catch up and wont even get thier part out until 2004. Just like ATI goofed up the R300 launch by not having a card that could compete from the 80-220 dollar market. They rushed out the 9500\pro which was a bad idea for them as it was a big core and costed a lot for the intended market. They are goofing up this launch. They should of had a DX9 part at least by the time Nvidia showed up with the FX cards in April.


Genx87 - you're such an nvidia fanboy it borders on trolling. I can't even choose where to begin - these quoted paragraphs speak volumes unto themselves. Why do you kiss the ground beneath nVidia's feet? You don't seem to make a single admission that ATI has done anything good in their history, while nVidia are the godly pioneers of video.

According to you, ATI goofed up their R300 launch, yet that single release boosted them up from constant underdog to performance leader. So, ATI screwed up with their R300 launch while nVidia took another 8 months just to release a rushed DX9 part to compete?

Not trying to keep selling off the many many warehouses of R200s that flopped.
Funny, last I checked, nVidia's top retail sellers were still the Ti4200-8X parts. Correct me if I'm wrong, but those are *gasp* DX8 parts, just like R200 and R250.

Hell the 5200 still runs QuakeIII@ ~50-60 FPS

That's excellent, seeing as back in the GeForce 3 days that game could be run at over a hundred FPS already.

What, exactly is your problem Genx87? Can you not contend that ATI make excellent video cards, just like nVidia do? Can you not contend that ATI engineered a superior part in the R300 (radeon 9700) and caught nVidia completely by surprise? That ATI held the undisputed lead in performance until nVidia pushed the FX5800 out the gates?

Why don't you just bring up ATI's fabled "horrible drivers" while you're at it? And why stop there; you can orchestrate a full-on boycott of their evil Canadian products while you're at it!

There are two sides to every argument. For a change, next time you decide to post something, consider the other side!
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Re: ATi to make low cost DX9 products.

~$100 128MB 9600 = low-cost DX9 product, IMO, and one far superior to a 5200 and favorably comparable to a 4200.
 

MDE

Lifer
Jul 17, 2003
13,199
1
81
The whole reason (IMHO) for dirt-cheap DX9 cards is compliance with Longhorn which will use DX9 acceleration in the GUI. I'm sure an FX5200 or whatever low-end ATI card puts out will be able to handle that kind of work.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Wow...flame wars II...ATi strikes back.

I wish I had money to buy up some ATi stock. Maybe if I had some Nvidia holdings, I could sell them off and buy up some Canuck graphics. It's kinda strange how north of the border, 2D image quality goes up and profit margins go down :)