ATI to make low cost DX9 products.

McArra

Diamond Member
May 21, 2003
3,295
0
0
I've just read it in a spanish on-line computer magazine. It seems they're to fight with FX 5200 series. They'll be 0.13 mikron and seems like it is the RV350 core redisigned to go 300Mhz core and 500/600 memory, which might be 64bit wide!! I hope it is 128bit. Damn 64 bit is crap!! It is supposed to be launched in 2004.

I just hope it performs better than 5200 series. There's too many crap out there yet.
 

blazer78

Senior member
Feb 26, 2003
436
0
0
lol, i hope so.... cuz i might need a new dx9 card considering what i got in mark03 with my r9100.... lol but then again... i'm a mojority... =)
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Originally posted by: nemesismk2
I thought ATI would release a budget dx9 video card eventually especially after they saw how well the fx5200 was selling compared to their rubbish r9200.

The 5200 isn't really much better than the 9200, except it has DX9 support, so the consumer thinks this makes it better. It does in terms of features, but in terms of raw speed, the 5200 and 9200 are fairly evenly matched. The only "rubbish" thing about the 9200 is the lack of DX9, not speed. If ATi released a card that was the same speed as the 9200 but had DX9 support, it would probably sell better than te 9200 simply because of the DX9 support, nothing to do with speed.

And the 9600 non-pro isn't all that expensive anyway, a cheaper version would be easily within the range of the 5200 in terms of price.


Xbitlabd article on the new ATi DX( budget cards

ATI Technologies is set to start sampling of its code-named RV351 chip shortly in order to mass-produce it in late Q4 2003 or, more probably, in Q1 2004. The chip will have 4 rendering pipelines, DirectX 9.0, AGP 8x support and other features of the RADEON 9600 (aka RV350) product line. ATI will redesign the VPU a bit in order to reduce manufacturing costs. Most likely, the main trump of the whole RV350/RV360 VPU family ? their high core-speed ? will not be inherited by the RV351 due to price constraints. Obviously, you should expect some 275 ? 325MHz VPU, 400 ? 500MHz memory and a simple PCB with 64- or 128-bit memory bus.

Performance of the part with 128-bit memory should be higher compared to the RADEON 9200 PRO, but lower compared to the RADEON 9600 PRO. Nevertheless, in case NVIDIA Corporation, the main rival of the Canadian graphics processors? developer, offers something more powerful for the entry-level market, ATI will have to boost the performance of the RV351, I believe.

The GeForce FX 5200 Ultra performs slightly above the level of a GeForce4 MX 460 in situations where no AA or aniso is enabled
Not all that great, and without AA/AF, it's pretty much the same as the 9000Pro (same as the 9200Pro). In AA/AF it's better, but then again, not all budget users are going to want AA/AF.
 

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
a 9000 pro level dx9 card would be great, and if they sold it for like 60 bucks or less AWW YEA!
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: McArra
I've just read it in a spanish on-line computer magazine. It seems they're to fight with FX 5200 series. They'll be 0.13 mikron and seems like it is the RV350 core redisigned to go 300Mhz core and 500/600 memory, which might be 64bit wide!! I hope it is 128bit. Damn 64 bit is crap!! It is supposed to be launched in 2004.

I just hope it performs better than 5200 series. There's too many crap out there yet.

Honesly, how pointless is that. You can't have your cake and eat it too. Want a low-cost DX9 card? Get A 9600. How much better-performing thant the 5200 can you get before you essentially have a Radeon 9600 or a FX 5600? They don't want to compromise their own market shares, now do they?
 

modedepe

Diamond Member
May 11, 2003
3,474
0
0
It would be nice if it would perform better than the fx5200, but I don't see that happening while keeping it as a budget card. It will probably perform pretty much on par with the 5200, maybe a little better if the memory is 128 bit. Buying any budget dx9 card for a while is going to be a waste, but those who don't research before buying will get the dx9 cards anyway, thinking they're better.
 

McArra

Diamond Member
May 21, 2003
3,295
0
0
Originally posted by: modedepe
It would be nice if it would perform better than the fx5200, but I don't see that happening while keeping it as a budget card. It will probably perform pretty much on par with the 5200, maybe a little better if the memory is 128 bit. Buying any budget dx9 card for a while is going to be a waste, but those who don't research before buying will get the dx9 cards anyway, thinking they're better.

That's it
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
While you people piss and moan about the 5200 you obviously miss the point of the card. It introduces DX support to the largest volume arena. Now while you people whine about Nvidias PS 2.0 support and DX9 performance. Ask youself why would Nvidia be trying to get DX9 cards into the majority of machines sold????? The faster the market gets taken over by DX9 cards the faster DX9 games will be developed using the features of DX9. My guessitmate is the DX9 complaints are being fuled by fanboi sites who will do anything to smear Nvidia. See AMD vs Intel back in the day........................

Again we obviously cant wait for ATI to bring the standards to the industry as they seem content to leave an aging dog in the R200 for that market.

 

McArra

Diamond Member
May 21, 2003
3,295
0
0
I think that non usable DX9 is a mistake. It would have been much better if NVidia had left GF4Ti as budget, as it performs incredibly well and isn't so expensive. A FX5200 it's like a car with incredible tunning looks and 30HP. 100km/h at best..... A huge crap. GF4Ti it's more like a GTi car, with compact looks but with a powerfull 170HP engine.

So you're telling you rather like features that can't even be used than a less featured powerfull card?
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
I think that non usable DX9 is a mistake. It would have been much better if NVidia had left GF4Ti as budget, as it performs incredibly well and isn't so expensive. A FX5200 it's like a car with incredible tunning looks and 30HP. 100km/h at best..... A huge crap. GF4Ti it's more like a GTi car, with compact looks but with a powerfull 170HP engine.

Who said it isnt usable?!?!?!?!?!? And FX5200 probably costs less from a manufacturing point of view than the GF4.

So you're telling you rather like features that can't even be used than a less featured powerfull card?


You are missing the point. When developers look at cards when they design a game and see a DX7\8 part in the low end that is what they will develop for. When they see a DX9 part they will start using DX9 features in the games. They always want to go with the lowest common denominator so they know their game will hit the largest possible market. If Nvidia and ATI just sold DX9 parts to the middle to high end markets it would take a lot longer to get DX9 titles out the door as game companies wouldnt want to miss out on the largest portion of the market.

Just because the 5200 might not be very good at playing DX9 games(apparently even the 9800s and 5900s arent too hot) doesnt mean it is a complete waste. Nobody is buying a DX9 5200 to play the latest greatest game at 200 FPS. But the game companies see all markets being covered by DX9 parts are start to code for it. If the 5200 was a DX8 part then we would be waiting much longer for DX9 titles.

 

McArra

Diamond Member
May 21, 2003
3,295
0
0
I'm sure you've seen last Tomb Rayder DX9 benchmarks in FX5200 cards. If that is playable...... Sure it brings faster DX9 implementation in the market but this DX9 must have DX8 to run in FX5200 at playable framerates. So, what's the point of being DX9? It's a marketing product... damn even a crappy GF4MX runs as fast as it runs. It is a new generation card that can't outperform a DX7 card!
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
I'm sure you've seen last Tomb Rayder DX9 benchmarks in FX5200 cards. If that is playable...... Sure it brings faster DX9 implementation in the market but this DX9 must have DX8 to run in FX5200 at playable framerates. So, what's the point of being DX9? It's a marketing product... damn even a crappy GF4MX runs as fast as it runs. It is a new generation card that can't outperform a DX7 card!


Oh god how many times do I need to explain this?!?!?!?!?!?!?!?!?!?!?!?

DX9 low end == DX9 titles coming out faster.

And I saw the demo of Tomb Raider. It isnt pretty for even the 9800 Pro\5900s Ultras.
 

McArra

Diamond Member
May 21, 2003
3,295
0
0
It isn't pretty for NVidia, and yes I understand what you said about DX9 games. But in practice that support will not be used by budget DX9 cards as they aren't powerfull enough to run it and backward compatible systems will have to be used, so no real benefit.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Would you rather the game developers code for the low end DX8 parts or the low end DX9 parts?

Get past the obvious dislike for Nvidia and see what the 5200 is really all about.
 

McArra

Diamond Member
May 21, 2003
3,295
0
0
It's not that I dislike NVidia, in fact I'm being critical with ATIs budget at the beggining of this thread. It's true it makes faster DX9 encoding, but it won't really be usefull when using such cards. The only strong point I find in those cards is the possibility of running the incoming windows, as pixel shader 2.0s will be needed. But how would you feel if R300 would have been a Radeon 8500 with DX9 features? Same performance but more features, hardly usable.
When a company takes out a new generation of X market product it should outperform the previous, shouldn't it?
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
But how would you feel if R300 would have been a Radeon 8500 with DX9 features? Same performance but more features, hardly usable.
When a company takes out a new generation of X market product it should outperform the previous, shouldn't it?


If the R300 would of been the 8500 performance wise with DX9 features I would of laughed my ass off because the R300 is a top end part. Now if ATI would of released a DX9 part to the low end I would of thought it a wise move. Not trying to keep selling off the many many warehouses of R200s that flopped. And who told you the 5200 isnt useable?!?!?!?!?!? Just because it doesnt run HL2 at 60 FPS doesnt mean it isnt useable. Somebody who is purchasing a 5200 is not doing it to get a top end video card. They are getting it to toy around on the desktop and do some casual gaming. Hell the 5200 still runs QuakeIII@ ~50-60 FPS. That means the card should play 90% of the games out there just fine. That to me seems very useable.

It comes down to a simple fact. Nvidia is pushing the envelope to get DX9 parts to the masses. ATI as usual is playing catch up and wont even get thier part out until 2004. Just like ATI goofed up the R300 launch by not having a card that could compete from the 80-220 dollar market. They rushed out the 9500\pro which was a bad idea for them as it was a big core and costed a lot for the intended market. They are goofing up this launch. They should of had a DX9 part at least by the time Nvidia showed up with the FX cards in April.

 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
"Low cost" as in good products, or just some piece of junk that will likely never be able to run a full directx 9 game in any sort of glory?

My experience with the low end of graphics cards is that you have to search the value bin for year-old games that you can run on them.

Nvidia already brought out their budget directx 9 card, and it can't even beat a R8500 in directx 8!
 

McArra

Diamond Member
May 21, 2003
3,295
0
0
Originally posted by: SickBeast
"Low cost" as in good products, or just some piece of junk that will likely never be able to run a full directx 9 game in any sort of glory?

My experience with the low end of graphics cards is that you have to search the value bin for year-old games that you can run on them.

Nvidia already brought out their budget directx 9 card, and it can't even beat a R8500 in directx 8!

I'm refering to this. It is supposed to be faster than the previous generation, at least, the previous budget, and it isn't.
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
They rushed out the 9500\pro which was a bad idea for them as it was a big core and costed a lot for the intended market.

Bad idea my a$$, The 9500 was definately not rushed, it simply used cores intended for the 9700 that didn't pass speed binning and ultimately made it possible to max the yield and deliver handpicked cores for 9700 cards on the larger process that everyone said wasn't possible..hello. THe 9500 was a short lived coup that will be in demand untill its gone just like the MUCH larger production ti4200 (that definately cost NV alot more in lost ti4600 sales than the 9500 will cost ATI in lost 9700 sales)

The 5200 is nothing more than an OEM's ticket to "DX9" in da box.

 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Low cost" as in good products, or just some piece of junk that will likely never be able to run a full directx 9 game in any sort of glory?

My experience with the low end of graphics cards is that you have to search the value bin for year-old games that you can run on them.

Nvidia already brought out their budget directx 9 card, and it can't even beat a R8500 in directx 8!


It is just hilarious reading the same old argument over and over.

Again would you rather developers code for the low end DX8 card or the low end DX9 card?

Bad idea my a$$, The 9500 was definately not rushed, it simply used cores intended for the 9700 that didn't pass speed binning and ultimately made it possible to max the yield and deliver handpicked cores for 9700 cards on the larger process that everyone said wasn't possible..hello. THe 9500 was a short lived coup that will be in demand untill its gone just like the MUCH larger production ti4200 (that definately cost NV alot more in lost ti4600 sales than the 9500 will cost ATI in lost 9700 sales)


I beg to differ. I have never heard of 9500 pros being failed 9700s. Either way it is a bad decision to put a GPU that size in a card that fits the sub 200 market. ATI admitted this by releasing the smaller 9600 Pro. Even worse was the 9500 non pro which is the same GPU as the 9700 but was selling in the sub 150 market. It isnt about eating 9700 sales. It is the costs involved with making a 9500 Pro. That is why it was a bad idea and obviously rushed.

The 5200 is nothing more than an OEM's ticket to "DX9" in da box.

Exactly and when those developers see that many DX9 cards in the market they will start using features for DX9.
 

McArra

Diamond Member
May 21, 2003
3,295
0
0
But they'll encode in DX9 and DX8 so any card can perform decently in the game. And tohse budget will HAVE TO USE DX8 as they can't perform DX9 of any kind fast enough.
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
beg to differ. I have never heard of 9500 pros being failed 9700s. Either way it is a bad decision to put a GPU that size in a card that fits the sub 200 market. ATI admitted this by releasing the smaller 9600 Pro. Even worse was the 9500 non pro which is the same GPU as the 9700 but was selling in the sub 150 market. It isnt about eating 9700 sales. It is the costs involved with making a 9500 Pro. That is why it was a bad idea and obviously rushed.

Whether you heard it or not makes no difference. The reason the 9700 could run at 325 MHz on 150 nm process was because the cores were hand binned for speed. Do you really think they put the extra pipelines and 256 bit memory path on the 9500 pro/9500 cards and then disable them so enthusiasts could "discover" them....yea right. I'm not sure why you think its a bad decision to increase your yield by saving cores you would have to otherwise toss by disabling half the memory path and/or pipelines...Its called smart in my book.

The 9600 was produced not because they admitted the mistake of the expensive 9500, its because they were smart and went the 130 process on their "value" card instead of betting the bank with a new process on their flagship like NV. We all see the result. From what I hear the R420 will borrow heavily from the 9600 (like 3 X 9600 pipelines on a single 013 core) sure looks like ATI has a plan to me.