R250: ATI to Announce RADEON 8500XT and RV250 in March

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

NFS4

No Lifer
Oct 9, 1999
72,636
47
91


<< "Pariah, is it just me, or do you have a deep hate for nVidia?"

Nah, I don't hate any company. It's computer hardware not a religion, though some here would claim otherwise. It's just tiring seeing this company given credit for everything, when deserve it for almost nothing. Their business methods and tactics are no better than a Creative, Intel, or MS, all roundly hated around here, but for some reason they seem to have acquired an almost cult like following because they are able to get 15fps more in Quake3 or 200 points more in MadOnion benchmarks. There is far more to a video card than benchmarks. I'm not like some of the people on the other side of the fence either, I've owned five NVidia based cards as well as using numerous other on other people's computer and I have always been disappointed with them. I've also owned a powergraph 64 card, 2 Matrox cards, 5 3dfx cards, and one ATI card, so I'm not basing my opinion on 10 pages of benchmark pages from THG's or Anandtech like some people here. A card's usability can't be judged by benchmarks.

It's also extremely disheartening to see this company run one company after another out of business because they fooled the lemmings into thinking framerates are everything. The truly innovative companies are all going by the wayside, and all we are left with is a marketing giant that sells video cards on the side.
>>


It's the American way;)

Lead, follow, or get the %#&@ out of the way. I think we know which category NVIDIA, Microsoft, and Intel fall into. If you don't know what it takes to compete, then you don't deserve to stay alive...3dfx screwed themselves by delaying products and not delivering new features in a timely fashion.

I've used the following graphics accelerators in my system:

Diamond Stealth II S220 (Rendtion V2100)
Canopus Pure3D (6MB Voodoo1)
Elsa Erazor II (TNT)
Voodoo3 3000
Gigabyte 32MB SDR GeForce256
Asus V7700 64MB DDR GeForce2
PowerVR Kyro II
ATI AIW Radeon
NVIDIA Reference GF3 Ti200

I have been satisfied with ALL of the cards. The only that I ever had problems with was the Kyro II
 

Akaz1976

Platinum Member
Jun 5, 2000
2,810
0
71
When i said that ATi has come a long way from the days of putting up Rage Fury MAXX against Geforce 256, what i meant was that their impression and following at these boards has improved significantly and people who are members of these forums act as a leading indicator of what joe blow is gonna be buying 6 months to a year from now.

When Geforce came out these boads and most of Tech community immediately hailed Nvidia as market leader but it took Nvidia good six months to convert that superior product into a market share lead. So as ATi gains more exceptance on these boards, i would expect to see its market share rebound (currently at around 20-25% in Desktop descrete cards market) in the actual market 6 months down the road. Tho ATi is unlikely to regain its market share of yore (70%+) it is likely to improve from its current levels and biting into Nvidia's market (which is at 66% i believe).

Akaz
 

Bushwicktrini

Senior member
Jan 8, 2002
756
2
81


<<

<< "Pariah, is it just me, or do you have a deep hate for nVidia?"

Nah, I don't hate any company. It's computer hardware not a religion, though some here would claim otherwise. It's just tiring seeing this company given credit for everything, when deserve it for almost nothing. Their business methods and tactics are no better than a Creative, Intel, or MS, all roundly hated around here, but for some reason they seem to have acquired an almost cult like following because they are able to get 15fps more in Quake3 or 200 points more in MadOnion benchmarks. There is far more to a video card than benchmarks. I'm not like some of the people on the other side of the fence either, I've owned five NVidia based cards as well as using numerous other on other people's computer and I have always been disappointed with them. I've also owned a powergraph 64 card, 2 Matrox cards, 5 3dfx cards, and one ATI card, so I'm not basing my opinion on 10 pages of benchmark pages from THG's or Anandtech like some people here. A card's usability can't be judged by benchmarks.

It's also extremely disheartening to see this company run one company after another out of business because they fooled the lemmings into thinking framerates are everything. The truly innovative companies are all going by the wayside, and all we are left with is a marketing giant that sells video cards on the side.
>>


It's the American way;)

Lead, follow, or get the %#&@ out of the way. I think we know which category NVIDIA, Microsoft, and Intel fall into. If you don't know what it takes to compete, then you don't deserve to stay alive...3dfx screwed themselves by delaying products and not delivering new features in a timely fashion.

and let's not forget the Diamond Viper II with the broken T&L. I may not have a much experince with all of the video cards like some members but I have to say to dislike a company for taking advantage of what the "LEMMINGS" want is a little silly. I mean come on all of big companies do it some to a differing degree than others but all do it. IBM has bought a lot of tech companies instead of inventing it themselves but I don't see the hate sent at them the way people hate MS, INTEL et al..
 

Pariah

Elite Member
Apr 16, 2000
7,357
20
81
"I think we know which category NVIDIA, Microsoft, and Intel fall into."

Nvidia, despite their tactics, doesn't even deserve to be mentioned in the same breath as Intel and MS. Both those companies created industries, there was no PC to speak of before Intel and MS. These 2 companies brought the PC to the masses. NVidia did no such thing. 3dfx brought 3D gaming to the masses.

"3dfx screwed themselves by delaying products and not delivering new features in a timely fashion."

3dfx screwed themselves long before they started delaying products. They had engineers that were the best in the industry, and morons making decisions on the direction of the company. 3dfx was dead the day they bought STB and decided to market their cards exclusively to the public. Everyone and their grandmother could see that was an idiotic idea from the word go.

"nVidia has a cult following because they over-engineer their cards with excellent drivers for the most part."

2 things on this. One, Nvidia is not #1 because of "overengineered" cards. They are #1 because they are #1 in benchmarks. If the Radeon 8500 was 25-30% faster than the Ti500, the majority of people would have overlooked the issues with ATi's drivers, which for most people weren't issues anyway.

2, the only reason why their cards can be considered overengineered is because they have been working on the same card for 3 years and have released it under about 25 difference variations. I would hope by this point they would have most of the bugs in hardware and drivers worked out because they sure have had enough time to work on them.

This conservative development cycle is also the reason they have attained the dominant position. Yes, conservative, people think that because Nvidia is releasing a new series every 6 months that they are some sort of development wonder kids. Bull, Nvidia has been releasing the same card for 3 years, ala Creative and the Live series. The other companies while trying to innovate certainly could not keep up with a 6 month product cycle. Any company releasing truly new products could never keep up with a 6 month cycle. Nvidia simply flooded the market with one slightly faster card after another that brought nothing new to the table but was always able to just eek out in front on benchmarks. Nvidia's incremental speed increase strategy has stifled true advancements in the industry and really done a disservice to gamers. Nothing here is going to change either unless ATi pulls a miracle out of their hat, or some startup company comes along with something so revolutionary that it takes the industry by storm (Bit Boy's I'll tell you where you can store that press release until we see a card on a retail shelf).
 

Rand

Lifer
Oct 11, 1999
11,071
1
81
Frankly I strongly dislike nVidia as a company, some of their tactics are disgusting.
Their whole disgrace the Kyro fiasco was disgusting, and all their little PR games they played with Hypothermia/HardOCP awhile back was less then polite. They were one of the first to start the benchmark cheating parade with their WinBench98 'tweaks'.

That said, I admire their success. They've come from nowhere to bringing down the industry giant 3dfx and cornering a large portion of the enthusiast market... I just really dislike some of their less reputable methods of doing so.

Not that any of that has ever stopped me from purchasing nVidia graphics cards, and I've used them/recommended them and sold them and I'm sure I will do more of it in the future.






<< hmm, the GPU? Who came up with that, oh yeah, nVidia. What about GTS? First with the GeForce 2 >>



GPU: A fancy name meaning Graphics Processing Unit, they made it to signify their chips had T&L... I don't appluad them for making a name :p
GTS: Another fancy name meaning Giga-Texel Shader... the GTS was the first chip to have a peak theoretical textel fillrate of 1GT/s/1000MT/s. Again, no new technology... just a fancy name.
They werent the first with 32bit, they werent the first with T&L... hell it had been used for years in the prfessional market before nVidia had it.

On this point I'm in agreement with Pariah, they've never really been the first to have anything. They've been the first to bring certain things to the desktop market, but all of those things have always been used elsewhere long before nVidia ever had them.
 

NFS4

No Lifer
Oct 9, 1999
72,636
47
91


<< Nvidia, despite their tactics, doesn't even deserve to be mentioned in the same breath as Intel and MS. Both those companies created industries, there was no PC to speak of before Intel and MS. These 2 companies brought the PC to the masses. NVidia did no such thing. 3dfx brought 3D gaming to the masses. >>


Then 3dfx doesn't either, b/c they are fuggin DEAD!;) NVIDIA is more of a min-Intel and that is fact. Just take a look at how much clout they have and how much marketshare they command.


<< Bull, Nvidia has been releasing the same card for 3 years >>


I guess ATI has too?? That along with most of what you have said in this thread has been anti-NVIDIA all the way around. It's pretty obvious that reading your posts show that you have NO respect for them whatsoever and that you can't give them credit for ANYTHING. You make it seem like NVIDIA has become some sort of devil or something just for being successful.

That's all I have to say about that.
 

Leon

Platinum Member
Nov 14, 1999
2,215
4
81


<< Bull, Nvidia has been releasing the same card for 3 years, ala Creative and the Live series. >>



3dfx was releasing same card for 4 years. Voodoo 5 is nothing more than advanced Voodoo1 with twin pipelines and higher clock speed. On the other hand, there are significal differences between TNT, GeForce, and GeForce 3 series.

ATI relied on POS Rage 3d cards until 2000, when they suddenly realized that their market share has gone to crapper, and they lost 60 million in Q3 alone. From then, ATI cards followed the same evolutionary approach as Nvidia cards.

PowerVR, again, same evolutionary approach, but with "chunking" architecture. Neon, Kyro, Kyro II, and Kyro II Ultra. Faster clock speeds, mostly the same basic design. Kyro III (if ever) will add DDR memory, and DX8 featureset.

This is not 1970's, when engineering teams created new cpu cores every three months, and supercomputers were designed from scratch. What matters today is ability to execute, and slowly improve design.

Leon
 

NFS4

No Lifer
Oct 9, 1999
72,636
47
91


<<

<< Bull, Nvidia has been releasing the same card for 3 years, ala Creative and the Live series. >>



3dfx was releasing same card for 4 years. Voodoo 5 is nothing more than advanced Voodoo1. On the other hand, there are significal differences between TNT, GeForce, and GeForce 3 series.

ATI relied on POS Rage 3d cards until 2000, when they suddenly realized that their market share has gone to crapper, and they lost 60 million in Q3 alone. From then, ATI cards followed the same evolutionary approach as Nvidia cards.

PowerVR, again, same evolutionary approach, but with "chunking" architecture. Neon, Kyro, Kyro II, and Kyro II Ultra. Faster clock speeds, mostly the same basic design. Kyro III (if ever) will add DDR memory, and DX8 featureset.

This is not 1970's, where engineering teams created new cpu cores every three months, and supercomputer were designed from scratch. What matters today is ability to execute, and slowly improve design.

Leon
>>


THANK YOU!
 

Rand

Lifer
Oct 11, 1999
11,071
1
81


<< Bull, Nvidia has been releasing the same card for 3 years >>




How so?
The GF2 was little more then a modified GF1. The GF4 being little more then a modified GF3. But the GF2/3 I would classify as being fairly different, and the GF1/ and TNT1/2 are reasonably different.
If anyone re-released the same card over again I'd say 3dfx did that.
Even the V3 was little more then an overclocked and modified Voodoo1.

I don't think nVidia's done any more re-releasing of the same card then has anyone else.
They just have some rather misleading marketing... ie GF4 MX little more then a hyped up GF2. And WAY to many revisions.
There must have been about 10 different GF2 cores they released.

I like ATi's naming scheme.
7XXX/8XXX
First digit denotes the DirectX revision features supported in hardware, and the last three digits denote the relative performance.



<< Voodoo 5 is nothing more than advanced Voodoo1 with twin pipelines and higher clock speed. >>



I disagree, personally I'd say it was the V3 and prior that were very close to the V1. The V5/VSA100 was quite different IMHO
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0


<< w/TV Wonder VE. The VE was a POS! Now I'm using a Hauppage WinTV card it rocks >>



I agree, Its time ATI released a TV card based on the Rage Theater Chip, then maybe I could consider a card other than the AIW.
 

Pariah

Elite Member
Apr 16, 2000
7,357
20
81
"Voodoo 5 is nothing more than advanced Voodoo1 with twin pipelines and higher clock speed"

Voodoo5 was based on a significantly different architecture compared to previous Voodoo lines. I agree completely that 1-3 were basically the same architecture sped up, but the Voodoo2 line did implement SLI which was something completely different from the competition and actually truly benefitted the user on the games they wanted to play then. Nvidia has brought nothing of this nature to the industry.

"On the other hand, there are significal differences between TNT, GeForce, and GeForce 3 series."

I'm not going back to the TNT, there were major differences between the TNT and Geforce. Nothing significant has been changed since the original Geforce, just more of the same. Increasing memory speed and adding pipelines isn't exactly revolutionary.

"From then, ATI cards followed the same evolutionary approach as Nvidia cards."

ATi has gone through the Rage128, FuryMaxx (SLI on a card), and Radeon architecture which are all completely different architectures in the time Nvidia has been stuck on the Geforce line.

You want to argue that Nvidia isn't releasing the same card every 6 months. Then answer me this, what compelling reason has there been to upgrade from a Geforce1 to 2, or 2 to 3, etc besides faster benchmarks? Has there been even one feature introduced since the Geforce that everyone thought, wow, I want that, that will change the way we play games? Is it coincidental that the industry hasn't really seen anything new since the Geforce started dominating the market? I think not. Nvidia killed off innovation and turned the industry into a benchmark race.
 

Leon

Platinum Member
Nov 14, 1999
2,215
4
81


<< but the Voodoo2 line did implement SLI which was something completely different from the competition and actually truly benefitted the user on the games they wanted to play then. >>



Benifiited the user in terms of higher speed. Higher speed = higher benchmarks, in your own terms.



<< Nothing significant has been changed since the original Geforce, just more of the same. Increasing memory speed and adding pipelines isn't exactly revolutionary.
>>



First fully programmable shader architecture with major improvements, in terms of features and speed. Many software developers consider this major step forward.



<< Voodoo5 was based on a significantly different architecture compared to previous Voodoo lines. >>



Yes, slap two TNT2 err I mean VSA100 cores and add 32bit color (supported by G200 for two years) with FSAA capability.



<< ATi has gone through the Rage128, FuryMaxx (SLI on a card), and Radeon architecture which are all completely different architectures in the time Nvidia has been stuck on the Geforce line.
>>



Rage 128 ---- Fury Maxx

Slap two outdated Rage 128 cores in desperate attempt to compete with GeForce DDR. Then write an nice excuse e-mail telling customers that the product does not support Windows 2000.

Innovation at it's best!


I think you should learn what the term "innovation" really mean.

 

Sunner

Elite Member
Oct 9, 1999
11,641
0
76
Pariah, nVidia introduced T&L to the mainstream, then they introduced P&V shaders with the GF3.

NV4 - 32 bit color, hi-res textures, etc
NV5 - Upgraded NV4
NV10 - T&L
NV15 - Upgraded NV10
NV20 - P&V Shaders
NV25 - Upgraded NV20

See the pattern?
Evolution, with every new core(TNT, GF1, GF3), they introduce entirely new features, then they refine that core(TNT2, GF2, GF4).

I dont remember if it was Carmack or Sweeny who got the question "What hardware are you looking forward to the most", but whoever it was, the answer was "The NV20, it's gonna change everything".

So I guess not everyone agrees with you that nVidia never changes anything.

You really do hate nVidia, dont you?
 

Sohcan

Platinum Member
Oct 10, 1999
2,127
0
0


<< ATi has gone through the Rage128, FuryMaxx (SLI on a card), and Radeon architecture which are all completely different architectures in the time Nvidia has been stuck on the Geforce line. >>

Rage128 was released in late 1998, the Geforce was released in (IIRC) late 1999...I hardly call that the same time frame. In that time, ATi has gone through two distinct architectures (Rage128 & Radeon, I hardly call the FuryMaxx a distinct architecture) whereas NVidia has gone through 2.5 (trailing end of TNT, GeForce, and GeForce3). Perhaps you can define the Radeon and Radeon 8500 seperate architectures, but essentially ATi's and NVidia's generational architectural changes are paced equally (though somewhat staggered).
 

AA0

Golden Member
Sep 5, 2001
1,422
0
0


<< This thread sure shows how much ground ATi has made since the days when they had Rage Fury MAXX competing against Geforce 256. And sure is good for all of us here :)

Akaz
>>



I still love my Rage Fury MAXX :)
It was a good card for the price when it came out, but it can't run under win2k.... which is the only part that sucks. Drivers are solid on it though.
 

Pariah

Elite Member
Apr 16, 2000
7,357
20
81
"Benifiited the user in terms of higher speed. Higher speed = higher benchmarks, in your own terms."

SLI also allowed gamers for the first time to play at 1024x768, which wasn't slow before, it simply wasn't possible. Most gamers still play at this res today. The difference between 800 and 1024 was a major step forward. To this day there are no games that require a res higher than 1024.

As much as you joke about the innovation of the other companies, NVidia can't even claim anything comparable those accomplishments.

"Evolution, with every new core(TNT, GF1, GF3), they introduce entirely new features, then they refine that core(TNT2, GF2, GF4)."

Once again, I am starting from Geforce1, NV10 in your chart. You left out generations as well. The 6 month schedule was Geforce, Geforce Pro, Geforce 2, Geforce2 Ultra, Geforce 3, Geforce Ti, Geforce 4. So basically what you told us, is that in 3 years (6 generations of cards) Nvidia released the Geforce 1 with T&L, the Geforce 3 P&V shaders and 4 generations of cards that added nothing. How does that disprove my point?

As for P&V where is this revolutionary feature benefitting us? I haven't seen it anywhere. Carmack has made numerous comments to the affect that this new feature is going to change the industry and none of them have yet. In fact I even remember him making a comment that Matrox was going to change the industry with the release of the G800. We know how that turned out. When P&V actually does change the industry let me know. As I recall T&L in the Geforce was suppose to revolutionize the industry. We still haven't reached that point 3 years later. Feel sorry for the people who bought a Geforce1 thinking they were going to be playing revolutionary games on their new card.
 

Rand

Lifer
Oct 11, 1999
11,071
1
81


<< Then answer me this, what compelling reason has there been to upgrade from a Geforce1 to 2, or 2 to 3, etc besides faster benchmarks? Has there been even one feature introduced since the Geforce that everyone thought, wow, I want that, that will change the way we play games? >>



Programmable pixel and vertex shaders. It's nothing "new" as others had it long before nVidia in other markets, though of course it's new to the desktop market. It means nothing now as no games really utilize the effects you could design, but it is a TREMENDOUS step in terms of what can be done with it.
It's not a feature I'd go out and buy a card for as it won't be truly utilized to any wide extent for quite some time, but for software developers it's a huge step... and many things are easily possible that could never have been done before.



<< Yes, slap two TNT2 err I mean VSA100 cores and add 32bit color (supported by G200 for two years) with FSAA capability >>



Sure it added some new features, but if you look at the architectural design the VSA100 was very different from the V3 or any previous 3dfx designs.



<< ATi has gone through the Rage128, FuryMaxx (SLI on a card), and Radeon architecture which are all completely different architectures in the time Nvidia has been stuck on the Geforce line. >>



The Rage 128 and FuryMaxx are hardly two "completely different architectures", the Maxx is just two Rage 128 cores on one card. I'd say that's little more of an advancement then the GF1 to 2 was.
The Radeon (R100) is certainly very different architectually, and the R200 is reasonably different from the R100 core.
In nVidia's case the GF1/2 are similar, but the GF3/4 are fairly different designs. They share a decent number of similarities in the basic idea behind them but it's been totally re-vamped in many ways.

Regarding the MAXX, many people have never forgiven ATi for initially promising driver support for NT prior to release and then never being able to deliver.
You can bet they'll never make a mistake like them again, it garnered them an awful lot of negative press.
 

Pariah

Elite Member
Apr 16, 2000
7,357
20
81
"Rage128 was released in late 1998, the Geforce was released in (IIRC) late 1999..."

We're knit picking now. The Rage Pro 128 was released in late 1999 which added better performance, texture compression, AGP support, Rage Theater chip, and flat panel support among other things. Regardless, proving the other guys aren't innovative does not prove your point the NVidia is innovative. As I have said, Nvidia has dragged everyone else into the benchmarking race as it is clear that for some reason that is all most people seem to be concerned with. If you want to remain in the industry you have to mirror what the leader is doing. All NVidia cares about is faster Quake3 benchmarks, and everyone else has to follow suit, with the 6 month cycle NVidia is on, there is not enough development time for anything truly new to come out from anyone.
 

AGodspeed

Diamond Member
Jul 26, 2001
3,353
0
0
Regardless, proving the other guys aren't innovative does not prove your point the NVidia is innovative.

Then what's your point? If ATi, Matrox, STMicro and the rest are just as innovative as nVidia then what's the point of specifically singling out nVidia?

And no, nVidia has not been "releasing the same card for 3 years", as most here have already mentioned. Btw, why would you not include the TNT?
 

JellyBaby

Diamond Member
Apr 21, 2000
9,159
1
81


<< As for P&V where is this revolutionary feature benefitting us? I haven't seen it anywhere. >>

I believe Comanche 4 (arcade game for the PC) uses P&V and supposedly it adds something special to the visuals even if the game itself stinks. Linkage to screenshots.
 

Sohcan

Platinum Member
Oct 10, 1999
2,127
0
0


<< Once again, I am starting from Geforce1, NV10 in your chart. You left out generations as well. The 6 month schedule was Geforce, Geforce Pro, Geforce 2, Geforce2 Ultra, Geforce 3, Geforce Ti, Geforce 4. So basically what you told us, is that in 3 years (6 generations of cards) Nvidia released the Geforce 1 with T&L, the Geforce 3 P&V shaders and 4 generations of cards that added nothing. How does that disprove my point? >>

I'll nit-pick again :). It's been a little over two years since the introduction of the GeForce. How does that prove your point? From the TNT (spring 1998?) to the GeForce 3 (spring 2001), NVidia has introduced a new architecture every 1.5 years (TNT -> GeForce 1 -> GeForce 3). From the Rage128 (late 1998) to the Radeon 8500 (late 2001), ATi has introduce a new architecture every 1.5 years (Rage128 -> Radeon -> Radeon 8500). As ATi's comparable generational architecture usually comes out around 6 months after NVidia's, it usually has a slightly more robust feature set. I still don't see what you're gripping about.
 

Leon

Platinum Member
Nov 14, 1999
2,215
4
81
As for GeForce -- > GeForce 3 architecture, we went from something as simple as treedemo with couple of million polygons and simple terrain to extremely complex outdoor engines, with hunders of millions of triangles, and realistic water reflection done with pixel shader.

Another good example is Dinosaur Isle, which is actually fully playable demo.

 

Sunner

Elite Member
Oct 9, 1999
11,641
0
76
The GF2 GTS, and the Pro(Thats GeForce 2 Pro, not GeForce Pro), are not different "generations", they're like the V3-2K and the V3-3K.

And frankly, I dont considder the GF2 Ultra a new generation card either, it was a desperate move, made so they'd have something to show when the time for the refresh came, and NV20 wasn't ready.

Compare cores, not variations of the same cards.
 

NFS4

No Lifer
Oct 9, 1999
72,636
47
91


<< Regardless, proving the other guys aren't innovative does not prove your point the NVidia is innovative.

Then what's your point? If ATi, Matrox, STMicro and the rest are just as innovative as nVidia then what's the point of specifically singling out nVidia?

And no, nVidia has not been "releasing the same card for 3 years", as most here have already mentioned. Btw, why would you not include the TNT?
>>


It's not that hard to see it. He can't say anything positive about NVIDIA and he singles them out in everything.
 

Agent004

Senior member
Mar 22, 2001
492
0
0
I don't like where this is going, there were fair arguements until certain point.



<< It's pretty obvious that we are dealing with a STRICTLY anti-NVIDIA troll here. It's not that hard to see it. He can't say anything positive about NVIDIA and he singles them out in everything. >>



Obivously from what NFS4 said, we just can't say anything bad about nVidia either