Hi, just joined the forum..

and read through all the 50+ posts. ^^
I think its good that we're actually having this debate in the first place. Just as a while back when it was 3dfx vs Rendition (V2100/V2200 series still have the best 3d graphics quality to date.. until Radeon showed up).. then it was 3dfx vs nVidia. It's been a long time since anybody even raises the question of which company's product is superior (nVidia ofcourse is still superior!).
nVidia has gone a long way, but Ati has always been there. They still hold the biggest market shares, especially owning over half the OEM market (Q1 industry reports).
People tend to get so driven they forget the fundamentals.. as customers, you get what you pay for. Paying over $350 for a video card oughta get you top notch performance and quality.. and in the case of the GF3, it does. However, paying $70 for a Radeon LE in business sense should get you a piece of crap.. but it doesnt. It gets you a video card that once properly setup gives a GF2 a run for its money.
nVidia holds a strong stance a few years back claiming that there is a point when such a high FPS isnt needed, and the focus should be on image quality. Compared to a $70 Radeon LE, the GF2 Ultra and even the GF3 (without dx8 support in todays game) seems to fail in graphics quality utterly. This is exactly like the TnT vs the Voodoo 2 debate, where the V2 could push polygons way faster, but the TnT looked better.
Im sure when the GF3 is properly used, its graphics will be outstanding. But then again, if the Radeon's Programmable Pixel Shader were to be used properly, it would look amazing as well. Its a fact that it wont be used though. This isnt because game developers think it is inferior, its because nVidia and Microsoft have a contract to support each other due to the XBox fiasco. As it stands, feature wise nVidia with MS support will always be one step ahead. But not for long though, Radeon 2 will be fully dx8 compliant..
As for the reason why overclocking the Radeon wont give huge performance yield, simply because it only has 2 rendering pipelines. And in current games most of them use only 2 texture per pixel, hence, Radeon's 3rd texture unit is wasted, so its only half effective vs GF2. But on a GF2 with its 4 pipeline, it can withstand and delivery a lot more. However, as new games come out that support 3 texture, you will see GF2/GF3 being crippled.. if the game needs 3 texture per pixel, the GF2's 4 pipe (1st pipe renders 2 texture, 2nd pipe render 1 extra, with 1 remaining texture unit wasted) is equal to Radeon's 2 pipe.
Lets face it, graphics will become more and more complex, why would developers use only 2 texture when they can use more to make it realistic? Especially now that there wont be performance loss.
Radeon 2 will boast (i think from latest article i read) 4 pipe with 3 texture unit. This will be huge compared to GF3, just this alone will kick it in the nuts.. if what the white paper says is true.. it's up to Ati to deliver, and we shall wait.
Btw, i dont dislike nVidia, in fact i think they were great (used a TNT1, TNT2, GF2).. until they decide to overly charge consumers for their products. Without companies like Ati and ST:Micro, we'll see the day nVidia retails their mainstream video cards for 1,000.. just as they originally planned to sell GF3 for 500.. bloody ridiculous!! But what's worse is now they've joined forces with the "other" monopoly, MS.. they've got a plan for world wide domination i bet ya!
Cant wait for the GF3 Ultra vs Radeon 2 vs Kyro 3 debate in a few months!!

hotdamnit i wish Rendition were still around.. *sob*
