Hey Robotech, let me make these points clear.
Robotech wrote
>
>I find 3 things very interesting about all these previews I'm reading:
>
>1) Same fillrate and memory throughput as the Ultra
>
This is not correct. Although the teoretical fill-rate is almost same between two, the actual fill-rate of Gf3 especially under 32 bit and FSAA is much more better than ultra (around %80 performance gain over ultra). I think this is what increases performance of the games under 32 bit and FSAA (1.6x1.2x32+Qu FSAA at 71 fps).
>2) FSAA, a feature that was once "downplayed" is now being hyped
>
I dont agree with you guys about the FSAA thing. I still like to play one of my favorite old F1 game, which only support 640x480, and the FSAA makes the quality of rendering significantly far better than the non-FSAA one (on a Radeon). As I said, high contrast games (like the ones that take place in open environment) benefits a lot from FSAA. Even a 1x2 mode look far better since while playing games, you do not stop the game and analyze the scene if the card does a good anti-aliasing. I think with this card, Nvidia did what they have to do one year ago by sacrificing their so called 6 month product cycle. If they would still continue this agressive cycle, what we get is only lots of marketing hypes like not so good-working T&L, FSAA implementation, bad bad bad memory arcitechture, etc etc. Maybe we have to thank Microsoft such that these guys are forced to produce a real decent chip at last.
>3) Every review is finally admitting the REAL reason why almost no games used the >GeForce's T&L engine - cuz it was lame
>
Totally agree.. Lots of marketing hypes for 10-15 fps difference for low end machines (I admit that I benefit though, coz' I still have a P-II 350)... At least they are on the right track now..
>the more I read, the less impressed I am.
Actually, I am far less impressed for old gf2-gf products. These guys came up with fill rate numbers like 1.6Mtexel, which they can never never never reach, especially in 32 bit. I read lots of articles which blamed ATI about their drivers, but in real world benchmarks, their radeon chipset will get par-to-par scores (sometimes even better) above 1024x780x32 despite the maximum fill-rate is only 1Mtexel. Up to this time Nvidia always delivers products with lots of marketing hypes, and now this is the first time that I see something really innovative coming from them. I wish 3dfx is still on the market and I hope ATI will come up with more decent chip such that the competition would not be over at least for some more years.
Ps: I totally agree with John Carmack. This is the first time that I am very impressed by a capability of a chip after voodoo2. And shame on 3dfx, because what Jonh Carmack wants from a 3d chip (increased internal/external precision like 64/128 bit colors) was implemented in Rampage (remember the apperance of Spectre in their web page), and most probably with loads of new features which would be missing in gf3. 🙁
Best