Old Fart, I'm impressed, you've not only read what I've said but remembered.
My current stance would seem hypocritical on the face of it I imagine. Here's why it's not:
Last year, when you downgraded from a 256 bit PS 2.0 9800P to a 128 bit PS 1.4 5800U, you posted how it didn't matter what the technology was as long as the cards had about the same performance
You've mixed a few things together here, but I'll address each:
a. I went from a 9700Pro to a 5800NU that I clocked at Ultra level without a problem. There is much less difference going from 9700Pro to 5800U, they perform approximately equally. One has a little higher memory bandwidth, the other has higher fill rate. I found that I wasn't hitting the memory bandwidth limitation at the settings I ran. Upon going from 9800Pro to actual 5800U this year, I noticed AA/AF didn't seem as good, but overall am still loving the 5800U. (it's no slouch, still performs at ~ 9700P level)
b. PS 2 wasn't an issue last year. The only game that used it at all last year was Wallet Raider: Angel of Sloppy Code, and no one really played that.
This year, you are all about having a newer core, and are against the Ati "2 year old core", even though the cards perform ~ the same.
Personal preference. I've bought that core feature set twice for $400, there's a new core/ feature set to buy now. (I also bought two usable 5800s, seems fair) If the situation were reversed and ATI had the new tech and no overriding reason not to buy it, I'd buy ATI.
Last year, when the visual differences between PS 2.0 and 1.4 were shown, you were not impressed. "Ooooooh shiny water.....oooohh shiny pipes....spank spank" or something like that. Your opinion was the PS 2.0 Vs 1.4 visuals were not important anyway.
That was my opinion, I didn't think the shinier water of the famous HL2 comparison screen was that big a deal. ( the pipes did look better, but not enough to make me buy a video card based on that) Remember there were no games with PS2 effects last year, so they were easy to dismiss. I think we'll see much more of SM3 in the next 12 months than we saw of PS 2 in the last 12.
This year, SM 3.0 is something that cant be done without. Are you now impressed with the PS 2.0/SM 3.0 visuals? No silly comments about the lighting effects shown here?
I haven't seen enough comparison to say. For me, my hunch that nVidia will be anxious for developers to use their new card's now exclusive functionality, coupled with my lack of desire to buy a faster version of a card I've already bought twice, points me toward nVidia this round. (at least when I don't have to monitor inventory and pay $100 over MSRP :roll: )
Lat year, you didn't have any gripe with nVidia's Brilinear, or other optimizations.
This year, you are outraged because ATi has a similar optimization.
Two reasons:
1. nVidia gives you the option to turn off their optimizations and run true trilinear.
2. I'm giving those who said they would never buy from a company that built image degrading optimizations into their drivers and hid the fact to not be hypocrites and say the same about ATI. Not only did ATI do the exact same thing, but they made it worse by insisting reviewers turn off nVidia's optimizations for an accurate comparison to their "true trilinear". It's funny how a lot of those same people that stuffed that down my throat like Archie Bunker choking on a brat are now saying,"It doesn't matter, ATIs brilinear has a better algorhythm, so who cares?" even though IQ is degraded.