Old Fart:
I guess I'll have to forget about calling you "Agent Mulder" and call you "Nostradamus":
They did. Its the R420. You would find some excuse not to like it no matter what. Even if it had newer tech, you would find it "boring" and find an excuse to go nvidia anyway
Behold, ladies and gentlemen, a man who has never met me, yet can see into my future and tell me what I "would" do.
Makes sense to me. Old Fart, can you pick some stocks for me?
And did those cards just evaporate this year? People still own them. Have any new games come out?
I consider the useful life of a card one year. If you think using two year old cards is good, enjoy your sacrifices.
Again. From the guy who said not to buy a card based on future games when ATi had a leg up. By your old logic, it doesn't matter since new cards will be out by then anyway. Now all of a sudden your card has to be future proof.
Based on the information I had at the time, I didn't think there would be many PS2 games out within a year of the release of the 9800/5900 generation of cards. Time proved me right, it wasn't an issue with those cards for that year.
I also said both those cards shaders were far too weak for PS2 games, and that by the time there were PS2 games, those cards PS2 performance would seem like a joke. Time proved me right. Check out some Far Cry benchmarks of 9800/5900 vs X800/6800 if you don't believe me.
Its called a R420. Even calling it R300 is your little way of putting in a dig. R300 = 9700. R420 = X800. At least use the right terminology. Funny when nVidia was behind the technology curve with nV30, you went on about how the core tech didn't matter as long as you got the same performance (nV3x vs R3xx). Now that nVidia has a leg up, you conveniently reverse that to suit the pro nVidia agenda.
It's called the 9800U as far as I'm concerned. A tweaked memory controller and more pixel pipelines is not some gpu renaissance.
The nV30 was more advanced than the R300 in some ways, less in others. Carmack couldn't even use an R300 on Doom3 due to it's feeble instruction limits. These days, no one can use an X800 to develop, because why would you code in crappy partial precision DX9b when you could use DX9c /SM3 and do your job much easier? Yet again, ATI leaves developers hang with their primitive 2002 tech, while nVidia gives them tools that can produce games that will be relevant in the next couple years. Might have something to do with why "TWIMTB" is on most game boxes and I've yet to see a GITG logo?
This is not relevant to this discussion, but most of them with the exception of RTCW were pretty lame SP games. No story, just decent shooters.
I like to play shooters? It's relevant to to the topic because the D3 engine is one of the reasons I prefer nVidia this generation. If the 9800Ultra was the card to have for Doom3, it's likely I'd own one, and it's likely you would still be calling me a nVidia fanboy.
You are "solidly in the nVidia camp". You can deny it all you like, Your posts clearly prove otherwise.
Err, sure thing Old Fart. Actually, I
am solidly behind the nV40 this gen, as I don't see any reason anyone would prefer a X800 to it. I'll be back in ATIs camp again when they make something new, instead of trying to re-sell the same stuff I bought from them the last two years.
I'm not the only one around here that has made this observation.
I'm not the only one around here who has made the observation that I just like new tech video cards, no matter who makes them? So the fact that some other people agree with you is supposed to prove something?
LOL- I forget the name of this logical fallacy, but the gist of it is "many people can share a common misperception". (i.e. if the 3 Stooges all think gravity will stop working tomorrow, and I disagree, they're not right because all three of them think this)