Klixxer,
The nest generation wasn't the 4400's or even the 5500's it was the Vodoo 3000's.
The 3000 was basically the same speed as a SLI'ed V2, so while I may have skipped it by mistake, my point still stands: 3dfx obviously tried to do without SLI ever since the V2. There must have been a reason for that, and for why nV didn't bother with SLI all this time.
Besides, SLI on one card is SGPU tech, not at all comparable to SLI tech, you need to get a grip on the tech before arguing these things.
SGPU tech? Never heard of it. Are you inventing a new term, "single GPU tech"?
You may have the generations right, but not the concepts. AFAIK, the main reason for 3dfx's SLI was because they were behind on process tech (either their engineers just weren't as good as nV's, or they weren't willing to take as many risks, or they couldn't do what they wanted on a single [reasonably sized] die). nV resurrecting SLI a generation after they hit a limit fabbing the 5800U doesn't seem like a coincidence, though the timing may be right for more than just the gaming market.
Do I still need to get a grip on the tech?
Rollo,
The people SLI will appeal to aren't the ones thinking about the cost of a psu.
True, which is why I said SLI is great for the high-end. It basically offers you next-gen high-end performance a generation early. But that doesn't really apply to people upgrading $200-400 at a time, as I said.
As far as the 128MB on a NU SLI setup being a limiter goes, I wonder if it will be? You would think rendering half the scene would require half the memory intuitively?
I don't think they've solved that yet. If you think of two SLI'ed cards as just a single one with a bridge chip, you'd think it'd be possible, if not feasible. nV's new SLI can apparently do AFR as well, though, and that definitely requires each card to hold the whole scene.
My point is that for a year and a half ATI said PS2 was the defining factor in buying a video card
It's certainly a valid factor when your competitor is only as fast as you in PS1 and very uncompetitive in PS2. It was the defining factor in terms of longevity, but ATi also offered nicer AA and faster AF. Look, ATi had a lot of cards to play with the 9700P. I'm not sure why you begrudge PS2 performance, as it was valid. Every single PS2 benchmark showed ATi ahead, even the questionable ones. So why shouldn't they tout it? Just like SM3.0, it's better to have it than not to have it, no?
That they are not comparable is my point exactly. Here we are two years after the launch of the "Gotta have PS2" 9700P. How many PS2 games can I go to the store and buy?
Enough that it was worth considering. Don't you think Far Cry alone is enough to justify SM2.0? It's sort of like SM3.0--not necessary, but nice to have. And decently fast PS2.0 support turned out to be nice for the 9700P, at least nicer than for 5800/5900 owners.
At least with PS3, Far Cry has shown us some benefit, and actually runs well on the PS3 cards? Like I said, the next year will tell us a lot more about the necessity/utility of PS3, and whether TWIMTBP developers will retro code to give ATI users the same performance.
And Far Cry has shown some benefit with the 9700P's PS2 performance when compared with the 5900. And we were all saying the same thing about PS2 back then, weren't we? "The next year will tell us a lot more about the nec/util of [PS2]." And the benefits we saw with SM3 in Far Cry weren't all because of SM3. A lot of it was Crytek more fully exploiting PS2 to incorporate more lights per pass.
1. ATI lied to me and everyone else about their trilinear, while intentionally trying to deceive the press to make their parts look better.
Not the same as 3DM03, as most couldn't tell the difference in real life, and it applied to all games. nV's 3DM03 cheats applied to a single application, and a benchmark at that. Dude, trylinear was in ATi parts since the fecking 9600 and NO ONE NOTICED. (Actually, one person did. Want to guess who? Yeah, that ATI apologist, Wavey.) You're not going to get magical DX9 improvements with FX cards unless the dev or nV switches to mostly DX8. Is the IQ difference huge? Doesn't seem so. But nV was lying outright when they twisted 3DM03 to show similar DX9 performance. Was ATi forcing your eyeballs to lie?
2. When caught, they re-defined trilinear so they could say "We didn't lie". (Must have been watching Clinton's impeachment when he explained how he didn't have sex with Monica)
Granted, but this isn't any worse than 3DM03, IMO.
3. Brought out the same damn part three calendar years in a row. I won't buy it next year either, if they try to trot it out yet again.
Yeah, uh, that didn't stop you from liking nVidia in their GF1>GF2>GF2U or GF3>GF3Ti>GF4 eras, did it? In the end, they performed at the top of their field, and that's all that matters. No matter how much you or me want ATi and nV to release new tech every six months, it ain't gonna happen when most people buy $80 video cards.
4. Lost edge on features.
5. Didn't offer answer to SLI
6. Brought nothing to the table for my favorite game in the last few years, Doom3. Will likely lag at it's licenses, which I will buy.
Granted. The last three I buy, but the first three are mostly BS, and you know it.
Anand says lots of people bought SLI
Seriously, Rollo? You're going to use Anand's quote that "quite a few people" SLI'ed their $300 V2 ten years ago as a rebuttal to less people SLI'ing their $300-500 6800s, when the latter will require a new MB and probably a new PSU? Honestly, what's the point of debating something with you when you trot out an Anand quote as proof in a 3D card discussion? He's a smart guy, but he's not exactly the messiah when it comes to video cards. Show me figures, not a throw-away line on an internet article, if you want to back up that rolleyes.
If you had been doing the same thing during the cheerleading(particularly in the early days) of SM2 I wouldn't see any problem with it, but it is coming off as one sided when you choose to defend/apologize for ATi and tend not to do the same with nV.
I was fooled once with SM2, so I think it's understandable that I'm not as excited about SM3, no? But I also haven't read anything about SM3 that'll translate to in-game superiority in terms of either IQ or performance, and that's an equal reason for my cool reaction.
Fvck nVidia and their hype. This type of 'SLI' offers flexibility and significantly more power- how are either of those things remotely close to bad?
Agreed, fvck hype, but, again, I haven't knocked SLI's greater power. I am questioning SLI's flexibility in the $200 card segment, but how is that remotely close to saying it's bad? I'm undecided as to its benefit in the midrange, and fvcking impressed with its benefits in the high end. And thus my irritation when people start labelling me as an nV hater, because I don't accept nV's SLI as perfect on every level. It's perfect on one, the high end; the rest, I'm unsure about. This is reason enough to call me less than frothing at the mouth?
You're better than that, Ben.
Clauzii,
At the time most people can afford the current dual graphics/motherbords - they´ll be obselete!!!!
Actually, SLI will be much more attractive when dual-PEG MBs are standard (thus removing that initial entry barrier). As it is, the barrier stands, thus my ambivalence toward SLI for $200-300 video cards.
Gamingphreek,
So if you can run at max 800x600 on 1 card if you get 2 you should be able to run at 1600x1200 because each card does half the screen. THere is no combining things.
1600x1200 is 4x(800x600). You meant 1600x600, or maybe 1152x768. Yes, you can call me Mathphreek. It's preferable to nitpicker, anyway.
And, finally, back to Rollo.
The 9700Pro smoked 4600s for six whole months before 5800Us came out and offered comparable if not better performance at all playable settings. Then the 5900s came out and evened the palying field again. There was no "Golden Era of 9700Pro Domination" unless call the 6 months the 5800U was delayed due to TSMCs failure an "era".
The 9700P dominated for as long as the 4600 did, about a generation, which is really all a card needs to dominate before it's replacement comes along. And you're the only one using the term "golden era."