• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Rollo's 6600GT SLI benches

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Originally posted by: otispunkmeyer
Originally posted by: malak
Originally posted by: HardWarrior
Originally posted by: malak
I'm sorry, but SLI has one marketable feature. Buy one really good card now that can handle everything, then 2 years from now when it can't play things on high settings, buy another one for a very low price and you now can have all that performance again.

This may not be possible, at least not with the current state of SLI. The cards have to be identical in every respect. Two years is a long time in GPU years, meaning you might not be able to find an exact twin so far down the road.

People are still selling geforce3's, so I'm pretty sure you can find a geforce6 in 2 years.


people are still selling TNT2's so yes im sure you'll be able to

What if you had to find a specific revision of a TNT2?
 
Originally posted by: malak
In that case, there is no good use for SLI left.

Not true. Like Rollo and SB said, if you want a smoking-fast rendering subsystem, and the bragging rights that go with it, RIGHT NOW, SLI is it. BTW, it is okay to look at more than one aspect of a thing all at once, malak. 😉

 
Originally posted by: James3shin
once again rollo, your desire for that new edge in technology came through huh? 😛 Thanks for the benchies, SLI does indeed sound like a very nice option for those that want to expand later on at a fraction of the cost. Oh and I expect you will be on the ATI version of SLI when that time comes around as well 😀 :beer:

I have bought ATI and nVs flagship products for many years, I would imagine the R520 and ATI SLI will pique my curiosity as well. 😉
 
Originally posted by: Rollo
Thanks for the welcome, link to the "OC around the clock" party? I don't know, but they might with their paper launch of the R520 next month?

Link I doubt if it is the r520 preview, likely just a very fast x850 super cooled, but also possibly an introduction to a dual gpu setup. More expense and probable driver problems?

 
Originally posted by: ronnn
Originally posted by: Rollo
Thanks for the welcome, link to the "OC around the clock" party? I don't know, but they might with their paper launch of the R520 next month?

Link I doubt if it is the r520 preview, likely just a very fast x850 super cooled, but also possibly an introduction to a dual gpu setup. More expense and probable driver problems?

Wow. I can't believe they're doing that, but I suppose it's their own bottom line they're subtracting from by encouraging this, so who am I to judge?

What a change from when I emailed them a few years ago and they said "Our products are clocked as high as they can safely be run. Any increase would void your warranty".

I'm guessing that the release of SLI and overall success of the 6600GT, 6800NU, 6800GT lines have ATI looking for ways to court the enthusiast market after a year of dismal yields and no available product. They probably want to re-kindle interest in their line and are doing so by playing to the "something for nothing" crowd.
 
Certainly is for publicity,but likely not aimed at the enthusuiast market, as they easily sell all top of the line cards they make. More likely aimed at the mid range store shelf market. Gives the sales guy a nice hook. Anyways they must be pretty confident or they wouldn't do all this hoopla. Anyways can not see it being the r520, as that would immediately kill the market for high end stuff. I have assumed that both companies have learned their lesson and launches will become more realistic. I hope it is not another clumsy dual gpu thing, as practical aplications seem limited at this time.
 
Originally posted by: HardWarrior
Originally posted by: malak
In that case, there is no good use for SLI left.

Not true. Like Rollo and SB said, if you want a smoking-fast rendering subsystem, and the bragging rights that go with it, RIGHT NOW, SLI is it. BTW, it is okay to look at more than one aspect of a thing all at once, malak. 😉

I've looked at it from every aspect, and it is found lacking in each one.
 
Originally posted by: ronnn
Certainly is for publicity,but likely not aimed at the enthusuiast market, as they easily sell all top of the line cards they make. More likely aimed at the mid range store shelf market. Gives the sales guy a nice hook. Anyways they must be pretty confident or they wouldn't do all this hoopla. Anyways can not see it being the r520, as that would immediately kill the market for high end stuff. I have assumed that both companies have learned their lesson and launches will become more realistic. I hope it is not another clumsy dual gpu thing, as practical aplications seem limited at this time.

I don't know, I loved my MAXX for the same reasons I love SLI. Unusual, performance oriented hardware to tinker with, without voiding any warranties. 🙂
 
Originally posted by: HardWarrior
Based on some of the things you've said here I find that hard to believe. But okay... 😉

In all my posts I've looked at performance of SLI in today's games, use of SLI being a future upgrade path, and also how long SLI will last compared to just buying one card. In each aspect, SLI is a waste of money, which I have shown.
 
Originally posted by: malak
Originally posted by: HardWarrior
Based on some of the things you've said here I find that hard to believe. But okay... 😉

In all my posts I've looked at performance of SLI in today's games, use of SLI being a future upgrade path, and also how long SLI will last compared to just buying one card. In each aspect, SLI is a waste of money, which I have shown.

The voice of reason. 😉

Mind telling us what you're using that you've deemed SLI such a heinous crime for the rest of us?
 
Originally posted by: SickBeast
The technology is in its infancy and I'm sure the bugs will get worked out over time if nVidia dedicates a long-term commitment to it. That said, I don't know why they don't just develop multi-cored GPUs. The first of the two companies to do so will have the next R300 on their hands if they pull it off correctly.

3DLabs has been making "multi-cored GPUs" for years now. Look what has happened to them. 🙁
 
Originally posted by: SickBeast
What if you had to find a specific revision of a TNT2?
The same thing was largely true in the days of the V2 SLI cards - if you had a non-reference-design V2 card, then you would have to find an exact identical match, in order to get a functional SLI setup, I believe.
 
Originally posted by: VirtualLarry
Originally posted by: SickBeast
What if you had to find a specific revision of a TNT2?
The same thing was largely true in the days of the V2 SLI cards - if you had a non-reference-design V2 card, then you would have to find an exact identical match, in order to get a functional SLI setup, I believe.

You are correct. Some V2s matched, some did not.
 
Originally posted by: ronnn
Originally posted by: Rollo
Thanks for the welcome, link to the "OC around the clock" party? I don't know, but they might with their paper launch of the R520 next month?

Link I doubt if it is the r520 preview, likely just a very fast x850 super cooled, but also possibly an introduction to a dual gpu setup. More expense and probable driver problems?

No. ATI has already announced that it will be a completely new architecture. Especially since they need 32bit FP support in order to achieve SM3.0 certification.

-Kevin
 
Ati has also hinted at a dual gpu solution. I am not doubting the eventual release of the r520, just don't see them advance releasing it at a oc around the clock event. I just wonder what they will use to catch the paper release of the dual gpu 6800 ultra record.
 
Malak:
And 2 budget cards in SLI that can't match the performance of the one card you already had, yeah that's not a waste of money.

Hmmm. well, you bought a X850 XTPE, that has only 20MHz more on the core and 100MHz more on the RAM than a X800 XT, yet probably cost you a good deal more.

So apparently you don't mind "wasting money" either, and we only differ on the amount we're willing to "waste"? 😉

I already have an X800XT PE, so I know what that is. What gamer wouldn't want to try SLI in those circumstances? Because you choose not to is no reason to question my choice to do so?
 
What is up with all the hate towards SLI? It seems that if it was good enough for NV, ATi deemed it good enough for them, after ATi stated they will have their own version of SLI.
 
Originally posted by: James3shin
What is up with all the hate towards SLI? It seems that if it was good enough for NV, ATi deemed it good enough for them, after ATi stated they will have their own version of SLI.

I don't think there is any hatred towards SLI as a feature, more the implementation. The fact that Nvidia has to individually support every game makes it almost worthless, IMO. Gamers who are willing to spend the type of money SLI requires probably play all types of games. If your fancy 6600GT SLI rig effectively becomes one 6600GT for many of your games, what's the point? One 6600GT gets absolutely crushed by any 6800GT/X800 level card.

The second problem is that two 6600GT's in SLI make very little sense. As shown on benchmarks on anandtech and other sites, two 6600GT's get outperformed by one 6800GT, sometimes significantly, when using AA/AF and high resolutions. I'm going to assume anyone spending $400+ on a video card setup is going to use AA/AF and high resolutions - making the two 6600GT's a pretty poor option. SLI motherboards come at a $50-60 preminum - you're paying more money for less performance even in SLI compatible games, while at the same time getting poor performance in games that don't support SLI.

If you want to get a pcie card, the X800XL makes the most sense now. Faster than two 6600GT's, and even though retailers are still price gouging, you can get it for $370. Two 6600GT's plus the $50-60 extra for the SLi motherboard will run you more than that.

SLI really only makes sense if you get two 6800GT's or 6800 Ultras, where you will still get good performance in games that don't support SLI.
 
Originally posted by: VirtualLarry
Originally posted by: SickBeast
The technology is in its infancy and I'm sure the bugs will get worked out over time if nVidia dedicates a long-term commitment to it. That said, I don't know why they don't just develop multi-cored GPUs. The first of the two companies to do so will have the next R300 on their hands if they pull it off correctly.

3DLabs has been making "multi-cored GPUs" for years now. Look what has happened to them. 🙁
They pulled the trigger on dual-core a little too fast then. 😉

It seems that only now are the CPU makers "hitting the wall" in terms of single-core designs. Based on the fact that high-end graphics cards now require massive dual-slot coolers and reach enormous temperatures, I'm thinking that they are nearing this threshold as well.

3DLabs still makes the Realizm workstation cards which are really the cat's meow for Cad Monkeys like me you know. I've never heard of their multi-cored products. Did they really perform all that poorly?

I guess the other option is to go multi GPU, which has been done in the past by both 3DFX and ATI, and now there's a 6600GT that does it also (gigabyte?). I don't think it's quite as elegant a solution as going dual-core though; each GPU requires its own memory bank, which in turn means that a 256mb card isn't *really* a 256mb card. Correct me if I'm wrong please. 🙂
 
Why not support SLI? It is not a technology that is right for me at this moment, but it apparently is for others. Some of the tech is starting to top out. Seems as though the comp industry is moving to parallelism on the desktop to achieve more performance rather than radical new tech similar server models. If SLI is a mainstream alternative, programmers will design for it.

Upgrading: You can say that it will be hard to find a match to your video card down the road. Recall the recent move to dual channel memory. I had no problems all finding a match to my 256mb stick at the time. Same when I had a VP6, gathering up matched cpus was no great chore.

Downgrading: When my VP6 was relegated to a lesser use machine, I popped out the 2nd cpu and a stick of the ram to partially fund my upgrade. I would have loved to be able to split the video card as well. In Rollo's case, he has passed on an X850, when he got the SLIs. As a parent of teen age game playing rats, I like my kids to be able to enjoy the finer things in life too. A 6600GT would serve them just as well at the max res of their monitor as 1/2 of an X850, but that cannot be done without modularity.

Gpus are already at 200+mil size only to grow. Would it not be easier to manufacture two smaller cores at say 60-70% the size without defects. ie: more cores to speed bin and match, less huge cores to toss that are defective?
 
Upgrading: You can say that it will be hard to find a match to your video card down the road. Recall the recent move to dual channel memory. I had no problems all finding a match to my 256mb stick at the time. Same when I had a VP6, gathering up matched cpus was no great chore.

You don't need to match RAM for DC memory. Any two sticks the same size that run the same speed *should* work. Most dual-CPU boards are the same; even if the chips are a different stepping, it should work as long as they're the same speed. OTOH, I'm not certain why you have to perfectly match your video cards, but apparently you do.

Downgrading: When my VP6 was relegated to a lesser use machine, I popped out the 2nd cpu and a stick of the ram to partially fund my upgrade. I would have loved to be able to split the video card as well. In Rollo's case, he has passed on an X850, when he got the SLIs. As a parent of teen age game playing rats, I like my kids to be able to enjoy the finer things in life too. A 6600GT would serve them just as well at the max res of their monitor as 1/2 of an X850, but that cannot be done without modularity.

Well, I haven't heard any plans for modular video cards, but the idea does get tossed around every now and then, and if they ever stop completely overhauling their designs every 12-18 months, this could become more feasable.

Gpus are already at 200+mil size only to grow. Would it not be easier to manufacture two smaller cores at say 60-70% the size without defects. ie: more cores to speed bin and match, less huge cores to toss that are defective?

I personally think they're going to go in this direction (either multi-core, multiple GPUs on one card, or both). Building monolithic 32-pipeline chips is going to be a nightmare in terms of yields, and they'll be huge even at 90nm.

SLI is a good technology for people who need more speed than a single high-end card can provide (consumers with money to burn, CAD/CAM professionals for whom another $500 to nearly double the GPU's speed is a damn good deal). If you somehow knew that graphics cards wouldn't get dramatically faster or add any new features for 2-3 years, it would be a decent upgrade path. But as long as manufacturers are putting out a new core design and doubling overall speed every 2 years (or less, sometimes!), it seems a little silly to lock yourself into an old architecture, especially when you're paying a premium for it now. IMO, anyway.
 
As it turned out Mathias, you could mix and match ram and cpus (to a certain degree) in most cases, but the manufacturers were urging you to match (for better chance of compatibilty I'm sure).

To clarify on X850 modularity statement: two 6600s can be used in two different systems at the same time if necessary. A single card cannot. So for example, assuming multiple video card solutions become as common place as multiple sticks of DDR is today, you have a much greater flexibilty. However, I do not believe if perfectly matching cards remain a requirement, success is on the horizon.
 
you people do realize that rollo got SLI 6600GT's is so he can "play" with them right? Not for performance, he does have a x800xt in the house for play/performance as well...but to get back on topic about SLI performance. The performance gains of SLI are substantial enough for enthusiasts to take the plunge after looking at benchmarks from AT, especially in Doom and Far Cry if I recall. SLI is targeted for "enthusiasts" but there is that indirect target for the budget minded that want to upgrade at a cheaper price later on.
 
Back
Top