First glance at 3870X2 Crossfire X results.

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: LOUISSSSS
if the 8800ultra is going to be praised for being the fastest single gpu card until further notice, why don't people still praise the FX-57 as the fast single core cpu for the past 4 years or whatever it is.

Looks like the FX-57 still holds the crown as the fastest single core cpu, but the Celeron 430 (Conroe-L) is nipping at its heels. :)

http://www.xtremesystems.org/f...howthread.php?t=148382

 

CP5670

Diamond Member
Jun 24, 2004
5,697
798
126
Originally posted by: LOUISSSSS
if the 8800ultra is going to be praised for being the fastest single gpu card until further notice, why don't people still praise the FX-57 as the fast single core cpu for the past 4 years or whatever it is.

who cares if its 2 gpu's its on one pcb, takes one pci-e, can work on any pci-E mb.. etc etc etc alike any single gpu card.

The point is that these multi GPU cards can't do everything that a single card does. I would praise the 8800U because it's the fastest card that works with vsync in everything and can also output a 400mhz video signal. :p
 

NinjaJedi

Senior member
Jan 31, 2008
286
0
0
Originally posted by: nRollo
Originally posted by: Sylvanas
Who cares about a single GPU successor when people aren't buying a high end card 'because it has one GPU' they are buying the package and the 3870X2 is the highest performing *card* that you stick in a PCI-E slot and it runs....The fact is has two GPU's onboard is mute considering there is no differentiation by the driver or OS and its seen as a 'single' card solution- the G80 has ended.

The problem with your theory is Crossfire doesn't scale in all games, and only allows for one method of AFR as a fal lback if a game is not profiled.

So why is this a problem? If the AFR fall back doesn't work, and a game is not profiled, a 3870X2 performs worse than a 3870. The only way around that is to disable Catalyst AI, which also disables all fixes and optomizations for a game.

So the era of high end single GPU solutions will not be at an end, because there is a big difference between a solution like and a single GPU. (made bigger by this limitation of Catalyst drivers)



Edit:
We don't know what the GX2 will be of course, but if it follows the same path as other NVIDIA drivers, it will allow users to pick AFR1, AFR2, SFR, or single card as the fall back for non profiled games, and let the user create or edit profiles. This would limit the problems associated with multi gpu. Of course ATi may offer similar at some point which would make your argument closer to correct.

I don't think Sylvanas saying the G80 is dead means single GPUs cards are dead. He is saying people are buying a card because it is the best at the time. Regardless of the number of GPUs. (correct me if I got it wrong Sylvanas) We get it you like NV whoopee. Your blind devotion and unwillingness to admit the 3870x2 is the better card for the time being (weather NV's or ATI's next card tops it) is no better than Belichick walking out before the game was over.

Edit:
I did not realize Rollo actually worked for NV so I guess he has no choice in having blind devotion. Sorry I thought the sig was just an attempt at boasting the ego.

No hard feelings Rollo
 

Gikaseixas

Platinum Member
Jul 1, 2004
2,836
218
106
Originally posted by: CP5670
Originally posted by: LOUISSSSS
if the 8800ultra is going to be praised for being the fastest single gpu card until further notice, why don't people still praise the FX-57 as the fast single core cpu for the past 4 years or whatever it is.

who cares if its 2 gpu's its on one pcb, takes one pci-e, can work on any pci-E mb.. etc etc etc alike any single gpu card.

The point is that these multi GPU cards can't do everything that a single card does. I would praise the 8800U because it's the fastest card that works with vsync in everything and can also output a 400mhz video signal. :p

like Louisssss said when dual cores were released we had some situations were a single core would beat a dual core, specialy in games and one threaded applications.

Dual = future

3870x2 is a single card with two GPU's it only takes one PCIe slot and overall it beats the best Nvidia has to offer. Take it like a man and stop crying geeezuuss. I'm a Nvidia guy myself and i want to get 2 8800 GT's as my M/B supports SLI. If i could i would buy one 3870x2 now and another later, just too expensive.
 

CP5670

Diamond Member
Jun 24, 2004
5,697
798
126
Originally posted by: Gikaseixas
like Louisssss said when dual cores were released we had some situations were a single core would beat a dual core, specialy in games and one threaded applications.

Dual = future

3870x2 is a single card with two GPU's it only takes one PCIe slot and overall it beats the best Nvidia has to offer. Take it like a man and stop crying geeezuuss. I'm a Nvidia guy myself and i want to get 2 8800 GT's as my M/B supports SLI. If i could i would buy one 3870x2 now and another later, just too expensive.

I have no preference between Nvidia and ATI; I don't like either company's current implementations of the dual GPU concept. As I said, they don't do basic things properly that a single card can do fine.

As for "dual = future", Nvidia's next gen GT200 is supposed to be a single chip card.
 

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
Originally posted by: CP5670
Originally posted by: Gikaseixas
like Louisssss said when dual cores were released we had some situations were a single core would beat a dual core, specialy in games and one threaded applications.

Dual = future

3870x2 is a single card with two GPU's it only takes one PCIe slot and overall it beats the best Nvidia has to offer. Take it like a man and stop crying geeezuuss. I'm a Nvidia guy myself and i want to get 2 8800 GT's as my M/B supports SLI. If i could i would buy one 3870x2 now and another later, just too expensive.

I have no preference between Nvidia and ATI; I don't like either company's current implementations of the dual GPU concept. As I said, they don't do basic things properly that a single card can do fine.

As for "dual = future", Nvidia's next gen GT200 is supposed to be a single chip card.

What 'basic' things can't a multi-GPU setup do that a single GPU can?
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: CP5670
Originally posted by: Gikaseixas
like Louisssss said when dual cores were released we had some situations were a single core would beat a dual core, specialy in games and one threaded applications.

Dual = future

3870x2 is a single card with two GPU's it only takes one PCIe slot and overall it beats the best Nvidia has to offer. Take it like a man and stop crying geeezuuss. I'm a Nvidia guy myself and i want to get 2 8800 GT's as my M/B supports SLI. If i could i would buy one 3870x2 now and another later, just too expensive.

I have no preference between Nvidia and ATI; I don't like either company's current implementations of the dual GPU concept. As I said, they don't do basic things properly that a single card can do fine.

As for "dual = future", Nvidia's next gen GT200 is supposed to be a single chip card.

GT200 may be a single chip, but the days of single chip high-end cards are numbered. G92 is already 324mm^2, imagine how big GT200 will be on 65nm. It's simply not feasible to continue increasing graphics performance at the current rate on a single die; the number of transistors is increasing too fast for process technology to keep up.
 

CP5670

Diamond Member
Jun 24, 2004
5,697
798
126
What 'basic' things can't a multi-GPU setup do that a single GPU can?

The things I mentioned earlier are some examples. SLI doesn't work with vsync and triple buffering in a number of games, and Crossfire can't output video signals at 400mhz, which I would need to max out my monitor. This is not what you want to see on top dollar hardware considering that even a cheap 7300 or X1300 can do these things fine. There are also several odd game-specific glitches that come up on these setups.

GT200 may be a single chip, but the days of single chip high-end cards are numbered. G92 is already 324mm^2, imagine how big GT200 will be on 65nm. It's simply not feasible to continue increasing graphics performance at the current rate on a single die; the number of transistors is increasing too fast for process technology to keep up.

That may be the case eventually, but it's still at least one generation off. Actually, it might not be such a bad thing if everything becomes dual GPU, as the companies will then put more resources into working the bugs out of the drivers. I don't have so much a problem with the dual GPU concept itself as the way it's currently implemented.
 

LOUISSSSS

Diamond Member
Dec 5, 2005
8,771
58
91
Originally posted by: CP5670
What 'basic' things can't a multi-GPU setup do that a single GPU can?

The things I mentioned earlier are some examples. SLI doesn't work with vsync and triple buffering in a number of games, and Crossfire can't output video signals at 400mhz, which I would need to max out my monitor. This is not what you want to see on top dollar hardware considering that even a cheap 7300 or X1300 can do these things fine. There are also several odd game-specific glitches that come up on these setups.

can u list them again?
what are the things that a dual-gpu can't do as well as a single gpu?
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: thilan29
Originally posted by: nRollo
The problem with your theory is Crossfire doesn't scale in all games, and only allows for one method of AFR as a fal lback if a game is not profiled.

It doesn't scale well in all games but in most reviews I've seen, Crossfire scales better than SLI. And I'm pretty certain SLI doesn't scale well in ALL games.

And from those Crossfire X results it seems ATI has done a decent job of getting the scaling to work even with 4 GPUs.

But he wasn't comparing it to nvidia, he was comparing dual GPU to one GPU...

Originally posted by: ShadowOfMyself
Good thing they dropped the previous nomenclature, or else we would be seeing X1900XTX Crossfire X ... I mean, how many X do you really need?

Good stuff though, certainly better looking than the 7950GX2 fiasco

Xfire not crossfire :p thats another X in there...

So it could have ended up as the X3870X2 XTX Xfire X (Xtreme edition for an OC or extra ram version :p)

All in all the new xfire X shows amazing scaling... AMD is definitely pushing for superior drivers to compensate for lackluster hardware. Considering the reason I switched over to nvidia was their (back then) superior drivers (with hardware that was less bang for the buck, but that was worth it for the drivers).

I might end up in the AMD camp for video cards soon...


The CCC might report it as a single GPU but it is not a completely transparent solution... its just a covered up solution. At no place is it mentioned to be two GPUs, but the drivers ARE treating it as a crossfire solutions... with some games actually loosing performance compared to one card due to lack of driver profiling. All in all it depends on driver support, and it will only be good for so long. (I bet you that there wouldn't be any good drivers for the X2 for windows 2009 or whatever its called)

http://www.anandtech.com/video/showdoc.aspx?i=3219&p=2
Check the last result... the X2 performs WORSE then a single card due to lack of driver profiling, this can ONLY happen on a crossfire solution that is treated like crossfire. the only "OS transparency" I see is that they cover up the fact that its two separate GPUs. Not that it is a bad thing really, this stuff should NOT be bothering the user anyways. (I see no benefit to having the OS and driver treat it like two cards)

Single GPUs still have a lot in them, but now finally we have a case where going multi is not a horrible waste of money, but a valid alternative. but its still just that... a valid alternative, depends on your needs and desires. Just like dual vs quad core. there is no "this one is absolutely superior". It used to be that single GPU was absolutely superior. With the X2 single GPU and dual GPU are now both valid choices depending on your needs. And maybe in a few years buying a single GPU card will not make any more sense...

One thing that worries me, (aside from the insane power consumption) is that with 4 GPU in AFR you get into the realm of QUAD buffering... triple buffering (the max in DX9) already introduces noticeable input lag. Quad buffering now possible with DX10 means even more severe reaction lag. You might end up with smooth as silk frame rate but predictable and consistent input lag. (I play with vsynch forced on in drivers, and with triple buffer OFF)
 

CP5670

Diamond Member
Jun 24, 2004
5,697
798
126
Originally posted by: LOUISSSSS
Originally posted by: CP5670
What 'basic' things can't a multi-GPU setup do that a single GPU can?

The things I mentioned earlier are some examples. SLI doesn't work with vsync and triple buffering in a number of games, and Crossfire can't output video signals at 400mhz, which I would need to max out my monitor. This is not what you want to see on top dollar hardware considering that even a cheap 7300 or X1300 can do these things fine. There are also several odd game-specific glitches that come up on these setups.

can u list them again?
what are the things that a dual-gpu can't do as well as a single gpu?

:confused:

I just did in the post you quoted. :p

The CCC might report it as a single GPU but it is not a completely transparent solution... its just a covered up solution. At no place is it mentioned to be two GPUs, but the drivers ARE treating it as a crossfire solutions... with some games actually loosing performance compared to one card due to lack of driver profiling. All in all it depends on driver support, and it will only be good for so long. (I bet you that there wouldn't be any good drivers for the X2 for windows 2009 or whatever its called)

Yes, I think the claims of it being treated a single card by the drivers are exaggerated. From what I have seen in the reviews, it seems that it's operating just like a pair of cards in Crossfire but simply trying to hide that from the user.
 

thilanliyan

Lifer
Jun 21, 2005
12,085
2,281
126
Originally posted by: taltamir
Originally posted by: thilan29
Originally posted by: nRollo
The problem with your theory is Crossfire doesn't scale in all games, and only allows for one method of AFR as a fal lback if a game is not profiled.

It doesn't scale well in all games but in most reviews I've seen, Crossfire scales better than SLI. And I'm pretty certain SLI doesn't scale well in ALL games.

And from those Crossfire X results it seems ATI has done a decent job of getting the scaling to work even with 4 GPUs.

But he wasn't comparing it to nvidia, he was comparing dual GPU to one GPU...

Of course he was...the bolded part he has mentioned in several threads and compared it to SLI which can apparently do 3 different modes.

Here's the last bit of his post I quoted:

Originally posted by: nRollo
Edit:
We don't know what the GX2 will be of course, but if it follows the same path as other NVIDIA drivers, it will allow users to pick AFR1, AFR2, SFR, or single card as the fall back for non profiled games, and let the user create or edit profiles. This would limit the problems associated with multi gpu. Of course ATi may offer similar at some point which would make your argument closer to correct.

You see?? I'm definitely seeing comparisons there. Did you read his full post or just my quote?...
 

JayDeeJohn

Junior Member
Aug 1, 2006
8
0
0
I dont reply here often, but all fanboyishnish asside, we are all going to have to get used to the idea that multi gps on 1 card IS the future, and only having 1 per card or system will be archaic. Since GPs are so parrallel, this is the way of tommorrow, so dont go knocking the future, less you like being stuck in the past
 

Gikaseixas

Platinum Member
Jul 1, 2004
2,836
218
106
Originally posted by: CP5670
Originally posted by: LOUISSSSS
Originally posted by: CP5670
What 'basic' things can't a multi-GPU setup do that a single GPU can?

The things I mentioned earlier are some examples. SLI doesn't work with vsync and triple buffering in a number of games, and Crossfire can't output video signals at 400mhz, which I would need to max out my monitor. This is not what you want to see on top dollar hardware considering that even a cheap 7300 or X1300 can do these things fine. There are also several odd game-specific glitches that come up on these setups.

can u list them again?
what are the things that a dual-gpu can't do as well as a single gpu?

:confused:

I just did in the post you quoted. :p

The CCC might report it as a single GPU but it is not a completely transparent solution... its just a covered up solution. At no place is it mentioned to be two GPUs, but the drivers ARE treating it as a crossfire solutions... with some games actually loosing performance compared to one card due to lack of driver profiling. All in all it depends on driver support, and it will only be good for so long. (I bet you that there wouldn't be any good drivers for the X2 for windows 2009 or whatever its called)

Yes, I think the claims of it being treated a single card by the drivers are exaggerated. From what I have seen in the reviews, it seems that it's operating just like a pair of cards in Crossfire but simply trying to hide that from the user.

It ususally performs better than two 3870 cards in xfire due to the PCIe bridge chip in the card and also higher core speeds. This card is not perfect, sometimes it won't excel at certain games due to drivers but that's only here and there, for the majority of games it trounces the competition and that's why sites like Anandtech gave it the title of the best single card out there.
Competition is good, i'm loving it. We need another GTX or 9700pro, this is not one but it might ignite ATI or Nvidia to make one.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: JayDeeJohn
I dont reply here often, but all fanboyishnish asside, we are all going to have to get used to the idea that multi gps on 1 card IS the future, and only having 1 per card or system will be archaic. Since GPs are so parrallel, this is the way of tommorrow, so dont go knocking the future, less you like being stuck in the past

and tommorow it might have a non sucky implementation. It HAS been getting better with each generation of cards, but its still not good yet. There are still serious issues to solve.
This implementation already solves SEVERAL serious issues. Where it becomes a valid alternative. One day tomorrow it will solve them all.
But buying crappy technology today because something good might come out based on the same tech tomorrow is pointless, your card isn't gonna suck any less when the good implementation comes about.
 

Gikaseixas

Platinum Member
Jul 1, 2004
2,836
218
106
Originally posted by: taltamir
Originally posted by: JayDeeJohn
I dont reply here often, but all fanboyishnish asside, we are all going to have to get used to the idea that multi gps on 1 card IS the future, and only having 1 per card or system will be archaic. Since GPs are so parrallel, this is the way of tommorrow, so dont go knocking the future, less you like being stuck in the past

and tommorow it might have a non sucky implementation. It HAS been getting better with each generation of cards, but its still not good yet. There are still serious issues to solve.
This implementation already solves SEVERAL serious issues. Where it becomes a valid alternative. One day tomorrow it will solve them all.
But buying crappy technology today because something good might come out based on the same tech tomorrow is pointless, your card isn't gonna suck any less when the good implementation comes about.

can you name the few serious issues? the only one i see is driver support in some games but if you know others please let me know.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
that's not serious enough? you literally get WORSE performance using a 3870x2 then a 3870 card when playing Universe at war. for a card that costs twice as much!

Want others?
1. Extra bugs in games that don't exist in single core implementation. (check the release notes of any driver from nvidia or ATI to see the list of known and fixed bugs... notice MOST of them are on dual GPU solutions only, in fact they even separate the two)
http://www2.ati.com/relnotes/c...ease_notes.html#253654
Half of these bugs will NOT apply to you if you have a single GPU.

2. It wastes half the ram on one card, or 3/4th of the ram if you have two 3870x2 cards... leaving you with 2GB of ram with out 500MB usable.

3. Multi monitor issues.

4. It devalues like crazy compared to single card (which are more flexible) and thus sells for much less when you try to upgrade. But it devalues for a REASON. Its a power and heat hog with issues that can not compare to a next gen single GPU because of its inefficiencies.

This actually solves the problem of lack of universal multi-video card solution, meaning you can put it in any motherboard (rather then choosing nvidia or ATI when buying the mobo, and having to replace the mobo to switch like in SLI or Xfire). BUT it comes at the cost of reduced performance compared to buying two separate 3870 cards...

I think there were a few other problems that I can't recall right now.


The whole thing is, GPUs are already highly parrallel, there are hundreds of stream processors and other calculation processors as part of the whole GPU. Rather then expanding the architecture they are just throwing two identical GPUs together with wasteful duplication of uneeded parts like memory crossbars, etc. It is a cop out.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
It is a cop out.

it is the *future*
-nvidia is also doing it and the "issues" you mention are less than when G80 was first released ... way less than when SLI/xfire were new
-both companies are committed to it and will solve these problems

---get used to it :p
 

NinjaJedi

Senior member
Jan 31, 2008
286
0
0
Getting a game to work with a certain card or card configurations (SLI/Xfire) should be done by game developers first. Blaming the performance of a card in different games solely on the GPU developer is wrong. Certain cards or card configurations need to be supported by the game developer. That is why there are lists of *supported cards* for games. It take a lot of time and money to get a game to work under a wide variety of cards. There has to be a balance. What works well for one will not for another. Also games are developed to function best with the drivers out at the time of the games release. Anyone who has programmed knows when you fix one thing its likely 5 new things will *break* That is why sometimes a driver update will hurt performance and why some game patches are put out to correct the problem. SLI and Xfire account for a small percentage of the game market and in most cases doesn't get the attention it deserves in the games development. If multi GPU cards become the norm game developers will be forced to optimize the game to function under those situations. I think a better part of the time driver updates help a games performance but if the game developers optimize the game to function under SLI or Xfire it makes driver updates easier and more likely to improve performance.
 

Thor86

Diamond Member
May 3, 2001
7,888
7
81
When are these guys going to start developing multi-core gpus, like cpus?

 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: apoppin
It is a cop out.

it is the *future*
-nvidia is also doing it and the "issues" you mention are less than when G80 was first released ... way less than when SLI/xfire were new
-both companies are committed to it and will solve these problems

---get used to it :p

I don't have to get used to it. I will just switched when the problems are solved and the thing is stable. I am not morally opposed to multiple GPUs where I see it as the bane of my existence. I WILL GLADLY use multiple GPUs when and if said problems are solved.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: thilan29
Originally posted by: taltamir
Originally posted by: thilan29
Originally posted by: nRollo
The problem with your theory is Crossfire doesn't scale in all games, and only allows for one method of AFR as a fal lback if a game is not profiled.

It doesn't scale well in all games but in most reviews I've seen, Crossfire scales better than SLI. And I'm pretty certain SLI doesn't scale well in ALL games.

And from those Crossfire X results it seems ATI has done a decent job of getting the scaling to work even with 4 GPUs.

But he wasn't comparing it to nvidia, he was comparing dual GPU to one GPU...

Of course he was...the bolded part he has mentioned in several threads and compared it to SLI which can apparently do 3 different modes.

Here's the last bit of his post I quoted:

Originally posted by: nRollo
Edit:
We don't know what the GX2 will be of course, but if it follows the same path as other NVIDIA drivers, it will allow users to pick AFR1, AFR2, SFR, or single card as the fall back for non profiled games, and let the user create or edit profiles. This would limit the problems associated with multi gpu. Of course ATi may offer similar at some point which would make your argument closer to correct.

You see?? I'm definitely seeing comparisons there. Did you read his full post or just my quote?...

You and Taltemir are both right.

In the main body of my post I was only thinking of 3870X2 vs single GPU, in the edit I added later I was thinking of 3870X2 vs GX2 and how the GX2 s multicard limitations are lessened by the additional driver flexibility. The GX2 would still have some multi-card limitations though, so a single card offering similar performance to it would be "better" and/or worth more money.

I see this as the main problem with R6XX and ATi's stance from the beginning that multi GPU is how they would combat the 8800U and 8800GTX. Multi GPU has inherent limitations, more so with Crossfire, even though they've come light years from where it was just last year.

To me there are two reasons to go multiGPU.

Best: To get a level of performance unavailable with single cards, and for this you accept there will be variable scaling, more tweaking, more expensive hardware.

Second: You can't afford that level of performance, but realize adding a second card down the road you pick up used or discounted may be cheaper than selling your old card at big loss, buying new high end at launch price. (in short- flexibility)

A single 3870X2 doesn't offer consistently (appreciably) higher performance than a 8800U or OCd GTX, so it's more a hobbyists solution. An interesting alternative.

This of course may all change when the drivers for Quadfire launch as the one set of benches I've seen offer good scaling, at which point it may be a more attractive high end dual card solution.

My $.02
 

Gikaseixas

Platinum Member
Jul 1, 2004
2,836
218
106
Originally posted by: taltamir
that's not serious enough? you literally get WORSE performance using a 3870x2 then a 3870 card when playing Universe at war. for a card that costs twice as much!

Want others?
1. Extra bugs in games that don't exist in single core implementation. (check the release notes of any driver from nvidia or ATI to see the list of known and fixed bugs... notice MOST of them are on dual GPU solutions only, in fact they even separate the two)
http://www2.ati.com/relnotes/c...ease_notes.html#253654
Half of these bugs will NOT apply to you if you have a single GPU.

2. It wastes half the ram on one card, or 3/4th of the ram if you have two 3870x2 cards... leaving you with 2GB of ram with out 500MB usable.

3. Multi monitor issues.

4. It devalues like crazy compared to single card (which are more flexible) and thus sells for much less when you try to upgrade. But it devalues for a REASON. Its a power and heat hog with issues that can not compare to a next gen single GPU because of its inefficiencies.

This actually solves the problem of lack of universal multi-video card solution, meaning you can put it in any motherboard (rather then choosing nvidia or ATI when buying the mobo, and having to replace the mobo to switch like in SLI or Xfire). BUT it comes at the cost of reduced performance compared to buying two separate 3870 cards...

I think there were a few other problems that I can't recall right now.


The whole thing is, GPUs are already highly parrallel, there are hundreds of stream processors and other calculation processors as part of the whole GPU. Rather then expanding the architecture they are just throwing two identical GPUs together with wasteful duplication of uneeded parts like memory crossbars, etc. It is a cop out.

Extra bugs in games??? WOW
This should be blamed on the game developer not on the card itself. In general this card does well in 80% of the games out there and benchmarks so the other 20% of games needed more tweaking don't you think? I bet game developers will pay more attention to dual GPU's and we'll start to see games taking advantage of the hardware.

Resale value is that important to you? We're not talking about cars. This card will age and lose value as fast as any other card. You only have one reference and that's the previous Nvida dual GPU solution, a card that was not the fastest solution more ofen than this one so what makes you think the same will happen?

Both Nvida and ATI are shifting to dual GPU's for a reason, they reached the speed limits in these 65/55 processes and in order to go faster dual is the key.