Lucid Hydra Benchmarks Finally!

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Sureshot324

Diamond Member
Feb 4, 2003
3,370
0
71
What if you mix two cards with different amounts of texture memory, say a 512mb card and 1gb card? Will it only use 512mb on the 1gb card? If a new game requires 1gb of texture memory at max settings, then keeping that old 512mb card isn't going to be much good unless they can find a solution for this.

This is one of the big reasons I don't go for multi GPU setups. I could get a 2nd 8800gt now on ebay for like $50 and I would have the graphics horsepower of a modern card, but still wouldn't have the texture memory of a modern card, so what's the point?
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
What if you mix two cards with different amounts of texture memory, say a 512mb card and 1gb card? Will it only use 512mb on the 1gb card? I

Good question.

If the memory could be "pooled" instead of "mirrored" it would be a huge step up from the current situation.

How about four tiny modular GPUs on a stick sharing the same memory.....and it could be faster to market than something of equivalent power using massive single die size.
 

faxon

Platinum Member
May 23, 2008
2,109
1
81
how is that to bad? it benefits the consumer in that they dont have to pay $5 for something that comes with the board they buy regardless of who makes it. it's even been proven that there's nothing preventing SLI from working on earlier intel chipsets besides nvidia not giving out the code for it, since it would cut into their S775 chipset business
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
how is that to bad? it benefits the consumer in that they dont have to pay $5 for something that comes with the board they buy regardless of who makes it. it's even been proven that there's nothing preventing SLI from working on earlier intel chipsets besides nvidia not giving out the code for it, since it would cut into their S775 chipset business

If ATI makes something worth $5 they should be selling it for $5. (especially since they don't have deep pockets)

Otherwise (they way I see it) we will be paying $25 instead of $5 license fees to Nvidia for SLI (after ATI ceases to exist).

But I am sure ATI has a valid reason for what they are doing. I just can't understand the reason.
 
Last edited:

EarthwormJim

Diamond Member
Oct 15, 2003
3,239
0
76
If ATI makes something worth $5 they should be selling it for $5. (especially since they don't have deep pockets)

Otherwise (they way I see it) we will be paying $25 instead of $5 license fees to Nvidia for SLI (after ATI ceases to exist).

But I am sure ATI has a valid reason for what they are doing. I just can't understand the reason.

Since they don't charge for Crossfire, that means more boards support Crossfire (in fact almost any modern motherboard supports Crossfire) than SLI. What they gain from that is a greater likelihood of someone buying two Ati video cards instead of one.

I really don't understand why Nvidia wants to charge for SLI motherboard compatibility. If someone wants to use SLI, aren't they already winning by selling two (or more) video cards to the same person?
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
I really don't understand why Nvidia wants to charge for SLI motherboard compatibility.


Because lots of people want to pay them the $$$.

This is why I think it is a shame AMD doesn't charge a license fee also. (They certainly need the money).
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Obviously you don't remember the performance sli/xfire provided when they were released. They were lucky to get this kind of performance boost. When drivers mature we will see the 90% performance (hopefully).

no way will we see 90%. well, unless nvidia/ati get to 99% maybe...

again, lucid is at a huge disadvantage in writing drivers. it will be worse when they try to combine ati/nvidia cards b/c they are so different architecturally. Even if lucid does end up giving us a solid product at some point, they will get gobbled up by ati, nvidia, or even (gulp) intel.
 

deimos3428

Senior member
Mar 6, 2009
697
0
0
The best scaling obviously comes from mixing vendors, it seems that if either card is better at a particular game it takes over and is supplemented by the other so that the performance more than doubles that of the weaker card. It's like covering the weaknesses of each architecture.

Yeah, makes sense. Now hurry up and do it for CPUs too, guys.