• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

N00b multi-gpu question

Tweak155

Lifer
I've never done mGPU but I am considering going Skylake and at least want to understand what features I'm buying in the mobo.

I thought motherboards were no longer either xfire or SLI... but I see quite a few that only list xfire (example: http://www.gigabyte.com/products/product-page.aspx?pid=5489#sp), and that bridges were not necessarily needed (only for 4 way?).

Is it still possible to restrict myself based on my motherboard purchase? Or as it as long as it has at least 2xPCIe 16x/8x I should be good to go?

Simply put, how does mGPU work now? lol.

Thanks for any input.
 
I've never done mGPU but I am considering going Skylake and at least want to understand what features I'm buying in the mobo.

I thought motherboards were no longer either xfire or SLI... but I see quite a few that only list xfire (example: http://www.gigabyte.com/products/product-page.aspx?pid=5489#sp), and that bridges were not necessarily needed (only for 4 way?).

Is it still possible to restrict myself based on my motherboard purchase? Or as it as long as it has at least 2xPCIe 16x/8x I should be good to go?

Simply put, how does mGPU work now? lol.

Thanks for any input.

Nvidia charges the motherboard maker a fee to enable SLI in the bios and AMD does not. That's why you will see any motherboard with multiple PCIe slots labeled as xfire compatible but not necessarily SLI. Most gaming oriented boards will pay the Nvidia fee and have SLI support which will be passed on in the cost. Also Nvidia requires at least an x8 link for SLI whereas X-fire will work with an X4 link.

Bridges are no longer used on AMD GCN 1.1 cards and newer (r9 290/x and forward) however they are still used on Nvidia GPU's. Nvidia's new pascal GPU's only natively support 2 way SLI and need an "enthusiast" bios key from Nvidia to unlock 3 or 4 way which aren't officially supported any more.

Support has not been great in the market lately and scaling beyond 2 cards has been pretty bad always. With the shift to close-to-metal api's like Vulkan and DX12, the responsibility for supporting multi-gpu falls to the developer and not on the driver like it has been in the past. This will allow for new and potentially better performing multi-gpu but also the possibility of many games not supporting it all if the developer doesn't want to put the time in.
 
Nvidia charges the motherboard maker a fee to enable SLI in the bios and AMD does not. That's why you will see any motherboard with multiple PCIe slots labeled as xfire compatible but not necessarily SLI. Most gaming oriented boards will pay the Nvidia fee and have SLI support which will be passed on in the cost. Also Nvidia requires at least an x8 link for SLI whereas X-fire will work with an X4 link.

Bridges are no longer used on AMD GCN 1.1 cards and newer (r9 290/x and forward) however they are still used on Nvidia GPU's. Nvidia's new pascal GPU's only natively support 2 way SLI and need an "enthusiast" bios key from Nvidia to unlock 3 or 4 way which aren't officially supported any more.

Support has not been great in the market lately and scaling beyond 2 cards has been pretty bad always. With the shift to close-to-metal api's like Vulkan and DX12, the responsibility for supporting multi-gpu falls to the developer and not on the driver like it has been in the past. This will allow for new and potentially better performing multi-gpu but also the possibility of many games not supporting it all if the developer doesn't want to put the time in.

Man, what a great summary. Thanks!
 
never heard of the branding fee. even if there was. that fee would be minimal. definitely not a deal breaker.

the main reason why such motherboard does not support sli is due to the 8x restriction set by nvidia. another stupid nvidia move. it has been proven over and over that 4x is more than sufficient with insignificant loss of performance.



for 2 gpu or less. go with which ever color you prefer. even though the red team is a bit ahead.

for the latest 3 gpu or more. the green team is garbage. still using dinosaurs sli bridge. not to mention worsen scaling. and the infamous microstutter. so definitely the red team all the way. well. at least until NVidia get on board with pcie bridge.
 
Nvidia charges the motherboard maker a fee to enable SLI in the bios and AMD does not. That's why you will see any motherboard with multiple PCIe slots labeled as xfire compatible but not necessarily SLI. Most gaming oriented boards will pay the Nvidia fee and have SLI support which will be passed on in the cost. Also Nvidia requires at least an x8 link for SLI whereas X-fire will work with an X4 link.

Bridges are no longer used on AMD GCN 1.1 cards and newer (r9 290/x and forward) however they are still used on Nvidia GPU's. Nvidia's new pascal GPU's only natively support 2 way SLI and need an "enthusiast" bios key from Nvidia to unlock 3 or 4 way which aren't officially supported any more.

Support has not been great in the market lately and scaling beyond 2 cards has been pretty bad always. With the shift to close-to-metal api's like Vulkan and DX12, the responsibility for supporting multi-gpu falls to the developer and not on the driver like it has been in the past. This will allow for new and potentially better performing multi-gpu but also the possibility of many games not supporting it all if the developer doesn't want to put the time in.

excellent summary, /thread
 
Sorry to resurrect a dying thread but you guys seem to be knowledgeable crossfiring. If I were to take a 1 GB 6950 and a 2 GB 6970 and crossfire them, the RAM would stack, right? I am confused because the AMD website states that in the case of two different cards that match up, the RAM and GPU clock speeds will be notched down to the slower one. This is the same as how dual channel RAM works, isn't it? Some people are saying that it would only use RAM from one videocard and that doesn't really make any sense to me because crossfiring seems to perform better at higher resolutions (about twice as nice) and it would cause a huge bottleneck if the the two GPU's were to work faster without additional RAM. Please help.
 
Sorry to resurrect a dying thread but you guys seem to be knowledgeable crossfiring. If I were to take a 1 GB 6950 and a 2 GB 6970 and crossfire them, the RAM would stack, right? I am confused because the AMD website states that in the case of two different cards that match up, the RAM and GPU clock speeds will be notched down to the slower one. This is the same as how dual channel RAM works, isn't it? Some people are saying that it would only use RAM from one videocard and that doesn't really make any sense to me because crossfiring seems to perform better at higher resolutions (about twice as nice) and it would cause a huge bottleneck if the the two GPU's were to work faster without additional RAM. Please help.

NM, but posting the same question in different threads isn't going to change anything.
 
Last edited:
Sorry to resurrect a dying thread but you guys seem to be knowledgeable crossfiring. If I were to take a 1 GB 6950 and a 2 GB 6970 and crossfire them, the RAM would stack, right? I am confused because the AMD website states that in the case of two different cards that match up, the RAM and GPU clock speeds will be notched down to the slower one. This is the same as how dual channel RAM works, isn't it? Some people are saying that it would only use RAM from one videocard and that doesn't really make any sense to me because crossfiring seems to perform better at higher resolutions (about twice as nice) and it would cause a huge bottleneck if the the two GPU's were to work faster without additional RAM. Please help.

My understanding is that you would go down to 1gb total usable video memory in this scenario.
 
Sorry to resurrect a dying thread but you guys seem to be knowledgeable crossfiring. If I were to take a 1 GB 6950 and a 2 GB 6970 and crossfire them, the RAM would stack, right? I am confused because the AMD website states that in the case of two different cards that match up, the RAM and GPU clock speeds will be notched down to the slower one. This is the same as how dual channel RAM works, isn't it? Some people are saying that it would only use RAM from one videocard and that doesn't really make any sense to me because crossfiring seems to perform better at higher resolutions (about twice as nice) and it would cause a huge bottleneck if the the two GPU's were to work faster without additional RAM. Please help.

Both GPUs use their own VRAM however if one has a lesser amount the other one will match it. The VRAM does not stack because both GPUs need to use their own VRAM for bandwidth to render. I would not suggest crossfiring a 1GB card as you are going to be VRAM limited by that card.
 
So worst case scenario, my crossfire setup would use two of the three GB of available VRAM? I have sources saying that the VRAM stacks but not sure about different sizes.
 
when you crossfire a 6950 w/ 1gb with a 6970 w/ 2gb.
the 6790 w/ 2gb essentially become a 6950 w/ 1gb

the faster gpu slow down to match the slower gpu.
you now u have 6950 x2.

the larger vram reduce down to match the lower vram.
you now have 1gb vram.
note. vram does not stack. instead it is mirror.



for playable and enjoyable gaming. 1gb vram is still plenty for 6950 x2.
 
when you crossfire a 6950 w/ 1gb with a 6970 w/ 2gb.
the 6790 w/ 2gb essentially become a 6950 w/ 1gb

the faster gpu slow down to match the slower gpu.
you now u have 6950 x2.

the larger vram reduce down to match the lower vram.
you now have 1gb vram.
note. vram does not stack. instead it is mirror.



for playable and enjoyable gaming. 1gb vram is still plenty for 6950 x2.

Having had 6950's in CF, I can say that they do not downclock to match the slower card. They simply get held back by the slower card. You'll see higher usage on the slower card, but the clocks remain the same as they are set to run at.

Essentially it comes down to this; each card renders their own frames independently and alternate frames that they render. This means that you can only go as fast as your slowest card (the faster card will occasionally sit idle/wait for the slow one), and each card has to be able to render their own frame, and use their own VRAM to create that frame.
 
That sounds about right. So what I'm thinking is that I get 2 x 6950 1GB = 13900 2GB

No is incorrect. vram is mirror. not stacked.
what you essentially end up with is - 6950 x2 with 1GB of usable vram.
in your dialect. it is "13900 1GB."



as for the clocks. bystander36 is correct. the faster gpu does not actually down clock. instead the faster gpu will reduce workload to match the slower gpu. hence the overall average speed of the faster gpu will match the speed of the slower gpu.
 
Back
Top