The future of multi GPU

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Eymar

Golden Member
Aug 30, 2001
1,646
14
91
lastly - XDMA CF is near single gpu fluidity/quality. night and day compare to SLI and CF.

Definitely agree with this and was a big surprise for me as I went from 7970 XF (Terrible stuttering, effective frame rate had to be 25% or worse) > Titan SLI (Much better than 7970 XF, but still not as smooth as single card) > 290x XF (had to turn on GPU% overlays to make sure wan't running in single card mode).
 

DooKey

Golden Member
Nov 9, 2005
1,811
458
136
Not currently, they're supposed to in a future driver release. And have to give credit where it's due, AMD did great with XDMA and NVIDIA needs to get on that train sooner than later.

Pascal if I'm not mistaken. NVlink
 

Eymar

Golden Member
Aug 30, 2001
1,646
14
91
XDMA is boss. Nvidia needs to catch up on that real quick.

I thought Nvidia did with 980s as SLI looks single card smooth (doesn't have SLI or pre XDMA XF microstutter). However, I'm using G-Sync. I just did a test and turned G-SYNC off (V-SYNC also off) and sure enough my 100FPS in Grid Autosport feels like 60FPS or worse (just not smooth as single card would be). I'd assume V-SYNC off would just have added tearing, but looks like G-Sync adds some frame pacing type help to SLI (don't have 290x XF anymore, but I think the two are equal (XF XDMA = SLI with G-Sync where as XF XDMA >> SLI without G-Sync).
 

UaVaj

Golden Member
Nov 16, 2012
1,546
0
76
Which leads me to my next question. At present I have GTX 970 SLI, but with the VRAM, ROP and L2 cache fiasco I am debating returning them for a refund or a credit if I'm able, and using that money towards buying the fastest single GPU card I can get to handle 1440p max quality at 60 FPS in the latest titles..

as for the second question.

unless you are planning on adding another 970 in the coming future and/or planning on increasing your current resolution beyond 2560x1440.

chances are this spec change will not affect you.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
I honestly would definitely consider AMD at this point, but not for crossfire. AMD's power draw and DX11 driver overhead are just too high, and crossfire amplifies that.

Power consumption is a junk argument. 7-17% is literally completely unnoticeable unless you have your machine hooked up to a UPS... If you want to save 30 watts change a single light bulb to LED for $20.

Power_04.png
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Power consumption is a junk argument. 7-17% is literally completely unnoticeable unless you have your machine hooked up to a UPS... If you want to save 30 watts change a single light bulb to LED for $20.

1) I'm on a UPS.

2) Tomb Raider is hardly GPU intensive. You may as well have posted the Bioshock Infinite power draw graph instead. When you really stress the components, the gulf widens:

index.php


14164141670QcFceujXf_10_1.gif
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
as for the second question.

unless you are planning on adding another 970 in the coming future and/or planning on increasing your current resolution beyond 2560x1440.

chances are this spec change will not affect you.

Yeah, it doesn't really affect me. To be honest, my G1 GTX 970s have performed superbly, and if anything, performance has been increasing and not decreasing.

I don't care about 4K performance, as I won't be moving to 4K until NVidia has their die shrink.

I think I will keep them for now, but as soon as I see the performance tests for GM200 and R9 300 parts, I'll make the decision to sell them and go single card again.

GM200 and R9 300 should both be fast enough to handle 1440p at very high settings, unlike the GM204 and R9 200 cards..
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
The arrival of VR (hopefully soon), should provide a huge boost for multi GPU setups.

With the way Oculus Rift renders frames (two half-frames that are then stitched together), it should be possible to get almost perfect scaling and zero stutter (at least no more stutter than a single GPU), whilst also having perfect compatibility (assuming the game supports VR in the first place of course)
 

kasakka

Senior member
Mar 16, 2013
334
1
81
DSR definitely works, I use it. MFAA is not an option on SLI though.

My bad, DSR does work on SLI, but it does not work with SLI + G-Sync monitor for some reason. Just having a G-Sync monitor connected I never seem to get the DSR option in control panel if SLI is enabled.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Multi-GPU has been a dud for years. It's only relevant in the niche circle with the best cards combined to run 4K resolutions which single cards can't manage. As far as I'm concerned, both vendors just keep up appearances with the tech so neither can get an upper hand.

But an even bigger dud is 3D. I remember the claims that nVidia's glasses were going to transform the world, yet now we don't hear a peep out of either vendor about it. Even new games have multiple rendering issues with it.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
I would far prefer SFR rendering and then throw in Freesync/gsync and that would be perfect. unfortunately only the CIV:BE uses it, hopefully firaxis will use it in all their games going forward.

http://www.pcper.com/reviews/Graphi...mance-Maxwell-vs-Hawaii-DX11-vs-Mantle/2560x1

I had no idea Civ BE used SFR with Mantle. Looks like they got some nice results with it as well :thumbsup: Not as high a frame rate as AFR, but very nice frame times..

Perhaps SFR can be used with the Unreal Engine 4 after all..
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
But an even bigger dud is 3D. I remember the claims that nVidia's glasses were going to transform the world, yet now we don't hear a peep out of either vendor about it. Even new games have multiple rendering issues with it.

3D is likely going to take a backseat now that VR is the next big thing for PC gaming I'd wager.
 

guskline

Diamond Member
Apr 17, 2006
5,338
476
126
Very happy with my 290s in CF. Watercooling sure took care of the heat problem.
 

night.fox

Member
Mar 23, 2014
37
0
0
I think that multi-gpu for AMD cards will be better due to the XDMA features. Bear in mind that its only the R9 290's (inlcuding 295x2) has this feature and so far, it is the best scaling for multi-gpu.

I could imagine that AMD cards will utilize more bandwith on the PCI lanes with the upcoming coming cards and PCIE 4.0 is already in development and this will be introduced sooner.

Of course hardware is there. Its only game developers should step up to cope up powerful hardwares
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I think that multi-gpu for AMD cards will be better due to the XDMA features. Bear in mind that its only the R9 290's (inlcuding 295x2) has this feature and so far, it is the best scaling for multi-gpu.

I could imagine that AMD cards will utilize more bandwith on the PCI lanes with the upcoming coming cards and PCIE 4.0 is already in development and this will be introduced sooner.

Of course hardware is there. Its only game developers should step up to cope up powerful hardwares

Tonga too (R9-285)
 

Annisman*

Golden Member
Aug 20, 2010
1,931
95
91
I was an SLI/CF fiend for years until I started to wonder if the added latency was actually worth the frame rate boost. For me I decided it was not worth it, not to mention the occasional hassle of unsupported games or junk support. Now, if I was gaming at 4K or something the extra cards would be a no brainer.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
My bad, DSR does work on SLI, but it does not work with SLI + G-Sync monitor for some reason. Just having a G-Sync monitor connected I never seem to get the DSR option in control panel if SLI is enabled.

Well, that sucks. If I was going for Gsync I'd go 1440p or higher anyway personally. Still, I find some games (older titles) play well at 4k.

I was an SLI/CF fiend for years until I started to wonder if the added latency was actually worth the frame rate boost. For me I decided it was not worth it, not to mention the occasional hassle of unsupported games or junk support. Now, if I was gaming at 4K or something the extra cards would be a no brainer.

I still find single cards to be not enough for me at 1440p. I suppose everyone has a different expectation of performance.
 

Maximilian

Lifer
Feb 8, 2004
12,604
15
81
Ive been telling myself "ill go SLI/CF" since ~2004. I was gonna go with 6600GT SLI, never bothered... X1900XT CF, didn't do that either.

Its just never been worth it when a single next gen card typically beats or comes close to a dual gpu last gen setup. Never mind the issues/bugs and general dicking about that apparently comes with dual gpu.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
1) I'm on a UPS.

2) Tomb Raider is hardly GPU intensive. You may as well have posted the Bioshock Infinite power draw graph instead. When you really stress the components, the gulf widens:

index.php


14164141670QcFceujXf_10_1.gif

Literally nowhere but [H] shows power consumption deltas that big. Are they still using stock 290s running at low fan speeds with high stock voltages, hitting 90c all the time? You literally can't even buy stock 290(x)s new anymore... Aftermarket ones run lower voltage, lower temperature, lower noise, lower power consumption...
 
Last edited: