Dual GPU Maxwell - your thoughts?

Hauk

Platinum Member
Nov 22, 2001
2,808
0
0
With Maxwell's perf/watt, seems a dual GPU card is begging to be launched. Interested in seeing one? What's possible? Two fully enabled GM204 cores? Max TDP, clock speeds? Limited to ref PCB/cooling?
 

III-V

Senior member
Oct 12, 2014
678
1
41
With Maxwell's perf/watt, seems a dual GPU card is begging to be launched. Interested in seeing one? What's possible? Two fully enabled GM204 cores? Max TDP, clock speeds? Limited to ref PCB/cooling?
Max TDP would be the usual 300W. At that TDP, double GM204 would be able to achieve max clock speeds and be fully enabled, without much binning effort.

Frankly, I don't care much for dual GPU cards, but there's no doubt that 2xGM204 would make for a killer offering.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
If GM200 is coming soon, there will likely not be a dual GM204 unless Nvidia makes it a short run.

I personally would not get a dual GPU card, but it'd be nice to see nvidia offer dual GM204's and GM200's.
 
Last edited:

NTMBK

Lifer
Nov 14, 2011
10,245
5,035
136
My prediction is that it will have incomplete driver support and give unsatisfactory performance in a wide variety of games. Just like every other dual GPU setup.
 

Hauk

Platinum Member
Nov 22, 2001
2,808
0
0
Max TDP would be the usual 300W. At that TDP, double GM204 would be able to achieve max clock speeds and be fully enabled, without much binning effort.

That's what I was thinking. Seems those who want more GPU on card are thinking big Maxwell.
 
Sep 27, 2014
92
0
0
My prediction is that it will have incomplete driver support and give unsatisfactory performance in a wide variety of games. Just like every other dual GPU setup.

I was under the impression that Dual GPU cards do not have the issues that most SLI/Xfire setups run into, or am I just crazy?

I just want Nvidia to release GM200 already, or at least give some idea of its performance. I certainly wouldn't mind a dual GM204 card, but I would definately need to see what GM200/dual GM200 is capable of first.
 

NTMBK

Lifer
Nov 14, 2011
10,245
5,035
136
I was under the impression that Dual GPU cards do not have the issues that most SLI/Xfire setups run into, or am I just crazy?

Nah, a dual GPU is still a dual GPU. It's still two processors running in SLI, with split memory pools, but instead of using a SLI bridge to communicate they use an on-card connection. In terms of game compatibility they're no better than two cards sat side by side.
 
Last edited:
Sep 27, 2014
92
0
0
Nah, a dual GPU is still a dual GPU. It's still two processors running in SLI, with split memory pools, but instead of using a XFire bridge to communicate they use an on-card connection. In terms of game compatibility they're no better than two cards sat side by side.

I did not know that! So with something involving VR like the upcoming Oculus Rift, you would likely be using a single GPU regardless because of stutter...
 

scooterlibby

Senior member
Feb 28, 2009
752
0
0
My GTX690 has been a workhorse, but I'd love two 970's on a single PCB. I could use the extra VRAM.
 

Hauk

Platinum Member
Nov 22, 2001
2,808
0
0
My GTX690 has been a workhorse, but I'd love two 970's on a single PCB. I could use the extra VRAM.

I've never owned a dual-GPU card, but came close to buying a 690. Was very impressive compared to 680 SLI, pretty much on par. Would like to see if nV could repeat that..
 

Wild Thing

Member
Apr 9, 2014
155
0
0
I'd love to see two 970 GPUs on a single board.That would have some serious horsepower.
It seems to run cool enough and would still be fairly reasonable on power.
Make it so.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
I did not know that! So with something involving VR like the upcoming Oculus Rift, you would likely be using a single GPU regardless because of stutter...

One thing I think is kind of interesting about gsync is most people (myself included) report that games that still stuttered with SLI are now effectively fixed with gsync enable. Far Cry 3 for example has been one of those games that despite frame rate always ran with a lot of stutter, it seemed inherient to the engine and adding SLI just made the situation worse. Its smooth under gsync. VR is definitely going to want a very high frame rate diminishing the impact of stutter and ideally also using async technology as well. Then SLI will just bring faster performance and the cadence of the screen will be a non issue.
 
Sep 27, 2014
92
0
0
One thing I think is kind of interesting about gsync is most people (myself included) report that games that still stuttered with SLI are now effectively fixed with gsync enable. Far Cry 3 for example has been one of those games that despite frame rate always ran with a lot of stutter, it seemed inherient to the engine and adding SLI just made the situation worse. Its smooth under gsync. VR is definitely going to want a very high frame rate diminishing the impact of stutter and ideally also using async technology as well. Then SLI will just bring faster performance and the cadence of the screen will be a non issue.

If I remember correctly, latency was also an issue with multi-gpu setups with the earlier versions of the rift, I am not sure though.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I think they need to fix SLI first since it looks pretty bad atm. Then do a dual card.

http://www.guru3d.com/articles-pages/geforce-gtx-980-sli-review,9.html

PCPer did 3/4 way sli review, but no FCAT or FRAPS. I thought they really really really cared about frame times, dropped frames, etc...? Have they already done it elsewhere? Has anyone except Guru3D done any frame analysis?

http://www.pcper.com/reviews/Graphics-Cards/NVIDIA-GTX-980-3-Way-and-4-Way-SLI-Performance
 

MisterLilBig

Senior member
Apr 15, 2014
291
0
76
Considering the state of SLI and XFire, and its actual support in apps and games.
And considering the high cost of dual chip cards, isn't better for NV and AMD to just do a "small", based on research, surveys and projections of current dual chip cards sold and people who do SLI and XFire set-ups, to just do a batch of a double sized chip?

Wouldn't that be cheaper, in a few ways, and better, in every way, in the long run?
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
More options is welcome to the market. If the costs of launching GM200 are too high for now, it could be beneficial to launch a dual-GPU GM204 and get even more GM204 chips sold. It's just business. I just hope if NV does this, they put 16GB on the card because while 690 is still a powerhouse, it's undermined by only 2GB of VRAM. For $1000, I would prefer 990 with dual 970s and 16GB of VRAM rather than 990 with dual 980s and 8GB so the 690 situation is not repeated. As far as SLI/CF for dual-GPU setups, PCPer, LinusTechTips, HardOCP, TechReport have said that for dual cards the experience is good. It's when you get into 3-4 GPU setups that scaling and stuttering can become problematic. Of course if GM200 is not far from launching, then it's more preferable.
 
Last edited:

nenforcer

Golden Member
Aug 26, 2008
1,767
1
76
I've been asking if there was going to be a GM106 Geforce GTX 750Ti Boost w/ SLI support.

A dual GM204 Geforce GTX 990 4GB is likely coming next spring or whenever AMD releases the Radeon R9 390/X cards.
 
Sep 27, 2014
92
0
0
More options is welcome to the market. If the costs of launching GM200 are too high for now, it could be beneficial to launch a dual-GPU GM204 and get even more GM204 chips sold. It's just business. I just hope if NV does this, they put 16GB on the card because while 690 is still a powerhouse, it's undermined by only 2GB of VRAM. For $1000, I would prefer 990 with dual 970s and 16GB of VRAM rather than 990 with dual 980s and 8GB so the 690 situation is not repeated. As far as SLI/CF for dual-GPU setups, PCPer, LinusTechTips, HardOCP, TechReport have said that for dual cards the experience is good. It's when you get into 3-4 GPU setups that scaling and stuttering can become problematic. Of course if GM200 is not far from launching, then it's more preferable.

I agree, 16GB would be preferable but one shudders to think at the price tag. Honestly the 980 really seems like a stop-gap measure before GM200. I would guess (hope) GM200 is closer to launch than a 990.

I've been asking if there was going to be a GM106 Geforce GTX 750Ti Boost w/ SLI support.

A dual GM204 Geforce GTX 990 4GB is likely coming next spring or whenever AMD releases the Radeon R9 390/X cards.

Perhaps but you would think GM200 would be the single GPU competitor to the 390x whereas a 990 would be targeting (hopefully) the 1000$ dual GPU price point that's been dominated by the 295x2... who knows though...
 
Last edited: