Primarily Gaming Purposes - GTX 970 vs GTX 980 vs GTX 780Ti - 2/3 Way SLI ?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

GTX980 vs GTX970 vs GTX780Ti - 2/3 Way SLI ?

  • GTX 980 2 Way SLI

  • GTX 980 3 Way SLI

  • GTX 780 2 Way SLI

  • GTX 780 3 Way SLI

  • GTX 970 3 Way SLI


Results are only viewable after voting.

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
For a value set-up I don't doubt the utility of cheaper cards.

My inquiry is purely an academic one: How much of the poor scaling on 980 GTX is simply a result of the resolution not being challenging enough? How would triple SLI 980 GTX scale with triple 4K monitors as an extreme example?

No doubt reviews running tri-sli @ low res are worthless. An exception would be competitive FPS where people are trying to maintain 120fps. At which case high detail settings aren't as important and a single card can still likely do it, assuming not too much CPU overhead.

I still want a site to take the dropped frames into acct. and show us observed FPS. After how important FCAT testing is supposed to be, why aren't any of the FCAT proponents asking for it now? Even after we've seen there's a problem?

I'm going to put forward that there's actually a concerted effort by the press to obcsure the issue.
GTX-980-SLI-66.jpg


When capturing output frames in real-time, there are a number of eccentricities which wouldn’t normally be picked up by FRAPS but are nonetheless important to take into account. For example, some graphics solutions can either partially display a frame or drop it altogether. While both situations may sound horrible, these so-called “runts” and dropped frames will be completely invisible to someone sitting in front of a monitor. However, since these are counted by its software as full frames, FRAPS tends to factor them into the equation nonetheless, potentially giving results that don’t reflect what’s actually being displayed.

With certain frame types being non-threatening to the overall gaming experience, we’re presented with a simple question: should the fine-grain details of these invisible runts and dropped frames be displayed outright or should we show a more realistic representation of what you’ll see on the screen? Since Hardware Canucks is striving to evaluate cards based upon and end-user experience rather than from a purely scientific standpoint, we decided on the latter of these two methods.

With this in mind, we’ve used the FCAT tools to add the timing of partially rendered frames to the latency of successive frames. Dropped frames meanwhile are ignored as their value is zero. This provides a more realistic snapshot of visible fluidity.
We actually have HWC obscuring the dropped frames by adding them to the latency of the next frame rather than report them. They actually say they don't matter. WTF!?!
 

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
Isnt it <30 ms that you actually see dropped frames?, if so, what is the problem with the above chart?
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,330
126
No doubt reviews running tri-sli @ low res are worthless. An exception would be competitive FPS where people are trying to maintain 120fps. At which case high detail settings aren't as important and a single card can still likely do it, assuming not too much CPU overhead.

I still want a site to take the dropped frames into acct. and show us observed FPS. After how important FCAT testing is supposed to be, why aren't any of the FCAT proponents asking for it now? Even after we've seen there's a problem?

I'm going to put forward that there's actually a concerted effort by the press to obcsure the issue.

We actually have HWC obscuring the dropped frames by adding them to the latency of the next frame rather than report them. They actually say they don't matter. WTF!?!

Wow that is crazy considering the heat AMD took for their issues they fixed with crossfire was largely over runt/dropped frames. I guess HWC is stacking the deck in nvidia's favour here for whatever reason.
 

sidrockrulz

Member
Sep 26, 2014
103
0
0
Here is the 980 GTX 2 way vs. 3 way SLI comparison from guru3d Wanderer linked:

index.php


34% increase in FPS with a 50% increase in GPU. (Not bad, and I would imagine the scaling would be even closer to 50% with triple 1440p monitors compared to a single 4K monitor)

index.php


9% scaling from 50% increase in GPU. Not so good, but would also improve at triple 1440p.

index.php


24% scaling from 50% increase in GPU.

index.php


3% scaling from 50% increase in GPU. Not good.

index.php


2% scaling from 50% increase in GPU.

index.php


39% scaling from 50% increase in GPU. Pretty good scaling here already at 4K.

index.php


44% scaling from 50% increase in GPU. I'd have imagine this would go to perfect scaling on triple 1440p.


index.php


22% scaling from 50% increase in GPU.

Overall, it looks like only 2 or 3 of those games might not scale well (using x3 980 GTX) with an increase in resolution to triple 1440p....but I could be mistaken.


Overall,, I would think that's a solid increase !!

This makes me happy ;) :p

I guess the games that did not really show an increase are cuz they are probably limited on production ?
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Isnt it <30 ms that you actually see dropped frames?, if so, what is the problem with the above chart?

The problem with the above chart is they are manipulating the results to conceal the dropped frames. According to HWC they don't matter at all. (What???) Dropped frames are dropped frames. They need to be subtracted from the avg FPS graphs because they aren't there. Also, whether or not you can notice the dropped frames depends on how fast you are rendering. Recall that before we had FCAT good review sites who actually played games, not just ran 30sec canned benchmarks, like [H] for example, would report that they needed higher FPS to have smooth game play from crossfire than SLI. [H] has most recently reported the opposite, that crossfire appears smoother.

The fact that sites aren't reporting it, PCPer did an sli review where they didn't even report the FCAT results, makes me believe they are purposely overlooking it. If you consider that they were the ones who secretly got the FCAT equipment from nVidia and reported results with no transparency at all. Now, they aren't even using it on nVidia cards that others have reported dropped frames? If you look at some of their FRAPS graphs, they are pretty bad. Also notice there are no AMD cards for comparison. I haven't seen any site compare crossfire vs. SLI measured with FRAPS or FCAT in any Maxwell review. There might be one out there, but I haven't seen it. If someone knows of one please post the link for me.

http://www.pcper.com/reviews/Graphics-Cards/NVIDIA-GTX-980-3-Way-and-4-Way-SLI-Performance

AMD was lambasted with the 7970. We were told avg. FPS weren't what we should be basing our buying decisions on. It was out dated and not a relevant means of comparing GPU performance. Slower and smoother was more important than raw FPS. Now it's avg. FPS at lower CPU bound resolution and power consumption that we should base our entire buying decision on. We see Mantle offer much higher minimums and that doesn't matter. We see far superior perf/$, especially in hires multi card setups, and that doesn't matter. We see dropped frames in SLI, and that doesn't matter. I expect that from a company's fan base (We here). It's human nature to overlook flaws in things we are fond of. I expect more from the tech sites we rely on to get our info from. Especially when the well being of corporations and their employees are affected.
 

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
The problem with the above chart is they are manipulating the results to conceal the dropped frames. According to HWC they don't matter at all. (What???) Dropped frames are dropped frames. They need to be subtracted from the avg FPS graphs because they aren't there. Also, whether or not you can notice the dropped frames depends on how fast you are rendering. Recall that before we had FCAT good review sites who actually played games, not just ran 30sec canned benchmarks, like [H] for example, would report that they needed higher FPS to have smooth game play from crossfire than SLI. [H] has most recently reported the opposite, that crossfire appears smoother.

The fact that sites aren't reporting it, PCPer did an sli review where they didn't even report the FCAT results, makes me believe they are purposely overlooking it. If you consider that they were the ones who secretly got the FCAT equipment from nVidia and reported results with no transparency at all. Now, they aren't even using it on nVidia cards that others have reported dropped frames? If you look at some of their FRAPS graphs, they are pretty bad. Also notice there are no AMD cards for comparison. I haven't seen any site compare crossfire vs. SLI measured with FRAPS or FCAT in any Maxwell review. There might be one out there, but I haven't seen it. If someone knows of one please post the link for me.

http://www.pcper.com/reviews/Graphics-Cards/NVIDIA-GTX-980-3-Way-and-4-Way-SLI-Performance

AMD was lambasted with the 7970. We were told avg. FPS weren't what we should be basing our buying decisions on. It was out dated and not a relevant means of comparing GPU performance. Slower and smoother was more important than raw FPS. Now it's avg. FPS at lower CPU bound resolution and power consumption that we should base our entire buying decision on. We see Mantle offer much higher minimums and that doesn't matter. We see far superior perf/$, especially in hires multi card setups, and that doesn't matter. We see dropped frames in SLI, and that doesn't matter. I expect that from a company's fan base (We here). It's human nature to overlook flaws in things we are fond of. I expect more from the tech sites we rely on to get our info from. Especially when the well being of corporations and their employees are affected.

I agree, avg isnt the best measure of GPU performance. But from the charts above, it doesnt look as if the game wasnt smooth. 7970 had a blurr of thick lines going anywhere from 25-55ms, which showed un smooth game play.

Are we getting reports of unplayable SLI, or just that CF is better? Everyone on the forum knows your prefered manufacturer, however making mountains out of mole hills?, really!
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I agree, avg isnt the best measure of GPU performance. But from the charts above, it doesnt look as if the game wasnt smooth. 7970 had a blurr of thick lines going anywhere from 25-55ms, which showed un smooth game play.

Are we getting reports of unplayable SLI, or just that CF is better? Everyone on the forum knows your prefered manufacturer, however making mountains out of mole hills?, really!

So you think the coverage is balanced? I think we all have our preferred brands. ;)


Edit: Did you look at the FRAPS results from the PCPer link? Pretty blurry.
 

x3sphere

Senior member
Jul 22, 2009
722
24
81
www.exophase.com
The fact that sites aren't reporting it, PCPer did an sli review where they didn't even report the FCAT results, makes me believe they are purposely overlooking it. If you consider that they were the ones who secretly got the FCAT equipment from nVidia and reported results with no transparency at all. Now, they aren't even using it on nVidia cards that others have reported dropped frames? If you look at some of their FRAPS graphs, they are pretty bad. Also notice there are no AMD cards for comparison. I haven't seen any site compare crossfire vs. SLI measured with FRAPS or FCAT in any Maxwell review. There might be one out there, but I haven't seen it. If someone knows of one please post the link for me.

What? They compared frame times against AMD cards in their original 980 review...

http://www.pcper.com/reviews/Graphi...70-GM204-Review-Power-and-Efficiency/Crysis-0

I didn't see any glaring issues. There are cases were the R9s are smoother, but that goes for the 980s as well. Even when the 980s are behind 290 CF, it's nowhere near on the level the 7970s used to be.

Tri-SLI and beyond is where the issues seem to be, frame times are a lot worse there. Maybe that's what you are referring to but it wasn't clear.
 
Last edited:

raghu78

Diamond Member
Aug 23, 2012
4,093
1,476
136
So you think the coverage is balanced? I think we all have our preferred brands. ;)

Edit: Did you look at the FRAPS results from the PCPer link? Pretty blurry.

Well said. I am quite sure that the emphasis on frametimes in SLI has fallen or atleast is not being displayed with the same vigour as when the HD 7970 issues were highlighted. hardocp was the most vocal in recent times that XDMA CF is superior than SLI

http://www.hardocp.com/article/2014...r9_290x_4gb_overclocked_at_4k/11#.VE3ZCXutHhA

To give you a comparison look at the BF4 frametimes at 4k on R9 295x2 . the GTX 980 3 and 4 way SLI results are compared here. The 4 way scaling is broken for SLI while XDMA CF scales almost perfectly

http://www.pcper.com/reviews/Graphi...e-4K-Quad-Hawaii-GPU-Powerhouse/Battlefield-4

"Our first set of results come from Battlefield 4 running in DirectX mode. At 4K and running on the Ultra preset, we see an impressive scaling rate of 81% by adding in the second pair of GPUs to the mix. Frame time variance does increase a bit with the quad GPU configuration but it doesn&#8217;t become a bother really until you hit the 95th percentile of frame times when you exceed 4ms. You can see at the 50 second mark in our test where there is some stutter occurring, but with a frame rate exceeding 80 FPS at 4K, it&#8217;s hard not to walk away impressed."

http://www.pcper.com/reviews/Graphi...3-Way-and-4-Way-SLI-Performance/Battlefield-4

"BF4 is our first indication that scaling beyond two graphics card in SLI is going to be problematic. Here you can see that we jump up from 58 to 94 FPS on average going from a single GTX 980 to a pair of them, but the move to a third card only scales to 110 FPS; but the story is much worse than that. The Frame Times graphs indicates that the average FPS doesn't mean much as the frame time consistency is very poor, result in wildly fluctuating frame rates. The exact same thing occurs with 4 GPUs as well. A look at the Frame Variance graph tells the story from another angle: both 3-Way and 4-Way SLI are seeing more than 3ms of frame time variance (from frame to frame) for more than 20% of the total frames being rendered!

The results at 4K are much the same as the 2560x1440 results above: 2-Way SLI works very well but both 3-Way and 4-Way are poor experiences."

http://www.pcper.com/reviews/Graphi...70-GM204-Review-Power-and-Efficiency/Battlefi

"At 4K the relative performance delta actually grows, with the new GM204 GPU bring in an average frame rate that is 13% faster than the R9 290X from AMD."

http://www.pcper.com/reviews/Graphi...70-GM204-Review-Power-and-Efficiency/Battle-0

"Our GTX 980 results show solid and consistent scaling in BF4 with minimal frame time variance and no stutter. For NVIDIA, the experience is actually slightly more erratic at 4K than with Hawaii, indicated by the slightly more variable orange line in our Frame Times graph."

So the GTX 980 starts out faster at single gpu and similar smoothness. At 2 way GTX 980 SLI is slightly ahead in raw fps and slightly behind in frametime variance. Finally at 4 way GTX 980 SLI is crushed by R9 295x2 CF in fps and frametimes. Ironically there was no comparison of GTX 980 tri and quad sli with R9 295x2 with R9 290X Tri CF and dual R9 295x2 quad CF in pcper's review. Nvidia's solutions need to benchmarked against the competition and not against themselves.
 
Last edited:

x3sphere

Senior member
Jul 22, 2009
722
24
81
www.exophase.com
Well said. I am quite sure that the emphasis on frametimes in SLI has fallen or atleast is not being displayed with the same vigour as when the HD 7970 issues were highlighted. hardocp was the most vocal in recent times that XDMA CF is superior than SLI

Well that's probably because the issues only seem to be affecting 3-Way and 4-Way SLI. Sites are going to cater to their reader base, and far less people are using GPU setups beyond two cards. I'm not saying the issue should be overlooked, but I think it's partially down to priorities.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
What? They compared frame times against AMD cards in their original 980 review...

http://www.pcper.com/reviews/Graphi...70-GM204-Review-Power-and-Efficiency/Crysis-0

I didn't see any glaring issues. There are cases were the R9s are smoother, but that goes for the 980s as well. Even when the 980s are behind 290 CF, it's nowhere near on the level the 7970s used to be.

Tri-SLI and beyond is where the issues seem to be, frame times are a lot worse there. Maybe that's what you are referring to but it wasn't clear.

Thanks for linking that. Still no FCAT, though, which I thought was the only definitive test because it actually measures the output from the card where FRAPS doesn't.

When comparing to the 7970 remember that the fps are much higher with these cards, so the frame times will naturally be lower. Nobody complained about crossfire either at frame rates as high as they are in these reviews.
 

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
Thank you...

But as user cbn says, scaling only applies to 4k ?

He's merely speculating, and it does apply to all resolutions not just 4k (notice how he doesn't have any sources). I haven't seen anything to indicate that going beyond 4k would increase scaling, but it's very poor scaling already with 3x or 4x SLI even with demanding games. Even if it goes up a few percent at 1440px3 it's still a very poor value. It won't change the situation, you'll still pay a lot for less gains with the 3rd and 4th card. If you are fine with that it's your choice.
 

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
What? They compared frame times against AMD cards in their original 980 review...

http://www.pcper.com/reviews/Graphi...70-GM204-Review-Power-and-Efficiency/Crysis-0

I didn't see any glaring issues. There are cases were the R9s are smoother, but that goes for the 980s as well. Even when the 980s are behind 290 CF, it's nowhere near on the level the 7970s used to be.

Tri-SLI and beyond is where the issues seem to be, frame times are a lot worse there. Maybe that's what you are referring to but it wasn't clear.

Aren't these glaring issues with dual card SLI? Check out the FCAT analysis, isn't this precisely what was being targeted with the 7970 when everyone was suggesting the slower and more expensive 680's due to dropped frames? Why is it any different now?

http://www.guru3d.com/articles-pages/geforce-gtx-980-sli-review,9.html

I am surprised more sites don't investigate this thoroughly especially the sites which were investigating the 7970 crossfire so well. I guess they are giving NV time to fix them.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Well that's probably because the issues only seem to be affecting 3-Way and 4-Way SLI. Sites are going to cater to their reader base, and far less people are using GPU setups beyond two cards. I'm not saying the issue should be overlooked, but I think it's partially down to priorities.


These are 2 way SLI of both the 980 and 970 from Guru3D. This is enough for me to not be able to recommend Maxwell SLI until it's fixed. Especially if you account for the price premium over 290/290X
index.php

index.php


index.php

index.php


index.php

index.php


These are FCAT charts, not doctored, unlike what Hardware Canucks did to theirs. Those downward drops to 0 are where nothing is being sent to the screen and it should be.

Now they also pass it off as not visible, which technically is true. Since the frames are dropped and not rendered there isn't anything to see (Pretty tricky eh?). It does cause visible stuttering if the FPS drop down enough. None of these benches are running slow enough to have stutters with the exception of Thief. They did notice stutters in Thief, but they said it wasn't due to the dropped frames. They gave no explanation as to what it was that was causing it though except "there was a lot going on on screen". What that has to do with stuttering I don't know?

Keep in mind these are also only 30sec benchmarks not the entire game. I wouldn't bet $700 to $1100 that all will run smooth in actual hours of gaming. This is also consistent in all the games they tested, not just an outlier for a single game. It's definitely not the fault of the games. There is a driver or hardware issue. Likely driver, but nVidia claims "superior hardware frame metering". Seeing this also makes me question those claims.

Now, when this was pointed out on AMD cards AMD admitted it and vowed to correct it. For the most part they did too. I'm waiting for something from nVidia.
 

x3sphere

Senior member
Jul 22, 2009
722
24
81
www.exophase.com
The BioShock frame times over at Guru are similar to PCPer's results, too. But the other games PCPer tested show no issues like that, so it seems to be game-specific or maybe engine specific. And notably, Crysis 3 shows no issues even at 4K, which is one of the most demanding games out there. BioShock was UE3 - so was Thief, not sure about Tomb Raider?

I'm not trying to downplay the issue, it should be looked into - but comparing it to the 7970 situation seems a bit overblown to me, that would suggest Nvidia's frame metering is completely broken which it doesn't seem to be. I had a 7970 CF setup and games consistently ran terrible on it when I started pushing the cards below 60 FPS.

It would be good if one of the sites re-ran the tests on those games with the new driver set.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
He's merely speculating, and it does apply to all resolutions not just 4k (notice how he doesn't have any sources). I haven't seen anything to indicate that going beyond 4k would increase scaling, but it's very poor scaling already with 3x or 4x SLI even with demanding games. Even if it goes up a few percent at 1440px3 it's still a very poor value. It won't change the situation, you'll still pay a lot for less gains with the 3rd and 4th card. If you are fine with that it's your choice.

If a resolution and/or detail settings are not challenging enough adding in extra GPU power is not going to increase FPS very much (and maybe in some cases not at all). That is just a general rule we all know about and it applies to single cards as well.

How do we know this is not the case with 980 GTX Triple SLI at 4K?
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
The BioShock frame times over at Guru are similar to PCPer's results, too. But the other games PCPer tested show no issues like that, so it seems to be game-specific or maybe engine specific. And notably, Crysis 3 shows no issues even at 4K, which is one of the most demanding games out there. BioShock was UE3 - so was Thief, not sure about Tomb Raider?

I'm not trying to downplay the issue, it should be looked into - but comparing it to the 7970 situation seems a bit overblown to me, that would suggest Nvidia's frame metering is completely broken which it doesn't seem to be. I had a 7970 CF setup and games consistently ran terrible on it when I started pushing the cards below 60 FPS.

It would be good if one of the sites re-ran the tests on those games with the new driver set.

I would agree that we need more sites to run these tests. For some unexplained reason sites that typically do, aren't with Maxwell. That actually gives me more reason to pause.

As far as other sites showing these issues, If they aren't running FCAT, then they can't show them. FCAT measures the actual output from the card. FRAPS does not. Fraps can show you that something is up, but it doesn't show you the end output where it could be fixed or something new could show up.
 

sidrockrulz

Member
Sep 26, 2014
103
0
0
You think 980 SLI will provide better gameplay than 970 SLI at 3x 1440P over 4 years? The extra 13-17% performance is going to barely make a dent at such resolutions in modern games. As far as 980 SLi outliving 970 SLI over 4 years, that's not going to happen either. Every 3 years GPU performance increases about 2-2.3X, with an annualized growth rate of about 30-33%. Think of it this way:

970 SLI = 88%
980 SLI = 100%

In 3 years = Insert Card ABCD SLI = 200-233%
In 4 years = Insert Card ABCD SLI = 260-303%

That means in 2 roughly years, there will be a card ~60-66% faster than 980 for $550. Shouldn't be surprising since that's about the time when NV's Pascal should launch.

For this reason, I would not buy any $1100 GPU setup and keep it for 4+ years (unless there was some factor involved like bitcoin mining that made cards free). By far the better strategy unless you are top 1% or upgrade frequently is to buy NV's 2nd best and upgrade more often. 480/580/680 didn't last any longer than 470/570/670. Would you say a $500 580 is a fast card today? No you wouldn't because a $200 R9 280X is 40% faster just 4 years later.

GPU progress is too just fast and prices drop too rapidly to justify overspending $300-400 extra for a 7-10% gain in performance when that $300-400 extra nets one 50% more performance in 2 years when it comes time to resell your old card(s).

Even if you forget the 970 for a second, 780Ti is $360-380 on many occasions:
http://www.ncixus.com/products/?sku=96778&promoid=1413

Think about it:

Dual 780Tis are going to be $750 or less..
Dual 980s are going to be $1100 at least.
Performance gain is 5-10%, maybe 20% if you get poor 780Ti overclocking.

As I already told you, your strategy of buying flagship cards and keeping them without upgrading for 4 years is a poor one to put it mildly. It's way better to buy dual 970s/780Tis and sell them in 24 months and get something faster with the money saved + resale value instead of dumping $1100 into 980s today. The people on our forum who buy 980 SLI will upgrade in 2-2.5 years because they know if they keep such a card for too long, it'll drop to $175 in resale value by the 4 year mark. Since you won't be gaming on a single 1080P monitor and are considering 3x 1440P setup, you will want a pretty fast GPU setup over the next 4-5 years not just over the first 2 of those years. For that reason, while 980 SLI will be good for the first 50% of your timeframe, they will become far too slow for the remaining portion. But if you buy 970 SLI/780Ti SLI, you will get most of that performance in the first 1/2 of ownership time and then at half way mark be able to sell these cards and get A LOT more needed performance. In other way to explain is that 980 SLI will hardly provide a better playability than the 970 SLI/780TI setups other than a few small settings or bumps in AA but when your 970 SLI will become too slow, 980 SLI will be just as unplayable, forcing you to upgrade or sacrifice IQ settings by a large degree.

It's odd to me that you can't decide between 4790K and 5820K for keeping for 4 years when the cost difference between both platforms is $150-170 at MC but you are so eager to spend $350 extra for just 5-10% more performance 980 SLI will offer over 780Ti SLI. And if you want to add a 3rd card down the line, it would be better to sell 980s and buy dual GM200s than buying a 3rd 980 in 2 years. Not to mention going 3rd card requires you get a more expensive Z97 PLX board which destroys the value proposition of the 4790K over 5820K in the first place.

Now if you go dual 970s or dual 780Ti, you don't even have to worry about Tri-SLI scaling or frame stutter issues since you will be ready to upgrade in 2-2.5 years to another pair of fast single GPUs. This way in your 3rd and 4th year, you will get 100-200% more performance than 980 SLI, exactly when you'll want more performance for 2017-2018 games.



I am actually very disappointed that so many of our forum members recommended such an overpriced setup given teh recent adjustments in pricing. Looking at the benchmarks of 780Ti SLI vs. 980 SLI, unless one can't sleep at night knowing their GPU isn't the best, it's a huge waste of money to get $1100 980 SLI over $720 780 Ti SLI or $660 970 SLI in the context of not upgrading the GPU setup at all over a 4 year period.

This forum has really changed from when I joined, when people now complain about 100-200W of extra power usage for GPUs due to electricity costs and heat and yet are willing to recommend someone spends 50% more for 5-10% more performance, ignoring $$$ entirely. It's a pure contradiction. It's almost if the money we earn/save for a future upgrade path to buy newer products is the least relevant aspect now, behind features, perf/watt, etc. At that point why aren't all of us gaming on Quad-SLI 980s with 5960X? I thought the whole point of a PC forum is to help people find the best balance of $ spent on good components to get a good gaming experience, and not overspend too much so that you are smart with your $ and leave some for the next upgrade, rather than to recommend that someone simply buys the best. If someone just wanted to buy the best components all the time, what's even the point of a forum other than bragging? Even during 470 vs. 480 or 570 vs. 580, the price disparity was more reasonable and the performance delta was greater than between a $360 780Ti and $550 980.

I don't need someone to tell me that Quad-SLI 980 and 5960X, all water-cooled is the best. If a gamer can't afford Quad-SLI 980 and 5960X, then obviously money is a factor and then we have to discuss how to best spend it effectively. To each his own but I'll just keep a mental note of how fast GPUs get by October 25, 2018 and what it costs to buy a card with 980's performance at that time. Normally I would have no problem recommending 980 SLI if that gamer is willing to upgrade in 2-2.5 years but over 4 years I cannot recommend such a setup. It's simply a waste of money because it's not an efficient way to keep your gaming PC up to date. It's basically suggesting that the GPU future-proofing model is superior to upgrading more frequently, an idea which has failed to be true in the last 30 years.


This is exaclt what iw as thinking...

I am staying away from 780Ti as they are not available anymore in reference design, so watercooling is out..and i would like to...

So.. you would suggest the GTX970 2 Way SLI ? and upgrade to the 200 architecture when it come out ?


Also, the reason i would consider a PLX board is because, if i upgrade processor to Broadwell K , which may support enough lanes for a 3 way in the future maybe ?
 

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
Take a look at the 4k [h] 980 SLI review. It's pretty brutal, the only thing the 980 SLI can win is in power consumption at 4k. Imagine higher resolution, it will get worse since 980 sli is better at low resolution and drops as it goes higher, whereas the 290/x are slower at low resolution (vs. 970/980) and become stronger competitors at higher resolutions. The 980 SLI is 83% more expensive yet can't even offer a superior experience.

You may want to consider 290/x. Are they not a consideration? (Sorry if I missed it)
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
index.php


2% scaling from 50% increase in GPU.

Regarding the 980 GTX SLI scaling in the above graph, it obviously did not do well with Tri SLI.

But how do I know this not cpu bottleneck rather than poor scaling of 980 GTX?
 

sidrockrulz

Member
Sep 26, 2014
103
0
0
Nvidia always had poor scaling above 2-way SLI configurations. I remember when I had the Valley Thread at OCN. Those were my scaling percentage with GK104 and GK110. Nothing much has changed since Maxwell apparently.

Thank you for the grpahs... But now that we have the 290X into the picture.. Let me ask.. I do not really see the 290X CF perform any better than the 970 SLI.... With 970 having better features and lower temps( which allow better OC)..

So... Which would u put your money on ? Why ?