[PCPER] GTX 970 SLI (3.5GB effect)

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
It does look brutal once you start to utilize that slow 512 MB of VRAM.

BF4_3840x2160_PLOT_0.png


Compare 980 SLI to 970 SLI

980-BF4_3840x2160_STUT_0.png


BF4_3840x2160_STUT_0.png



http://www.pcper.com/reviews/Graphics-Cards/Frame-Rating-GTX-970-Memory-Issued-Tested-SLI

NVidia has also dropped the ball on SLI, those frametimes are terrible and coming from the huge SLI smoothness campaign. :eek: The 980 SLI isn't even very good in those games, and the 970 SLI is terrible.

Note that this is coming from the site which first published FCAT results (tarnishing crossfire while touting SLI while they pretended to be independent) while hiding the fact it came directly from NVidia, which means they were basically doing NV's PR. Later on they admitted that FCAT was indeed provided by NVidia, which puts the objectivity of the site in question. The smoothness situation seems to have reversed with XDMA being the superior solution overall.
 
Last edited:

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
WIkk9w0.jpg


Surprised to see such terrible frame times in BF4 & COD: AW, such a game doesn't have the ultra textures to truly load the segmented vram partitions.

UNGhrzp.png



If they did FCAT on SLI in Mordor..
mhOevfQ.png


http://www.computerbase.de/2015-01/geforce-gtx-970-vram-speicher-benchmarks/2/

Holy mother of gawd, that's just unacceptable. Multiple SECOND stutter.
hEFOnOG.jpg

To be fair, they are using settings that are probably using more than 4GB of vram so he stutters are understandable.

That being said. The 970s even at lower settings seem to stutter more. A prime example is BF4 at 140%. The 980s look relatively smooth throughout, the 970s do not.
 

Riceninja

Golden Member
May 21, 2008
1,841
3
81
this is truly damning. i know most 970 owners aren't currently affected and think all this whining is overblown, but when all the 2015+ games coming out hit the 3.5gb wall (either high res textures or just lazy coding) there are going to be lots of very unhappy people.
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
To be fair, they are using settings that are probably using more than 4GB of vram so he stutters are understandable.

That being said. The 970s even at lower settings seem to stutter more. A prime example is BF4 at 140%. The 980s look relatively smooth throughout, the 970s do not.

The 980s are worse than the 970's at 140%

980-BF4_3840x2160_PLOT_0.png


970s have greater variance but the 980 has big spikes.

The 980s also stutter like mad at 150%.
 

therealnickdanger

Senior member
Oct 26, 2005
987
2
0
Hopefully PCPer will re-run these benchmarks when NVIDIA releases its remedy driver, whatever impact it has, that is.
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
To be fair, they are using settings that are probably using more than 4GB of vram so he stutters are understandable.
Actually, it appears only the 980 exceeds 4GB in their BF4 test. The 970 stays below 4GB.

bf4mem.png




That being said. The 970s even at lower settings seem to stutter more. A prime example is BF4 at 140%. The 980s look relatively smooth throughout, the 970s do not.
 

pablopg69

Junior Member
Feb 3, 2015
1
0
6
Hopefully PCPer will re-run these benchmarks when NVIDIA releases its remedy driver, whatever impact it has, that is.


Didn't they just said that there was no driver incoming? Also a driver update wouldn't help since the issue with the 970s is on the hardware side.
 

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,250
136
Not defending NVIDIA....Seems like they pushed the cards beyond their limits. Looking at the fps charts even the baseline makes me think so. Pretty much no grunt left anyways. Would like to see other cards subjected to the same testing to make final judgment.
 

Attic

Diamond Member
Jan 9, 2010
4,282
2
76
Nice that they are investigating this in actually gameplay. I'd like to see other sites do this same type of real world evaluation.


We know that the 970 is an oddball card, we know some information about the oddity. We know abit about the nuts and bolts thanks to sites like anandtech, we know what nVidia says about it.

Where that left off it's nice that PCPer and a few users are actually testing this once the rubber hits the road for the VRAM configuration on the 970.


SLI 970's are not a good buy now that this info is out. Single card 970 looks much more solid for where that will be put to use. I'm reading a lot about significant stutters on the 970's (including single cards) once the card is pushed above 3.5gb's. This is what kills gameplay and what i'd like to see more on from different reviews.
 

destrekor

Lifer
Nov 18, 2005
28,799
359
126
Pretty much everybody who wants to play at 4k with some eye candy on top.

index.php


Is this a trick question?

Are people actually PLAYING it with that resolution and settings? Just because that is how it is benched, does not imply that is how it is used.

It seems a majority of gamers utilize SLI/Crossfire to achieve playable ultra settings at 4K, or they drop settings down. I wouldn't play at 30fps (average - so many moments of stutterfest is a guarantee when you dip down to 20 or fewer fps), no matter how gorgeous it looked - it'd be getting me killed, so nope. I'd drop the settings until it was playable, as would almost everybody.
 

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
It appears to take some effort to bench it since NV has aggressively tried to ensure you cannot use that slow VRAM. Why they even bothered to add it for marketing (and even outright lied for months) appears to have bit them badly. First there was outright denial, then even NV admitted to the misinformation and wrong specs, and now there is hard proof that it clearly is detrimental to use that 512 MB VRAM.

Why the deflection, now that sites are investigating, and in this very clear example with both the 970 and 980 in SLI, 970 is getting hammered. It will only become more prevalent as games require more VRAM. SLI and high resolution users will get the worst side affects.

The point -> 970 + 0.5GB slow VRAM = massive potential frametime spikes once it's in use!
 

Gikaseixas

Platinum Member
Jul 1, 2004
2,836
218
106
The purpose of these benches is to show the cards limitations but most users don't even have 4k monitors and the ones that do usually have CF or SLI configs.

The only thing GTX 970 owners need to be concerned is "2016" as we might start seeing games demanding 4GB on lower resolutions.
 

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,250
136
It appears to take some effort to bench it since NV has aggressively tried to ensure you cannot use that slow VRAM. Why they even bothered to add it for marketing (and even outright lied for months) appears to have bit them badly. First there was outright denial, then even NV admitted to the misinformation and wrong specs, and now there is hard proof that it clearly is detrimental to use that 512 MB VRAM.

Why the deflection, now that sites are investigating, and in this very clear example with both the 970 and 980 in SLI, 970 is getting hammered. It will only become more prevalent as games require more VRAM. SLI and high resolution users will get the worst side affects.

The point -> 970 + 0.5GB slow VRAM = massive potential frametime spikes once it's in use!

I can see the point of testing this way....I just cant see the point of expecting to be able to play it at these settings anyways. Make sense?
 

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
Actually, it appears only the 980 exceeds 4GB in their BF4 test. The 970 stays below 4GB.

bf4mem.png

I'm confused. So this is 4k which would already be 200% resolution scaling vis a vis 1080p. Then they are applying even more scaling which at 150% would be something like 6k. This is a setting I suppose only 0.1% of users would try.

From my use I know a 980GTX runs out of steam around 160% resolution scaling at 1080p. It starts to dip below 60 fps a lot. Just due to lack of enough GPU power not due to running out of VRAM. Maybe SLI tests would allow some of the higher end DSR/Resolution scaling settings. Then I suppose VRAM would become the bottleneck.

EDIT: I see that is SLI. I'd like to see some fps plots on that setup.
 

RocketPuppy

Junior Member
Jan 14, 2015
3
0
0
I can see the point of testing this way....I just cant see the point of expecting to be able to play it at these settings anyways. Make sense?

Yeah, I'm a tad confused on the insistence on testing these cards at 4k to check how well they work, when the cards were never great 4k performers. SLI GTX 970's were never a good 4k solution, early benchmarks and reviews never gave any evidence to support the 970 was a good choice for 4k. The lower pixel fill rate could just as easily be the reason for frame spikes as the slower .5gb of memory.

I'm not trying to hide any wrong doing from Nvidia under the table, I just want to see a few more realistic tests at resolutions the cards can actually push.
 

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
Are people actually PLAYING it with that resolution and settings? Just because that is how it is benched, does not imply that is how it is used.

It seems a majority of gamers utilize SLI/Crossfire to achieve playable ultra settings at 4K, or they drop settings down. I wouldn't play at 30fps (average - so many moments of stutterfest is a guarantee when you dip down to 20 or fewer fps), no matter how gorgeous it looked - it'd be getting me killed, so nope. I'd drop the settings until it was playable, as would almost everybody.

30 fps average is indeed unplayable. However on a game like Ryse Son of Rome I can't get 60fps at 1080p. However I can get a solid 30fps with DSR 4x which is 4k. Given that that game runs at 30fps on consoles and that it is basically a movie not a game it works on that game.

Similar story with Watch Dogs, except that I prefer 60fps in that even with occasional stuttering. The 30 fps setting is solid but it just doesn't feel right. I wonder how console users deal with that. Maybe their motion blur is better. TBH destiny at 30fps feels fine to me.
 

n0x1ous

Platinum Member
Sep 9, 2010
2,574
252
126
PCper article says its roughly 6k at the higher resolution scale percentages he tested.
 

destrekor

Lifer
Nov 18, 2005
28,799
359
126
30 fps average is indeed unplayable. However on a game like Ryse Son of Rome I can't get 60fps at 1080p. However I can get a solid 30fps with DSR 4x which is 4k. Given that that game runs at 30fps on consoles and that it is basically a movie not a game it works on that game.

Similar story with Watch Dogs, except that I prefer 60fps in that even with occasional stuttering. The 30 fps setting is solid but it just doesn't feel right. I wonder how console users deal with that. Maybe their motion blur is better. TBH destiny at 30fps feels fine to me.

That's not a fair comparison, because while consoles RUN at 30fps, it is not a true 30FPS average. Console games, when limited to 30fps, are capped at that framerate. It should be very rare that it dips below 30fps, and most of the time might run at 45 or higher. But going up and down in framerate between 30 and 45 can be noticeable for many, so keeping it capped at 30 means it is almost always smooth.

If you can only muster 30fps average, that means it is [almost] surely below 30fps just as often as it is not.

And you are right, for some game types, it does not matter nearly as much. And for other games types, it can be annoying, but does not impact your gameplay and chances of success.
 

destrekor

Lifer
Nov 18, 2005
28,799
359
126
ROP is the key and what many people forget the importance of...

Well you can make a big deal out of ROP count, but, it's pointless. Those extra ROPs would be completely pointless and wasted. Due to fewer SMMs, the GPU can only create so many pixels per clock. There are still excess ROPs in the revealed accurate specs, that means the card can drive more pixels per clock than it can actually create in the rendering pipeline. Thus, there is no limitation due to ROPs.

I'm not defending the act of misleading, but the point is, what they did to ROP count is not limiting the card. The memory is in some situations, but not the ROP count.
 

Teizo

Golden Member
Oct 28, 2010
1,271
31
91
Just curious, who here plays BF4 with an average FPS below 30?

No one.

That said, it is obvious Nvidia wanted to market the card as 4GB even though there was really no way it was going to be able to effectively use it all...which was a major PR flub...but not one the company can't rebound from.
 

Cloudfire777

Golden Member
Mar 24, 2013
1,787
95
91
Well you can make a big deal out of ROP count, but, it's pointless. Those extra ROPs would be completely pointless and wasted. Due to fewer SMMs, the GPU can only create so many pixels per clock. There are still excess ROPs in the revealed accurate specs, that means the card can drive more pixels per clock than it can actually create in the rendering pipeline. Thus, there is no limitation due to ROPs.

I'm not defending the act of misleading, but the point is, what they did to ROP count is not limiting the card. The memory is in some situations, but not the ROP count.

Explain why:
GTX 770 have 1536 cores, 256bit and 32ROPs.
GTX 760 have 1152 cores, 256bit and 32ROPs.
According to your theory the 760 shouldnt really need those 32ROPs. (Unless Nvidia had other plans to support better management of bandwidth intensive games imo....)

ROP is needed once you hit the GPU with higher resolution and high VRAM requirement. For example over 3.5GB and 3K/4K which have been the typical testing of GTX 970 ever since the news broke out.

Second, the 7th L2 cache is dealing with 2 memory controllers at the same time. Far from ideal and why I think some scenarios may see some latency/stuttering which may be reduced with a different memory management through improved drivers
 
Last edited: