[PCPER] GTX 970 SLI (3.5GB effect)

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
Explain why:
GTX 770 have 1536 cores, 256bit and 32ROPs.
GTX 760 have 1152 cores, 256bit and 32ROPs.
According to your theory the 760 shouldnt really need those 32ROPs. (Unless Nvidia had other plans to support better management of bandwidth intensive games imo....)

ROP is needed once you hit the GPU with higher resolution and high VRAM requirement. For example over 3.5GB and 3K/4K which have been the typical testing of GTX 970 ever since the news broke out.

Second, the 7th L2 cache is dealing with 2 memory controllers at the same time. Far from ideal and why I think some scenarios may see some latency/stuttering which may be reduced with a different memory management through improved drivers


Starting with the ROPs, while NVIDIA’s original incorrect specification is unfortunate, from a practical perspective it’s really just annoying. As originally (and correctly) pointed out by The Tech Report and Hardware.fr, when it comes to fillrates the GTX 970 is already bottlenecked elsewhere. With a peak pixel rate of 4 pixels per clock per SMM, the GTX 970’s 13 SMMs inherently limit the card to 52px/clock, versus the 56px/clock rate for the card’s 56 ROPs.

http://www.anandtech.com/show/8935/...cting-the-specs-exploring-memory-allocation/4
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Just curious, who here plays BF4 with an average FPS below 30?

Not the point. They are showing that +3.5gb ram the card stutters where it shouldn't.

Are people actually PLAYING it with that resolution and settings? Just because that is how it is benched, does not imply that is how it is used.

It seems a majority of gamers utilize SLI/Crossfire to achieve playable ultra settings at 4K, or they drop settings down. I wouldn't play at 30fps (average - so many moments of stutterfest is a guarantee when you dip down to 20 or fewer fps), no matter how gorgeous it looked - it'd be getting me killed, so nope. I'd drop the settings until it was playable, as would almost everybody.

This tangent continues and it will just get the thread derailed.
 

flash-gordon

Member
May 3, 2014
123
34
101
Most people don't buy GPUs to use for only 6-12 months. I believe all benchmark should take any GPU, on all aspects, to the limit, that's the only way of having a minimum sense of future proofing...

All that happened these days shows that there's little need to know if a GPU can play COD at a hundred fps at 1080p or 1600p with half VRAM loaded if there's a clear trend of games asking for more.

So, I don't understand people being passive saying what's the point of testing with these settings. We need to see where in the future one setup can get.
 

destrekor

Lifer
Nov 18, 2005
28,799
359
126
Explain why:
GTX 770 have 1536 cores, 256bit and 32ROPs.
GTX 760 have 1152 cores, 256bit and 32ROPs.
According to your theory the 760 shouldnt really need those 32ROPs. (Unless Nvidia had other plans to support better management of bandwidth intensive games imo....)

ROP is needed once you hit the GPU with higher resolution and high VRAM requirement. For example over 3.5GB and 3K/4K which have been the typical testing of GTX 970 ever since the news broke out.

Second, the 7th L2 cache is dealing with 2 memory controllers at the same time. Far from ideal and why I think some scenarios may see some latency/stuttering which may be reduced with a different memory management through improved drivers

I reckon that they don't always disable ROPs just because they aren't needed. How they are disabled is determined by how the chip itself gets cut down, and what gets disabled by removing shader cores is entirely dependent upon the architecture of the chip itself.

For some cut downs, ROP count may be completely untouched simply simply for engineering simplicity.

This may all be wrong, however, it was discussed that SMMs can only pump out 52 pixels per clock, but the ROPs can handle 56.

I'm not debating the L2 and memory controller issue.

edit: Ahh, I see SlowSpyder beat me to it, and addressed this better by including a reference.
 

ocre

Golden Member
Dec 26, 2008
1,594
7
81
Both the 980 and the 979 SLI look terrible in these extreme cases. Not sure why e thread is so focused on the 970 with segmented ram cause the 980 is looking just as bad.

Its obvious that the settings are to high, the cards are pushed past their limits.

It would be interesting to see how the 290cf scales in the similar brutal test.

I honestly think that the 512bit bus should be helpful but some of the test I have seen show the 290x to stutter and hitch worse than the 970 in extreme conditions.

Let's look at this relative to other cards. Picking out the 970 by itself is completely not useful. Not if you want to look at this subjectively. But if all you want to do is bash a GPU, have at it
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
To me the GTX 970 SLI looks markedly worse than the GTX 980 SLI. The frame variance for the GTX 970 is not nearly as good, especially when pushed at 1.50x scaling.
 

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
980-BF4_3840x2160_PLOT_0.png

It does look brutal once you start to utilize that slow 512 MB of VRAM.

BF4_3840x2160_PLOT_0.png


Compare 980 SLI to 970 SLI

980-BF4_3840x2160_STUT_0.png


BF4_3840x2160_STUT_0.png



http://www.pcper.com/reviews/Graphics-Cards/Frame-Rating-GTX-970-Memory-Issued-Tested-SLI

NVidia has also dropped the ball on SLI, those frametimes are terrible and coming from the huge SLI smoothness campaign. :eek: The 980 SLI isn't even very good in those games, and the 970 SLI is terrible.

Note that this is coming from the site which first published FCAT results (tarnishing crossfire while touting SLI while they pretended to be independent) while hiding the fact it came directly from NVidia, which means they were basically doing NV's PR. Later on they admitted that FCAT was indeed provided by NVidia, which puts the objectivity of the site in question. The smoothness situation seems to have reversed with XDMA being the superior solution overall.

Both the 980 and the 979 SLI look terrible in these extreme cases. Not sure why e thread is so focused on the 970 with segmented ram cause the 980 is looking just as bad.

Its obvious that the settings are to high, the cards are pushed past their limits.

It would be interesting to see how the 290cf scales in the similar brutal test.

I honestly think that the 512bit bus should be helpful but some of the test I have seen show the 290x to stutter and hitch worse than the 970 in extreme conditions.

Let's look at this relative to other cards. Picking out the 970 by itself is completely not useful. Not if you want to look at this subjectively. But if all you want to do is bash a GPU, have at it

Take another look, there is a substantial difference. The frame time variance is substantially worse on the 970. The green frametimes at 1.5x scaling are looking like tall weeds on the 970.

The point, again, is to compare the 980 sli with the 970 sli and see what the differences (primarily the slow VRAM segment which is said to contribute to stuttering) mean.

Take another look at those pictures. Sure the 980 doesn't look good either, but the 970 is way worse. If comparing the 970 to the 980 isn't a good objective comparison then what is? :eek:

To me the GTX 970 SLI looks markedly worse than the GTX 980 SLI. The frame variance for the GTX 970 is not nearly as good, especially when pushed at 1.50x scaling.

It's blatantly obvious, I don't understand the deflection by some.
 

Elixer

Lifer
May 7, 2002
10,371
762
126
Let's look at this relative to other cards. Picking out the 970 by itself is completely not useful. Not if you want to look at this subjectively. But if all you want to do is bash a GPU, have at it
What are you talking about ?
The whole point of the thread (& test) is to bring some more proof that there is a very real issue for a SPECIFIC line of cards, namely the 970 that wasn't found until recently, and that people were mislead / (lied to) by nvidia by having the wrong specs for the 970.

This has nothing to do with any other card, or how they perform.
 
Last edited:

kasakka

Senior member
Mar 16, 2013
334
1
81
I tried CoD Advanced Warfare with similar settings with my 970 SLI and based on running thru a few single player levels it ran just fine at 1440p, 2X upscaling (so essentially 5K), SMAA 1X and G-Sync. Memory use was about 3700-3800 for one card and 3500 for the other.

Personally I don't see myself using upscaling much (didn't seem to have much of an effect in image quality at this resolution) and high AA was never my thing. With a duo of 980s costing me about 430 euros more to upgrade, I think I'll stick to my 970s unless further developments arise.
 

positivedoppler

Golden Member
Apr 30, 2012
1,148
256
136
Can someone explain why they think the 970 was designed like this? Cut cost? Not enough time to engineer a better solution? Purposely design to enhance performance in one area while degrading performance in another?
 

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,250
136
This thread is too funny :)

The bickering back and forth is amuzing.

The settings used are unrealistic in the real world uses for the card(s). Maybe not for those that drank the Kool-Aid.

The test pretty much starts unplayable and gets more unplayable on both cards.

I can somewhat see both sides of the argument tho.

I figure most users would rather have useable settings that give greater gameplay experience.

Not defending NVIDIA at all on this. The whole issue isn't cool at all. Seems to me that the card pretty much needs to be pushed way beyond it's usable limit to expose the downside of it's memory weakness.

Of course I'll need to do some more testing to see if 970 SLI works for me. Doubt I'll purchase a monitor higher than 1440p within the useable lifespan of the cards in question.

It is what it is till further notice.

For the record these are my 1st NVIDIA products purchased after bumpgate....Maybe they'll be my last?
 

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,250
136
Can someone explain why they think the 970 was designed like this? Cut cost? Not enough time to engineer a better solution? Purposely design to enhance performance in one area while degrading performance in another?

For the money. More salvageable dies.
 

Elixer

Lifer
May 7, 2002
10,371
762
126
Can someone explain why they think the 970 was designed like this? Cut cost? Not enough time to engineer a better solution? Purposely design to enhance performance in one area while degrading performance in another?

Marketing.
Since AMD had a 4GB card, they really, really wanted the 970 to also be a 4GB card (which it is), but can't run at full speed using the last 500MB.
Remember, the 970 is a subset of the 980's that had defects, so, as per usual, they disabled the 'bad part', but, this still left them with useable 500MB left over.
 

Eymar

Golden Member
Aug 30, 2001
1,646
14
91
It's mainly a real world issue right now for SLI, surround and high res users. However, who knows for games in the next year or two. I'd feel more comfortable with buying a card with a full 4GB of high speed memory if planning not to buy another card in next few years. For those who want compensation due to deceptive advertising is to find out if the 970 behaving like a full speed 4GB vram card like it's advertised or a 3.5GB card? The only way to know is to do tests like this.
 

.vodka

Golden Member
Dec 5, 2014
1,203
1,538
136
These unplayable settings are there just to prove a point. Future games will use more resources (has that trend ever reversed?) and won't need these far fetched settings to show what's going on in the OP and the following posts. Right now 970 SLI is the first to get hit by all this, who knows when it'll show up on a single 970.

The *longevity* of the 970 has already been proven to be down the drain, and isn't a problem for those who change hardware regularly. For those who buy to keep for some time and enjoy.. it's a matter of *when* the 970's formidable performance will go off a cliff and be replaced with stuttering.



The only positive thing to say about all this (and that's a stretch), is that it'll be a learning experience for nvidia, to improve on Maxwell's half way chip cutting capabilites, and further refine it to avoid the 3.5GB+512MB fiasco, if possible on Pascal next year if they decide to cut down as granularly as they did with the 970.
 
Feb 19, 2009
10,457
10
76
You guys are dissing due to 4K as if the problem only occurs at 4K...

1440p:
WIkk9w0.jpg


mhOevfQ.png


Mordor also stutters like crazy on SLI 970s with Ultra Textures at 1080p. Same for Skyrim with texture mods. Dying Light SLI users also report much more texture pop ins and stuttering at normal resolution.

The point is it will be messed up as soon as games demand and need above 3.5GB. It will happen at 4K for older games (BF4) or it may happen at 1080/1440/1660p in newer games. Look at the bench from Computerbase.de, they tested a ACU at 1600p and found very bad SLI frame times on the 970 which did not occur on the 980.

Are those playable settings? Heck yes, if it weren't stuttering, its perfectly playable, view their videos:
http://www.computerbase.de/2015-01/geforce-gtx-970-vram-speicher-benchmarks/3/

It will be interesting as the next wave of "console ports" hit, especially GTA V, which if GTA IV is anything to go by (remember it pushing to 2gb vram limit back way those years ago?).
 
Feb 19, 2009
10,457
10
76
These unplayable settings are there just to prove a point. Future games will use more resources (has that trend ever reversed?) and won't need these far fetched settings to show what's going on in the OP and the following posts. Right now 970 SLI is the first to get hit by all this, who knows when it'll show up on a single 970.

The *longevity* of the 970 has already been proven to be down the drain, and isn't a problem for those who change hardware regularly. For those who buy to keep for some time and enjoy.. it's a matter of *when* the 970's formidable performance will go off a cliff and be replaced with stuttering.

As we've seen its not "unplayable settings" alone that reveal the problem, it's just a matter of loading the vram above 3.5gb, which is possible on older titles with texture mods. Users have been reporting terrible stutter in 970s SLI in Skyrim & Arma 3 for a long time, but were ignored until the recent findings.

Indeed the trend seems to be for cross-platform AAA titles, to cram as much into the vram as possible. When gamedevs design "ultra settings" they take into account current & future hardware, 4gb occurs on the top end 970/980 and R290/X, and future top cards should be 4gb+, with consoles in mind, pushing 4gb vram is likely. So I fully agree the longevity or "future proofing" of 970s just went out the door.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Didn't they just said that there was no driver incoming? Also a driver update wouldn't help since the issue with the 970s is on the hardware side.

No, SLI issues definitely get cleaned up with new drivers. They said a specific driver to change the behavior of the 970 wasn't coming, not that they will never release a new driver at all for it. The 980 also has some SLI issues apparently as well so it's not the 970 hardware. You can clearly see that there are times when the 970 and 980 both are well under 3.5GB usage and both have some amount of frames that deviate from the smooth line you ideally want in the graphs.

As others have said, this test is worthless to get any real world understanding from. Unplayable on both setups no matter what. 4k with 1.5x scaling is beyond stupid. Bring it back down to reality and run 1440p, some AA and max in game detail settings and then we can see what's going on with settings people can actually use.
 
Last edited:

ocre

Golden Member
Dec 26, 2008
1,594
7
81
980-BF4_3840x2160_PLOT_0.png





Take another look, there is a substantial difference. The frame time variance is substantially worse on the 970. The green frametimes at 1.5x scaling are looking like tall weeds on the 970.

The point, again, is to compare the 980 sli with the 970 sli and see what the differences (primarily the slow VRAM segment which is said to contribute to stuttering) mean.

Take another look at those pictures. Sure the 980 doesn't look good either, but the 970 is way worse. If comparing the 970 to the 980 isn't a good objective comparison then what is? :eek:



It's blatantly obvious, I don't understand the deflection by some.

I have returned my 970 and been very open about that. I am not trying to deflect anything. I am just saying its only reasonable to look at the results with a point of reference. In the examples posted before my post, the 980 sli isn't pretty at all either at those settings. Why would anyone expect the 970sli to look better or even as "good". If the 980 is struggling, the 970 will struggle that much more. It's been cut down, its common sense.

The charts silver force posted are the best ones so far. It shows the 980 not struggling and in this case there is an argument,

But you guys have to remember, the 970 is cut down. The 980 is 20% or more powerful. Any case that causes a 980 to struggle will be a nightmare for the 970. Are you really gonna reject that? Talk about trying hard to deflect
 

Subyman

Moderator <br> VC&G Forum
Mar 18, 2005
7,876
32
86
Its weird though, I haven't noticed any stuttering in games. I played Shadows of Mordor start to finish with 970 SLI with high textures and thought it was smooth.
 

Eymar

Golden Member
Aug 30, 2001
1,646
14
91
Assuming you're using the G-Sync monitor in your sig, G-Sync helps smooth out SLI frame times.
 
Feb 19, 2009
10,457
10
76
Its weird though, I haven't noticed any stuttering in games. I played Shadows of Mordor start to finish with 970 SLI with high textures and thought it was smooth.

When I had CF on older AMD cards I didn't notice the stuttering or lack of smoothness.. but apparently lots of people saw it, FCAT proved it...

Ultimately, smoothness is subjective. ;)

ps. High texture is the default that comes with the game, you mean ultra textures, which is an optional download. On Steam if you own the game you can see DLC, one of which is the HD content pack.
 

shady28

Platinum Member
Apr 11, 2004
2,520
397
126
Its weird though, I haven't noticed any stuttering in games. I played Shadows of Mordor start to finish with 970 SLI with high textures and thought it was smooth.

And you won't. The article clearly states they were only able to get unusual / out of normal differences between GTX 970 SLI and GTX 980 SLI running in 4K / 2160P at 150% scaling, which is to say 6K rendering (3240p).

According to Steam, this is how many people run that kind (or anywhere near) of res :

3840 x 2160 0.04%
5040 x 1050 0.01%
5760 x 1080 0.06%

5760x1080 is triple 1080p monitors.

Total is 0.11%. So 99.89% of gamers can't be affected because they can't run this kind of rez. For the 0.11% that do - only those that have a GTX 970 SLI rig would be affected. GTX 970 currently has 1.8% of the total GPU market.

My bet would be that >80% of those running 1440p and 2160p are running 27" iMacs with mobile GPUs....

This is really a big non-issue for any but the most extreme niche cases.
 
Feb 19, 2009
10,457
10
76
And you won't. The article clearly states they were only able to get unusual / out of normal differences between GTX 970 SLI and GTX 980 SLI running in 4K / 2160P at 150% scaling, which is to say 6K rendering (3240p).

In Battlefield 4.

Get with the program.