[PCPER] GTX 970 SLI (3.5GB effect)

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

shady28

Platinum Member
Apr 11, 2004
2,520
397
126
In Battlefield 4.

Get with the program.

Yeah, you have to run 4K / 2160p @ 150% (3240p) scaling in Battlefield 4 specifically with 970s in SLI. Or alternately you can run at 1440p with 2k scaling (2880p) with 970s in SLI.

And of course, neither SLI rig is playable at those settings, but w/e.
 

96Firebird

Diamond Member
Nov 8, 2010
5,746
342
126
You guys are dissing due to 4K as if the problem only occurs at 4K...

1440p:
WIkk9w0.jpg

Not the best graph to use to prove your point, as the 980 SLI Ultra stutters just as bad in that game...

Awful colors to use for that graph as well.
 
Feb 19, 2009
10,457
10
76
Not the best graph to use to prove your point, as the 980 SLI Ultra stutters just as bad in that game...

Awful colors to use for that graph as well.

Yes, the colors for the graph is badly chosen, yes the 980 SLI stutters badly, but the 970 SLI stutters worse. Actually it does prove another point that Maxwell frame time is pretty bad in SLI, I thought NV had it fixed with Kepler. Strange all the typical FCAT pushing sites stopped using it for Maxwell.

SLI users should demand better smoothness.
 

shady28

Platinum Member
Apr 11, 2004
2,520
397
126
Yes, the colors for the graph is badly chosen, yes the 980 SLI stutters badly, but the 970 SLI stutters worse. Actually it does prove another point that Maxwell frame time is pretty bad in SLI, I thought NV had it fixed with Kepler. Strange all the typical FCAT pushing sites stopped using it for Maxwell.

SLI users should demand better smoothness.

Frame time issues have always plagued SLI, this is nothing new. The 970 SLI will stutter worse than the 980 SLI, why would one expect anything different?

GTX 970 SLI has lower and more consistent frame times than crossfire 290s :

SOM-4K-Ultra-Preset-Frametimes.jpg
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Yes, the colors for the graph is badly chosen, yes the 980 SLI stutters badly, but the 970 SLI stutters worse. Actually it does prove another point that Maxwell frame time is pretty bad in SLI, I thought NV had it fixed with Kepler. Strange all the typical FCAT pushing sites stopped using it for Maxwell.

SLI users should demand better smoothness.

That graph is misleading though.. The frametime spikes were due to them enabling 2x supersampling, which is basically 5K resolution. At normal 1440p, both cards are very playable.

Any GPU can have it's memory system overwhelmed if you choose the right settings, so this doesn't really prove anything other than the GTX 970 not being as fast and as capable as the significantly more expensive GTX 980..

Going to extremes to prove that the GTX 970's memory subsystem is crippled is just nonsensical..
 
Feb 19, 2009
10,457
10
76
The point is about what is playable settings, as some here are saying these tests are not at playable settings.

>45 fps is playable, wouldn't you agree?

In that COD example, the frame time falls around the 16-18ms median with wild variations. That's 55-60 fps. It would be playable if it were smoother, the 970 suffers more spikes than the 980.

In the SoM example, its 45 fps for an open world RPG. Would be playable without the major stutters.

In the computerbase.de video, ACU at 1600p would be perfectly playable (side by side comparison with 980 which did not stutter), without the major stutters.

The point is multi-GPU users like to push IQ, else they just stick with a single GPU and run things on medium/high and not ultra.

Ultimately the 970 is a 3.5gb card because it gets into trouble when vram goes above that. That's all there is to it. How important that is to you depends on how likely you think future games will push vram limits above 3.5gb and in relation, how willing are you to lower texture settings or resolution or AA. That's it, pretty simple. I mean if you are fine with running textures on High instead of Ultra, or using FXAA/MLAA instead of MSAA or SSAA, 3.5gb will be fine for a LONG time.
 
Last edited:

netxzero64

Senior member
May 16, 2009
538
0
71
everyone is still crying about the 970 issue?

Warning issued for Trolling.

-Rvenger
 
Last edited by a moderator:

Subyman

Moderator <br> VC&G Forum
Mar 18, 2005
7,876
32
86
When I had CF on older AMD cards I didn't notice the stuttering or lack of smoothness.. but apparently lots of people saw it, FCAT proved it...

Ultimately, smoothness is subjective. ;)

ps. High texture is the default that comes with the game, you mean ultra textures, which is an optional download. On Steam if you own the game you can see DLC, one of which is the HD content pack.

I used high textures, I noted that because Ultra textures did stutter.

Assuming you're using the G-Sync monitor in your sig, G-Sync helps smooth out SLI frame times.

That's true. I'd think large stutters would still be recognizable. G-sync certainly shows the cards in their best light though, so that is good to point out.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
I used high textures, I noted that because Ultra textures did stutter.



That's true. I'd think large stutters would still be recognizable. G-sync certainly shows the cards in their best light though, so that is good to point out.

Did you notice a difference with ultra textures in Shadow of Mordor? I went back and forth a few times and really couldn't spot the difference but that could be my display. It's not calibrated and is starting to exhibit some trouble spots in the panel.
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
Yes, the colors for the graph is badly chosen, yes the 980 SLI stutters badly, but the 970 SLI stutters worse. Actually it does prove another point that Maxwell frame time is pretty bad in SLI, I thought NV had it fixed with Kepler. Strange all the typical FCAT pushing sites stopped using it for Maxwell.

SLI users should demand better smoothness.

Statistically there is little difference in the stuttering. You are looking at simply one run.

Likewise the stuttering is terrible on the 980, so it matters little. Even with 4 GB the 970 would perhaps stutter less but still stutter too much to be comfortably playable.
 

bigboxes

Lifer
Apr 6, 2002
42,412
12,431
146
It appears to take some effort to bench it since NV has aggressively tried to ensure you cannot use that slow VRAM. Why they even bothered to add it for marketing (and even outright lied for months) appears to have bit them badly. First there was outright denial, then even NV admitted to the misinformation and wrong specs, and now there is hard proof that it clearly is detrimental to use that 512 MB VRAM.

Why the deflection, now that sites are investigating, and in this very clear example with both the 970 and 980 in SLI, 970 is getting hammered. It will only become more prevalent as games require more VRAM. SLI and high resolution users will get the worst side affects.

The point -> 970 + 0.5GB slow VRAM = massive potential frametime spikes once it's in use!

This marketing lie is very important to me. I was just about to pull the trigger on a 970 with the real possibility of purchasing a 2nd one for SLI in the future. I can't justify the cost of a 980. The 290X intrigues me. However, I most likely wait to see what the refresh brings on these cards. I don't think 4K is going away even though it's far from mature.
 

Final8ty

Golden Member
Jun 13, 2007
1,172
13
81
Frame time issues have always plagued SLI, this is nothing new. The 970 SLI will stutter worse than the 980 SLI, why would one expect anything different?

GTX 970 SLI has lower and more consistent frame times than crossfire 290s :

SOM-4K-Ultra-Preset-Frametimes.jpg

Sorry you need a time line to show constancy or else that just represents frame rates.

SLI and CrossFire Smoothness

This is an important topic when talking about SLI and CrossFire. In the past, AMD was the one in the hot seat with terrible CrossFire scaling, performance and smoothness. A lot has changed over the last couple of years, AMD now has a frame pacing technology as well as its new XDMA technology. These technologies combined have turned the tables and now CrossFire feels smoother than SLI. It still feels smoother even when compared to NVIDIA's new GTX 980 cards in SLI. While overclocked GTX 980 SLI is faster in framerate, the actual feeling and smoothness in games feels better on AMD Radeon R9 290X. AMD currently has the upper hand in this arena.
http://www.hardocp.com/article/2014...x_980_sli_overclocked_gpu_review#.VGNA5PmsV8F


A straight line at 20ms is no less consistent than a straight line at 5ms, it just means that the frame rate is lower at 20ms.
 
Last edited:

garagisti

Senior member
Aug 7, 2007
592
7
81
What has me scared is this post elsewhere

http://semiaccurate.com/forums/showpost.php?p=228888&postcount=203

This is in direct reference to DX12. I fear it is something that won't end well at all. I mentioned it elsewhere, but seems like this is not being taken seriously by most. Source is fairly reliable, and is a contributor to a tech site of some repute.

Is there anyone who may shed more light on this? Should there be another thread with this as subject?
 
Feb 19, 2009
10,457
10
76
What has me scared is this post elsewhere

http://semiaccurate.com/forums/showpost.php?p=228888&postcount=203

This is in direct reference to DX12. I fear it is something that won't end well at all. I mentioned it elsewhere, but seems like this is not being taken seriously by most. Source is fairly reliable, and is a contributor to a tech site of some repute.

Is there anyone who may shed more light on this? Should there be another thread with this as subject?

There's NO WAY NV execs would willingly promote DX12 compatibility and fail to deliver. That would be a crapstorm of epic proportions. The scenario is beyond stupid so let's not give it more thought..
 
Last edited:

destrekor

Lifer
Nov 18, 2005
28,799
359
126
There's NO WAY NV execs would willingly promote DX12 compatibility and fail to deliver. That would be a crapstorm of epic proportions. So, it's a bit too premature to discuss something that's just so random and lacking in proof.

I think there's a good chance that, while Maxwell will have DX12 compatibility, full feature set compatibility is not guaranteed.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
I think there's a good chance that, while Maxwell will have DX12 compatibility, full feature set compatibility is not guaranteed.

That would KILL some of our users here who harped on DX12 capability of the GTX 900.

That's a lot of backlash all at once for Nvidia. If AMD actually delivers the R9 380x in a reasonable timeframe (lol.... k so probably not) there will be a lot of salty owners willing to try some AMD kool-aid.
 

Subyman

Moderator <br> VC&G Forum
Mar 18, 2005
7,876
32
86
Did you notice a difference with ultra textures in Shadow of Mordor? I went back and forth a few times and really couldn't spot the difference but that could be my display. It's not calibrated and is starting to exhibit some trouble spots in the panel.

No difference at all. I ran it on my Swift and my VP2770-LED and couldn't tell anything. I was mostly looking at the terrain and such, so maybe it was character faces? It didn't matter, the game looked great on high.
 

rgallant

Golden Member
Apr 14, 2007
1,361
11
81
These unplayable settings are there just to prove a point. Future games will use more resources (has that trend ever reversed?) and won't need these far fetched settings to show what's going on in the OP and the following posts. Right now 970 SLI is the first to get hit by all this, who knows when it'll show up on a single 970.

The *longevity* of the 970 has already been proven to be down the drain, and isn't a problem for those who change hardware regularly. For those who buy to keep for some time and enjoy.. it's a matter of *when* the 970's formidable performance will go off a cliff and be replaced with stuttering.



The only positive thing to say about all this (and that's a stretch), is that it'll be a learning experience for nvidia, to improve on Maxwell's half way chip cutting capabilites, and further refine it to avoid the 3.5GB+512MB fiasco, if possible on Pascal next year if they decide to cut down as granularly as they did with the 970.

the GM200 could have two cut down cards , but maybe they will have enough vram to just use the fast portion. 6gb vs 5gb
if so nv will be doing a lot of hard thinking over those cut downs today for sure.
 

.vodka

Golden Member
Dec 5, 2014
1,203
1,538
136
GM200 Titan is rumored to be a 12GB card. The Geforce variant will probably be a 6GB card, then. Titan had 6GB and 780/Ti came with 3GB after all.

If the Titan/780/780TI kepler release is any indication of how they'll do these little steps to get rid of harvested GM200s up to the complete one in the "1080TI"... Let's hope they don't decide to gimp the cards in ways similar to the 970. That's all there is to the matter.

There's NO WAY NV execs would willingly promote DX12 compatibility and fail to deliver. That would be a crapstorm of epic proportions. The scenario is beyond stupid so let's not give it more thought..

I wouldn't dismiss the possibility of such an scenario. Does anyone even know what DX12 capabilites Maxwell has? Every time I've searched for this, I never came across a definitive answer. I remember reading here in this very forum back when I wasn't registered, that the launch day boxes for the 980 and 970 had DX11 printed on them and it was changed later on to DX12. That raises some doubts, doesn't it?

It is known that DX12 will run on GCN and Fermi/Kepler/Maxwell, at least these architectures will benefit from the lower overhead part... then we have stuff like this. I don't recall Maxwell being hardware DX12 compliant as a major point in the reviews I've read... and that extremetech article isn't written to convey certainty, it talks in potential (sounds like, strongly implied, etc).

Anyway, april should be a fun month if that guy's not trolling.
 
Last edited:

MeldarthX

Golden Member
May 8, 2010
1,026
0
76
GM200 Titan is rumored to be a 12GB card. The Geforce variant will probably be a 6GB card, then. Titan had 6GB and 780/Ti came with 3GB after all.

If the Titan/780/780TI kepler release is any indication of how they'll do these little steps to get rid of harvested GM200s up to the complete one in the "1080TI"... Let's hope they don't decide to gimp the cards in ways similar to the 970. That's all there is to the matter.



I wouldn't dismiss the possibility of such an scenario. Does anyone even know what DX12 capabilites Maxwell has? Every time I've searched for this, I never came across a definitive answer. I remember reading here in this very forum back when I wasn't registered, that the launch day boxes for the 980 and 970 had DX11 printed on them and it was changed later on to DX12. That raises some doubts, doesn't it?

It is known that DX12 will run on GCN and Fermi/Kepler/Maxwell, at least these architectures will benefit from the lower overhead part... then we have stuff like this. I don't recall Maxwell being hardware DX12 compliant as a major point in the reviews I've read... and that extremetech article isn't written to convey certainty, it talks in potential (sounds like, strongly implied, etc).

Anyway, april should be a fun month if that guy's not trolling.

Lets just say he's engine and game developer among other things. He won't post something unless he knows what's what and has backed up what other developers have said about mantle and DX 12....He know's his stuff and long as I've seen him post he's never trolled.
 

.vodka

Golden Member
Dec 5, 2014
1,203
1,538
136
Lets just say he's engine and game developer among other things. He won't post something unless he knows what's what and has backed up what other developers have said about mantle and DX 12....He know's his stuff and long as I've seen him post he's never trolled.

Thanks for the insight! I'm looking forward to what happens after april, then.
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
627
126
There's NO WAY NV execs would willingly promote DX12 compatibility and fail to deliver. That would be a crapstorm of epic proportions. The scenario is beyond stupid so let's not give it more thought..

Nvidia never fully supported DX10 (until very late when it didn't matter) so maybe the same thing will happen with DX12 on their current GPUs. Nvidia will simply say we support what matters etc.
 
Last edited: