WCCftech: Memory allocation problem with GTX 970 [UPDATE] PCPer: NVidia response

Page 12 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
They didn't go in the driver and set programming for every game and every mod. that's kind of impossible.

You aren't getting it. If this memory scheme has problems in a future high res texture game you will need to trust that Nvidia can "fix" it or be kept from playing at settings other true 4GB cards, 980/290/290X, can handle. That is something people need to decide on, they didn't even know they needed to make that decision before people discovered the 3.5+.5 memory scheme.

Nvidia should have been upfront about this.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
You aren't getting it. If this memory scheme has problems in a future high res texture game you will need to trust that Nvidia can "fix" it or be kept from playing at settings other true 4GB cards, 980/290/290X, can handle. That is something people need to decide on, they didn't even know they needed to make that decision before people discovered the 3.5+.5 memory scheme.

Nvidia should have been upfront about this.

You are also assuming it's a problem at all, we haven't found any games that show it's a problem so it might not even be an issue to worry about.

We're kind of going in circles, but I understand your point.
 

iiiankiii

Senior member
Apr 4, 2008
759
47
91
The thing is though, games still run great. That's why people buy these cards is it not? The "issue" is overblown...to the point I'm surprised I don't see memes popping up everywhere like the XBox 360 RROD.

Kinda like the whole framtime thing, right? The fact is that there are noticeable performance hit by splitting up the vram.
 

wanderer27

Platinum Member
Aug 6, 2005
2,173
15
81
nVidia's response :

The GeForce GTX 970 is equipped with 4GB of dedicated graphics memory. However the 970 has a different configuration of SMs than the 980, and fewer crossbar resources to the memory system. To optimally manage memory traffic in this configuration, we segment graphics memory into a 3.5GB section and a 0.5GB section. The GPU has higher priority access to the 3.5GB section. When a game needs less than 3.5GB of video memory per draw command then it will only access the first partition, and 3rd party applications that measure memory usage will report 3.5GB of memory in use on GTX 970, but may report more for GTX 980 if there is more memory used by other commands. When a game requires more than 3.5GB of memory then we use both segments.

We understand there have been some questions about how the GTX 970 will perform when it accesses the 0.5GB memory segment. The best way to test that is to look at game performance. Compare a GTX 980 to a 970 on a game that uses less than 3.5GB. Then turn up the settings so the game needs more than 3.5GB and compare 980 and 970 performance again.

Here’s an example of some performance data:

GTX 980 GTX 970
Shadow of Mordor
<3.5GB setting = 2688x1512 Very High 72 FPS 60 FPS
>3.5GB setting = 3456x1944 55 FPS (-24%) 45 FPS (-25%)
Battlefield 4
<3.5GB setting = 3840x2160 2xMSAA 36 FPS 30 FPS
>3.5GB setting = 3840x2160 135% res 19 FPS (-47%) 15 FPS (-50%)
Call of Duty: Advanced Warfare
<3.5GB setting = 3840x2160 FSMAA T2x, Supersampling off 82 FPS 71 FPS
>3.5GB setting = 3840x2160 FSMAA T2x, Supersampling on 48 FPS (-41%) 40 FPS (-44%)

On GTX 980, Shadows of Mordor drops about 24% on GTX 980 and 25% on GTX 970, a 1% difference. On Battlefield 4, the drop is 47% on GTX 980 and 50% on GTX 970, a 3% difference. On CoD: AW, the drop is 41% on GTX 980 and 44% on GTX 970, a 3% difference. As you can see, there is very little change in the performance of the GTX 970 relative to GTX 980 on these games when it is using the 0.5GB segment.


Table data and more is better in the actual Article that just came up :

http://www.pcper.com/news/Graphics-Cards/NVIDIA-Responds-GTX-970-35GB-Memory-Issue




.
 

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
You are also assuming it's a problem at all, we haven't found any games that show it's a problem so it might not even be an issue to worry about.

We're kind of going in circles, but I understand your point.

I respect your decision to trust that Nvidia has it handled, i.e. this won't affect gaming experience. I don't like that we went from up front disclosure of memory quirks with things like the 550 Ti to Nvidia not bothering to mention it until an internet teacup storm brewed up.
 

Abwx

Lifer
Apr 2, 2011
11,885
4,873
136
They also said right there that there is little difference for single GPU setups.

For multigpu FCAT is a must.

They didnt say this, their wording is the one of someone that wants to deliberatly turn a blind eye :

We sometimes use a tool called FCAT to capture exactly when each frame was delivered to the display, but that's usually not necessary in order to get good data with single-GPU setups.
Uusaly not necessary doesnt mean little difference, they just dont checked, big probability is that Nvidia did forbid the use of their viral marketing tools with recent GPUs, as they have the right to do so since they own the system.

Besides they cautiously didnt answer to a member who was pointing discretanpcies and uneven games usages in tests :

http://techreport.com/discussion/27702/nvidia-geforce-gtx-960-graphics-card-reviewed?post=877869
 
Last edited:

iiiankiii

Senior member
Apr 4, 2008
759
47
91
Obviously not noticeable enough or it would have been noted everywhere on day one.

Lol. People have been noticing it. Why do you think this whole issue came up? People noticed some funky thing happening when more then 3.5 GB was used. That is how this whole thing started.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
No, they noticed that these tools has only shown 3.5GB instead of 4GB with a GTX980. There weren't any performance problems.
 

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
Oh, also consider that some people would have upgraded to a 980 if Nvidia had made this memory configuration public at launch.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Lol. People have been noticing it. Why do you think this whole issue came up? People noticed some funky thing happening when more then 3.5 GB was used. That is how this whole thing started.

It came up where? From a quirky benchmark app that tests cuda and not gaming usage? No notable sites mentioned performance loss in games. Again if the performance penalty is as severe as claimed the framerate would tank and everyone would have reported on that fact. Not every review site used the same games either so you can't say it's a specific selection of games that won't show it. I've even personally tested some older titles that aren't used in any reviews at 4k and at some settings I am using a lot more than 3.5GB already. The FPS didn't tank and many times was still semi-playable. Though I think 4k gaming isn't really playable all the time on SLI 970s after my own testing but I have more demanding standards of performance.
 
Last edited:

iiiankiii

Senior member
Apr 4, 2008
759
47
91
It came up where? From a quirky benchmark app that tests cuda and not gaming usage? No notable sites mentioned performance loss in games.

You're right. The software (afterburner) wasn't reporting the correct memory usage due to nvidIa funky memory segmentation. That started everything.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
You're right. The software (afterburner) wasn't reporting the correct memory usage due to nvidIa funky memory segmentation. That started everything.

It's always reported pretty accurately for me when I looked at it. As noted in the second post on this thread where I loaded up Crysis 3 at 4k and AA applied to hit 4GB.
 

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
The question is what does it mean and so far nobody has actually determined the effect, if any. Everyone seems to claim it does/doesn't matter while none of us know if it does yet.

It feels like NV tried to sweep it under the rug but couldn't and their vague explanation and even trying to pass it off with some avg FPS data. Why wouldn't they also use their FCAT which they themselves developed for the purpose of determining if there is something wrong, which cannot be measured with the conventional applications, is a huge red flag that there may be something wrong there.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
The OP was right in noticing the behaviour, but a bit off in attributing the cause even though his original theory was plausible at the time. Noticing the behaviour here and in other places led to further investigation and better understanding of the issue.

The OP doesn't need to apologize. If anything, it's the Nvidia defense force that needs to apologize to the OP but you can already see them changing from head-in-the-sand denial to trying to paint the memory allocation problem as a non-issue.

Sure, sure...all the OP's claims have been debunked. Yet somehow he's been validated?
 

stahlhart

Super Moderator Graphics Cards
Dec 21, 2010
4,273
77
91
Reopening thread, with a warning that anyone who tries turning this into a flamewar is going to be sorry that they ever did.

Keep this discussion civil, focused on the technical issue and getting to the truth of the matter, whatever it is, and keep your cheap shots and callouts to yourself.

-- stahlhart
 

garagisti

Senior member
Aug 7, 2007
592
7
81
The question is what does it mean and so far nobody has actually determined the effect, if any. Everyone seems to claim it does/doesn't matter while none of us know if it does yet.

It feels like NV tried to sweep it under the rug but couldn't and their vague explanation and even trying to pass it off with some avg FPS data. Why wouldn't they also use their FCAT which they themselves developed for the purpose of determining if there is something wrong, which cannot be measured with the conventional applications, is a huge red flag that there may be something wrong there.
Your second paragraphs has me wondering proper. This was one of the first Nvidia cards in a while which i could recommend to most who i know. Well, most people i know also use their systems as media centers, hd audio bit-streaming support on Nvidia side was a bit shaky. Even with lack of clarity on support, the new cards performed well enough. Not that it has changed for certain resolution, but, any limitations should have been outlined by the manufacturer prior to sale, not when some random dudes (mean no offense to anyone) on interweb find a flaw.

What is properly shocking to me, is reading an analysis that the memory architecture is reportedly working as is designed. Great, but then should the same analyst not ponder why this information was left out at the launch or later, till someone brought it up? IMHO, it should have been, and the analysts, who so casually reference to it in passing, are perhaps not to be trusted to be neutral.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Your second paragraphs has me wondering proper. This was one of the first Nvidia cards in a while which i could recommend to most who i know. Well, most people i know also use their systems as media centers, hd audio bit-streaming support on Nvidia side was a bit shaky. Even with lack of clarity on support, the new cards performed well enough. Not that it has changed for certain resolution, but, any limitations should have been outlined by the manufacturer prior to sale, not when some random dudes (mean no offense to anyone) on interweb find a flaw.

What is properly shocking to me, is reading an analysis that the memory architecture is reportedly working as is designed. Great, but then should the same analyst not ponder why this information was left out at the launch or later, till someone brought it up? IMHO, it should have been, and the analysts, who so casually reference to it in passing, are perhaps not to be trusted to be neutral.

One take away could be that there was no need to reveal this as it didn't change the way the card performs for it's intended usage in games. Speaking generally, people don't care how it works or why it works just that it works. All things gaming related appear to be just fine. Any gaming related problems would have been reported by members on here for sure. Forums like this one seem to have a higher number of people who would have spotted an issue right away. If stuttering was a problem, people would have complained. Just like they have previously when playing UE3 based games and Skyrim which are notorious for their stuttering. I didn't see any reports of noticeable stuttering which I would say is the most important thing, that real world usage seems to not be a problem.
 
Last edited:

RampantAndroid

Diamond Member
Jun 27, 2004
6,591
3
81
Correct. If this tool is trying to allocate 4GB but only 3.5GB of physical VRAM is available for any given reason, then the last 512MB would need to spill out. It's clearly not blocked by physical VRAM size, as otherwise the program would hard fail on cards smaller than 4GB*.

* There are ways in CUDA to disallow a program from spilling over. NVIDIA's BandwidthTest sample does this, for example

Then can we get this "test" recompiled to force only physical VRAM?

Also, can someone please run this test using no monitor on the box itself, and an RDP session in to the machine? Just humor me.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Then can we get this "test" recompiled to force only physical VRAM?

Also, can someone please run this test using no monitor on the box itself, and an RDP session in to the machine? Just humor me.

I have no idea how to do that lol. I know you asked that earlier too.
 

garagisti

Senior member
Aug 7, 2007
592
7
81
One take away could be that there was no need to reveal this as it didn't change the way the card performs for it's intended usage in games. Speaking generally, people don't care how it works or why it works just that it works. All things gaming related appear to be just fine. Any gaming related problems would have been reported by members on here for sure. Forums like this one seem to have a higher number of people who would have spotted an issue right away. If stuttering was a problem, people would have complained. Just like they have previously when playing UE3 based games and Skyrim which are notorious for their stuttering. I didn't see any reports of noticeable stuttering which I would say is the most important thing, that real world usage seems to not be a problem.
Wrong. People not only here, but elsewhere are discussing, and have been discussing about how tech works, and not that it just works. More importantly, it may not meet advertising standards of certain countries and may be considered to be fraudulent advertising/ behaviour.

For what it is worth, i agree that gaming performance is what it is, and would remain unaffected for the benchmarks ran as they were at launch. Had this limitation been more public, more investigations into limitations and performance against competition could be evaluated in light of that information. It may also have affected the perceived value of the product. So yes, dodgy behaviour is being reported for that which it is. You don't care, well good for you, but then do not speak for everyone as you just tried to. Several other owners of the cards here and elsewhere, they are rather bemused, and ranging to dismayed.
 

amenx

Diamond Member
Dec 17, 2004
4,521
2,857
136
The question is what does it mean and so far nobody has actually determined the effect, if any. Everyone seems to claim it does/doesn't matter while none of us know if it does yet.
To date I've probably had the best GPU/gaming experiences with the 970 in games like FC4, Crysis 3, Metro redux, BL3, COD-AW W-TNO, Grid 2 and others all maxed as far as playability allows (ie, game settings maxed but variable levels of AA). I have been able to recreate the issue with FC4 by upping MSAA to x8 @ 1440p, and lo and behold the game came to a crawl :awe:. And it just so happened to be over 3.5gb vram. So some would like to conclude its due to that and FC4 should run smoothly at MSAAx8 @ 1440p on ultra settings. Sounds cool to me :D.
 
Status
Not open for further replies.