WCCftech: Memory allocation problem with GTX 970 [UPDATE] PCPer: NVidia response

Page 21 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Magic Carpet

Diamond Member
Oct 2, 2011
3,477
233
106
I don't think this will stop me from getting a 970 in the next days, but it still bad for nVidia
The way I look at it, R9 290 is performance per $ king, right now. If you are not bothered with $ per fps, you might as well get 980 for just being a better designed and overall a faster card. There are 980 models that actually are not only faster but also more power-efficient than some 970's (e.g. Asus 980 Strix).
 
Last edited:

Attic

Diamond Member
Jan 9, 2010
4,282
2
76
This will stop me from considering the 970 as an upgrade.

Still a solid card, but not as solid as I thought. Glad the information came to light.


It's a 3.5gb + 512mb card. I do not want to relie on nVidia to support the 970 well through drivers this aspect of the 970 once nVidia's next gen comes out.

*cough* GTX 770 now
*cough* GTX 780 now


I'll be steering towards a $980 now vs the new AMD offerings. If the 970 comes down to $269 then i'd be back considering it more fully.




Bottom line on this - This information about memory shenanigans with the 970 and it's ROP count should have been pointed out in initial reviews, there is a reason this information was concealed from initial reviewers and for crying out loud it wasn't because nVidia "umm we didn't know". If you are believing nvidia here you are being naive. Not a huge deal, but it matters.

Folks don't want a 3.5gb card with a 512mb tacked on that needs special considerations. They want the real deal hollyfield 4gb cards. 970 is pretending here as a 4gb card vs it's peers.
 
Last edited:

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
Ryan is investigating.

Now make no mistake, NVIDIA right now is in full damage control mode due to the negative press this issue has garnered and the pain that’s going to follow. When NVIDIA is getting Senior VPs like Jonah Alben on the phone with us on a weekend night to talk architecture and answer questions, this isn’t normal operating procedure for the company. But at the same time it’s a positive sign for how serious NVIDIA is taking our concerns, and meanwhile an NVIDIA under pressure is an NVIDIA that is more likely to answer our deepest technical questions, giving us more insight than ever before into GM204.
As part of our discussion with NVIDIA, they laid out the fact that the original published specifications for the GTX 970 were wrong, and as a result the “unusual” behavior that users had been seeing from the GTX 970 was in fact expected behavior for a card configured as the GTX 970 was. To get straight to the point then, NVIDIA’s original publication of the ROP/memory controller subsystem was wrong; GTX 970 has a 256-bit memory bus, but 1 of the 4 ROP/memory controller partitions was partially disabled, not fully enabled like we were originally told. As a result GTX 970 only has 56 of 64 ROPs and 1.75MB of 2MB of L2 cache enabled. The memory controllers themselves remain unchanged, with all four controllers active and driving 4GB of VRAM over a combined 256-bit memory bus.

GTX 970 can read the 3.5GB segment at 196GB/sec (7GHz * 7 ports * 32-bits), or it can read the 512MB segment at 28GB/sec, but not both at once; it is a true XOR situation. Furthermore because the 512MB segment cannot be read at the same time as the 3.5GB segment, reading this segment blocks accessing the 3.5GB segment for that cycle, further reducing the effective memory bandwidth of the card. The larger the percentage of the time the crossbar is reading the 512MB segment, the lower the effective memory bandwidth from the 3.5GB segment.
To me this is the worst part of the card! I'd rather have fast (i.e. normal) 3.5 GB than 4 GB which is slower due to the 0.5 GB of slow VRAM.

http://www.anandtech.com/show/8935/geforce-gtx-970-correcting-the-specs-exploring-memory-allocation

Now that the GTX 970 has been confirmed as a 3.5GB/0.5GB card, the investigation begins as to the performance effects this brings.
 
Last edited:

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
The biggest issue for me around the 3.5GB vs. 4GB memory on the 970 is in relation to development and drivers.

Because NV did not disclose this up-front, users had no idea of the potential performance trade-offs used in the 970 configuration. That's the most shady thing from the NV perspective...
 

Magic Carpet

Diamond Member
Oct 2, 2011
3,477
233
106
So.. how the 7/8gb 970 models will look like (if any), 8 gigs on paper, 7 in reality... you reckon?
 
Last edited:

guskline

Diamond Member
Apr 17, 2006
5,338
476
126
The biggest issue for me around the 3.5GB vs. 4GB memory on the 970 is in relation to development and drivers.

Because NV did not disclose this up-front, users had no idea of the potential performance trade-offs used in the 970 configuration. That's the most shady thing from the NV perspective...

Well stated. A few weeks ago, before the disclosure, I used my Christmas gift cards to replace my 2 GTX 670 FTWs with a Gigabyte GTX970 G1 Gamer. Would have been nice to know this info before I made the switch. I might have waited to make the switch.
 
Last edited:

amenx

Diamond Member
Dec 17, 2004
4,524
2,859
136
I guess we should now start seeing several sites do thorough testing on 970s with comparable cards to go against. Hope AT does at least.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
I guess we should now start seeing several sites do thorough testing on 970s with comparable cards to go against. Hope AT does at least.

Hate to burst your bubble, but I'm going to HIGHLY doubt AT does indepth testing before we see it done by every other site out there.
AT is usually late to the review party at this point.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
I've been looking at some other forums on this issue and for the most part, frame times are good for hte GTX 970. But when you're at that 3.5 mark and you go over, you get a spike. Then, you come below that 3.5 mark, you get another spike. Then another when you go above again.

It seems like there is quite an issue with moving around that 3.5 mark.

Here is just one example although there are plenty more:
k3B6TTq.jpg


This isn't a "massive" issue but it's still one I'd never want to deal with.
 

HurleyBird

Platinum Member
Apr 22, 2003
2,812
1,550
136
It may have started out as an honest mistake, but I believe someone in the organization knew sooner or later but chose not to mention it.

Yes, while it's delving down into the realm of conspiracy to claim that Nvidia knowingly lied to the press, it's pure naivety to think that of the multitude of people who worked on the card and architecture a great number didn't read the launch reviews and immediately notice the mistake, and that none of them reported the error to management.

On a side note, I think a lot of posters in this thread now owe the OP a serious apology given the way he has been ridiculed -- let's see if they have the humility to do so. While the GTX 970 is technically a 256-bit card, effectively it isn't and certainly not behave as one. In fact, with the additional information that the GTX 970 has some its ROPs and L2 cache fused off the situation is worse than what the OP originally proposed.
 
Last edited:

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
So less L2 than 980 as well? That's probably even more concerning to some people.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
DirectCompute is handled via DirectX.

I would bet you every single harvested part works the same way, including AMD before they decoupled the ROPs.

So you think every cut down nVidia card operates like this? Like Titan and the 780(nonTI), for example? Should be easy to check. Someone out there has to have on and can run the .exe to find out.
 

waffleironhead

Diamond Member
Aug 10, 2005
7,061
570
136
Hate to burst your bubble, but I'm going to HIGHLY doubt AT does indepth testing before we see it done by every other site out there.
AT is usually late to the review party at this point.

On the contrary, I'd bet this is the kind of stuff that Ryan would set other things aside and focus on. Whether he is first out with some new data or not, I would guess that it is something he is spending time looking into.
 

NomanA

Member
May 15, 2014
134
46
101
So you think every cut down nVidia card operates like this? Like Titan and the 780(nonTI), for example? Should be easy to check. Someone out there has to have on and can run the .exe to find out.

Even 980M doesn't show this problem. It has all 64 ROPs and the 2048KB L2 cache. 768 tests (DGPU without display) show no issues either.

This is quite a unique issue, and specific to GTX 970. Perhaps they got a lot of 980 dies where at least 8 L2 cache blocks couldn't be salvaged, and they decided to go for this particular ROP/L2-cache setup. We can only speculate, since the reasons aren't known about why 980Ms have all 64 ROPs and 970s don't.
 

Abwx

Lifer
Apr 2, 2011
11,885
4,873
136
Hardware.fr commented the announcement, from their pic we can see that the 980m was reserved the dies where full clusters are functionals, a single cluster has been disabled, this set the bus at 192bit and a single memory pool, the 970/970m are harvested die where the defect are not conveniently distributed, with faulty SMMs in three cluster out of four in the 970 and two out of three in the 970m, this latter doesnt need two RAM pools though, so its behaviour will be different from the much polemical 970.


http://www.hardware.fr/focus/106/gtx-970-3-5-go-224-bit-lieu-4-go-256-bit.html

Even 980M doesn't show this problem. It has all 64 ROPs and the 2048KB L2 cache. 768 tests (DGPU without display) show no issues either.

Check the link above, there are ROPs that are useless anyway on the 980m since there s a full cluster disabled, the real number of ROPs is 48 if we count only thoses that are functional.
 
Last edited:

SteveGrabowski

Diamond Member
Oct 20, 2014
8,957
7,666
136
Why on earth would they? The only legal recourse that would conceivably make sense in this situation would be a class action on behalf of consumers who purchased GTX 970 graphics cards.

For the same reason POM sued Minute Maid for false advertisement regarding their pomegranate cocktail. Nvidia's false advertisement cost AMD a sale of an R9 290x in my case.
 

Pneumothorax

Golden Member
Nov 4, 2002
1,181
23
81
Bought a 3rd 970 to replace the 280 in my son't rig. Screw you Nvidia, it's going back and I'm going to get a Tri-X 290 instead... At least that one has a full 4gb of FULL-SPEED ram. Too bad with my SLI setup, I'll eat over $150 if I sell once I factor in fleabay/paypal/sales tax losses!
 
Status
Not open for further replies.