Should you be compensated for the GTX 970 issues and spec changes?

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Do you feel you're owed compensation for the misrepresented GTX 970?

  • Yes

  • No

  • Undecided


Results are only viewable after voting.

EliteRetard

Diamond Member
Mar 6, 2006
6,490
1,022
136
*snip*

Lol you guys keep saying you only have 3.5GB, do I need to provice screenshots of 3.7+GB in use again? Sheesh

There's definitely 4GB on the card. But Nvidia themselves just admitted that only 3.5GB is usable as normal. The other 512MB is low performance and separate, controlled by the driver.
 

Pneumothorax

Golden Member
Nov 4, 2002
1,181
23
81
That doesn't even make sense. You can see the motor by lifting the hood.



So you looked at the SMM count and number of ROPs? No I don't believe that.

This is what Nvidia would like you to believe is under the hood.

2015-hyundai-genesis-v8-engine.jpg


Only now do you actually get to see what's under it...

IMG_2892_628opt.jpg
 

Spanners

Senior member
Mar 16, 2014
325
1
0
How many people use ROP and L2 cache as a Purchase criteria?

Many get excited about those data points when we are dealing with rumours and trying to predict Performance of future products. Other than that I just don't see anyone making decisions on those factors.

Nvidia should offer some explanation. Beyond that I'm rather meh on it. Everyone knows/knew the Performance and other important data. They didn't buy it cause it had sexy L2 cache or awesome ROPs.

Maybe, but people (with limited tech knowledge) definitely use ram capacity as purchasing criteria, it's akin to Megapixels in cameras. This isn't a 4GB card in the sense that any card previously has been. To continue the camera analogy (seeing we are doing them) it would be like purchasing a 20 Megapixel camera only to find out it's actually one 17.5 Megapixel sensor and another 2.5 Megapixels sensor that only works up to 400 iso not 3200 like the main sensor. Sure it still takes great photos but it's not what I expected/paid for.
 

HurleyBird

Platinum Member
Apr 22, 2003
2,812
1,550
136
Maybe, but people (with limited tech knowledge) definitely use ram capacity as purchasing criteria, it's akin to Megapixels in cameras. This isn't a 4GB card in the sense that any card previously has been. To continue the camera analogy (seeing we are doing them) it would be like purchasing a 20 Megapixel camera only to find out it's actually one 17.5 Megapixel sensor and another 2.5 Megapixels sensor that only works up to 400 iso not 3200 like the main sensor. Sure it still takes great photos but it's not what I expected/paid for.

This is the most accurate analogy so far, if you also add that certain fringe scenarios could theoretically cause the camera to perform worse than a 17.5 Megapixel camera without the extra sensor (eg. if Nvidia's drivers fail to communicate to an application not to cache data into the 512MB portion)
 
Last edited:

lehtv

Elite Member
Dec 8, 2010
11,897
74
91
If the card was listed as a 2GB model would you have bought it?

Perhaps I wouldn't have considered it as an option in the first place, because on the face of it, a 2GB card would've seemed like a downgrade to my 7950 3GB. In that sense, yes, specs matter.

However, if reviews and tests had unanimously shown that given only 2GB of VRAM, the GPU performs exceptionally well (i.e. equally well as the GTX 970 we actually have) at high resolutions, AA levels, tessellation levels etc., then it probably would've caught my eye and I would've bought it.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
I'm torn on it. I think it would be good of Nvidia to offer something, and maybe even the option to return if you really feel wronged by the false specs.

On one hand, people saw the performance and price and pulled the trigger and got one, and they still have that. But on the other hand, this very well could have an effect on the card down the road with games that are bound to be more vram heavy. People with a lot of pixels to push may get lower performance than anticipated from the lower ROP count (though Nvidia says the ROP thing isn't an issue as there are still enough to handle what the CUDA cores can feed, but I don't know if that's damage control or truth).

So I don't know, I feel Nvidia should offer something, but I don't know what really. I suppose the option of returning the card for a full refund for those who really feel they got screwed over wouldn't hurt, and probably is the 'right' thing to do. Beyond that, maybe a Steam voucher or if they have an upcoming game bundle they could offer that.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
I hope you guys are confident NV's driver team will not leave Maxwell in the dust once Pascal is out, else your SLI setups will be unable to handle games with 4k textures. You'd also hope more games don't fall into that "worse case" scenario in pushing the limits of vram. With more console ports and 4-6gb vram for consoles to abuse.. it will be a damn shame if you can't run 970 SLI at 1080p and enjoy ultra textures or AA.
.

More console blaming. When will the nonsense end? Console games are not using 4-6GB of memory for the VRAM. There is a set amount (I think 2 GB) reserved for the OS from the 8GB. The rest is used for the entire game including the netcode and everything else. There is no 6GB of vram available. As for Shadow of Mordor (I didn't quote that part). Have you used them? I can't even tell the difference when it's on. They claim it requires 6GB of VRAM to use them but when I do a side by side and can't tell it's even working, seems like a complete waste of resources.

Perhaps I wouldn't have considered it as an option in the first place, because on the face of it, a 2GB card would've seemed like a downgrade to my 7950 3GB. In that sense, yes, specs matter.

However, if reviews and tests had unanimously shown that given only 2GB of VRAM, the GPU performs exceptionally well (i.e. equally well as the GTX 970 we actually have) at high resolutions, AA levels, tessellation levels etc., then it probably would've caught my eye and I would've bought it.

Pretty much. It's not the specs, it's the performance in games I care about at the resolution I need that matters.
 
Last edited:
Feb 19, 2009
10,457
10
76
You guys don't need to come up with silly analogies. The problem is CLEAR. Read AT's article:

"The worst case scenario on the other hand would be to have the NVIDIA heuristics fail, or alternatively ending up with a workload where no great solution exists, and over 3.5GB of resources must be repeatedly and heavily accessed**. In this case there is certainly the potential for performance to crumple, especially if accessing resources in the slow segment is a blocking action."

Currently in FC4 and Frostbite engine games (which AT went into), they allocate vram above what they need, for dynamic LOD. Thus, in these games, its not actually pushing at the 4gb barrier at all, since they do not require it.

The problem will exist for games that push to 4gb AND NEED it.

In simple layman's terms. NV is saying: We'll try to hide the issue by offloading data that's rarely accessed into the last 0.5gb segment. When we can't do that.. oops.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Maybe, but people (with limited tech knowledge) definitely use ram capacity as purchasing criteria, it's akin to Megapixels in cameras. This isn't a 4GB card in the sense that any card previously has been. To continue the camera analogy (seeing we are doing them) it would be like purchasing a 20 Megapixel camera only to find out it's actually one 17.5 Megapixel sensor and another 2.5 Megapixels sensor that only works up to 400 iso not 3200 like the main sensor. Sure it still takes great photos but it's not what I expected/paid for.

Finally someone who is accurate here. You can't look at the camera and tell. Cars just don't work.
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
627
126
However, if reviews and tests had unanimously shown that given only 2GB of VRAM, the GPU performs exceptionally well (i.e. equally well as the GTX 970 we actually have) at high resolutions, AA levels, tessellation levels etc., then it probably would've caught my eye and I would've bought it.
You are putting all your faith in how the drivers handle memory, going forward it is very possible (likely even) that more demanding games will show issues. Especially since Nvidia has shown they are not all that interested in driver optimizing on their previous generation cards.
 

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
This is what Nvidia would like you to believe is under the hood.

2015-hyundai-genesis-v8-engine.jpg


Only now do you actually get to see what's under it...

IMG_2892_628opt.jpg
I thought you could just pop the hood and magically know what was there regardless of that those emblems said, but wait, who cares what those emblems say as long as it goes like they said.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
You guys are missing the point about the GTX 970 defenders.

The vast majority of GTX 970 people who don't care about this issue are happy because the performance is exactly as reviewers advertised. They don't care what the specs say, as long as performance is in line with what reviews said it would be.

People like "Pneumothorax" who returned a GTX 970 for a R9 290x are few and far between.

This especially becomes more apparent when you move to other forums where there is nothing good said about AMD. Go to a casual big forum like Neogaf and see how many people have nice things to say about the R9 290x. The general consensus on casual forums is that AMD Graphics cards are cheap (For a reason), they have bad driver support, are generally an inferior product.

So you have a combination of the GTX 970 performing as expected, along with 90%+ of purchasers of the GTX 970 not seeing a viable alternative to the card anyway, and there is very little backlash.

If this had been deception in the form of PS4 vs XboxOne, where PS4 had lied about it's specs there would be far more backlash as people are far more likely to crosshop ps4 vs xbone than they are AMD vs Nvidia at this point.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
I thought you could just pop the hood and magically know what was there regardless of that those emblems said, but wait, who cares what those emblems say as long as it goes like they said.

Such ignorance. You can tell what type of motor it is even with the shroud.

The only analogy that works is the camera. You can't visually see anything incorrect about it. Everything is internal.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
Such ignorance. You can tell what type of motor it is even with the shroud.

The only analogy that works is the camera. You can't visually see anything incorrect about it. Everything is internal.

It's quite amazing how you've danced around answering the actual analogy.
You know the situation he is trying to portray but instead of made the issue about how all car buyers should be able to tell between a v6 and a v8 engine rather than actually just addressing the issue.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
It's quite amazing how you've danced around answering the actual analogy.
You know the situation he is trying to portray but instead of made the issue about how all car buyers should be able to tell between a v6 and a v8 engine rather than actually just addressing the issue.

The issue is the analogy doesn't work. The end
 

Termie

Diamond Member
Aug 17, 2005
7,949
48
91
www.techbuyersguru.com
There's a clear situation where the reality that this is effectively a 3.5GB card matters, despite the fact that people are getting the 1080p/1440p performance they thought they would: going SLI to game at 4K.

The GTX 970 has now been outed as a fairly poor choice for such setups, and almost no one could have known that based on the launch-day reviews, which reported that this was a 4GB card and benched it almost exclusively below 4K and in single-card setups. Yes, you could find 4K SLI reviews (e.g., http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_970_SLI/20.html), but the truth is that most people aren't gaming at 4K yet, and we don't know how future games will perform at 4K. What we do know is that VRAM matters, and this card is lacking in that crucial area.

I don't care about the ROPs or cache, but this is a 4GB card like the 660 Ti was a 2GB card, i.e., it's literally true but effectively false. Did it or does it matter to most people? No. Did it need to be divulged like it was in the case of the 660 Ti (see http://www.anandtech.com/show/6159/the-geforce-gtx-660-ti-review)?

Absolutely.

Note - I'm not voting in the poll as I don't own a 970, but I was considering one for my 4K setup and am no longer interested.
 
Last edited:

lehtv

Elite Member
Dec 8, 2010
11,897
74
91
You are putting all your faith in how the drivers handle memory, going forward it is very possible (likely even) that more demanding games will show issues. Especially since Nvidia has shown they are not all that interested in driver optimizing on their previous generation cards.

Are you talking about the hypothetical 2GB card here, or the "3.5GB" card we actually have?

If the former... Yes, good point. But this point would've come up in reviews or in threads discussing the GPU or its reviews. Don't assume I wouldn't have researched things thoroughly before making the purchase.

If you're talking about the actual GTX 970, I'm not putting any faith in anything. I bought the card based on the performance it showed, with the knowledge that I wouldn't be downgrading from my previous 7950 3GB in terms of the card's ability to handle high resolutions, high res textures, high MSAA levels etc. This hasn't changed. A single GTX970 running a game that actually needs over 3.5GB VRAM almost certainly isn't fast enough to fulfill my framerate requirements anyway.
 
Last edited:

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
I voted no, the performance hasnt changed from that of original review. Beside which, I didnt see any polls for refunds with AMD 7xxx CF debacle that still doesnt support DX9 or 4K
If anything, buyers should be allowed to return the card for the price paid.
 

michal1980

Diamond Member
Mar 7, 2003
8,019
43
91
[redacted]

Infraction issued for obscenity. Six days off for accumulation of points.
-- stahlhart
 
Last edited by a moderator:

TheRyuu

Diamond Member
Dec 3, 2005
5,479
14
81
I don't own a 970 but what would interest me the most is looking at benchmarks like this:
https://techreport.com/review/27067/nvidia-geforce-gtx-980-and-970-graphics-cards-reviewed

Which have frame timing information. Don't forget that even launch day(ish) benchmarks are still valid because this has always existed on the 970, it's just that now we know about it. We may look at those benchmarks and place more emphasis on the frame timing portions of it based on what we know now but it doesn't change the fact they're still valid. None of the games tested in the techreport article show what would be perceived as stuttering is any worse than the other cards tested. The 970 is worse than the 980 but that's to be expected. How much of this is because of the memory configuration I'm not sure but it's still competitive in every game they tested.

That being said, I'm not really sure how I feel about this, I don't know a 970 so I can't relate but nevertheless trying to find corning cases should be investigated (the anandtech article said they couldn't find any).
 
Last edited:

EliteRetard

Diamond Member
Mar 6, 2006
6,490
1,022
136
You guys are missing the point about the GTX 970 defenders.

The vast majority of GTX 970 people who don't care about this issue are happy because the performance is exactly as reviewers advertised. They don't care what the specs say, as long as performance is in line with what reviews said it would be.

*snip*

This is why I started the thread, it seems that the majority of owners do indeed care about the issue and do feel slighted by it.

For me, R9 290 is very much an option and depending on Nvidia's response and the costs (if any) for me to return the card will now determine if I switch.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
There's a clear situation where the reality that this is effectively a 3.5GB card matters, despite the fact that people are getting the 1080p/1440p performance they thought they would: going SLI to game at 4K.

The GTX 970 has now been outed as a fairly poor choice for such setups, and almost no one could have known that based on the launch-day reviews, which reported that this was a 4GB card and benched it almost exclusively below 4K and in single-card setups. Yes, you could find 4K SLI reviews (e.g., http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_970_SLI/20.html), but the truth is that most people aren't gaming at 4K yet, and we don't know how future games will perform at 4K. What we do know is that VRAM matters, and this card is lacking in that crucial area.

I don't care about the ROPs or cache, but this is a 4GB card like the 660 Ti was a 2GB card, i.e., it's literally true but effectively false. Did it or does it matter to most people? No. Did it need to be divulged like it was in the case of the 660 Ti (see http://www.anandtech.com/show/6159/the-geforce-gtx-660-ti-review)?

Absolutely.

Note - I'm not voting in the poll as I don't own a 970, but I thing 970 owners should be concerned.

Fair enough, but by the time 4k gaming becomes a major thing even a GTX 980 would be EOL. I'll replace these long before I have a 4k setup and I'd probably need a complete rebuild with 3 or 4 cards to get performance in new games that I consider playable at 4k resoutions if I had a display for it today. Right now anything newer than 3 years or so to me is pretty poor performing on SLI 970s at 4k with my DSR testing. Older games perform great though.

I don't own a 970 but what would interest me the most is looking at benchmarks like this:
https://techreport.com/review/27067/nvidia-geforce-gtx-980-and-970-graphics-cards-reviewed

Which have frame timing information. Don't forget that even launch day(ish) benchmarks are still valid because this has always existed on the 970, it's just that now we know about it. We may look at those benchmarks and place more emphasis on the frame timing portions of it based on what we know now but it doesn't change the fact they're still valid. None of the games tested in the techreport article show what would be perceived as stuttering is any worse than the other cards tested. The 970 is worse than the 980 but that's to be expected. How much of this is because of the memory configuration I'm not sure but it's still competitive in every game they tested.

That being said, I'm not really sure how I feel about this, I don't know a 970 so I can't relate but nevertheless trying to find corning cases should be investigated (the anandtech article said they couldn't find any).

Those frametimes and 99th percentile results show good things for the 970 though. Guru3D did FCAT tests and saw some dropped frames in SLI but even the 980 had dropped frames. After new drivers were released they did 3 way SLI testing and the dropped frames were alleviated. Was it a 2 way SLI problem or did drivers take care of it? We dunno, but single cards didn't drop frames on their test setup.
 
Last edited:
Feb 19, 2009
10,457
10
76
There's a clear situation where the reality that this is effectively a 3.5GB card matters, despite the fact that people are getting the 1080p/1440p performance they thought they would: going SLI to game at 4K.

It's not limited to 4k. We'll see it as games come with 4k textures for ultra settings. So far, user frame time measurements in SoM at 1080p ultra textures show very bad latency AS SOON as vram use goes above 3.5gb. The latency smooths out immediately as soon as vram drops below that amount.

It is a problem, it depends on the games, whether they allocate extra vram for dynamic purposes or whether they actually NEED that extra vram. If it needs above 3.5gb, definitely as per AT & PCPER's article, you will get gimped performance.