Why Doesn't Anyone Get to the Bottom of the Aging of Kepler and GCN?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Feb 19, 2009
10,457
10
76
Yeah you are correct. Video cards are never the battle of good and evil like some make it out to be, I was just clarifying in context. My bad if it seemed out of place.

I mean at the end of the day it doesn't matter why Kepler tanked from purely a predictive standpoint because past performance is no guarantee of future results. I think completely making an arbitrary decision like "Nvidia will drop Maxwell like Kepler," or that future Nvidia GPUs can't overcome deficiencies is folly. And I don't know if I can except that future AMD GPUs will get the same console advantage going forward, seeing as how the newish GCN GPU Fiji is a basket case when it comes to performance.

1080p_VeryHigh.png


It will be fun to see for sure, but I think the retrospective discussion is relevant today.

Just be aware Rise of Tomb Raider benches, the pre-release version is not representative, because Nixxes had a release day update that improve performance on AMD across all SKUs. AMD also released a driver 3 days later to fix Fiji performance and stutter.

This was the release build.

2560_1440.png


3840_2160.png


1455189919EDyKUcGV8E_5_3.gif


1455189919EDyKUcGV8E_7_3.gif


But definitely Fiji needs special driver treatment compared to other GCN. I suspect it's the HBM.
 

moonbogg

Lifer
Jan 8, 2011
10,731
3,440
136
The rapid degeneration of GPU's in general makes it hard to respect any of them IMO. When I hear people say, "That game requires a super high end beast of a GPU" I now tell myself that no such card exists, nor will it ever. The best cards of today are only doing their best to keep up with the mid range cards right around the corner.
It holds for CPU's though, so at least there is that. It also holds for cooling solutions usually, like a good custom water set up. It holds for a good mechanical keyboard, SSD's last a long time even and perform well over time, monitors hold for a long time and can remain beastly, speakers, headphones, mice and other things as well. The only component that is great today and sucks a year from now are GPUs, Nvidia GPU's to be specific. So at least an enthusiast still has things that they be enthused about which will hold their respect over time. Luckily everything about my rig isn't like an Nvidia GPU.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
it seems new games love gcn

Its a little late though, most who bought the higher end gtx780ti/970/980/290/290x are waiting for new cards in 3 months or later this year.

Soon they will optimize for Fury x and Polaris and the 280/290/390 cards will suffer, just like Kepler compared to Maxwell.
 
Feb 19, 2009
10,457
10
76
Soon they will optimize for Fury x and Polaris and the 280/290/390 cards will suffer, just like Kepler compared to Maxwell.

Doubt it, unless it's a DX12 game, due to uarch specific optimization, then yes, GCN 1 will be left behind.

It'll be a significant change when next-gen consoles arrive and if it's a new uarch.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
The rapid degeneration of GPU's in general makes it hard to respect any of them IMO. When I hear people say, "That game requires a super high end beast of a GPU" I now tell myself that no such card exists, nor will it ever. The best cards of today are only doing their best to keep up with the mid range cards right around the corner.
It holds for CPU's though, so at least there is that. It also holds for cooling solutions usually, like a good custom water set up. It holds for a good mechanical keyboard, SSD's last a long time even and perform well over time, monitors hold for a long time and can remain beastly, speakers, headphones, mice and other things as well. The only component that is great today and sucks a year from now are GPUs, Nvidia GPU's to be specific. So at least an enthusiast still has things that they be enthused about which will hold their respect over time. Luckily everything about my rig isn't like an Nvidia GPU.

When AC Unity came out, I knew there wouldn't be a good GPU to be able to play it til at least 2016. Most games that you guys have all played, I have not played. I don't believe GPUs today really can handle them. Polaris/Vega will be when I play most games that have come out these past couple of years.

And what about games that don't support crossfire/SLI? You DEFINITELY need new gpus for those games lol.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
Its a little late though, most who bought the higher end gtx780ti/970/980/290/290x are waiting for new cards in 3 months or later this year.

Soon they will optimize for Fury x and Polaris and the 280/290/390 cards will suffer, just like Kepler compared to Maxwell.

What's your reasoning for why Tahiti and Hawaii will suffer when Polaris comes out. What architecturally will keep them from doing better than Kepler and Maxwell?
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
From a gamer's perspective, Kepler being slow is a bad thing and it makes me less confident in their current and future products. But if I owned substantial stock in Nvidia, I would be pissed off every time I heard of a single software engineer wasting time on products no longer being sold. If I owned Nvidia, Kepler would be ignored beyond belief. Two gens old? GTFO not on my dime bishtces.

I hope I'm never unfortunate enough to buy anything you make.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I suppose they are testing what is available to buy new at retail. Granted there are used cards from previous generations for sale, but I dont know what part of the market that is.

And it would increase the workload of testing sites by several fold to test cards from 3 or 4 generations from each maker with a lot of different games. That is just the way it is, pretty much every testing magazine in any field tests the "latest and greatest".

The main reason for including older cards is to show people who own them what the new cards offer in relationship to them.

As far as them doing some kind of deep expose' on the performance of last gen relative to before and after next gen dropped? Well, they aren't going to bite the hand that feeds them. The only reason the AIB's give cards to the sites to test is to generate sales. If sites did a review who's only purpose is to expose the IHV's gimping old hardware what's the odds they are going to continue giving them cards?

Review sites are an extension of the hardware company's marketing. They serve no other purpose. They are purely self serving. They sure as hell aren't there to serve the readership.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
By the way, saying bad things about AMD can also get you yanked from the review list. TechReport's investigation into frame pacing did not make it any friends at AMD, and it was eventually denied a Nano for review. Ironically it did lead to Scott Wasson leaving the site to work for AMD.

Tech reports initial testing was with the 7950 IIRC. There were a lot of review samples sent between then and the Nano. I don't think the evidence points to that as a reason to not get a Nano. Now, IIRC it was sites that were using Project Cars in their benchmark suites and wouldn't quit even after it was apparent the game wasn't representative of performance between the two brands that got the axe. The straw that broke the camel's back, so to speak.

The game wasn't even particularly popular and badly skewed results. It was obvious to AMD that there was only one reason to include it. To make AMD cards look bad.

nVidia plays a lot of politics and AMD is just terribad at it. I could give you many examples but that's not actually the topic of this thread.
 

mohit9206

Golden Member
Jul 2, 2013
1,381
511
136
So you guys are implying that Nvidia is intentionally gimping Kepler to make people upgrade to newer cards? That they are purposely not providing driver improvements for new games to Kepler and only focusing on Maxwell? Well from a business perspective it makes sense as more people would upgrade to newer gen cards sooner rather than holding on to their old card means more money for Nvidia.So what Nvidia is doing is what's "best for business".
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
Gimping no, not making updates yes.

Any company would do this, and AMD have done the same for ages. Specially 2013 Richland users got hit hard.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
Its a little late though, most who bought the higher end gtx780ti/970/980/290/290x are waiting for new cards in 3 months or later this year.

Soon they will optimize for Fury x and Polaris and the 280/290/390 cards will suffer, just like Kepler compared to Maxwell.

I think GCN 1.1 will do relatively well, at least on console ports. But something like Fiji performance will get double hit. Not only the lack of drivers but also the 2 VRAM engineers. Tonga users already look abandoned.

Else yes, anyone not GCN 1.1, GCN 1.3, Pascal loses one way or the other.
 

amenx

Diamond Member
Dec 17, 2004
4,527
2,863
136
So you guys are implying that Nvidia is intentionally gimping Kepler to make people upgrade to newer cards? That they are purposely not providing driver improvements for new games to Kepler and only focusing on Maxwell? Well from a business perspective it makes sense as more people would upgrade to newer gen cards sooner rather than holding on to their old card means more money for Nvidia.So what Nvidia is doing is what's "best for business".
No. Doing so is bad for business actually. If my GPU degrades in performance vs the competition over time, I most surely will not buy from the same manufacturer again. I have a Maxwell (970) GPU and I suspect that its performance is already beginning to fall off. I will not be buying Nvidia next time around because of this. Pretty sure that applies to anyone seeking some reasonable measure of longevity in their hardware.

But I dont think Nv did this intentionally with Kepler. Their hardware is not as capable as GCN cards in dealing with games developed for next gen consoles. Their driver teams may also be overwhelmed trying to optimize for both maxwell and kepler, so they prioritize. In the end when customers see they need to upgrade more frequently just to keep up, they will switch sides. Simple as that. And I think Nvidia is nervous about this.
 

R0H1T

Platinum Member
Jan 12, 2013
2,583
164
106
No. Doing so is bad for business actually. If my GPU degrades in performance vs the competition over time, I most surely will not buy from the same manufacturer again. I have a Maxwell (970) GPU and I suspect that its performance is already beginning to fall off. I will not be buying Nvidia next time around because of this. Pretty sure that applies to anyone seeking some reasonable measure of longevity in their hardware.

But I dont think Nv did this intentionally with Kepler. Their hardware is not as capable as GCN cards in dealing with games developed for next gen consoles. Their driver teams may also be overwhelmed trying to optimize for both maxwell and kepler, so they prioritize. In the end when customers see they need to upgrade more frequently just to keep up, they will switch sides. Simple as that. And I think Nvidia is nervous about this.
The next gen of consoles is also gonna be from AMD, better to switch sides & stay with them for as long as you can.

http://forums.anandtech.com/showthread.php?t=2467679
 
Feb 19, 2009
10,457
10
76
I think GCN 1.1 will do relatively well, at least on console ports. But something like Fiji performance will get double hit. Not only the lack of drivers but also the 2 VRAM engineers. Tonga users already look abandoned.

Else yes, anyone not GCN 1.1, GCN 1.3, Pascal loses one way or the other.

Assuming devs don't take a few days to optimize the different uarch, we've seen how easy it was for GOW devs to fix Tonga, Fiji, and improve the rest of the line up in a week. If they actually tried, it wouldn't even be an issue.
 
Feb 19, 2009
10,457
10
76
I know how much you like to ensure citations are correct and propper. I am assuming you made a mistake by citing imgur for this slide. =]

Are you serious? That's a slide (page 50) from the official presentation. -_-

uO6CWfI.jpg


Take a few seconds and test your googling skills, you'll find the official source easy enough.

At least you recognize this:

12ax4T8.jpg


-_-
 

Gryz

Golden Member
Aug 28, 2010
1,551
204
106
The rapid degeneration of GPU's in general makes it hard to respect any of them IMO. ....<snip> .... The only component that is great today and sucks a year from now are GPUs, Nvidia GPU's to be specific. So at least an enthusiast still has things that they be enthused about which will hold their respect over time. Luckily everything about my rig isn't like an Nvidia GPU.
You're nuts.

There is a constant improvement of base technology. Every year we learn how to put more transistors on a square millimeter. Every year we gain more knowledge on how to design chips. We get better software to help design chips.

The GPU industry succeeds in using that overall progress to manufacture better products.The CPU industry does not. They have better base material to work with, but they do not succeed in converting those better opportunities into a better end-result for the customers. Imnsho the GPU industry is the successful one here, and the CPU industry is the failing one.


Unless you wanted us all to still use slide rules. And horse and carriages. Humanity makes technological progress. If you don't like that, maybe you should move back to the Dark Ages. Tbh, now that I think of it, I have no idea what you are doing on AT technical forums.
 

garagisti

Senior member
Aug 7, 2007
592
7
81
You're nuts.

There is a constant improvement of base technology. Every year we learn how to put more transistors on a square millimeter. Every year we gain more knowledge on how to design chips. We get better software to help design chips.

The GPU industry succeeds in using that overall progress to manufacture better products.The CPU industry does not. They have better base material to work with, but they do not succeed in converting those better opportunities into a better end-result for the customers. Imnsho the GPU industry is the successful one here, and the CPU industry is the failing one.


Unless you wanted us all to still use slide rules. And horse and carriages. Humanity makes technological progress. If you don't like that, maybe you should move back to the Dark Ages. Tbh, now that I think of it, I have no idea what you are doing on AT technical forums.
All that is good, but i'd be pissed if resale dropped on cards. Most Nvidia buyers simply sell their old cards, and for whatever reasons prices are stable. As a customer, lack of support will piss me off given 780ti wasn't cheap, nor was Titan, or Titan Z.

People are complaining about 78xx series and lower not getting DX 12 support, it is funny. 2012 when GCN was introduced and all GCN cards will have various functionalities, and that's a good thing. Well, all the things that AMD promised.
 

Mahigan

Senior member
Aug 22, 2015
573
0
0

Goatsecks

Senior member
May 7, 2012
210
7
76
Take a few seconds and test your googling skills, you'll find the official source easy enough.

For someone that cares so much about proper sources I am surprised that you are so nonchalant about this. The onus of finding correct citations is not on the reader. Do not make out like it is me that is being lazy.

Piers Daniell, Vulkan on NVIDIA GPUs. SIGGRAPH2015 presentation: http://www.nvidia.com/object/siggraph2015-best-gtc.html

You can download the PDF.

That's the source for Vulkan being supported on Fermi.

Thankyou.
 

Mahigan

Senior member
Aug 22, 2015
573
0
0
As for Kepler, GCN and Maxwell...

It has to do with compute utilization...


Just like Kepler's SMX, each one of Maxwell's SMM has four warp schedulers, but what's changed between SMX and SMM is that each SMMs CUDA cores are assigned to a particular scheduler. So there are less shared units. This simplifies scheduling as each of SMM’s warp schedulers issue to a dedicated set of CUDA Cores equal to the warp width (Warps are 32 thread wide and each scheduler issues its Warps to 32 CUDA cores). You can still dual issue, like with Kepler, but a single issue would result in full CUDA core utilisation. This means that you have less idling CUDA cores. (There's also the dedicated 64KB per SM of Maxwell over Kepler's 16KB + 48KB design).

Why is this important? Because console titles are being optimized for GCN. Optimizing for GCN means using Wavefronts (not Warps). Wavefronts are 64 threads wide (mapping directly to two Warps). Since a Maxwell SMM is composed of 4x32 CUDA core partitions, that means that a wavefront would occupy 2x32CUDA core partitions (half an SMM). With Kepler, you had 192 CUDA cores per SMX. Try mapping Wavefronts to that and you need 3 Wavefronts. If you only have a single wavefront then you're utilizing 50% of a Maxwell SMM while only utilizing 33.3% of an SMX. That's a lot of unused compute resources.

With NVIDIAs architecture, only Kernels belonging to the same program can be executed on the same SM. So with SMX, that's 66.6% of compute resources not being utilized. That's a huge loss.

So what has happened is that:
1. The ratio of render:compute is tilting higher towards compute than a few years ago when Kepler was introduced.
2. The console effect is pushing developers into using more compute resources in order to extract as much performance as possible from the consoles GCN APUs.
3. Console titles are being optimized for GCN.

This has pushed GCN performance upwards as ALL GCN based GPUs (GCN1/2/3) utilize Compute Units which map directly to a wavefront (4 x 16SIMD = 64).

The end result is higher compute utilization on GCN, less wasted resources on GCN, good utilization on Maxwell and very crappy utilisation on Kepler.

NVIDIA is evolving towards a more GCN-like architecture while AMD are refining GCN. GCN is more advanced than any NVIDIA architecture. People who claim that GCN is "old" simply don't understand GPU architectures.

It's the console effect.
 
Last edited:

Timmah!

Golden Member
Jul 24, 2010
1,571
935
136
It's pretty clear nvidia doesn't do much kepler specific driver work anymore. With the witcher 3 performance with the game ready driver was identical to a year old driver. Then, after a lot of complaining nvidia did some kepler fixes too.

But most games aren't popular enough to get enough people to complain.

This. Games are too reliant on specific driver support, if its gone, performance plummet.

Look at this, performance of GTX980 vs 780Ti in GPGPU situation, Octane Render:

https://render.otoy.com/octanebench/results.php

780Ti - 103 points
980 - 98 points

pretty much tied for performance. How much faster is 980 in games?