Status
Not open for further replies.

Mahigan

Senior member
Aug 22, 2015
573
0
0
We can't really discuss the data due to the website it was posted on. I can't really take the data seriously. That said Hardware Canucks had a similar article here:
http://www.hardwarecanucks.com/reviews/video_cards/gtx-780-ti-vs-r9-290x-the-rematch/

What neither of these articles answer is, well, the reasoning behind the data as well as the trend. Both sites use older games, for the most part, and don't separate game titles by generation.

If you separate by generation, you see that in newer Gen titles, Kepler struggles a lot. The reasoning for this had already been explained (I explained it in another thread recently).
 

PhonakV30

Senior member
Oct 26, 2009
987
378
136
maybe GTX 780Ti is reached EOL While with 290x , there is still room for improving performance, who knows.

edit : Rise of tomb raider (DX12) , despite having 3GB , GTX 780Ti is faster than 290X ? Isn't this game intensive memory hungry?
 
Last edited:

xthetenth

Golden Member
Oct 14, 2014
1,800
529
106
If you separate by generation, you see that in newer Gen titles, Kepler struggles a lot. The reasoning for this had already been explained (I explained it in another thread recently).

Which thread out of curiosity? I think I must have missed it.

My personal suspicion especially after seeing some recent games where Maxwell falls behind more than Kepler is just that NV has been releasing some seriously short sighted architectures, and AMD has been left to control the direction of future development.
 
Feb 19, 2009
10,457
10
76
[Nope]

You need to read the OP's comment about shill sites. You obviously didn't listen.

-Rvenger
 
Last edited by a moderator:

Kris194

Member
Mar 16, 2016
112
0
0
For a few months, Nvidia is going to release Pascal GPU that is improved Maxwell so it's not strange that they do not care that much about Kepler anymore.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
. Both sites use older games, for the most part, and don't separate game titles by generation.

If you separate by generation, you see that in newer Gen titles, Kepler struggles a lot

I believe the article I linked does separate by generation and shows Kepler struggling in the latest games. The test you linked is 8 months old.

780tiVs290xVsTitan-FIX.jpg
 
Last edited:

Mercennarius

Senior member
Oct 28, 2015
466
84
91
The game industry has been increasingly moving towards programming games that are better suited for AMD's architecture for some time now. A large part of this is due to the consoles utilizing similar GPU tech as what's seen AMDs GPUs. AMD also seems to have gotten much better at putting out regular driver updates that actually increase performance on both current and legacy GPUs over the last few years.
 

Piroko

Senior member
Jan 10, 2013
905
79
91
The first thing I noticed: Why does the original Titan trail the 780Ti that much in some games? The worst offender is Wolfenstein: The new blood, with 33 fps versus 56 fps average!

The second thing: There are only a few games in there where one card can be considered playable at 1080p while another can not. Hawaii stumbles in Wolfenstein, Project:Cars and Fallout 4 whereas Kepler stumbles in Farcry Primal and The Witcher 3.

The third one is that Hawaii scales better into high resolutions, but all cards lack the grunt to pull that off gracefully anyways.

So, nothing too exciting to see here in my opinion.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
A large part of this is due to the consoles utilizing similar GPU tech as what's seen AMDs GPUs.

I think a large part of this is the fact we were stuck on 28nm so long and AMD decided to rebrand their cards so they were forced to just make driver improvements.

Nvidia on the other hand made an architectural change with Maxwell, focused on performance/watt and made drivers that mostly improved their latest cards.

Nvidia has always worked better with game devs but lately I see AMD finally gets it, and has been trying to work closer with game devs. I would guess this is good for all of us.
 

biostud

Lifer
Feb 27, 2003
19,741
6,823
136
Nvidia card delivers full performance at launch, AMD needs to do lots of driver tweaking to reach full potential?

Also tomb raider performs worse in dx12 than in dx11, so probably not at good benchmark atm.
 

tg2708

Senior member
May 23, 2013
687
20
81
So in other word gcn goes from slightly worse to on par in some games and on par to even better performance over time. A certain architecture is built for now but deteriorates worse than the one that was built not only for now but loses less performance over time than the former. It begs this question to be asked, who has the better architecture in this case?
 

Mercennarius

Senior member
Oct 28, 2015
466
84
91
So in other word gcn goes from slightly worse to on par in some games and on par to even better performance over time. A certain architecture is built for now but deteriorates worse than the one that was built not only for now but loses less performance over time than the former. It begs this question to be asked, who has the better architecture in this case?


Based off these statistics it's not really a question.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
It begs this question to be asked, who has the better architecture in this case?

Since that question really cant be answered without a heavy opinion, I have a different question.

Do you believe AMD meant to have a 7970.280x,380x or do you think they would have refined with the 20nm node (which never happened) and made drivers to optimize their 20nm architecture?

I don't think we should be trying to see Nvidia in a bad light because they moved to a better more power efficient gpu before AMD?
Do you?

How long should you FULLY support a older outdated card? 5 years?
I think Kepler is what 4 years old? Notice I said FULLY support.
 

Flapdrol1337

Golden Member
May 21, 2014
1,677
93
91
There is no doubt in my mind nvidia neglects kepler. And kepler needs a bit of work to perform properly. With the witcher 3 enough people complained to get nvidia to fix it after 2 weeks, but most games aren't that popular.

My 100 euro 670 still performs fine, but if I'd have bought a 780Ti I'd be slightly pissed it now performs at 970 level, or not, if I'd bought a 980Ti directly at launch :p.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
Based off these statistics it's not really a question.

Yes your right, its not a question, its an opinion.:thumbsup:
The real question was did Nvidia purposely cripple Kepler?
or did they move onto Maxwell?

That was the answer I was looking for.

Its my opinion that if you bought a videocard in 2012 and expected it to be FULLY optimized in 2016, you would be delusional. In fact I'm not a hard core enthusiast but ,I don't think I remember ever keeping a card for more than 2 or 3 years at the most.

I think the 20nm node skip played into this more than people think.
 
Last edited:

BlitzWulf

Member
Mar 3, 2016
165
73
101
Since that question really cant be answered without a heavy opinion, I have a different question.

Do you believe AMD meant to have a 7970.280x,380x or do you think they would have refined with the 20nm node (which never happened) and made drivers to optimize their 20nm architecture?

I don't think we should be trying to see Nvidia in a bad light because they moved to a better more power efficient gpu before AMD?
Do you?

How long should you FULLY support a older outdated card? 5 years?
I think Kepler is what 4 years old? Notice I said FULLY support.

Are you implying that if we had seen a node change AMD would have stopped optimizing for GCN1.0? they have released two architecture updates since then and even one to replace the 280x(380X is Tonga GCN 1.2) yet the 280x continues to pull further ahead from every card with which it competed to this day .
 

Piroko

Senior member
Jan 10, 2013
905
79
91
It was the opinion of some that Nvidia was purposely crippling Kepler and Nvidia cards Maxwell would receive the same treatment.
I don't see crippling, do you?
Well, I didn't really support that opinion. But I will also note that the titles in which Kepler trails the pack are almost all high profile. That has probably twisted the average benches on websites a bit, supporting those who voice the "Kepler lost out" argument by more than it should.
 

Kris194

Member
Mar 16, 2016
112
0
0
Why would Nvidia cripple Kepler GPUs? They focused on Maxwell cards, their latest architecture. It's been like that since years.
 

Mahigan

Senior member
Aug 22, 2015
573
0
0
I believe the article I linked does separate by generation and shows Kepler struggling in the latest games. The test you linked is 8 months old.

780tiVs290xVsTitan-FIX.jpg

Let me re-word what I said so that me meaning is understood. They primarily tested older games and threw in a few new titles.

Their conclusions are based on looking at the older games, for the most part, and mentioning the newer games in passing.

What's holding Kepler back is explained here: http://forums.anandtech.com/showthread.php?t=2467773&page=4

Basically it has to do with compute utilization. Most newer games were designed for console GCN architectures. That means they were designed for 64-thread wide wavefronts as opposed to 32-thread wide warps.

Kepler has 4 Warp schedulers sharing a pool of 192 CUDA cores per SMX. That's 3 wavefronts. So the 4th scheduler is left not being utilized.

NVIDIA took note of this when they designed Maxwell. Maxwell had 4 groups of 32 CUDA cores each with their own Warp scheduler. That's 128 CUDA cores per SMM. This maps directly to two wavefront groups broken into four warps. No idling schedulers.

Kepler is falling behind, not because of drivers but because of the console effect. You don't see either babeltech or hardware canucks mentioning this.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
Are you implying that if we had seen a node change AMD would have stopped optimizing for GCN1.0? they have released two architecture updates since then and even one to replace the 280x(380X is Tonga GCN 1.2) yet the 280x continues to pull further ahead from every card with which it competed to this day .

Yes I am.
DO you think if you optimize for a 7970 in a driver that it would optimize for the 280x and 380x? Yes it would.
That's not a real architectural update/ that's what I call a refinement.

If you optimize for Kepler would the same optimizations work for Maxwell?
No.
If you optimize for Maxwell does the Kepler cards see increases? no.

Pretty simple to see and I must add, a good way to sell cards and move technology forward..
 
Status
Not open for further replies.