against gimping kepler

monstercameron

Diamond Member
Feb 12, 2013
3,818
1
0
http://www.bytemedev.com/the-gtx-780-ti-sli-end-of-life-driver-performance-analysis/

Conclusion (as of 7/3/2015):
The transitioning into 35x.xx family has caused a certain amount of performance penalty to some Kepler users, for example a loss of 4.05% in GPU score in Fire Strike, and 3.88% Fire Strike Extreme. A loss though should be read in context. The loss in performance since at least driver 347.25. I don’t think there is a huge conspiracy to “withhold” performance as people speculate, but there is enough data to merit a review by nVidia. I don’t expect major performance gains any more, now that we are EOL, but we shouldn’t be going down in performance in a benchmark that even pre-dates this video card.

The best driver currently for Kepler, in terms of 3DMark performance I have tested thus far is 347.25. If you must use a 35x.xx driver, I highly recommend the 350.12 or the 352.86. I have updated the charts and added the actual numbers.

Update:
The numbers since 340.52 only show a lost of 3.96% loss of performance in the GPU score, or 2.09% in overall score in Fire Strike. The latest hotfix driver 353.38 seems to be that nVidia is taking the issues serious and managed to resolve not only the performance issues but the vast majority of TDR crashing that was occurring.

Anecdotal Comments:
I have greatly enjoyed my time with the GTX 780 Tis, even though they are not good overclockers. Out of all my nVidia experiences, they have been my favorite cards. nVidia driver stability has been usually good for me, that being said, the 353.30 is the worst driver I have ever used from nVidia and I have had Quad SLI and GTX 590s before.

I have been using the 780 Tis since November 2013. I did my testing in my configuration because this has been my experience with Kepler GPUs in SLI. To me there hasn’t been a mass reduction in performance, I have never felt this crushing ominous crippling of performance that so many people are claiming is happening. I don’t doubt there are issues, issues which seem to vary based on each user’s configuration (including driver issues) but as far as I can tell no one has taken the time to prove many of these claims. I recommend everyone, if they can, to use the 347.25 driver, it is a solid performer and very stable. Maxwell also predates this driver which refutes the whole “nerfing Kepler to make Maxwell shine” theory. Looking at the trend up to the 347.25 driver, performance has been consistent. I personally do not have the TDR bug in Windows 8.1, I did have it in Windows 10, it does look like they are working on a fix finally though.

The tests here are for a narrow premise and not all encompassing, I realize that. Again the point of this article is encourage neutral fact based discussion but also provide a reference to repeatable and reproducible tests. All my test results are under the user RagingCain at 3DMark Results, if you want/need that extra proof.
 

Azix

Golden Member
Apr 18, 2014
1,438
67
91
Drops in performance wouldn't be noticed subjectively. If the judgement comes from having fps showing and monitored all the time, sure. But where these things have significance is benchmarks in reviews. The few fps differences decide purchases.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
From the links I've seen that followed Firestrike's performance, which is really a pretty narrow topic, the performance showed growth until recently, which is pretty much up and down every new driver, but it really is not falling. It's just hovering up and down at this point, which is what you should expect once a card has been out long enough.
 

DaveSimmons

Elite Member
Aug 12, 2001
40,730
670
126
If nvidia adds any general fix there is no guarantee that it won't reduce some specific benchmark score. Especially if the fix is to correct incorrect (but faster) behavior.

If I write a square root function that always returns 42 it will run faster than calculating the correct value.
 

Flapdrol1337

Golden Member
May 21, 2014
1,677
93
91
The problem is suspected lack of optimisation in new games, which you can't really test. All you can do is compare agianst maxwell and the gcn 1.0, 1.1 and 1.2 cards.

When the witcher 3 launched there was a small shitstorm and nvidia put some kepler specific fixes in the next driver.
 

Spjut

Senior member
Apr 9, 2011
928
149
106
Sadly, it wouldn't be surprising if Nvidia neglected Kepler optimizations in the new games, prioritizing Maxwell.

I do think it's probable though that Nvidia is encouraging the developers to use effects that exploit Maxwell's strengths, even if's something that'd hurt Kepler's performance
 

Rvenger

Elite Member <br> Super Moderator <br> Video Cards
Apr 6, 2004
6,283
5
81
I think Kepler is hitting the 3gb VRAM limit in most instances. If 280x ~= GTX 780 then it is a safe bet that VRAM limits are coming into play with current releases.
 

tential

Diamond Member
May 13, 2008
7,355
642
121
I think Kepler is hitting the 3gb VRAM limit in most instances. If 280x ~= GTX 780 then it is a safe bet that VRAM limits are coming into play with current releases.

Considering how many people should have 3GB VRAM cards on this forum right now, this should be PAINFULLY easy to test.

Which benchmarks do you think Kepler is hitting the 3GB VRAM limit?
 

SPBHM

Diamond Member
Sep 12, 2012
5,056
409
126
3dmark optimizations are pretty irrelevant in the end... would be interesting to analyze in actual games.

also, why not compare Fermi vs Kepler in 2012-2013 drivers?
or GCN vs VLIW4 during the same period?

yes the support will be prioritized for the newer stuff, and the new architecture is more efficient anyway.
 
Dec 30, 2004
12,554
2
76
Sadly, it wouldn't be surprising if Nvidia neglected Kepler optimizations in the new games, prioritizing Maxwell.

I do think it's probable though that Nvidia is encouraging the developers to use effects that exploit Maxwell's strengths, even if's something that'd hurt Kepler's performance

from a business perspective, especially when you're ahead like this, it doesn't make sense to do anything but.

[anything but profit off the blind brand loyalty by abusing your position of power by doing things like supporting new titles and making sure they run correctly, but only including optimizations if it's the latest generation of cards.]
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Back into the circle we go. At least this article didn't spin it as "Gameworks kills my performance!" What is this thread number 12? (Mods should just make a master thread at this point ahah.)

Does anyone remember the optimizations HD6k got AFTER HD 7k launched? Do they not remember how TressFX gave 7K a above average advantage? Or same thing with Global Illumination almost doubling HD 7K performance over HD 6K.

Yeah, it seems when both these companies switch uarch's they sort of abandon the old one faster. But, no one was starting threads over AMD basically leaving VLIW4/5 users to flounder. Did we already forget the "Never Settle" Drivers that gave GCN users a nice performance increase while VLIW4/5 users got jack?

http://www.techpowerup.com/reviews/AMD/Catalyst_12.11_Performance/24.html

Our testing also confirms that these performance benefits are only for Radeon HD 7000 cards since the HD 6970 in our test group saw no improvements

I hope Fury Nano is a huge success, perhaps then the forum can move on to new subjects.

EDIT: Just take a stroll down memory lane:
http://forums.anandtech.com/showthread.php?t=2278109
 
Last edited:

Atreidin

Senior member
Mar 31, 2011
464
27
86
If one company has a lot more customers than another, one might expect that to be reflected in forum posts, which would include people who are complaining.

I don't see a lot of complaints about S3 graphics, therefore you guys must all be S3 shills.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
If one company has a lot more customers than another, one might expect that to be reflected in forum posts, which would include people who are complaining.

Agreed, but now take a survey on this forum of the most vocal complainers and you'll find something interesting.

I don't see a lot of complaints about S3 graphics, therefore you guys must all be S3 shills.

MATROX FOR LIFE!!!!
 

Face2Face

Diamond Member
Jun 6, 2001
4,100
215
106
The only game that struggles a bit, would probably be TW3. Other than that, my GTX 780 does perfectly fine at 1440p. The driver updates have helped quite a bit.
 

Azix

Golden Member
Apr 18, 2014
1,438
67
91
Back into the circle we go. At least this article didn't spin it as "Gameworks kills my performance!" What is this thread number 12? (Mods should just make a master thread at this point ahah.)

Does anyone remember the optimizations HD6k got AFTER HD 7k launched? Do they not remember how TressFX gave 7K a above average advantage? Or same thing with Global Illumination almost doubling HD 7K performance over HD 6K.

Yeah, it seems when both these companies switch uarch's they sort of abandon the old one faster. But, no one was starting threads over AMD basically leaving VLIW4/5 users to flounder. Did we already forget the "Never Settle" Drivers that gave GCN users a nice performance increase while VLIW4/5 users got jack?

AFAIK those differences were down to compute. If there is an actual hardware reason then sure, if not then people wonder. This compares the same architecture with driver updates, so that's pretty much not relevant.
 

therealnickdanger

Senior member
Oct 26, 2005
987
2
0
MATROX FOR LIFE!!!!

Bahahhahaha! I remember that I bought a Matrox G500 or G200 once - was that a thing? Either way, I just remember being really, really let down. My memory isn't what it used to be, but I either came from or went to an ATI Fury Maxx. This was the mid-late 90s, so there were lots of thing happening then.

EDIT: Yup, G200. LOL
http://www.anandtech.com/show/189/5
 
Last edited:

krumme

Diamond Member
Oct 9, 2009
5,952
1,585
136
Bahahhahaha! I remember that I bought a Matrox G500 or G200 once - was that a thing? Either way, I just remember being really, really let down. My memory isn't what it used to be, but I either came from or went to an ATI Fury Maxx. This was the mid-late 90s, so there were lots of thing happening then.

EDIT: Yup, G200. LOL
http://www.anandtech.com/show/189/5
Yeaa wtf happened back then - i bought a g400. Lol. I probably didnt game and aparently didnt read reviews at that time either i suppose looking at the crappy numbers today.
Perhaps it was all beer, girls and hifi. But who knows for sure...
 

krumme

Diamond Member
Oct 9, 2009
5,952
1,585
136
3d gfx drivers at that time though proves some things get better. Lol.
- only windows 95 beta was worse
 

hawtdawg

Golden Member
Jun 4, 2005
1,223
7
81
There hasn't been a performance "loss", they simply haven't optimized for newer games. I noticed this trend starting with FarCry 4. It was around this time that 3 year old AMD GPU's (x280) slowly started catching up with the 780 in newer games, and then even got matched or passed by the 960 in some games. I've never seen another GPU series fall off like that.
 
Last edited:
Oct 27, 2012
114
0
0
I agrea with hawtdog, while I dont believe nvidia purposefully took kepler performance away its still weird to see a 680 slower than a 960 in some games and is why its sparking up this conversation.