Little bit disappointed with the 980 Ti performance

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

TestKing123

Senior member
Sep 9, 2007
204
15
81
I went from dual overclocked 670s to a single 980 Ti as well. While I agree the benchmarks are outstanding, I'm more into playing the games than looking at numbers and as it stands, I'm not able to upgrade many of the settings from where they were at on my SLi 670s. Last Light I'm running at the exact same settings, though I am maintaining a constant 60FPS even in the outside areas and tunnel fire-fights. I am happy with my purchase (especially since I was able to get this card at a laughably cheap price), but I'm just now wowed by the performance like people were reporting.

Then again, I did get some pretty good overclocks on my 670s, so I was able to run most games at max or near maxed settings at 1440p.

I'm going to try overclocking my card today and see if that doesn't give me some better performance.

I'd say your CPU and 1600Mhz memory is slowing you down a bit.

I run Haswell i7 and 16GB PC2400Mhz memory and dying light, far cry 4, and witcher 3 all pegged at 60fps on my 980ti.
 

Pandamonia

Senior member
Jun 13, 2013
433
49
91
Im running 1440p and G Sync. 29% overclock and i can his 120+fps in BF4 on Ultra 4xMSAA.

Because of G Sync i no longer care about sub 60fps dips now.

I sold my 780 SLI which performs the same as my 980Ti but was plagued with SLI issues.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
Uber Sampling was more or less full scene anti-aliasing - a very inefficient form of AA that renders the scene/pixels multiple times. A traditional name for it is SSAA or sometimes FSAA. I am not sure if UberSampling was a 4xSSAA mode or some hybrid but it's a very intensive setting. Imagine taking a 1440P image and rendering it 2-4X and downscaling/downsampling it to your monitor's set native resolution. It's no wonder it would stress even a modern GPU.

Some people actually find that it blurs things in TW3 while making other things sharper.

Alternatively, other games have found that you can set 'sharpening filter' to 0 (basically off) in the .ini 'Config' file - under the 'Rendering section'. This makes the game more blurry but some liked it better and would run Uber Sampling off to get above 60fps. (The .ini 'Config' file is found in My Documents > Witcher 2)

There are other alternatives such as using Sweet FX as some gamers found Uber Sampling made the colours dull in TW2:
http://www.nexusmods.com/witcher2/mods/630/?

Someone like BFG would be able to correct my explanation and add a lot more detail.

You are right that game developers should provide a better explanation with a small preview/picture comparisons to aid in setting selection.

Why not just use VSR then? I found VSR at a perfect multiple of my screen was amazing for bioshock 2. It made my 720p HDTV go from meh to great to play on. I want to see the same with 1080p/4K on my projector with VSR but AMD is holding me back! It's get a Fury (which I'm NOT doing the performance isn't worth the price), or be stuck at 1800p with the r 290. But I just tried to play The Witcher 2, gorgeous game, but FOV is killing me badly. It's so hard to play with such a bad FOV on my projector.

I should probably pay more attention to my OC and get more out of my HD 7950. I need to read up more and see what I should expect. I just set the Overdrive numbers to an ok OC and just never bothered to see if I can go higher. I probably can now that my Fractal Design case is cleaned out of my 8 HDDs lol.

Edit: Could someone tweet at AMD or something if Freesync is monitors only or if it can be on TVs too?
 

iiiankiii

Senior member
Apr 4, 2008
759
47
91
I'd say your CPU and 1600Mhz memory is slowing you down a bit.

I run Haswell i7 and 16GB PC2400Mhz memory and dying light, far cry 4, and witcher 3 all pegged at 60fps on my 980ti.

That's false. His CPU is fine. No way is his cpu OVERCLOCKED a limiting factor.
 

DiogoDX

Senior member
Oct 11, 2012
757
336
136
Witcher 2, Mordor and Metro all have SSAA. Turning on will be like runing 2x or 4X of you resolution.

Just played Metro all max 1080P with 2X SSAA and solid 60 fps on my 980ti acx2.
 

n0x1ous

Platinum Member
Sep 9, 2010
2,574
252
126
Agree with RS that there has been stagnation since Crysis 3 and Ryse. I played Ryse start to finish despite it not being a ton of fun because it was so technically impressive. Hopefully Crytek gets through their financial issues since they seem to be the ones we have to count on for serious visual improvements.

I was also holding out hope that the former Engine dev from Crytek that took over Carmack's spot at ID would bear some fruit in upcoming Doom 4 and Bethesda games.....
 

moonbogg

Lifer
Jan 8, 2011
10,731
3,440
136
I went from dual overclocked 670s to a single 980 Ti as well. While I agree the benchmarks are outstanding, I'm more into playing the games than looking at numbers and as it stands, I'm not able to upgrade many of the settings from where they were at on my SLi 670s. Last Light I'm running at the exact same settings, though I am maintaining a constant 60FPS even in the outside areas and tunnel fire-fights. I am happy with my purchase (especially since I was able to get this card at a laughably cheap price), but I'm just now wowed by the performance like people were reporting.

Then again, I did get some pretty good overclocks on my 670s, so I was able to run most games at max or near maxed settings at 1440p.

I'm going to try overclocking my card today and see if that doesn't give me some better performance.

I didn't go to a single 980ti. I got two of them with EK water blocks @ 1400+ mhz. Trust me when I say I am not disappointed. They are stupid fast and I play games with them @ 144hz. I don't stare at benchmarks all day.
Is your card OC'd? These things OC like crazy. Its enough to actually make it worth it. I don't normally bother OCing my GPUs, but these demand an OC. I don't use the OC for games like BF4 though. Its already pegged at 144hz most of the time anyway.
 

StinkyPinky

Diamond Member
Jul 6, 2002
6,948
1,256
126
Part of the problem is feature creep though isn't it? They could make a single GPU 3x more powerful than a 980ti and within a couple of years developers will find a way to bog it down with "features" that aren't really even noticeable.
 
Feb 19, 2009
10,457
10
76
Part of the problem is feature creep though isn't it? They could make a single GPU 3x more powerful than a 980ti and within a couple of years developers will find a way to bog it down with "features" that aren't really even noticeable.

Indeed.

Plus I tend to notice games that focus on visuals as a selling point end up with really crap gameplay.

With the next node looking to be even more expensive, I think I'll just console peasant it if GPU prices keep on soaring while visual gains tapers off.
 

moonbogg

Lifer
Jan 8, 2011
10,731
3,440
136
Part of the problem is feature creep though isn't it? They could make a single GPU 3x more powerful than a 980ti and within a couple of years developers will find a way to bog it down with "features" that aren't really even noticeable.

I've wondered about this myself. If I think about games I played with my 8800GT's and how good they looked and how fun they were, today's games aren't enough improved to justify or explain the extreme performance gap between the hardware.
Two 980ti's vs two 8800GT's...the difference is extreme to put it lightly. Today's games SOMEHOW manage to use that power, but I get this tinfoil hat idea that they are using the power for the sake of driving GPU sales. A 980ti is like a million times as powerful as an 8800GT, but games aren't a million times better looking.
I'd say for every 100% increase in GPU power, visual fidelity in games increases 5%.
 

Stg-Flame

Diamond Member
Mar 10, 2007
3,648
588
126
Well, I don't have a water block or any type of cooling other than really good airflow (and my central air is set pretty low all day long despite living in the desert) and I was a bit hesitant to go crazy on overclocking this card.
http://www.newegg.com/Product/Product.aspx?Item=N82E16814487139

That's the card I have, so I was unsure what I should be shooting for without decent cooling, especially after I read so many reviews stating how hot this thing gets on a full load.
 
Feb 19, 2009
10,457
10
76
A 980ti is like a million times as powerful as an 8800GT, but games aren't a million times better looking.
I'd say for every 100% increase in GPU power, visual fidelity in games increases 5%.

Diminishing returns. Next use of GPU processing increase will be VR, then much later, holodecks. ;)
 

Mondozei

Golden Member
Jul 7, 2013
1,043
41
86
Developers are not stashing the games with useless "features" for the lulz. The average GPU on Anandtech forums =/= the average GPU for most PC gamers.

Look at the top 10 dGPUs on Steam's latest survey. The most powerful card is the 970. Other than that, you got dGPUs like the 660, 750 Ti etc. Do you not think developers are aware of that?

Also, folks gotta understand that the most rapid progress always exists at a low base. In the year 2000, graphics was just shades of beige.
Further, CPU progress was much faster as well. That's something a lot of people are forgetting. A lot of features in games are CPU-bound.

If the same pace in Moore's law had been in place from 2010 to 2017(7 years because I count Crysis 1 as the biggest milestone of the first decade in terms of pure GPU performance) for both GPUs/CPUs, we would still have seen a smaller direct impact in terms of graphical fidelity, because we are at a higher base. But it would nevertheless be faster than we are seeing now.

Lastly, compare a game like Ethan Carter, which can run surprisingly well on even mid-range GPUs, to most mainstream games in 2010. Of course there's been progress - and plenty of it. But a lot of devs just want to focus on brute force to solve everything. Alien: Isolation and MGS: GZ both proved that you can get very good effects on modest hardware if you spend time optimising the game, something which for some reason seems to be seen as a luxury these days.
 
Last edited:

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Developers are not stashing the games with useless "features" for the lulz. The average GPU on Anandtech forums =/= the average GPU for most PC gamers.

Look at the top 10 dGPUs on Steam's latest survey. The most powerful card is the 970. Other than that, you got dGPUs like the 660, 750 Ti etc. Do you not think developers are aware of that?

Also, folks gotta understand that the most rapid progress always exists at a low base. In the year 2000, graphics was just shades of beige.
Further, CPU progress was much faster as well. That's something a lot of people are forgetting. A lot of features in games are CPU-bound.

If the same pace in Moore's law had been in place from 2010 to 2017(7 years because I count Crysis 1 as the biggest milestone of the first decade in terms of pure GPU performance) for both GPUs/CPUs, we would still have seen a smaller direct impact in terms of graphical fidelity, because we are at a higher base. But it would nevertheless be faster than we are seeing now.

Lastly, compare a game like Ethan Carter, which can run surprisingly well on even mid-range GPUs, to most mainstream games in 2010. Of course there's been progress - and plenty of it. But a lot of devs just want to focus on brute force to solve everything. Alien: Isolation and MGS: GZ both proved that you can get very good effects on modest hardware if you spend time optimising the game, something which for some reason seems to be seen as a luxury these days.

Simply put - time is money, friend.

Just look at batman ak. Game got pawned off to a small studio already working on a few projects to get ported while rocksteady polished off the console versions.
 
Feb 19, 2009
10,457
10
76
Lastly, compare a game like Ethan Carter, which can run surprisingly well on even mid-range GPUs, to most mainstream games in 2010. Of course there's been progress - and plenty of it. But a lot of devs just want to focus on brute force to solve everything. Alien: Isolation and MGS: GZ both proved that you can get very good effects on modest hardware if you spend time optimising the game, something which for some reason seems to be seen as a luxury these days.

Yeah but if you make very optimized games that look awesome with low minimal requirements, it wont be used in GPU benches cos it's not GPU killing enough... ;) People see 100-200 fps on modest hardware and the plebs be thinking "that graphics can't be good!"

I fully agree with you though, Vanishing of EC & AI (I'm too scared to finish it heh) are drop dead gorgeous games that run extremely well.
 

NTMBK

Lifer
Nov 14, 2011
10,402
5,639
136
OP, what sort of hard drive/SSD are you using? Your system could be bogging down loading assets into the GPU.
 

guskline

Diamond Member
Apr 17, 2006
5,338
476
126
Specs in signature.

From the hype of the card, I was expecting to be able to at least max a few of the newer games (Metro Last Light, Shadow of Mordor, etc.) at 1440p, but it seems some settings still kill the FPS. I'm aware it could be poor optimization, but I even turned all the settings to max on The Witcher 2: Assassin of Kings and while it plays around 50FPS constant, it does dip down to the 40s ever so often. This is somewhat disheartening, but maybe I just had exaggerated expectations for this card replacing my SLi 670s.

I was planning on going SLi 980s down the line after the price had dropped a bit, so maybe once I get a second card in my system I will see the drastic improvements I was expecting.
Just a thought. Ever consider bumping your system ram to 16G? I think on some of these new games it might help.
 

tweakboy

Diamond Member
Jan 3, 2010
9,517
2
81
www.hammiestudios.com
1080p is still safest for maxing everything. You're lucky it wasn't 4k, then you'd really be sad. 1440p is close to 2x the pixels of 1080p if I remember right. That revelation is pretty much why I am planning to avoid it for now.

buying a high end card just to turn down settings because of pixel count doesn't seem right.


Nice.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Yeah but if you make very optimized games that look awesome with low minimal requirements, it wont be used in GPU benches cos it's not GPU killing enough... ;) People see 100-200 fps on modest hardware and the plebs be thinking "that graphics can't be good!"

I fully agree with you though, Vanishing of EC & AI (I'm too scared to finish it heh) are drop dead gorgeous games that run extremely well.

There is nothing stopping dev's from polishing a game with extremely high end settings that still push high end GPU's.

Of course, most people around the forums think the FPS you get is a direct reflection of how well optimized the game is.
 

Madpacket

Platinum Member
Nov 15, 2005
2,068
326
126
This is why I think G/F Sync is so important. Brute forcing (dual GPU's etc) is a wasteful, inefficient and expensive way to game with the highest details unless you really want 4K.

I just upgraded from a 27.5" TN 60Hz 1200P monitor to a 27" 1440P IPS monitor with Freesync. I used to have dual 290's and sold one off and went with just one 290X (flashed 290). With the 290X @ 1440P I have no issues playing games like TW3 basically maxed out due to freesync. This also has allowed me to cram everything into an ITX case that's fairly quiet and uses a modest amount of power. Win, Win.

If I want higher frames for competitive gaming I just turn the graphics down anyway. No competitive gamer plays with with high or extreme detail settings, the graphics just get in the way. You don't need dual graphic cards for this.

OP, just get a G-Sync monitor and crank up the details.

I mostly agree with RS, we are being held back by consoles but this has always been the case as the market is bigger there. I disagree about the grass in TW3 looking worse than Crysis though. TW3 is one the best looking games I've ever seen and on PC is clearly better looking than the PS4 or XBOX One version.

Looks are subjective though :)
 
Last edited:

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
I've wondered about this myself. If I think about games I played with my 8800GT's and how good they looked and how fun they were, today's games aren't enough improved to justify or explain the extreme performance gap between the hardware.
Two 980ti's vs two 8800GT's...the difference is extreme to put it lightly. Today's games SOMEHOW manage to use that power, but I get this tinfoil hat idea that they are using the power for the sake of driving GPU sales. A 980ti is like a million times as powerful as an 8800GT, but games aren't a million times better looking.
I'd say for every 100% increase in GPU power, visual fidelity in games increases 5%.

I expect DX12 to give us actually noticeable increase in fidelity because game devs will finally be able to increase static mesh and object count dramatically. Having more "Stuff" in the game worlds will do wonders for immersion IMO. Due to the CPU overhead of DX11 and the weakness of the old consoles, object count stagnated for a decade. New consoles plus new low level API on PC = more things in game for XBONE/PS4/PC DX12 only titles. When DX11 will be out of the mainstream wont happen for a while though and so we won't see it fully fledged for a while yet :/. I'm sure we'll see increasing amounts of object density sliders with higher tiers only on DX12

Also I expect VR to be exciting and have some really impressive visuals too.
 
Last edited:

ocre

Golden Member
Dec 26, 2008
1,594
7
81
Well, I don't have a water block or any type of cooling other than really good airflow (and my central air is set pretty low all day long despite living in the desert) and I was a bit hesitant to go crazy on overclocking this card.
http://www.newegg.com/Product/Product.aspx?Item=N82E16814487139

That's the card I have, so I was unsure what I should be shooting for without decent cooling, especially after I read so many reviews stating how hot this thing gets on a full load.

When I went from a 970 to a 980, i was surprised to see first how little the difference felt. 15-20% you see in the graphs are really hard to feel and notice in real life.

The 670s overclocked would be at least 80% as fast as a single 980ti, on average as a rough guess. There will be games, are games that favor maxwell and the gap will be bigger. But still, that is a minor bump in speed. AMD gained at least that much with driver improvements on GCN. So it's just not something that will be huge, no wow factor. The numbers aren't there, sorry.

Overclocking my 980 helped me, it made my side grade a little more like a step forward. Overclocking your 980ti will help you as well. If you are ~20% faster now, overclocking can get you closer to 40% faster- which is noticeable.

My suggestion is to slide your power/temp up to max 125%, leave the fan on auto and try +200 on the core.
If that doesn't work out, back down some.

I believe 200 is a good starting point, I don't have a TI but on my ref 980 i can do this without noticing any loudness not any issues. I inched mine up to 1450boost and run there with no issues. I have ran heaven and a few games at 1500mhz with no problems but since I have my PC on all the time, and the gpu OC all the time, and my son uses it way more than me.... I just leave it at 1450mhz , set it and forget it
 

moonbogg

Lifer
Jan 8, 2011
10,731
3,440
136
I compared my dual 670's to a single 980ti. The 670's had lower mins and they had hitching issues and felt sluggish and crappy. The 980ti had higher mins and no hitching, so despite a similar average FPS the 980ti was much better.
A single GPU is always better, but if you want performance that doesn't exist yet, you have to buy two cards.
Also, regarding competitive players and game detail settings, I don't know what you guys mean by competitive, maybe a paid professional in the Korean Olympics? If that's the case, then who cares about what that one guy does. I enjoy my competitive games with full details.
 
Last edited:

Madpacket

Platinum Member
Nov 15, 2005
2,068
326
126
Also, regarding competitive players and game detail settings, I don't know what you guys mean by competitive, maybe a paid professional in the Korean Olympics? If that's the case, then who cares about what that one guy does. I enjoy my competitive games with full details.

I should clarify. It's really game dependent. Some games with details cranked hinder the ability to play competitively. For example Battlefield 4 with low detail set is much easier to play online and get a decent score than with high or ultra settings. And not just because you get a better frame rate (although this plays a big part of it). Visual distractions make it harder to spot enemies etc. I find this true of many modern games. Call of Duty is another game I have to turn down the graphics to compete. Eye candy is great for single player or MMO games but it can really hinder your ability to climb the leaderboard in some games.