GTX 980Ti finally launched - MSRP $649 - Reviews

Page 16 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
You can easily imagine a case where a gamer could have bought HD7950 for $280 and then an R9 290 for $375 in the last 3.5 years. That's way better than buying a $550 HD7970 and holding on to it. If we consider resale of 7950, it makes investing into the HD7970 even worse.

That is literally exactly what I did. Had a 7950 I got on sale at the time, then sold it for $400 during the mining craze and was able to get a used 290 for that same price. Much better off than wasting $550 on a release date 7970 and holding it
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
Lol... the 290x was destroyed for its power consumption which is 20 more watts than the 980 Ti which is now a cool/quiet card?

goalposts...

Because the 980Ti is about 50% faster, using less power, and about the same noise?

LOL. I cannot believe I am even seriously answering this question. :p
 

moonbogg

Lifer
Jan 8, 2011
10,731
3,440
136
I've never had one of the fancy metal coolers, so I am looking forward to it. They are being delivered today. I will fondle them and smell them. You know you like that new video card smell too, so don't look at me funny.
Also, it seems that Maxwell will live and die on 28nm? I didn't expect that a while back. I thought it was going to be the main feature of the next shrink. Seems like a lot of R&D to just use as a short stop gap product. I bet with a shrink some major gains could be had with this architecture.
 
Last edited:

2is

Diamond Member
Apr 8, 2012
4,281
131
106
So you are betting that 4GB of VRAM will become a bottleneck in 2.5 years or less at 1080P-1440P?

No official info is known about Fiji XT because it hasn't been officially confirmed/announced. You still went ahead and bought 980Ti SLI despite trying to come off objective/impartial for months. Ya right. No one objective does that unless you are specifically tied to a GSync monitor or need some CUDA specific software or NVidia 3D Vision gaming or you have insider info that Fiji is worse for 100% fact. That's the whole point of the discussion that you seem to be evading at all costs. You never intended to buy Fiji even if it had 8 or 16GB of HBM because you didn't even bother waiting for reviews, its price, or a confirmation of its HBM capacity.

I guarantee it if Fiji beats 980Ti in performance at 1440P and 4K, you'll have some excuses lined up how in 2018 it might run into a VRAM bottleneck. If 980Ti OC beats Fiji XT OC and they cost similarly, of course 980Ti is a better choice due to 6GB of VRAM. But we don't know any of that until Fiji PROs and Fiji XT's launch. You didn't wait, which means you never even considered buying it; that's the point.

I'm not "betting" anything, as has been explained to you (are we up to 6 times now?) it's simply not a chance I'm willing to take. If you listened even half as much as you talked you wouldn't need things repeated to frequently.

Lots of interesting products are coming within the next year or less. The two I'm interested in are DX12 and Oculus Rift and I'm not going to chance being "stuck" with a 4GB card. I'm going to absolve myself of that worry.

Fiji might very well beat the 980Ti at 4k today but I'm more concerned with performance tanking if 4GB isn't enough than I am with having a few FPS more (or less) in situations they can both handle. Again, if you did half as much listening as talking, this would have already been ingrained in to your mind.

That said, I actually canceled my Amazon order because I realized it was an impulse buy and I liked some of these aftermarket coolers and factory OC's they were achieving. The most demanding game I was playing was FC4 which I'm done with. I have AC: Unity which could certainly take advantage but I'm going to play through Uncharted 3 (PS3) and Arkham Origins (which I have no problems maxing out) first, so I can afford to wait a little while and get a more complete lay of the land.

If 390x is 4GB it's still not a contender for my next purchase. I get that you're deeply hurt by this, and it will preclude us from being friends, but you'll just need to get over it.
 

moonbogg

Lifer
Jan 8, 2011
10,731
3,440
136
I bet the AMD card doesn't even have 4gb. I bet it comes with 8gb. I just don't think they would release a flagship with 4gb. They just wouldn't do it IMO. That's a no go.
 

Face2Face

Diamond Member
Jun 6, 2001
4,100
215
106
I've never had one of the fancy metal coolers, so I am looking forward to it. They are being delivered today. I will fondle them and smell them. You know you like that new video card smell too, so don't look at me funny.
Also, it seems that Maxwell will live and die on 28nm? I didn't expect that a while back. I thought it was going to be the main feature of the next shrink. Seems like a lot of R&D to just use as a short stop gap product. I bet with a shrink some major gains could be had with this architecture.

I'm wondering if I should call the Video card protective services. This is quite disturbing...

In all seriousness, you should start doing some benches on your 670's and post a comparison when you install your 980Ti's.
 

thehotsung8701A

Senior member
May 18, 2015
584
1
0
I have a question to all of you.

If I buy the GTX 980ti assuming the Fury is not as good as it was meant to be, does that mean the GTX 980ti is the ultimate 1440p GPU that can last me say 5 years down the line? Meaning will I be able to play all games from here to the next five years including DX 12 at ultra setting with 60fps or more in 1440p?
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
I have a question to all of you.

If I buy the GTX 980ti assuming the Fury is not as good as it was meant to be, does that mean the GTX 980ti is the ultimate 1440p GPU that can last me say 5 years down the line? Meaning will I be able to play all games from here to the next five years including DX 12 at ultra setting with 60fps or more in 1440p?

Of course not.

No one could say that.

How could anyone have any idea?
 

xthetenth

Golden Member
Oct 14, 2014
1,800
529
106
I'm not "betting" anything, as has been explained to you (are we up to 6 times now?) it's simply not a chance I'm willing to take.

Betting the risk about being wrong when buying against a few days, maybe a week is a fool's bet. One really wrong bet would take a huge number of generations to make good on. It's a good call cancelling the cards.
 

flopper

Senior member
Dec 16, 2005
739
19
76
I have a question to all of you.

If I buy the GTX 980ti assuming the Fury is not as good as it was meant to be, does that mean the GTX 980ti is the ultimate 1440p GPU that can last me say 5 years down the line? Meaning will I be able to play all games from here to the next five years including DX 12 at ultra setting with 60fps or more in 1440p?

No.
witcher 3 shows Titan x cant sustain 60fps at 1080p at max settings.
able to play all games you cant do at those resolutions and settings.
Dx12 changes things so who knows what nvidia can do there as I rather trust AMD who made the beseline code for it.

You end up having to either lower settings or run sli/crossfire.
no single cards for at least 2 more generations if one asks me
 

moonbogg

Lifer
Jan 8, 2011
10,731
3,440
136
I'm wondering if I should call the Video card protective services. This is quite disturbing...

In all seriousness, you should start doing some benches on your 670's and post a comparison when you install your 980Ti's.

When I get home tonight I'll do some benches and post the difference for anyone who wants to see my real world example. It might be useful for people still using sandy bridge and considering a GPU upgrade or what have you.

I have a question to all of you.

If I buy the GTX 980ti assuming the Fury is not as good as it was meant to be, does that mean the GTX 980ti is the ultimate 1440p GPU that can last me say 5 years down the line? Meaning will I be able to play all games from here to the next five years including DX 12 at ultra setting with 60fps or more in 1440p?

Sorry, but no chance of that happening. Absolutely zero chance. In 5 years it will be a hot, slow, power hungry and stuttering mess of a GPU. It can't even play all games out now maxed at 1440p @ 60fps.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
How about this. Can the GTX 980ti handle DX12 games?

Considering its a DX12.1 GPU yes. (Note all GPUs below can actually run DX12 games)

Jn2NVyf.jpg
 
Last edited:

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
How about this. Can the GTX 980ti handle DX12 games?

All current cards can handle DX12 games.

There are some feature set differences, but all current NV and AMD cards are DX12 API compliant.

DX12 should make it easier on the graphics cards, iirc.
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
All current cards can handle DX12 games.

There are some feature set differences, but all current NV and AMD cards are DX12 API compliant.

DX12 should make it easier on the graphics cards, iirc.

DX12 reduces CPU overhead, not GPU (everything else being equal) and one of it's advantages is that it allows more unique objects on the screen at any given time. If anything, it will push GPU's harder, not less.
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
When I get home tonight I'll do some benches and post the difference for anyone who wants to see my real world example. It might be useful for people still using sandy bridge and considering a GPU upgrade or what have you.



Sorry, but no chance of that happening. Absolutely zero chance. In 5 years it will be a hot, slow, power hungry and stuttering mess of a GPU. It can't even play all games out now maxed at 1440p @ 60fps.

Agree.

I can really only think of 2 GPUs in the last 10-13 years that fit this bill, and even that was a stretch. The 9700Pro and the 8800GTX, if you bought right away, were very viable for a LONG time.

These are rare occurrences though...
 

thehotsung8701A

Senior member
May 18, 2015
584
1
0
When I get home tonight I'll do some benches and post the difference for anyone who wants to see my real world example. It might be useful for people still using sandy bridge and considering a GPU upgrade or what have you.



Sorry, but no chance of that happening. Absolutely zero chance. In 5 years it will be a hot, slow, power hungry and stuttering mess of a GPU. It can't even play all games out now maxed at 1440p @ 60fps.

Wait...what is going on here? 7 years ago when I bought my HD 5850, I could max all games way past 60fps. I was getting 90 to 120fps on max on every single game.

Your telling me that the #1 GPU right now can't even handle ultra setting at a minimal of 60fps in 1440p for all game? I did read and look over the benchmark and I'm pretty disappointed in this generation of gpu. Now I know I'm comparing 1440p to 1080p but this is 7 years apart so 1440p in my opinion is the new standard.

I'm not going to delay my upgrade cause once Pascal release along with better price 4K monitor, my financial solution will be so much better it would not matter. But still paying $650 for a GPU that can't even max out is disappointing. I used to think gaming at 60fps was the bare minimal.
 

IEC

Elite Member
Super Moderator
Jun 10, 2004
14,600
6,083
136
With a process leap to 14nm or 16nm or whatever the next node is going to be for GPUs coming in the next year or two, I don't think buying for "five years" is a good idea.

Would be better off spending $200-350 now if it runs everything you want to at good FPS and spending another $300+ a few years down the road when you can get 100% more performance.

Unless you don't mind the cost of buying flagship level cards.
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
Wait...what is going on here? 7 years ago when I bought my HD 5850, I could max all games way past 60fps. I was getting 90 to 120fps on max on every single game.

Your telling me that the #1 GPU right now can't even handle ultra setting at a minimal of 60fps in 1440p for all game? I did read and look over the benchmark and I'm pretty disappointed in this generation of gpu. Now I know I'm comparing 1440p to 1080p but this is 7 years apart so 1440p in my opinion is the new standard.

I'm not going to delay my upgrade cause once Pascal release along with better price 4K monitor, my financial solution will be so much better it would not matter. But still paying $650 for a GPU that can't even max out is disappointing. I used to think gaming at 60fps was the bare minimal.

This is what happens when resolutions grow faster than performance.

I think its a good thing though. Without the 1440P and 4K displays becoming more common, there isn't much else to drive GPU hardware improvements.

We collectively spent 10 years at approx the same resolution...its over-due that we start to make higher-res options the norm.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
In the long run it will be smarter to spend $250 on a 290 now and $400 on a 14/16nm GPU when that comes out vs 650 now, if you've only got $650 total to spend.
 

Samwell

Senior member
May 10, 2015
225
47
101
Wait...what is going on here? 7 years ago when I bought my HD 5850, I could max all games way past 60fps. I was getting 90 to 120fps on max on every single game.

You had to use a smaller resolution than FullHD or you didn't play GPU stressing games. Because Games like Crysis or Stalker killed the 5850 totally and pushed it under 30Fps.
 

biostud

Lifer
Feb 27, 2003
19,845
6,932
136
In the long run it will be smarter to spend $250 on a 290 now and $400 on a 14/16nm GPU when that comes out vs 650 now, if you've only got $650 total to spend.

Except that you don't get to play with the GTX 980 TI amount of GPU power for the next 1.5 years.
 

monkeydelmagico

Diamond Member
Nov 16, 2011
3,961
145
106
You can easily imagine a case where a gamer could have bought HD7950 for $280 and then an R9 290 for $375 in the last 3.5 years. That's way better than buying a $550 HD7970 and holding on to it. If we consider resale of 7950, it makes investing into the HD7970 even worse. You know how long it took before HD7950 was $280? By summer 2012.

A gamer can save even more if they scoop up top tier cards right around the time the next gen is being released. Your example is almost precisely what I do except I paid $180 for a 7950 in early 2013 and $280 for a 780 late 2014. Resale values hold up better too, depreciation seems to hit first gen cards the hardest once the next gen is released. Less so once a card is already a couple generations old. Added bonus: You can cherry pick cards that are highly rated and proven reliable.

Keep up the good work RussianSensation your analysis and input is outstanding.
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
A gamer can save even more if they scoop up top tier cards right around the time the next gen is being released. Your example is almost precisely what I do except I paid $180 for a 7950 in early 2013 and $280 for a 780 late 2014. Resale values hold up better too, depreciation seems to hit first gen cards the hardest once the next gen is released. Less so once a card is already a couple generations old. Added bonus: You can cherry pick cards that are highly rated and proven reliable.

Keep up the good work RussianSensation your analysis and input is outstanding.

Agree.

I have recouped 66-80% of my costs for the 5870 and 670 cards. They held-up value relatively well. Bought both at launch...