GTX 980Ti finally launched - MSRP $649 - Reviews

Page 13 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

tential

Diamond Member
May 13, 2008
7,355
642
121
Yes, it's shared, but the PS4 has a total of 8GB some people say 6GB is available for the game while others are saying it's 4.5-5GB. Either way you look at it, it's more than 4GB which is one reason why 4GB is a major point of contention, especially for a top end part.

First computers aren't consoles. Second that's 8gb of shared ram. Quoting the max ram usage of a console for games as the minimum needed for VRAM makes 0 sense in that regard.
Really most of what has been posted here drawing conclusions from consoles has all been just lulz worthy. If you want to base your ram usage off consoles then technically 4 GB of system ram and 4 GB of vram should be more than enough....

Lots of just lol posts in this thread. I guess gtx 970 owners should throw away their useless 3.5 GB cards....
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106

2is

Diamond Member
Apr 8, 2012
4,281
131
106
First computers aren't consoles. Second that's 8gb of shared ram. Quoting the max ram usage of a console for games as the minimum needed for VRAM makes 0 sense in that regard.
Really most of what has been posted here drawing conclusions from consoles has all been just lulz worthy. If you want to base your ram usage off consoles then technically 4 GB of system ram and 4 GB of vram should be more than enough....

Lots of just lol posts in this thread. I guess gtx 970 owners should throw away their useless 3.5 GB cards....

Interesting comment. You agree that computers should have more system ram but don't think it's necessary to have more vram? Nice goal post swtich.

You're right though, consoles aren't PC's. PC games are typically less optimized and have higher quality assets, making more RAM (vram and system ram) more desirable. Thanks for pointing that out, though it doesn't appear to have helped your side of the argument at all.

No one said anything about 3.54GB being "useless" that's your own words in a desperate attempt to defend the 390x
 

xthetenth

Golden Member
Oct 14, 2014
1,800
529
106
Lots of just lol posts in this thread. I guess gtx 970 owners should throw away their useless 3.5 GB cards....

They probably should be rid of them by the time Pascal drops. That 3.5 GB setup is begging for a return of Kepler syndrome.
 

SimianR

Senior member
Mar 10, 2011
609
16
81
When people use the term "aggressive pricing" for $650 you know there's a problem. The fact that NVIDIA is regularly pricing the flagships above $600 now and that's a-okay and mid-range cards like the 980 are $500+.. it really makes you wonder what the pricing would look like if AMD wasn't around. Obviously they aren't having much of an impact on NVIDIA's pricing now. It would be nice if people sent them a message and went Fiji this time around, but why do that when you can get 4X Titan X's at $1000 a pop.
 

tential

Diamond Member
May 13, 2008
7,355
642
121
Interesting comment. You agree that computers should have more system ram but don't think it's necessary to have more vram? Nice goal post swtich.

You're right though, consoles aren't PC's. PC games are typically less optimized and have higher quality assets, making more RAM (vram and system ram) more desirable. Thanks for pointing that out, though it doesn't appear to have helped your side of the argument at all.

No one said anything about 3.54GB being "useless" that's your own words in a desperate attempt to defend the 390x

390x has 8 GB of ram....
I never said that pc shouuld have more system ram.

Lol at me convincing you to try to buy an amd card. You've been dead set for awhile now against fiji. Dunno why you pretend you aren't it's whatever I won't join the multitude of posters baited to debate you on you're already made up decision without seeing the competition.
 

96Firebird

Diamond Member
Nov 8, 2010
5,712
316
126
It boggles my mind that people think a card at Titan X/980 Ti level of performance with 4GB is the same as a 970 with 4GB...

The 970 hardly had a chance to be playable with anything that used more than 4GB of VRAM, unless you have more than one card. The new top cards have a chance of being playable at settings that use >4GB VRAM.

Is that so hard to understand?
 

SimianR

Senior member
Mar 10, 2011
609
16
81
It boggles my mind that people think a card at Titan X/980 Ti level of performance with 4GB is the same as a 970 with 4GB...

The 970 hardly had a chance to be playable with anything that used more than 4GB of VRAM, unless you have more than one card. The new top cards have a chance of being playable at settings that use >4GB VRAM.

Is that so hard to understand?

I think people do understand that, but they're also curious if there are any actual real world situations, even at 4K, where having more than 4GB right now is making a significant impact on performance. I'm not sure that there are - that's not to say that there won't be down the road. Even on Fiji/TitanX/980 Ti - it's likely that if you start pushing VRAM usage that high the settings are unplayable anyway.

In Crossfire/SLI on the other hand, especially if they're planning to hold on to the cards long term, I can see having 4GB+ making a bigger impact.
 
Last edited:

Sohaltang

Senior member
Apr 13, 2013
854
0
0
gaming at 1440P, does this offer much value over a 780ti classified OC at 1300-1350 mhz? Maybe I should wait for the AMD offering.
 

Owls

Senior member
Feb 22, 2006
735
0
76
Man this is still a tough sell to upgrade my 780Ti. When is next big Nvidia GPU to be released after this? And when?

there's zero reason to upgrade from 780 ti to 980 ti. i'm running 780ti in SLI and I won't upgrade until the next ti model comes out. 1080 ti? lol

besides we haven't seen what AMD has in store.
 

ocre

Golden Member
Dec 26, 2008
1,594
7
81
I just think that there is a lot of effort to spam in this thread.

If you want to talk about how 6gb is a waste, make a new thread.

This isnt the place to discuss all that.

Back on topic,
How is the availability of the 980ti? Many seem to be rushing too buy one from reading across various forums
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
390x has 8 GB of ram....
I never said that pc shouuld have more system ram.

Lol at me convincing you to try to buy an amd card. You've been dead set for awhile now against fiji. Dunno why you pretend you aren't it's whatever I won't join the multitude of posters baited to debate you on you're already made up decision without seeing the competition.

So you think PC's should only have as much ram as consoles? A card that costs more than an entire console having only as much memory? That's assuming the memory was a 50/50 split like you are doing here, which it isn't.

If you want to base your ram usage off consoles then technically 4 GB of system ram and 4 GB of vram should be more than enough....

You're confused. I'm saying PC's should have MORE than consoles, Lets also not forget, PS4 is rendering at 1080p and sometimes less.

Also, please show me the info you have for 8GB 390x
 
Last edited:

B-Riz

Golden Member
Feb 15, 2011
1,529
676
136
It boggles my mind that people think a card at Titan X/980 Ti level of performance with 4GB is the same as a 970 with 4GB...

The 970 hardly had a chance to be playable with anything that used more than 4GB of VRAM, unless you have more than one card. The new top cards have a chance of being playable at settings that use >4GB VRAM.

Is that so hard to understand?

I am of the mindset that more VRAM is needed for higher resolutions and the incoming VR explosion, and one would buy accordingly. (Dunno if 12GB of VRAM is needed as that is recommended to have 24GB of system memory...)

Dismissing a new product that will have a minimum of 4GB, but may have more, we don't know, doesn't make sense.

I am currently just wanting to take advantage of the 144 Hz 1080p LCD, moar FPS, will turn down eye candy to get it...
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
I think people do understand that, but they're also curious if there are any actual real world situations, even at 4K, where having more than 4GB right now is making a significant impact on performance. I'm not sure that there are - that's not to say that there won't be down the road. Even on Fiji/TitanX/980 Ti - it's likely that if you start pushing VRAM usage that high the settings are unplayable anyway.

In Crossfire/SLI on the other hand, especially if they're planning to hold on to the cards long term, I can see having 4GB+ making a bigger impact.

This was the EXACT same argument made for 2GB cards. Turns out to be nothing more than a convenient way to delude ones self. I know, I've been there.

If a PS4 with it's APU which is FAR weaker than the cards you mentioned can take advantage of it these cards surely can. The only thing that would make pushing >4GB of VRAM with of assets on any of these cards unplayable is if they only have 4GB of VRAM.
 
Last edited:

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
gaming at 1440P, does this offer much value over a 780ti classified OC at 1300-1350 mhz? Maybe I should wait for the AMD offering.

It's faster than 2 780Ti's in a lot of recent games, so yeah its pretty good. The OC will probably be better too. I would suggest waiting for custom PCBs and AMDs offerings before making the jump, unless you need it now.
 

Face2Face

Diamond Member
Jun 6, 2001
4,100
215
106
gaming at 1440P, does this offer much value over a 780ti classified OC at 1300-1350 mhz? Maybe I should wait for the AMD offering.

Depends on the games your playing. Like you. I'm gaming at 1440p. I've been playing The Witcher 3 pretty much exclusively lately, and the performance is less that desirable with my GTX 780 OC'd to 1267Mhz+. The new NVIDIA driver helps Kepler's performance, but it's still pretty weak sauce.

Just look here in the Guru3D review - Ultra settings w/AA - No hairworks. The GTX 980Ti is twice as fast as a stock GTX 780ti. Now yours is overclocked, so add 5-8 FPS to that.

index.php
 

Stormflux

Member
Jul 21, 2010
140
26
91
The biggest open world games this season, Witcher 3 and GTA5, use 1842MB and 4221MB respectively according to Guru3D.
http://www.guru3d.com/articles-pages/gta-v-pc-graphics-performance-review,9.html
http://www.guru3d.com/articles_pages/the_witcher_3_graphics_performance_review,9.html

Battlefield Hardline uses 3667MB at 4k.
http://www.guru3d.com/articles_pages/battlefield_hardline_vga_graphics_performance_review,8.html

But looking at those graphs, the next resolution down cuts almost 500MB to 1GB of video ram usage.

Is there a technical breakdown on what and how Video Ram is being utilized at game time? People keep saying "Loading assets" will fill up the memory but I find that would be hardly the case unless you download 8k texture packs or games are filled with million polygon objects (tessellation).

It seems more to do with Sampling. If it is sampling, is this not a function of bandwidth?

A 2k Texture in 1080p is a 2k Texture in 4k. Similar to a 40k tri model is a 40k model in 1080p and 4k is what I mean.
 
Last edited:

2is

Diamond Member
Apr 8, 2012
4,281
131
106
Depends on the games your playing. Like you. I'm gaming at 1440p. I've been playing The Witcher 3 pretty much exclusively lately, and the performance is less that desirable with my GTX 780 OC'd to 1267Mhz+. The new NVIDIA driver helps Kepler's performance, but it's still pretty weak sauce.

Just look here in the Guru3D review - Ultra settings w/AA - No hairworks. The GTX 980Ti is twice as fast as a stock GTX 780ti. Now yours is overclocked, so add 5-8 FPS to that.

index.php

I'd be interested to see these benchmarks using the drivers that were just released yesterday which are said to improve Kepler performance.
 

Face2Face

Diamond Member
Jun 6, 2001
4,100
215
106
I'd be interested to see these benchmarks using the drivers that were just released yesterday which are said to improve Kepler performance.

Me too.

Here is some more info on the newest drivers 353.06

Kepler Performance Optimizations

Following end user reports of lower-than-expected performance in The Witcher 3: Wild Hunt when using GeForce GTX 600 and 700 Series Kepler GPUs, we have identified and fixed three bugs that were limiting performance not only in The Witcher 3, but also Far Cry 4 and Project Cars. With the new GeForce Game Ready drivers installed, frame rates are increased in each title, improving and optimizing your experience.

We would like to thank the community for their feedback on this issue. These reports and benchmarks helped greatly in rapidly identifying and resolving the issue. If you have any further feedback following the installation of the new driver, please post it in this thread.

Some users are reporting a large increase in FPS, and others are saying no performance increase at all. I'll have a chance to test it tonight.

EDIT: After seeing many users with positive performance increases for GTA V, FC 4, and TW3, I may have to recant my "weak sauce" statement.
 
Last edited:

moonbogg

Lifer
Jan 8, 2011
10,636
3,095
136
Me too.

Here is some more info on the newest drivers 353.06



Some users are reporting a large increase in FPS, and others are saying no performance increase at all. I'll have a chance to test it tonight.

EDIT: After seeing many positive performance increases with GTA V and TW3, I may have to recant my "weak sauce" statement.

So they are calling it a "bug" right? I swear in all honesty, I do not believe them when they say that. I want to see some new numbers for the fix.
 

SimianR

Senior member
Mar 10, 2011
609
16
81
So they are calling it a "bug" right? I swear in all honesty, I do not believe them when they say that. I want to see some new numbers for the fix.

We discovered a bug in Witcher 3 that was compromising performance and it has been fixed. For more detail see below.

bug description: We completely forgot about kepler once we released Maxwell. Buy more maxwell cards please.


But seriously, the 980 Ti seems like a great card. I would just be somewhat nervous about it's performance once pascal arrives, but I guess for the target market they would just grab a new card anyway.
 

Face2Face

Diamond Member
Jun 6, 2001
4,100
215
106
We discovered a bug in Witcher 3 that was compromising performance and it has been fixed. For more detail see below.

bug description: We completely forgot about kepler once we released Maxwell. Buy more maxwell cards please.


LOL :biggrin:
 

B-Riz

Golden Member
Feb 15, 2011
1,529
676
136
We discovered a bug in Witcher 3 that was compromising performance and it has been fixed. For more detail see below.

bug description: We completely forgot about kepler once we released Maxwell. Buy more maxwell cards please.


But seriously, the 980 Ti seems like a great card. I would just be somewhat nervous about it's performance once pascal arrives, but I guess for the target market they would just grab a new card anyway.

Lol.

980 Ti OC perf, :thumbsup:

Price :'(
 

Sohaltang

Senior member
Apr 13, 2013
854
0
0
Depends on the games your playing. Like you. I'm gaming at 1440p. I've been playing The Witcher 3 pretty much exclusively lately, and the performance is less that desirable with my GTX 780 OC'd to 1267Mhz+. The new NVIDIA driver helps Kepler's performance, but it's still pretty weak sauce.

Just look here in the Guru3D review - Ultra settings w/AA - No hairworks. The GTX 980Ti is twice as fast as a stock GTX 780ti. Now yours is overclocked, so add 5-8 FPS to that.

index.php


Obviously it varies by game and Valley is only a synthetic benchmark. My 780TI classy benches (1080 I know) above any 980 or 290X in our local benchmarking thread. https://docs.google.com/spreadsheet...q0CGV7dCfS8/pub?single=true&gid=0&output=html

Have not played withcer yet, just too busy. Someone (hint hint) should do a 1440 and 4k bench thread.