***Official Reviews Thread*** Nvidia Geforce GTX Titan - Launched Feb. 21, 2013

Page 30 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

AlbertaOilField

Junior Member
Feb 23, 2013
1
0
0
If you wait another year or so, Maxwell will give you better performance at 1/3 of the price and 1/3 of the watts.

Spending $1000 on this is quite idiotic. It's a nice statement piece though.


.....
$1k card/130k = 0.00769........Spending 0.77%; less than 1% of total income
$500 card/130k = 0.00384......Spending 0.386 %; less than 0.5% of total income
.....
.....
.....
$1k card/50k = 0.02.......Spending 2% of total income
$500 card/50k = 0.01....Spending 1% of total income
....
.....
.....
$1k card/30k = 0.03.....3%
$500 card/30k = 0.17....1.7%
......
 
Idiotic? I think the guy that doesn't make a whole lot of money and spends $500 on a card is more of an idiot than a guy who makes more and buys a titan.
 

Termie

Diamond Member
Aug 17, 2005
7,949
48
91
www.techbuyersguru.com
I disagree with you, and your personal weighing of words is akin to semantics. You can't change history and the knowledge base already out there, that has identified frame time issues. Most articles, point out the issue or problem can happen with single cards. Not surprisingly this isn't a recent problem. It almost comes down to a matter of quality control and/or design error, by driver.
That's my opinion.

Fair enough. I think we agree much more than we disagree - I was just noting that if we simply refer people to a Wikipedia definition of microstutter, they'll dismiss it as nothing new when in fact the method of testing for microstutter is very new indeed.

And yes, as Keysplayer said, it is still a work in progress.
 

BoFox

Senior member
May 10, 2008
689
0
0
Alrite, here's what microstuttering looks like (from HardwareCanucks ):

GTX-TITAN-91.jpg

With SOLID blue all over the place. It's purely up and down with each and every frame, like as if every other frame is almost ("basically") skipped with GTX 690.

Here, for HD 7970 (although Apoppin recently said that the latest beta driver actually vastly improved stuttering/jittering issues, while also improving fps count at the same time):
GTX-TITAN-84.jpg

It's not purely SOLID, all over the place (but I would still call that just as bad as AFR microstuttering). You can see in between the up and down's. Note: GTX 690 still has actual microstuttering, although the "slow" frame times are not as high, with the frames alternating between 10-25ms on average.

The higher the spikes go, the worse they get (like):
GTX-TITAN-90.jpg

Techreport proved it with the videos, in that FRAPs was at least giving a pretty good picture of the problem. Anyway, it's amazing if AMD already fixed it with 13.2 beta 6, while also increasing fps!
 
Last edited:

BoFox

Senior member
May 10, 2008
689
0
0
I guess my question would be where the cutoff point is. How low would the frame times have to be compared to a single card for multiple gpu with its higher variation to be a better experience? I guess that's different for each person.

This is assuming there are no spikes and settings don't get reduced.
That's what I try to account for by giving multi-GPU setups a "modest" 15% penalty for microstuttering (and more likely incompatibility with less popular games that are not covered by 70% of review sites out there).

It'd be great if all review sites out there could cover at least 20 games, with at least 5 of them being unique (not covered by other sites), with exhaustive research on the frame time analysis.

But honestly, microstuttering just needs to go. It's been around for long enough. Ignorance has been bliss for these companies for long enough.

I'd still call GTX 690 a "faster" card overall than Titan despite such microstuttering, when taking all of the results into average, but if one does not want to experience such problems (like Oh crap, I'm seeing severe microstuttering with my new favorite game, where it performs no better than with just 1 GPU)... Titan could be a less frustrating experience.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Oh, Crossfire? We talking about that, right:
fr-3.png


Or that:
fc3-7970cf.gif


Or here:
Anno_hd7970cf.png


Damn, what a wonderful gaming experience you get with AMD.
 

Majcric

Golden Member
May 3, 2011
1,370
37
91
The Titan defintitely needs a few driver divisions. It very much lacking at its pricepoint.
 

Imouto

Golden Member
Jul 6, 2011
1,241
2
81
I thought that benches trying to measure the user experience while tearing the screens were totally debunked already.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Yes, because playing with 30FPS in Crysis 3 is the next level of gaming experience on a Crossfire system.
 

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
Oh, Crossfire? We talking about that, right:
fr-3.png


Or that:
fc3-7970cf.gif


Or here:
Anno_hd7970cf.png


Damn, what a wonderful gaming experience you get with AMD.


$1000 and the ultimate holy grail metric, frametimes?

Wait the titan suffers from MS, funny you only mention amd.
GTX-TITAN-86.jpg

GTX-TITAN-87.jpg


Even the NV 690, apparently it's not only amd that has ms? Nice try shill.
GTX-TITAN-91.jpg
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,475
136
crysis3_evil_2560smaa.png

http://pclab.pl/art52489-5.html

With numbers like that, i don't see a problem with the price of TITAN.

nice cherry picking. there are many reviews which show the Titan is 35 - 40% faster than HD 7970 Ghz in crysis 3.

http://www.pcper.com/reviews/Graphi...mance-Review-and-Frame-Rating-Update/Crysis-3-

http://www.hardwarecanucks.com/foru...orce-gtx-titan-6gb-performance-review-10.html

http://hothardware.com/Reviews/GeForce-GTX-Titan-Performance-Yes-It-CAN-Play-Crysis-3/?page=9

http://www.pcgameshardware.de/Grafikkarten-Hardware-97980/Tests/Test-Geforce-GTX-Titan-1056659/5/

By now almost every major tech site has shown that Titan is on avg 30 - 35% faster than HD 7970 Ghz. so stop justifying the price. there are websites who have called Nvidia directly for the ridiculous price.

http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_Titan/35.html

"With the GeForce GTX 680 and HD 7970 GHz Edition being available at around $450, I find it extremely hard to justify spending more than twice that amount of money for around 30% more performance. A more reasonable price for the GTX Titan would be between $600 and $700, but it looks like NVIDIA doesn't feel like they have to sell a ton of these cards. The GTX Titan is clearly positioned as a premium product, and like the GTX 690, I expect it to remain at a premium price point for as long as there is no real competition."

"Sure, the statement "NVIDIA has the fastest single GPU" holds true, but "NVIDIA has the most overpriced single-GPU card in 25 years of VGA history" is also equally true."

Titan is a good product with bad pricing. But Nvidia knows it can milk the market until AMD brings out a product which comes closer to it and at half the price. Even a HD 8970 with a modest 15 - 20% improvement at USD 500 would put pressure on Nvidia to reduce Titan pricing. In fact a GTX 780 based on a GK114 with 1920 CUDA cores and 15 -20 % more performance at USD 500 would make the Titan look silly at its current pricing. So eventually Titan will make its way down to USD 700 , maybe in a different form - 1/24 DP version with 3 GB of GDDR5. in fact if they had done it now they would have provided a knockout product.

so please spare us your competitive analysis. :biggrin:
 
Last edited:

VulgarDisplay

Diamond Member
Apr 3, 2009
6,193
2
76
Not to mention the many sites that are now beginning to report the performance degradation when Titan has had a chance to heat up. In those cases I would estimate that Titan is less than 30% faster than the gtx680 and 7970.
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
Not to mention the many sites that are now beginning to report the performance degradation when Titan has had a chance to heat up. In those cases I would estimate that Titan is less than 30% faster than the gtx680 and 7970.

What is many sites? What is degraded performance? Have you seen a graph of AMD's turbo boost implementation?

There are also sites that are noting that 3 stacked cards in tri-sli never topped above 79c in continuous testing, which is beneath turbo-throttling.
From a above posted link.
Temperature and Power Consumption
In terms of power consumption and temperatures, we have to hand it to Nvidia, during the week or so of testing that we did the cards never went above 79C. That is, even when all three cards were stacked next to each other and running games at high settings. In addition to that, these cards ran incredibly quietly and cool with the idle being somewhere around 25C, which is impressive for most graphics cards, let alone a reference design.
In terms of power consumption, this card at it’s very max only consumed 237W, which isn’t bad at all when you consider the fact that the GTX 680 consumed 186W. Take into account that the Titan is twice the amount of transistors as the GTX 680 and only draws 50 more watts, that’s a pretty big deal for Nvidia.
We’ll come back to you once we’ve had more time with this card and have had a chance to find the one that overclocks best out of our three. We believe that we can probably get this card to overclock somewhere near 1.2GHz since we were able to do 1.3 GHz with our GTX 680
http://www.brightsideofnews.com/new...and-battlefield-3-vs-hd-7970-ghz-edition.aspx


Also from Anand review article : Available in Evga Precision, a user can adjust these settings. Like a user can adjust to +20% in CCC .

http://www.anandtech.com/show/6774/nvidias-geforce-gtx-titan-part-2-titans-performance-unveiled
The power target slider is still responsible for controlling the TDP as before, but with the ability to prioritize temperatures over power consumption it appears to be somewhat redundant (or at least unnecessary) for more significant overclocking. That leaves us with the temperature slider, which is really a control for two functions.
First and foremost of course is that the temperature slider controls what the target temperature is for Titan. By default for Titan this is 80C, and it may be turned all the way up to 95C. The higher the temperature setting the more frequently Titan can reach its highest boost bins, in essence making this a weaker form of overclocking just like the power target adjustment was on GTX 680.
Untitled_575px.png
 
Last edited:

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
Not to mention the many sites that are now beginning to report the performance degradation when Titan has had a chance to heat up. In those cases I would estimate that Titan is less than 30% faster than the gtx680 and 7970.

Any link to this?
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
What is many sites? What is degraded performance? Have you seen a graph of AMD's turbo boost implementation?

There are also sites that are noting that 3 stacked cards in tri-sli never topped above 79c in continuous testing, which is beneath any turbo-throttling.
From a above posted link.

http://www.brightsideofnews.com/new...and-battlefield-3-vs-hd-7970-ghz-edition.aspx

I haven't read the link, but isn't Titans GPU boost limited by temperature? I think when running in SLI the cards won't go above a certain temperature, they would just boost less? Correct me if I'm wrong.

EDIT: They are putting Titan in SFF PCs, I'm guessing they won't perform as well as in a case with better cooling. Conversly, the water cooled ones would perform better. I don't think it's comparable to AMD's solution that is linked to powertune not temperature.
 
Last edited:

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
A couple tidbits, go read that for more info.

Allegedly the clocks are 'boosted' while the card is cooler, thus the benchmarks can be skewed. It's taking e.g. 20 minutes to settle to the true clocks that will stay.

PCGH's benches show there to be between 5-10% difference between "dynamic boost" (open-air rig) and "28 degrees Celsius" at 2560x1600, with average fps (except for Skyrim which shows a 19% difference). Sometimes, the 28C test is slower than the 876MHz result, and sometimes it is faster.

Computerbase.de is saying the same thing (translated):

This explains Computerbase.de's relatively low scores for Titan (default, not "MAX") compared to most other review sites.

The same goes for Hardware.fr (translated page explaining such) and its "relatively" low scores for Titan.

http://forums.anandtech.com/showpost.php?p=34664254&postcount=54
 
Last edited:

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
I haven't read the link, but isn't Titans GPU boost limited by temperature? I think when running in SLI the cards won't go above a certain temperature, they would just boost less? Correct me if I'm wrong.

EDIT: They are putting Titan in SFF PCs, I'm guessing they won't perform as well as in a case with better cooling. Conversly, the water cooled ones would perform better. I don't think it's comparable to AMD's solution that is linked to powertune not temperature.
I added to my above post above. You can adjust the temperature and TDP target from stock settings. This fear mongering is more of not telling the whole story. If you were to put this card in a SFF pc, it could probably run better than lower tdp cards because of it's advanced monitor/sensor controls.