***Official Reviews Thread*** Nvidia Geforce GTX Titan - Launched Feb. 21, 2013

Page 18 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Rvenger

Elite Member <br> Super Moderator <br> Video Cards
Apr 6, 2004
6,283
5
81
The amount of fanboyism in this thread makes me want to go somewhere else. Geez, I can't believe some are defending Titan against a 7970 GE. $400 vs $1000. I don't care how you slice it, that card is a ripoff. I can get 70-75% performance on a 7970GE and narrow the gap even closer if I overclock. Don't get me wrong, Titan is a great card, but the price puts it in highway robbery status and you can't even fully overclock the thing! The $550 AMD launch price was nothing compared to this and I would not even complain if AMD released the 8970 at $550 because its still would be better price/performance if the increases were only a modest 15%.


Once I can find a GTX 690 at $750-800 this year, that will be my next card.
/end rant
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
I'm not going to bother reading between all the fanboy QQ'ing, but has anyone seen numbers on Titan's bitcoin hashrate?
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Oh well, the price/performance conscious PC upgrades will skip this overpriced card and wait for 20nm for a reasonable jump in price/performance.

20nm cards will likely have pretty bad price/perf increases just like the first batch of 28nm cards, thanks to Apple and co. eating up so much TSMC 20nm capacity.

On the other hand, do we HAVE to upgrade? Have you seen the specs on the PS4? I don't think anyone with a 79xx or 680/70 needs to upgrade anytime soon. Consolification has stunted PC game development for years, and I don't think Crysis 3 can reverse that trend all by itself. Worse, now we have mobile gaming competing for dollars as well, and a lot of talented devs are working on the next Farmville or whatever instead of the next Half Life 3. (Don't take that literally. But you know what I mean.) I may end up using my 79xx for a LOOOOT longer than I ever thought, unless I get 120Hz screens or something. Even 5760x1080 resolution is pretty good on an oc'd 7970 for a lot of older games, which is mostly what I play anyway.
 

Deltaechoe

Member
Feb 18, 2013
113
0
0
K I'm getting tired of being called an nVidia fanboy (yes, some have called me out in private) so let me clear the air here. I have been using AMD cards for the past 4 years for computing, bitcoin mining and gaming. The Titan makes things easier for me because
1. Single gpu, I'm tired of crossfire and want a single gpu to fix all the microstutter I get (it's a lot)
2. CUDA optimizations as far as computing performance, I do have computing needs and have needed them for a long time. The only problem is 99% of the work I do is CUDA optimized so my AMD cards are pretty gimped compared to the other people on the projects.
3. SLI prospect and high resolution gaming, with 6 gig of VRAM, I'm not worried about running things between my 3 monitors while making it still look pretty, especially when I by a second one to drive it better

For anything openCL and when it comes to bitcoin mining (mining is largly not worth it anymore though), I would still say go with AMD cards but if you relate to the reasons above, the titan isn't such a bad deal

Also if you are in it for purely gaming and don't intend to push resolutions to the absolute rediculous, then go with the gtx 690
 
Last edited:

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
The amount of fanboyism in this thread makes me want to go somewhere else. Geez, I can't believe some are defending Titan against a 7970 GE. $400 vs $1000. I don't care how you slice it, that card is a ripoff. I can get 70-75% performance on a 7970GE and narrow the gap even closer if I overclock. Don't get me wrong, Titan is a great card, but the price puts it in highway robbery status and you can't even fully overclock the thing! The $550 AMD launch price was nothing compared to this and I would not even complain if AMD released the 8970 at $550 because its still would be better price/performance if the increases were only a modest 15%.


Once I can find a GTX 690 at $750-800 this year, that will be my next card.
/end rant

I have to agree with this, although this memory delay frame stutter crap need more investigation...but if you need anything more than 7970GE, what more is there, certainly not CF......
 

-Slacker-

Golden Member
Feb 24, 2010
1,563
0
76
Can someone calmly explain to me how in the following [H] graph, the 7970's performance is "laggy" and "choppy", as opposed to Titan's which, according to Kyle, is perfectly playable?

1361407369LgJkN5z5XL_5_3_l.gif



I mean, asside from the 10% average fps advantage, neither of the two roller-coaster lines looks any more ... rollery ... coastery than the other (yes, that's a real word - look it up).
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Can someone calmly explain to me how in the following [H] graph, the 7970's performance is "laggy" and "choppy", as opposed to Titan's which, according to Kyle, is perfectly playable?

Where have you been for the last few months? I would recommend starting with this article and then googling for others:

http://techreport.com/review/24051/geforce-versus-radeon-captured-on-high-speed-video

FPS is not a good metric because a) most of the time people are citing FRAPS fps which is iffy because it's not what your eyes actually see on the screen (because FRAPs measurements take place earlier in the imaging chain than the actual output on the monitor and do not take into account things that the GPU may do to smooth out your gaming experience); and b) our eyes can see things at tiny fractions of a second, so to measure things in frames per second is way too much time. You can have high fps but jagged playing experience if the time between frames jumps around a lot... so-called "frame time."

I've given PCPer some flak in the past but they are currently at or near the front of the pack when it comes to measuring actual gaming experience vs. simply, flawed fps numbers: http://www.pcper.com/reviews/Graphi...ance-Review-and-Frame-Rating-Update/Frame-Rat
 
Last edited:

Black Octagon

Golden Member
Dec 10, 2012
1,410
2
81
For all the forum RAGE against AMD's MS woes.. storm in a teacup. Heck, Radeonpro smooths frame latencies on AMD cards to become "smoother" than NV. Free program with a nice onscreen monitor, takes a few minutes to setup n leave it on auto.

I strongly disagree. Radeon Pro can MITIGATE the microstutter from Crossfire but not to a level where it's as smooth as a single Radeon card.

I agree that reviews URGENTLY need to start looking systematically at frame times and not just frame rates

I say this as a happy 7970 owner who thinks the Titan is a rip off, btw.

RS: I love reading your posts, but IMHO you too need to start looking beyond mere FPS as a metric of performance
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
For all the forum RAGE against AMD's MS woes.. storm in a teacup. Heck, Radeonpro smooths frame latencies on AMD cards to become "smoother" than NV. Free program with a nice onscreen monitor, takes a few minutes to setup n leave it on auto.

Yes we've seen RadeonPro smoothing.
Rikard (I think) posted Max Payne 3 gameplay video. With log of frametimes that looked lovely indeed.

Pretty sure the game was actually frame skipping. So much for RadeonPro "smoothing".


Actually - that are 3 frames. :|

Fun part is that even the single 7970GHz showing less Frames with the Frame Rating Capture by PCPer.com. So we should wait for the whole report but it looks that AMD is boosting the FPS with a few extra useless frames...

FFS...
I missed that tiny narrow line.

And I thought you have been exaggerating about cheating, but it really looks like AMD is pulling the leg of CF users, and reviewers tbh.

fr_cf_1.jpg
 

-Slacker-

Golden Member
Feb 24, 2010
1,563
0
76
Where have you been for the last few months? I would recommend starting with this article and then googling for others:

http://techreport.com/review/24051/geforce-versus-radeon-captured-on-high-speed-video

FPS is not a good metric because a) most of the time people are citing FRAPS fps which is iffy because it's not what your eyes actually see on the screen (because FRAPs measurements take place earlier in the imaging chain than the actual output on the monitor and do not take into account things that the GPU may do to smooth out your gaming experience); and b) our eyes can see things at tiny fractions of a second, so to measure things in frames per second is way too much time. You can have high fps but jagged playing experience if the time between frames jumps around a lot... so-called "frame time."

I've given PCPer some flak in the past but they are currently at or near the front of the pack when it comes to measuring actual gaming experience vs. simply, flawed fps numbers: http://www.pcper.com/reviews/Graphi...ance-Review-and-Frame-Rating-Update/Frame-Rat


Yeah, I know about the frame time thing, but I wasn't talking about average FPS, I was referring to the graph with rugged-y lines (what do you call those, by the way?)

1361407369LgJkN5z5XL_5_3_l.gif



This is practically a frame-by-frame graph, right? Or is the horizontal scale measured in seconds in stead of frames? There's no mention of that, so I can't tell for sure.

edit: I guess it doesn't really make sense if the horizontal scale is measured in frames, so seconds it is then. But even so, that only means we get an accurate, real-time representation of the game's performance ... and the graph does not show any more jitter from the radeon than form the titan...
 
Last edited:

Termie

Diamond Member
Aug 17, 2005
7,949
48
91
www.techbuyersguru.com
Yeah, I know about the frame time thing, but I wasn't talking about average FPS, I was referring to the graph with rugged-y lines (what do you call those, by the way?)

1361407369LgJkN5z5XL_5_3_l.gif



This is practically a frame-by-frame graph, right? Or is the horizontal scale measured in seconds in stead of frames? There's no mention of that, so I can't tell for sure.

That's a framerate graph, with the average framerate calculated for each one-second interval. It is not a frame-by-frame representation of performance, which would be by milliseconds per frame, rather than frames per second.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76

That's simply a chart of fps over ENTIRE SECONDS of time. That's not useful. What we need is frames per millisecond, if you are going to make a graph like that. Or alternatively, frame times because we are interested in how the cadence of frames changes. Better yet is to do like PCPer and look at exactly what a "frame" looks like.

In case you missed it the first time: frames per second is not a good way to measure real-life gaming experience. You can have high fps and yet still have a very stuttery gaming experience that feels unpleasant.

This is why almost all video card review sites suck when it comes to measuring real-life gaming experience, including Anandtech because Ryan Smith spent his time updating some outdated GPU bench instead of making something competitive with PCPer and Tech Report. Only Techreport, PCPer, and to a limited extent HardOCP (by reporting their subjective feelings about smoothness and not just fps) are worth reading when it comes to judging video card performance as it relates to actual gaming experience.
 
Last edited:

Leadbox

Senior member
Oct 25, 2010
744
63
91
Yes we've seen RadeonPro smoothing.
Rikard (I think) posted Max Payne 3 gameplay video. With log of frametimes that looked lovely indeed.

Pretty sure the game was actually frame skipping. So much for RadeonPro "smoothing".




FFS...
I missed that tiny narrow line.

And I thought you have been exaggerating about cheating, but it really looks like AMD is pulling the leg of CF users, and reviewers tbh.

fr_cf_1.jpg
Is it AMD's fault that fraps counts that as 2 frames?
Are you also saying its deliberate?
 

Imouto

Golden Member
Jul 6, 2011
1,241
2
81
Vsync anyone? Frame limiting anyone?

Looks like PCPer method to measure is going to be useless too.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
PCPer can run it with Vsync and see if that changes anything. Hardly useless. And WAAAAAAAAAAAAAY more useful than 99% of review sites which are still using only the outdated fps metric.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
That's simply a chart of fps over ENTIRE SECONDS of time. That's not useful. What we need is frames per millisecond, if you are going to make a graph like that. Or alternatively, frame times because we are interested in how the cadence of frames changes. Better yet is to do like PCPer and look at exactly what a "frame" looks like.

In case you missed it the first time: frames per second is not a good way to measure real-life gaming experience. You can have high fps and yet still have a very stuttery gaming experience that feels unpleasant.

This is why almost all video card review sites suck when it comes to measuring real-life gaming experience, including Anandtech because Ryan Smith spent his time updating some outdated GPU bench instead of making something competitive with PCPer and Tech Report. Only Techreport, PCPer, and to a limited extent HardOCP (by reporting their subjective feelings about smoothness and not just fps) are worth reading when it comes to judging video card performance as it relates to actual gaming experience.


You can add hardwarecanucks!

http://www.hardwarecanucks.com/foru...orce-gtx-titan-6gb-performance-review-15.html
 

-Slacker-

Golden Member
Feb 24, 2010
1,563
0
76
That's simply a chart of fps over ENTIRE SECONDS of time. That's not useful. What we need is frames per millisecond, if you are going to make a graph like that. Or alternatively, frame times because we are interested in how the cadence of frames changes. Better yet is to do like PCPer and look at exactly what a "frame" looks like.

In case you missed it the first time: frames per second is not a good way to measure real-life gaming experience. You can have high fps and yet still have a very stuttery gaming experience that feels unpleasant.

This is why almost all video card review sites suck when it comes to measuring real-life gaming experience, including Anandtech because Ryan Smith spent his time updating some outdated GPU bench instead of making something competitive with PCPer and Tech Report. Only Techreport, PCPer, and to a limited extent HardOCP (by reporting their subjective feelings about smoothness and not just fps) are worth reading when it comes to judging video card performance as it relates to actual gaming experience.

Right .. I suppose that makes sense ... because the graph still represents averages - they're just averages from second to second ... and there can still be jerkiness in that interval, however small.
 

Imouto

Golden Member
Jul 6, 2011
1,241
2
81
This method to measure frame latency is useless since no one looking for good visuals would let his screen tear. And both GTX 680 SLi and HD 7970 CF tear without Vsync on on BF3.

So you don't like your screen to stutter but you will let it tear for the sake of measuring visuals in an induced awful scenario.

This is the same as that TechReport guy benching Skyrim without Vsync when it messes up the physics. Pure geniuses dealing with this issue!