***Official Reviews Thread*** Nvidia Geforce GTX Titan - Launched Feb. 21, 2013

Page 19 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I think you're off a bit on this one. The Titan isn't made for millionaires. The Titan is for hardcore PC lovers that have a decent amount of disposable income and like to get really nice stuff. When you think about it $1000 isn't a tremendous amount of money.

But who are these hardcore PC gamers who are willing to spend $1000 on a GPU but aren't already rocking GTX680 SLI / HD7970 GE CF / GTX690? That's my point. Chances are most of these same hardcore PC gamers you describe are either 2560x1600 users or triple-monitor users, etc. Those guys are already running $800-1000 GPUs. For them to upgrade to the Titan legitimately means spending $2000, not $1000. Ok fine, they can recoup some of the cost of their existing GPUs, but the upgrade cost will still be more than $1,000. How many people will deem it worthwhile to upgrade from GTX680 SLI OC / HD7970 CF OC to 2 Titans? I am guessing not that many. That leaves us with people who don't follow PC hardware at all and might buy $5000 PC gaming systems from Falcon Northwest every 4-5 years. Why did NV push for Tri-SLI testing in reviews so much? Why did NV talk about expensive boutique PC builders as their partners for Titan? This doesn't sound at all like a typical GPU launch. Sounds to me NV is targeting the wealthiest PC gamers, not just the hardcore ones.

"4) Buy a GeForce GTX Titan if you have a trio of 1920x1080/2560x1440/2560x1600 screens and fully intend to use two or three cards in SLI. In the most demanding titles, two GK110s scale much more linearly than four GK104s (dual GeForce GTX 690s). Three Titan cards are just Ludicrous Gibs! Gaming at 5760x1200 is sort of the jumping-off point where one GeForce GTX 690 starts to look questionable. Of course, then you’re talking about $2,000 worth of graphics hardware to either go four-way GK104s or two-way GK110s. To me, the choice is easy: a pair of GeForce GTX Titans is more elegant, lower-power, and set up to accept a third card down the road, should you hit the lottery." ~ Source

See, it's not really a $1000 upgrade for most high-end PC gamers on our boards, it's more!

He's just totally ignoring the issue like the AMD PR guy did with Titan. They must have gotten the same memo! :hmm:

Are you saying Ryan Smith is employed by AMD? :$
 
Last edited:

Imouto

Golden Member
Jul 6, 2011
1,241
2
81
here we go again...attack the messenger.

that shot is with Vsync!

http://www.pcper.com/reviews/Graphi...ance-Review-and-Frame-Rating-Update/Battlef-0
bf3settings.jpg


Seriously...
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Everyone can appreciate smoother gameplay. Smoothness is not the same as visuals. Some people are so dense that they equate visuals with playability. And they think Vsync is a cure-all when it is NOT. Frame times still matter.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Playing with Vsync is not resolving the problem. Ether you play with 60-30-60-30 fps steps or with adaptive Vsync which would show the problem, too.

But because AFR is increasing the input lag you can play with triple buffering on one card instead of Vsync with AFR.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Since we're being told that frametimes are the true flawless comparison of a cards performance, anyone care to explain this?

bf3-25x14-per_0.png


This also matches results at Tomshardware showing SLI with better frametimes. That just isn't possible.

Oh but wait. Frametimes are the ultimate method to compare video cards now, I forgot. I'm sorry for questioning it. I guess single cards apparently now have more microstutter than SLI in specific games, note to self: never buy a single card again. Duly noted.
 
Last edited:

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
The slithers are the result of tearing because vsync is off. You see them on the nVidia screens, too.

This is the same graph with data gathered from our method that omits RUNT frames that only represent pixels under a certain threshold (to be discussed later). Removing the tiny slivers gives us a "perceived frame rate" that differs quite a bit - CrossFire doesn't look faster than a single card.

I'm defining slither as a runt frame!
 

parvadomus

Senior member
Dec 11, 2012
685
14
81
Hopefully the crossfire thing is addressed soon by a driver, and obviously it has to do with the game being tested.
That aside 7970CF remains being much more powerful than a single titan in every metric but power consumption, and it just demolishes the turd in performance/price.
 

chimaxi83

Diamond Member
May 18, 2003
5,457
63
101
Since we're being told that frametimes are the true flawless comparison of a cards performance, anyone care to explain this?

bf3-25x14-per_0.png


This also matches results at Tomshardware showing SLI with better frametimes. That just isn't possible.

Oh but wait. Frametimes are the ultimate method to compare video cards now, I forgot. I'm sorry for questioning it.

Yeah, didn't you get the memo? Whatever method furthers agendas is the one the fanboys latch onto.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
I'm defining slither as a runt frame!

How do explain frametime inconsistencies where SLI actually has higher 99% frametimes than single cards? Is this the true video card comparison method that we need?

border-1920-latency.png


So the 690 has better 99% frametime than a GTX 680. Yeah ok.

Lower is better. Yeah ok. Did anyone bother to question this stuff? I'm just seeing tons of frametime benchmarks where stuff that shouldn't be happening, is. I'll give you time to get back to me on this.
 
Last edited:

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
Evidently the concept to too much for some to grasp.
They don't grasp when they are cherry picking, and cherry picking low results (good) across the board. They just see them graphs, and them directions that point to what's better , higher or lower.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
They don't grasp when they are cherry picking, and cherry picking low results (good) across the board. They just see them graphs, and them directions that point to what's better , higher or lower.

Oh okay. If you say so. I've already decided that SLI has less microstutter than single cards do, the benchmarks at PCPer and tomshardware prove it. I made the right choice with SLI lightning 680s, hope you're in a similar boat with SLI. Lucky me I guess.

I checked about 5 pages of tomshardware and without exception the GTX 690 has less microstutter than a single GTX 680. Good stuff I tell ya, good stuff.
 
Last edited:

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
How do explain frametime inconsistencies where SLI actually has higher 99% frametimes than single cards? Is this the true video card comparison method that we need?

border-1920-latency.png


So the 690 has better 99% frametime than a GTX 680. Yeah ok.

Lower is better. Yeah ok. Did anyone bother to question this stuff? I'm just seeing tons of frametime benchmarks where stuff that shouldn't be happening, is. I'll give you time to get back to me on this.

GTX 690 has hardware frame metering, from my understanding but when one raised the complexity with multi-monitor resolutions in that specific game -- the single GPU's did shine a bit more.
 

Imouto

Golden Member
Jul 6, 2011
1,241
2
81
GTX 690 has hardware frame metering, from my understanding but when one raised the complexity with multi-monitor resolutions in that specific game -- the single GPU's did shine a bit more.

PCPer is using a GTX 680 SLi with the same figures, SLi throwing better frame latencies than single cards.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
PCPer is using a GTX 680 SLi with the same figures, SLi throwing better frame latencies than single cards.

Which is entirely possible, as seen by AMD single doesn't mean better.

Nvidia has used some scaling to increase smoothing, it doesn't always work though sometimes you'll see 690 throwing frame times as ugly as AMD's single cards.


Anyways, on to more important things than AMD vs Nvidia...


20990-single.jpg

Single-3dmarkX.jpg

single-rig-2-635x476.jpg


610x.jpg


http://wccftech.com/kingpin-achieves-world-record-quadway-geforce-gtx-titan-air-hits-1750-mhz-ln2/
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
20nm cards will likely have pretty bad price/perf increases just like the first batch of 28nm cards, thanks to Apple and co. eating up so much TSMC 20nm capacity.

On the other hand, do we HAVE to upgrade? Have you seen the specs on the PS4? I don't think anyone with a 79xx or 680/70 needs to upgrade anytime soon. Consolification has stunted PC game development for years, and I don't think Crysis 3 can reverse that trend all by itself.

:thumbsup:

That's my point. Why spend $1-2 grand on the Titans when most 2013 games are not next gen yet? If you are rocking multi-monitors and are hardcore enough, you are already running GTX680 SLI Lightnings or faster OCed. Where is Metro LL? BF4 only in Q4 '13? GTA V - not even a release date for PC. By the time we get next gen games like Witcher 3, we'll be on Maxwell/Volcanic Islands. This card seems like it launched at the wrong price and at the wrong time. If it came out last year at $1K, it would have allowed people to have the fastest GPU for 2+ years.

The specs for PS4 are not too bad. 8GB of unified GDDR5 and 1.84 Tflops ~ HD7850 is a lot better than HD6670/7670 1GB we heard 1.5 years ago. Developers can get access to the metal of the hardware in consoles. The Titan will be obsolete way before PS4 is. The type of games PS4 will belt out by 2018-2019 will exceed Crysis 3 graphics on the Titan, no doubt. By then the $1000 Titan will be a $100 videocard.

Anyways, on to more important things than AMD vs Nvidia...

Nice scores, but it still can't max out Crysis 3. :awe:
 
Last edited:
Feb 19, 2009
10,457
10
76
Which is entirely possible, as seen by AMD single doesn't mean better.

Nvidia has used some scaling to increase smoothing, it doesn't always work though sometimes you'll see 690 throwing frame times as ugly as AMD's single cards.
[/url]

So its all a myth that there's more MS with SLI than single GPU, all these years??
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
PCPer is using a GTX 680 SLi with the same figures, SLi throwing better frame latencies than single cards.

Indeed, just read their review. Interesting -- need more investigations and more sites to see if they have similar findings now. Could possibly bode well for SLi.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
So its all a myth that there's more MS with SLI than single GPU, all these years??

No, Nvidia is using software and hardware to smooth frame delivery in SLI with Kepler.

AMD is still brute forcing, they have great scaling but their frame times are... well awful.