***Official Reviews Thread*** Nvidia Geforce GTX Titan - Launched Feb. 21, 2013

Page 21 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
It took me three 7970s to get a score like that.

I'm posting way too much, lol!

Anyways, did you see the voltage? I wonder if it's dynamic and went way up, or if he's actually using shipped voltage levels and riding high on the LN2? He's coming about 250Mhz short of his 680 score (at least the last one I saw).
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
I agree, though I also think that it'll be a while before people are able to optimize for PS4 (and the next XBox) as well as they've learned how to optimize for the PS3 today, so it is likely that we'll see ~7850 performance at first, and thus games optimized for ~7850 performance which is significantly below that of 79xx. So I don't know who needs more than their existing 79xx or GTX 680/670 for the next year at least, and perhaps longer than that, for playable high settings. Of course this forum is filled with people who insist on 120 Hz monitors or 3x1080p resolution or maxed-out settings including MSAA or whatever, but this forum is a freak and most PC gamers aren't that hardcore and can happily game away on their 7850s for quite some time before the next-gen console games' PC ports grow demanding enough that they'd want to upgrade.

Not to mention we're in a recession still (at least in my book, even if not technically so). So Titan is going to have a hard time finding a market among gamers purely interested in its gaming performance.

:thumbsup:

That's my point. Why spend $1-2 grand on the Titans when most 2013 games are not next gen yet? If you are rocking multi-monitors and are hardcore enough, you are already running GTX680 SLI Lightnings or faster OCed. Where is Metro LL? BF4 only in Q4 '13? GTA V - not even a release date for PC. By the time we get next gen games like Witcher 3, we'll be on Maxwell/Volcanic Islands. This card seems like it launched at the wrong price and at the wrong time. If it came out last year at $1K, it would have allowed people to have the fastest GPU for 2+ years.

The specs for PS4 are not too bad. 8GB of unified GDDR5 and 1.84 Tflops ~ HD7850 is a lot better than HD6670/7670 1GB we heard 1.5 years ago. Developers can get access to the metal of the hardware in consoles. The Titan will be obsolete way before PS4 is. The type of games PS4 will belt out by 2018-2019 will exceed Crysis 3 graphics on the Titan, no doubt. By then the $1000 Titan will be a $100 videocard.



Nice scores, but it still can't max out Crysis 3. :awe:
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Show me the IC responsible for that HARDWARE FRAME METERING.

Not dispositive, but...

http://techreport.com/review/21516/inside-the-second-a-new-look-at-game-benchmarking/11

"In fact, in a bit of a shocking revelation, Petersen told us Nvidia has "lots of hardware" in its GPUs aimed at trying to fix multi-GPU stuttering. The basic technology, known as frame metering, dynamically tracks the average interval between frames. Those frames that show up "early" are delayed slightly—in other words, the GPU doesn't flip to a new buffer immediately—in order to ensure a more even pace of frames presented for display. The lengths of those delays are adapted depending on the frame rate at any particular time. Petersen told us this frame-metering capability has been present in Nvidia's GPUs since at least the G80 generation, if not earlier. (He offered to find out exactly when it was added, but we haven't heard back yet.)"
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
We have a $1000 graphics card that can be almost matched by cards that can be had for half that and you talk about efficiency?

Wow how slow are you? Did you read what I said? I'm not buying, I'm not saying to buy one, I'm not saying it's priced effectively. I'm talking about the technology behind it. If that doesn't interest you then just ignore it.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Thank goodness some people are actually trying to measure end-user experiences better, like those at TechReport and PCPer with high-speed cameras and digital capture cards. Ryan Smith are you listening?????
Not to come off as condescending, blastingcap, but have you tried emailing the AT staff? It's not a huge secret that they don't closely follow these forums. You'd be especially unlikely to be noticed 17 pages into a thread like this.
 

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
Perf/W is somewhat interesting but that's offset a bit by a chunk of those 7.1 billion transistors being power gated off when gaming. It's mainly the MSRP has me concerned for 20nm GPU launch dates and pricing. Year + out until $600 $700 GTX 780/Radeon 8970?

Will the Titan drivers be signed off for professional software? If so, then even at $1000 it has merit as a home workstation (business and games) card.
 
Last edited:

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Regardless of the price, it's an amazing technical achievement. The power draw of Titan is fantastic for it's performance and die size.

I'd be willing to bet that the target market for Titan is not very concerned about the size of their power bills. So that's nice and all, but not exactly a selling point for that market. In the form of Tesla though, yes perf/watt is a big selling point for THAT target market.
 

raildogg

Lifer
Aug 24, 2004
12,892
572
126
Wow how slow are you? Did you read what I said? I'm not buying, I'm not saying to buy one, I'm not saying it's priced effectively. I'm talking about the technology behind it. If that doesn't interest you then just ignore it.

I'm pretty slow but I would not associate efficiency with this item in any way. I understand what you meant. That is my take on it and it does not have any weight to it whatsoever. Sorry if I offended you in any way.

As to the card, I'm a little surprised that this GPU got so much coverage but now most people seem disappointed. Maybe too many rumors and expectations. Next time we should wait for the card to be released instead of arguing for countless days and hours over what might be. But this is what these tech companies want - attention. I'm sure Nvidia doesn't mind hundreds of threads and countless thousands of posts all across the internet over its products - even if they don't meet expectations. We consumers go crazy over such things and then wonder why companies release such products.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
wut?

perfwatt_2560.gif


@ blasting it's still interesting from a tech standpoint.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
Not really, it's right in line with nvidia's other cards and is in the middle of the pack: http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_Titan/28.html

Your own link, genius.

perfwatt_2560.gif


Titan is 550mm^2 and is more efficient in perf/watt than GK104 and Tahiti when all are under full load. The "all resolutions" graph which you probably looked at and decided to respond with, is not indicative with either of those chips hitting full loads in the games tested.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Your own link, genius.

perfwatt_2560.gif


Titan is 550mm^2 and is more efficient in perf/watt than GK104 and Tahiti when all are under full load. The "all resolutions" graph which you probably looked at and decided to respond with, is not indicative with either of those chips hitting full loads in the games tested.

That's an amazing feat if ya think about it!
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
I'd be willing to bet that the target market for Titan is not very concerned about the size of their power bills. So that's nice and all, but not exactly a selling point for that market. In the form of Tesla though, yes perf/watt is a big selling point for THAT target market.

The underlying point I'm trying to make is that Nvidia was able to improve upon the graphics rendering efficiency since releasing GK104/106/107. To a noticeable degree too, if I might add. Whereas with fermi, GF114 was the most efficient chip in terms of perf/watt ( http://www.techpowerup.com/reviews/MSI/GeForce_GTX_550_Ti_Cyclone_II/24.html ), this time around the extra months spent working on GK110 didn't go unused. When has a chip so large ever been so efficient vs. it's siblings and competition?

Nvidia should absolutely be refreshing the other Kepler chips this year (if they aren't, they're stupid). I think Titan's perf/watt is indicative of improvements Nvidia can make to the rest of their chips.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
Your own link, genius.

perfwatt_2560.gif


Titan is 550mm^2 and is more efficient in perf/watt than GK104 and Tahiti when all are under full load. The "all resolutions" graph which you probably looked at and decided to respond with, is not indicative with either of those chips hitting full loads in the games tested.
So its efficiency only counts at resolutions that support your argument, rather than everything? It must be amazing to not have to think when you post, keep it up "genius."
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
As to the card, I'm a little surprised that this GPU got so much coverage but now most people seem disappointed..

I think gamers like the card and its engineering over-all but disappointed in the pricing, which is very understandable.
 

raildogg

Lifer
Aug 24, 2004
12,892
572
126
When it comes to frame per second and power usage, how else would you classify efficiency?

From Yahoo's dictionary:

Effiency:

1. The ratio of the effective or useful output to the total input in any system.
2. The ratio of the energy delivered by a machine to the energy supplied for its operation.

I don't want to discuss the technical aspects of this card but rather its general nature. Efficiency regarding purchases is relative so this is only my opinion.
 
Last edited:

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
They've power gated the DP units so while interesting I don't think it's worth spamming the thread with a large power efficiency graphic.

Will Titan be getting access to any of the Quadro driver features? I'd have to tentatively conclude no based on initial review coverage and nothing really popping up with internet searching.
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
Well I predicted great perf/W, but I expected better than 690.

But that seems far fetched, because Titan is boosting close to GHz (right?), and 690 are binned in hell so... no "told ya" from me :(
 

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
Man, nvidia is taking it on the chin over Titan's out of whack price and less than expected performance. On xtremesystems they're even wondering what is up on price. When many of nvidia's usual most passionate supporters are crying foul on them releasing the most overpriced vga of all time, you know they have crapped the bed.

The big question is just how fast and how far is the price of Titanic going to fall, should be interesting to see. Another gtx 280/260 rapid price drop is coming imo.

Yeah, so really, we dont want anyone buying them right now...do you really think they would shift the price?, even with no competition?
 

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
fuckthat.jpg


Well forget that then. Considering the cost, Titan doesn't offer enough over my 7970 for me to upgrade. Hopefully 2013 brings something better, as my 7970 is already the longest held video card I've ever had.

Wonder how long it will live with such a high OC?
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Well I predicted great perf/W, but I expected better than 690.

But that seems far fetched, because Titan is boosting close to GHz (right?), and 690 are binned in hell so... no "told ya" from me :(

Drivers may mature and over time you may be proven correct!