***Official Reviews Thread*** Nvidia Geforce GTX Titan - Launched Feb. 21, 2013

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
Nice retrospective when he sticks with the facts, but I disagre with some of the speculation parts

  • low yields (GK110) - no idea whatsoever about yields
  • The smaller GF104 appears much later and is no match for AMD's products and can't be used to counter high end products.
    GF104 practically singlehandedly stops rampant Evergreen and remains the best selling card of that generation
  • At this point NV saw how important the GF104, GF114, GK104 die was due to the huge volume of sales they produced. A decision is made to give the mainstream chip more attention so it will be on the market sooner.
    really? NV has had no idea that mainstream is mighty important?
  • GK104/GK100/GK110 speculation
    it's entirely unknown what really happened. True, much of it supports scenario in which GK104 was supposed to be mainstream chip, but 670Ti is simply too close to 680 to warrant GK100/110 exclusivly in 680
  • GTX Titan: GK110 is expensive to produce
    like hell it is. proly $120 same like GF110. R&D is what's expensive

This whole post shows you have no idea what you are talking about. R&D is expensive, but everyone but AMD pays if the wafer has 100% good chips or 20% good chips (AMD's problems caused a renegotiation with GF to charge differently). There are thousands and thousands of GK110s that won't make the cut, and be labled something else and sold.

AMD does the same thing. Every smart company does it. Unless you really think all those Phenom X3s were not supposed to be X4s.....hell, some unlocked.

Unless you think Nvidia intentionally slow-played Oakridge, just to fool a couple tech-forum nerds.
 

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
This whole post shows you have no idea what you are talking about. R&D is expensive, but everyone but AMD pays if the wafer has 100% good chips or 20% good chips (AMD's problems caused a renegotiation with GF to charge differently). There are thousands and thousands of GK110s that won't make the cut, and be labled something else and sold.

AMD does the same thing. Every smart company does it. Unless you really think all those Phenom X3s were not supposed to be X4s.....hell, some unlocked.

Unless you think Nvidia intentionally slow-played Oakridge, just to fool a couple tech-forum nerds.

So show some facts if you supposedly know what you are talking about. A lot of his speculation may be just that, but you didn't disprove anything.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,007
126
nVidia is using dedicated DP units. So it cost no extra power over the GTX60 as long as you dont' activate the DP units and using them for processing.
That's the exact point I was making. Fermi's power draw was constantly punished for having those FP64 units since they couldn't be throttled like they can with Titan.

Also even after throttling them, you’re still paying for them with die size and transistor budget. To me the ideal gaming Titan would be one with all of the FP64 units removed. Games don’t need them at all.
 
Last edited:

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
gtxtitan65.jpg


14883039.png


73533766.png
 

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
Wow, check out the scaling of Titan SLI in 3dmark?, but don't bother with the 3rd.

Don't understand the scaling of 7970 CF against Ares II though?
 
Last edited:

Annisman*

Golden Member
Aug 20, 2010
1,931
95
91
Hmmm I'm not yet confident this will be more powerful than my gtx 670SLI setup, even when overclocked
 

Annisman*

Golden Member
Aug 20, 2010
1,931
95
91
What are you guys doing up this early anyways? I'm in the middle of a midnight shift.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,330
126
What are you guys doing up this early anyways? I'm in the middle of a midnight shift.

:) I work sporadically changing nights shifts myself, but fortunately I'm in the middle of my weekend right now.

Looks good, almost twice as fast as a GTX 580. The actual flagship is their best since the 8800GTX. Unfortunate about the absurd price given that even with overclocking its losing to 690 and 680SLI.

3DMark is generally not the best benchmark, but I think this card will be at its best in SLI configurations. Look at the 2 and 3 way numbers and how its laying waste to 2xAres and 3way 670 & 680SLI setups. Looks like you'll need to get two if you want an upgrade.
 

Annisman*

Golden Member
Aug 20, 2010
1,931
95
91
:) I work sporadically changing nights shifts myself, but fortunately I'm in the middle of my weekend right now.

Looks good, almost twice as fast as a GTX 580. The actual flagship is their best since the 8800GTX. Unfortunate about the absurd price given that even with overclocking its losing to 690 and 680SLI.

Yeah and that puts a dent in my plan to just buy one, as I refuse to spend 1k+ for a performance decrease, I might end up buying 2, which will keep me dealing with multicard issues which I was trying to escape.

Oh the difficulties we face in this day and age.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,330
126
The all-encompassing HWC relative overall performance rating.

7C19n39.jpg


29% better than a 7970GE for.... 225% the price!