***Official Reviews Thread*** Nvidia Geforce GTX Titan - Launched Feb. 21, 2013

Page 33 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
Yep, cooler temps would help to reduce leakage (power draw), therefore allowing for yet higher clocks.. perhaps that's how Kingpin could do it, but I still think he was handed a special custom BIOS allowing for much more than 265W... not sure!
Yup no doubt about that.It is not a limit set by the power circuitry but probably by the bios itself.One thing I don't understand is NV's new obsession with power draw.It should have come with two 8+8 pin with a default TDP of 250W and the option to raise it by 25-30%.That way guys who are going to run it under water will have a added incentive.I also want the option to turn off GPU boost and the ability to control the clocks manually.It is a enthusiast product so people buying them will prefer more control over their hardware.
 

BoFox

Senior member
May 10, 2008
689
0
0
I'm not into LN2 cooling myself, like many of the XtremeSystems LN2 "purists", who were a bit arrogant in a way.. I might just as well wipe myself with dollar bills to be like them! That's why I didn't look much into it.. well yeah I guess it certainly does look like EVGA sponsored it along with a custom BIOS but then some chips can be overclocked mad without increasing voltage too much, if they are cooled wayyyy below zero.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Yup no doubt about that.It is not a limit set by the power circuitry but probably by the bios itself.One thing I don't understand is NV's new obsession with power draw.It should have come with two 8+8 pin with a default TDP of 250W and the option to raise it by 25-30%.That way guys who are going to run it under water will have a added incentive.I also want the option to turn off GPU boost and the ability to control the clocks manually.It is a enthusiast product so people buying them will prefer more control over their hardware.

It's actually really more of a pro card than enthusiast card imo. :\

$1000 1/3 DP /w OCing vs $4000+ K20X
 

BoFox

Senior member
May 10, 2008
689
0
0
Yup no doubt about that.It is not a limit set by the power circuitry but probably by the bios itself.One thing I don't understand is NV's new obsession with power draw.It should have come with two 8+8 pin with a default TDP of 250W and the option to raise it by 25-30%.That way guys who are going to run it under water will have a added incentive.I also want the option to turn off GPU boost and the ability to control the clocks manually.It is a enthusiast product so people buying them will prefer more control over their hardware.
NV just doesn't want exploding GTX 590's all over again, spreading like a wildfire through the media!
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
I always thought the faulty EVGA 570s which popped vrms @ 900+ core that went through places like overclock.net was more damaging than imporper driver 590 pops.

Still pretty lame, I enjoyed pulling 60% overclocks off my cards on water :(

I never killed any of them either...
 

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
Most reviewers went with default (afaik), which was 80C prioritized, from what I saw 80C on auto fan would only take fan speed to 50% (2xxx rpm), at which point when the card heated up (prior to that it was running factory max boost) the card would start to downclock and undervolt to bring temps inline with 50% fan speed @ 80C.

Nobody wants to talk about how quiet the card is, so I think we should just assume either priority is power (card will ramp fan to keep cool with power target > temp) or the user created a custom fan speed to keep boost /w 80C temp target.

Otherwise we'd have to start taking into account ambient, open bench vs case, sample quality, and all sorts of other random variables while disregarding controls (priority and fan speed).


Kingpin modded the PCB he added more capacitors and most likely had a custom signed bios from EVGA. AFAIK users can get pretty much the same ability with bios through a flash, but not the modding he did unless they know what they're doing. He also uses a custom power delivery addon PCB with two 8 pins but I didn't see it in the screens.

What are you trying to say? (this is a non-issue iyo, or it is, or what?)

In [h]ard's case the card hit 81C max, so most of the time was likely just under that. The card is downclocking 1019->941 which reviewers are missing with the standard few minute runs.

This is misleading and users are mostly likely not going to run the fan at 100% so this would likely be the way the card is usually ran, thus the benchmarks are potentially skewed. If you use water etc. than this might not be an issue, but the 265w is still a brick wall (without voiding the warranty via modded BIOSes etc).

If the cooling is incapable of providing a quiet way to keep the card under boost restrictions I would say it's pretty weak, or the boost stuff is kicking in too early. Regardless I would like to see benchmarks where the system is warmed up and see how this affects it, it is likely how the end users will use it.
 

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
Yup no doubt about that.It is not a limit set by the power circuitry but probably by the bios itself.One thing I don't understand is NV's new obsession with power draw.It should have come with two 8+8 pin with a default TDP of 250W and the option to raise it by 25-30%.That way guys who are going to run it under water will have a added incentive.I also want the option to turn off GPU boost and the ability to control the clocks manually.It is a enthusiast product so people buying them will prefer more control over their hardware.

I agree 100%.
 

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
NV just doesn't want exploding GTX 590's all over again, spreading like a wildfire through the media!
Titan is way better built than 590 so it can take the punishment :biggrin:
I believe NV shouldn't have released 590 at all and should have left it to Asus, they did a pretty good job with Mars II save the price of course :p
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
NV just doesn't want exploding GTX 590's all over again, spreading like a wildfire through the media!

I don't think that's it. They could easily build it so that doesn't happen. I think it's more a company direction to not allow it. While I agree that it should reduce RMA's from numpties, I think the main reason is to allow more control over granularity between models.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
What are you trying to say? (this is a non-issue iyo, or it is, or what?)

I'm saying it's not really an issue.

In [h]ard's case the card hit 81C max, so most of the time was likely just under that. The card is downclocking 1019->941 which reviewers are missing with the standard few minute runs.

80C is priority target so it's going to be at 79-81 for it's entire life unless you change that.

This is misleading and users are mostly likely not going to run the fan at 100% so this would likely be the way the card is usually ran, thus the benchmarks are potentially skewed. If you use water etc. than this might not be an issue, but the 265w is still a brick wall (without voiding the warranty via modded BIOSes etc).

lol woah there buddy, 100% fan speed? Did you miss the reviews? The card runs dang near silent, it's quieter than a 680 reference. More like 55-65% to keep all these samples at max boost with a 80C thermal priority.

265w at stock is a non issue, it's throttling from heat due to low fan speed not power. :hmm:

power_average.gif


If the cooling is incapable of providing a quiet way to keep the card under boost restrictions I would say it's pretty weak, or the boost stuff is kicking in too early. Regardless I would like to see benchmarks where the system is warmed up and see how this affects it, it is likely how the end users will use it.

Let me know when you get proof of those claims, I'm pretty sure it could maintain boost with 680 levels of acoustics.

Did you not watch any of the video reviews? They demonstrate clearly how the card works.

fannoise_load.gif
 
Last edited:

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
It's actually really more of a pro card than enthusiast card imo. :\

$1000 1/3 DP /w OCing vs $4000+ K20X

Actually Balla organizations never pay the msrp when buying high end quadro or tesla.They buy pre-configured oem systems from HP,Lenovo etc who offers a sizable discount, for e.g we paid $2500 a piece for Quadro 6000 SLi system(HP z800)while in retail it costs around ~$3500.Also if the apps are not optimized it will be a huge waste.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Actually Balla organizations never pay the msrp when buying high end quadro or tesla.They buy pre-configured oem systems from HP,Lenovo etc who offers a sizable discount, for e.g we paid $2500 a piece for Quadro 6000 SLi system(HP z800)while in retail it costs around ~$3500.Also if the apps are not optimized it will be a huge waste.

I'm aware Jay, just like I'm aware "organizations" would never buy a Titan, but other lesser companies or self employed people might :biggrin:
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Actually Balla organizations never pay the msrp when buying high end quadro or tesla.They buy pre-configured oem systems from HP,Lenovo etc who offers a sizable discount, for e.g we paid $2500 a piece for Quadro 6000 SLi system(HP z800)while in retail it costs around ~$3500.Also if the apps are not optimized it will be a huge waste.

What!?!?!?! Are you saying Oak Ridge National Laboratory didn't pay $3500ea for the +18K Tesla K20X's in the Titan? Next you're going to tell us that AMD didn't pay retail for the Never Settle gaming bundles. :D

I'm being facetious, of course. I'm just often amused by the armchair financial experts that presume otherwise. I wouldn't be surprised if we aren't paying more for Titans than ORNL paid for the K20X's.
 

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
I'm saying it's not really an issue.

80C is priority target so it's going to be at 79-81 for it's entire life unless you change that.

lol woah there buddy, 100% fan speed? Did you miss the reviews? The card runs dang near silent, it's quieter than a 680 reference. More like 55-65% to keep all these samples at max boost with a 80C thermal priority.

265w at stock is a non issue, it's throttling from heat due to low fan speed not power. :hmm:


Let me know when you get proof of those claims, I'm pretty sure it could maintain boost with 680 levels of acoustics.

Did you not watch any of the video reviews? They demonstrate clearly how the card works.

The 100% fan was what only as a comment about what could be needed to avoid those pretty big drops in clock speed (1019->941). I know the reviews don't use 100%, if they did it probably wouldn't have dropped so much. I only meant you need to crank up the fan to avoid the aggressive boost drops. The card runs hot, it's hitting 80C, if you want it cooler to maintain higher clock speeds you need to crank up the fan, period.

I didn't think it's dropping from hitting 265w yet, that was a side comment about the ov gimping.

Regardless, if the card is dropping ~8% in speed in [h]'s test when it warms up with the default fan profile (trying to keep it reasonably quiet), you either have to crank up the fan or suffer the 'boost' drops. Cranking up the 690 fan is loud and this is essentially the same thing. Of course someone can test where you need to crank it up to to avoid drops, maybe it's somewhere between and the noise could be measured. Whatever, the original point is the benches may be skewed for the average user who doesn't tinker with the fan profiles, and it's unknown how loud it is if you turn up the fan until boost quits dropping.

More investigation is needed imo.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Well if you'd use words properly without over dramatizing things to further an agenda we wouldn't even be having this conversation.
 

Will Robinson

Golden Member
Dec 19, 2009
1,408
0
0
What he means is equal zeal should be applied....somewhat reminiscent of your fervor for frame latency investigation with Tahiti.
Leave no stone unturned,Internet wide posting on tech forums,vague hints about coverups and shills etc etc...you know the drill.
Perhaps Lonbjerg could lead a commission of inquiry....:thumbsup:
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
I think the above is taking H's comments out of context.
In the various conclusion paragraphs, the Titan is called the ultimate SFF card a couple times.
Looking at the TITAN as a Small Form Factor Gaming GPU


Simply put, the GeForce GTX TITAN is the ultimate small form factor gaming video card.
The TITAN allows system integrators to provide the fastest single-GPU performance in existence, with less of a power demand, less heat output, in a shorter profile compared to the GeForce GTX 690 video card. We specifically tested gameplay performance at a more popular 1920x1080 (1080p) display size to see how that level of performance would affect SFF PC users.
What we found was simply amazing, every single game we tested was able to run at its absolute highest in-game settings, including AA. Even Far Cry 3, we were able to run at 8X MSAA with the highest in-game settings, maximizing the game's quality settings, at 1080p. It also allowed us to play at 8X MSAA with the highest in-game settings in Hitman, which is very sensitive to high AA levels. In addition we were able to play with "Extreme AA" and the highest settings in Sleeping Dogs. With BF3 Multiplayer we were able to play at 1080p with 4X MSAA and the highest in-game settings.

This amazing ability that with a single-GPU we are able to maximize every game we throw at it at 1080p is great for gamers. It means small form factor gamers don't have to suffer a crappy gameplay experience. SFF PC gamers can now have the absolute best performance, and high-end specifications in a GPU with 6GB of RAM on board.
Boost 2.0
The improvements in GPU Boost 2.0 allow for overvoltage and improved overclocking and GPU Boost ability. The ability to not only set power target, but also now temperature target gives the enthusiast a lot of different options to try to get the most clock speed out of their GPU. It is a tweaker’s dream.
Kyle:
Titan’s elegant thermal solution will not exhaust heat into a chassis, like GTX 690. Titan’s new GPU Boost II system will allow system integrators to put together much more complex performance presets that are directly predicated on GPU temperatures and how fans ramp under load. This also allows system integrators to put this monster of a video card into some very small footprint systems that are just not doable with GTX 680 SLI and GTX 690
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
What he means is equal zeal should be applied....somewhat reminiscent of your fervor for frame latency investigation with Tahiti.
Leave no stone unturned,Internet wide posting on tech forums,vague hints about coverups and shills etc etc...you know the drill.
Perhaps Lonbjerg could lead a commission of inquiry....:thumbsup:

Into what?

80C priority with 38dbA max fan speed?

I wonder what mysteries we'll unravel, this GPU BOOST 2.0 is a real doozy!

https://www.youtube.com/watch?v=EUd0VvlhY7k
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
What he means is equal zeal should be applied....somewhat reminiscent of your fervor for frame latency investigation with Tahiti.
Leave no stone unturned,Internet wide posting on tech forums,vague hints about coverups and shills etc etc...you know the drill.
Perhaps Lonbjerg could lead a commission of inquiry....:thumbsup:

I only care for performance, no need to involve me in your red herring...:thumbsdown:

(And by "performance" I mean the combination of FPS, frametimes, I.Q. and features that gives me the best experience with the least/no drawbacks.

Hence why you don't see me caring for perf/$, perf/watt, dB, Sone, multi-GPU, price or gamebundles.)
 

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
I'm aware Jay, just like I'm aware "organizations" would never buy a Titan, but other lesser companies or self employed people might :biggrin:

So far I haven't seen anything about Titan getting the workstation driver features. If it does end up getting access to workstation features it would be a great personal workstation card and it's MSRP would be reasonable for that purpose.
 

AdamK47

Lifer
Oct 9, 1999
15,846
3,638
136
The power target increase was probably a BIOS mod. I know it can be done on the 690 to get 150%
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Quick someone call the police. We need to get to the bottom of this coverup.

Fastest GPU is perhaps NOT the quietest after all :eek:


113% power target WOOT

20990%20single.jpg

Oh come on fisherman. You can argue against this but using a kingpin screenshot? You do understand that he had extra VRM's soldered on his card, sub zero temperatures, had it under LN2, and had a custom BIOS provided to him be EVGA? That BIOS is not available to any consumer, and it would be risky even for HWBOT users with LN2.

This is not what a normal card does. You know this. My favorite part of your screenshot is the 0C temperature at 1752mhz, a 100% overclock. Hmm.

You can make an opposing argument and that's great, but don't use kingpin as your example. I'm sure there's a middle ground somewhere, with one exteme being mentioned here and the other extreme being kingpin. You took the argument from one extreme to the other. Kingpin is absolutely not representative of what a normal person does, unless the normal person uses LN2 / sub zero temperatures with a blowtorch every 5 minutes to prevent cold boot bug. Come on.
 
Last edited: