He has a GTX660Ti, not quit a Titan card...
I doubt we will see 30% increase on the 780 GTX. No process shrink normally means very small performance increases.
Probably a 3GB 780 GTX between 680 GTX and Titan
Yep, OCing Titan is pretty pointless, once the card starts to heat up after ~10mins it basically settles at stock performance, regardless what you OC it to.
It's sad when these myths keep getting repeated. Yeah, random posters know more than the reviewers who test these cards. That tell us the reviews takes hours/days to run. That they repeat their findings multiple times if necessary.
Review sites use different techniques. Some game-play/ some automated scripts of game-play/ some scripts of in-game benchmarks. Some a mix of all.
Nope the cards never heated up/ sarcasm.
In fact, the power testing and sound testing is all flawed as well! And in the case of AMD cards, they always use the wrong driver!:'( So another 10% has to be added for them! Always./sarcasms!
Proof matters more then going on a binge.
The warming phenomena was discussed around launch. I haven't seen any Titan updates.
Links? This is the second time ill ask for proof so instead of whining about myths, prove your point.
Read the Guru3d article. You can change the Temp target now so it won't throttle based on temps. It will max the clock speed until that temp is reached or you hit the TDP limit(or it crashes/artifacts)
Strange that they will be making the Titan obsolete after only four months assuming these cards are released in July. Paying $1000 compared to ~500 for a 780 that should be nipping at its performance. I'd be a little pissed if I had dropped 1k only for nvidia to drop a new product four months later.
A prime example of a straw man, prove to notty , notty conspiracy theory is wrong.
Why should I ? You present to members here , that a majority of reviewers/ findings who agree with your theory. That their own review results/conclusions were wrong or compromised because of conditions you are citing.
To check whether the GeForce GTX titanium is held back by monitoring the power targets, and the temperature, we have an additional test with the maximum possible power target (106 percent), with the maximum temperature carried out (95 degrees Celsius). These values ​​were as "GeForce GTX Titanium (Max)" into the charts. Moreover, we have the graphics card before each test run "warmed up" for a few minutes so that the temperature rises to a realistic level and the GPU Boost clock speeds to accommodate it. We do this because at lower temperatures, the GTX titanium clocked higher and thus provides better FPS values ​​after a few minutes, however, no longer be reproduced.
It's up to you to prove the Titan doesn't throttle down after warming up (anymore, if true). This was widely discussed and known after launch (I know you know about it).
Ok, you obviously have no proof or you would be demonstrating it and flaunting it. All your fake straw man bs because you simply don't have a source. Way to go, just try tarnish the guy who brings up a flaw in your beloved NV.
I don't even care other than the benchmarks get skewed unless the card is warmed up. They are not realistic of stock performance, unless you change settings yourself. This is my nitpick but apparently it's so (seen as) anti-NV that some can't handle it.
http://translate.google.com/transla...fikkarten/2013/test-nvidia-geforce-gtx-titan/
Note their charts show two versions, one which when warmed up is lower.
They might have alleviated it by changing the power to 106% and 95C? Can anybody find an English source that proves this without google translator?
There is no way this is 7xx series desktop cards. They would be insane to not launch around the time the new consoles are released (late fall). All the upgrade frenzy due to new console hardware is something they will capitalize on. This is probably a $750 Titan little brother.