GeForce Titan coming end of February

Page 50 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Lol lon i agree with what you are saying:p,just taking the piss out of you and keys.lighten up.

If I were mad I would tell you ;)
And I hate AF for not letting my use æ, ø and å (danish vouls)..I see "lon" as "ION"...not Løn...enough offtopic ;)
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
According to phk (you all know who he is, right?) earlier drivers might not have shown the full potential of Titan. Remember the 680 where reported core clock was in the 7xx range?

Googlish:
about TT the performance, everyone anxious preliminary test data view is like this, we all pondering new drivers have to retest. And have to queue up (pondering)! Xiaobian we have to work overtime Oh Hey, 7970 must exceed 2GHz to overcome TT hey
http://bbs.expreview.com/thread-55910-1-1.html

TT=Titan btw.
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
According to phk (you all know who he is, right?)

He's the best
164619e07cwtnc8780jte02qz9.gif
leaker out there!!!


(*^___^*)
27124291-indexg5qc9.gif
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
I petition AT to get emoticons similar to those on the Chinese electronics forums!

*that would get so obnoxious*
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I thought it was pretty simple. That was an anticipation of arguments AMD fans will pose against Titan after launch.
Titan will be a big hungry die, and folks like you might like to exploit that fact.
I can understand your antagonistic response above because the wind is already been taken out of your sails and that can be frustrating.
Pretty simple, no?

Way to go being on the defensive already. Who here is even attacking the Titan's power consumption or size? Interesting how AMD's desktop line-up of cards offers superior price/performance and game bundles at most price levels on the desktop and instead of lowering prices or refreshing GTX600 desktop cards, NV's marketing team is trying to hype up Kepler again by making us believe a $900 videocard directly competes with a $380-430 one? :$ What wind is being taken out of AMD exactly? I am sure AMD cares about losing 0.3% of high-end PC gamers who will buy the Titan... There is no incentive whatsoever for AMD to make a 520mm2 die chip to go against the Titan at $900 since the cost to design such a chip specifically for the gaming market would never make it worthwhile at current prices of 28nm wafers (nor is it feasible given 7970Ghz existing power consumption). Kepler architecture is simply better on a performance/watt basis. For NV they have the luxury of successfully marketing/selling GK110 in very profitable $3-5K+ Tesla/Quadro markets. Releasing some of those chips that perhaps didn't make the grade or had broken DP transistors or excess GK110 capacity as gaming cards makes a lot more sense. It's an effective brand equity building/marketing exercise. For NV this move makes perfect sense. For AMD to counter on 28nm makes no sense at all financially. The payoff just isn't there.

Personally I don't think AMD cares about the Titan's 10K launch when their 7950-7970 cards are selling pretty well.
 
Last edited:

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Yeah, I have to snicker anytime one mentions that Titan will radically shift the market. Let's not kid ourselves, 10000 limited edition cards priced at 1000$? Is that a joke? Hardly anyone is going to buy it except the most dedicated PC gamers with 3000$ rigs. That is probably .0001% of the consumer base. I'm thinking about it, but i'm also not the average consumer who only spends 200-300$ on a GPU.

I'm not saying this to detract from nvidia's achievement, obviously the Titan is an awesome beast of a GPU. I definitely wouldn't say no to purchasing one, however I also don't believe the GPU landscape will change much unless nvidia churns out more products at lower price points -- Now if Titan were priced at 500$ AMD would have something to be concerned about, that obviously isn't the case.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I definitely wouldn't say no to purchasing one, however I also don't believe the GPU landscape will change much unless nvidia churns out more products at lower price points -- Now if Titan were priced at 500$ AMD would have something to be concerned about, that obviously isn't the case.

People also seem to not understand where AMD is losing the most market share. It's mobile dGPU space, not so much the desktop space. In Q3 2012, AMD dropped 2% in the desktop, but a mind-boggling 17% in notebooks. NV's excellent execution on the Kepler mobile side involved hundreds of design wins with OEMs, and a lot of those sales were coming from low-end to mid-range chips like GT610-660M, not $500-1000 GPUs. This is why even AT's article on HD7000 2013 roadmap mentioned that AMD's focus is on the mobile side first and foremost. Their HD7000M launch & execution were terrible. People keep correlating overall market share declines AMD suffered with their desktop dGPU declines. It's not even remotely the same thing. When GTX690 ruled the market most of last year it changed little about HD7000 vs. GTX600 sales. What difference will the Titan make for people buying $80-500 GPUs even if they have 100,000 Titans fully replacing the 690 series?
 
Last edited:

dangerman1337

Senior member
Sep 16, 2010
441
77
91
I was wondering, since the Titan doesn't seem to be part of the 600 or a possible 700 series does anyone think that "Titan" will be a new high-end brand, similar or a one-off?
 

psolord

Platinum Member
Sep 16, 2009
2,142
1,265
136
An example is here:
http://www.youtube.com/watch?v=zOtre2f4qZs&feature=player_embedded

Tho not in the same league as a Titan vs GTX690/Dual GTX680 would be.

AFR have and will always be one big pile of poop.

"downgrading" is a funny word to use as desciption of geting rid of microstutter, profiles and other multi-GPU quirks...

Big pile of poop? Big words there.

I've been using dual gpu systems since the 4870X2 came out and never regretted it (Voodoo 2 SLI does not count I guess).

I do most gaming on one of my cards BUT there are times when one gpu is just not enough to keep a solid 60fps.

I have many examples where one card was not enough for my games and when I enabled the second one, everything was smooth as butter. Bad Company 2, Crysis, Warhead, Crysis 2, Witcher 2, Risen 2, Alan Wake, Max Payne 3, to name just a few that popped in my mind. All these turned from a 35-50 lag fest, to a solid buttery smooth 60fps with vsync enabled.

I honestly don't understand where this sudden hatred comes towards dual gpu. Dual gpu was invented to increase performance and it did. Please don't try to tell us we've been crazy all this time, because we saw improvement that was night and day in many cases, in favor of the dual gpu mode of course.

As for the linked video, yes I know that framerate gets too jerky with dual gpu, if it gets too low, but that's why we are getting two gpus in the first place, so the framerate won't drop to such low levels.

I have a very recent example of Crysis 3, which was running at 30fps on my single GTX 570 and jumped to 60fps in SLI mode. It was night and day. No stutter not anything. Even my 5850 cfx system, was running decently at around 45fps in dual gpu mode and was a lag fest on single gpu mode. Are you honestly going to tell me, that 23fps is better than 45fps AFR? Let's be sensible please.

What all this has to do with the Titan thread, is that no matter how wondrous this card will be, it will be in the same performance ballpark as a 7950 CFX or a 670 SLI solution, which will cost considerably less (if the rumors of 900+ euros are correct).

What your "single gpu monster no matter the cost" preaching actually does, is give Nvidia or whatever corp the incentive to demand that we bend so they can stick it to us. Hell no thanks!

PS Some people also forget that other than quirks, dual gpu also has clear advantages:

1) You have two cards instead of one, so if one card dies, you are not left dead in the water. Not all shops all over the world have an instant RMA procedure.

2) You can fine tune your power consumption better, ie there's no need to power up a Titan in order to play Torchlight II. Using one GTX 670 of your SLI setup, would essentially cut power draw in half. And please don't give me this "high end users don't give a crap about power draw" crap. Why not save what can be saved?

3) You have two HSFs to cool the heat output of the card, instead of one. No matter how efficient the HSF of the Titan will be, it will still have to provide twice the cooling and that means higher fan speed and more noise! What? High end users don't mind about noise? Says who?

4) When your two cards get old, they are easier to sell. You can sell one card to one individual and the other card to another. Lower price segments have higher chances to get a sale.

5) Having two cards, allows you to assign a non gaming task to the second card, while you game on the first. Do that on a single card, forcing it to flush/refresh its data/caches all the time and see where it gets you.

6) For physx lovers, having two cards and assigning physx rendering on the second card, will provide a far better gaming experience. Ok that is kind of a moot point since you can get a cheaper card for physx, but still, in dual gpu it's there if you want it.

PS2 Pay a visit to my (now idle) Youtube channel if you want. I have a ton of camera recorded gaming benchmarks there.
http://www.youtube.com/user/toutagamon
Do a search for SLI and Crossfire and let me know in which gaming benchmarks you see any stuttering. Please do.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Imho,

Multi-GPU is a gaming god send that allows gaming performance now instead of being forced to wait sometimes years for a new generation. However, using just frame-rate numbers doesn't tell the whole story and why hardware and software frame metering, adaptive V-sync and smooth V-sync are welcomed, too. Companies trying to improve experiences and going beyond frame-rate, with the help of gamers, investigations, reviewers and articles, too!
 
Status
Not open for further replies.