GeForce Titan coming end of February

Page 21 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

OCGuy

Lifer
Jul 12, 2000
27,227
36
91
given the prelimimary specs (85% of 690, 6GB, 384bit).

in comparison to quad 7970 cf (104% of 690, 3GB, 384bit).

will take two - if the price is $600 a piece.

I promise you that two GPUs will scale better than four.


See, even that, which should be common knowledge, is impossible to know, but I said it anyway! Everyone states a position on theoretical pricing and performance, and then they have to continue to defend that position even if it ends up being different when the actual reviews come out.

It was getting boring around here, and now it is fun again.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
given the prelimimary specs (85% of 690, 6GB, 384bit).

in comparison to quad 7970 cf (104% of 690, 3GB, 384bit).

will take two - if the price is $600 a piece.

Don't you mean 7970CF? Not quad CF.

Besides, 85%/690≈50% faster than a 680. It just sounds better saying 85% of a 690. An O/C'd 7970≈35% faster than a 680. Not a lot of extra performance for people who already have a 7970. Especially if it's voltage locked again, like first gen Kepler is. Certainly not for twice the money.
 

UaVaj

Golden Member
Nov 16, 2012
1,546
0
76
thanks for the correction.

$600 for the preliminary specs. ready to pre order a pair.
 

OCGuy

Lifer
Jul 12, 2000
27,227
36
91
If it is $600 and takes a bigger leap than 580-680 or 6970-7970 did....they won't be able to keep up with demand.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
If it is $600 and takes a bigger leap than 580-680 or 6970-7970 did....they won't be able to keep up with demand.

The 7970 is about equal to a 6990. Faster once O/C'd. Not 85% of a 6990. Although at release it wasn't as fast. I would expect mature drivers for Titan though as it's not a new arch.
 

UaVaj

Golden Member
Nov 16, 2012
1,546
0
76
If it is $600 and takes a bigger leap than 580-680 or 6970-7970 did....they won't be able to keep up with demand.

$600 is a fair offer.

could care less which team - rather that be the red or the green.

a 7970CF is going for $380 on newegg. sell the game bundle quick and $350 all day any day.

so $700 for a pair of 7970CF. math say 780 is 75% of 7970CF's performance. 75% of $700 = $525.

-----

if $375 for limited time bragging right is worth it to you - definitely more power to you.
 

OCGuy

Lifer
Jul 12, 2000
27,227
36
91
The 7970 is about equal to a 6990. Faster once O/C'd. Not 85% of a 6990. Although at release it wasn't as fast. I would expect mature drivers for Titan though as it's not a new arch.

I haven't been around because there hasn't been much news...but is Titan what Oakridge now runs the top supercomputer with?

If so, then that is definately a "new arch". If not, educate my ignorant self. Thanks!
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I haven't been around because there hasn't been much news...but is Titan what Oakridge now runs the top supercomputer with?

If so, then that is definately a "new arch". If not, educate my ignorant self. Thanks!

It's still Kepler.

Edit: ...and add to that it's been in use for months now at Oak Ridge, and I'd expect the drivers not to be raw, 1st gen work.
 

HurleyBird

Platinum Member
Apr 22, 2003
2,684
1,267
136
Depends how you define "new arch. "Better phrase might be a "sister arch" to little Kepler. In any case, there's likely more of a difference between the two than, say, VLIW5 vs. VLIW4.
 
Last edited:

psolord

Golden Member
Sep 16, 2009
1,910
1,192
136
That would be true if multi-GPU didn't suck.

It doesn't. I can still play Crysis 3 at 60fps, 1080p, adaptive vsynced, max settings bar smaa and motion blur, on my GTX 570 SLI system and no microstutter crap in sight..

Point being, with single card the game runs at 30fps. So how exactly multi gpu sucks, since it doubled my framerate, elevating my gaming experience to where it should be?
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,601
2
81
It depends on your fps. With my 580 SLI I don't want to drop below about 50fps because then I get microstutter. With a single GPU that would be no problem. Also there is the issue of scaling. While SLI scales excellent in most cases, there are those games where problems exist. In Skyrim I still get graphic glitches in the water, in FC3 with the original bits scaling in certain locations is abysmal and sometimes it doesn't work at all (Settlers 7 for example).

While I know how to fix many issues with my SLI setup through use of special compatibility bits, that is not something everyone can or wants to do. If you can get near the performance of a SLI/CF setup with one GPU, that is always the better choice.
 

Whitestar127

Senior member
Dec 2, 2011
397
24
81
If this whole Titan rumor is true, shouldn't we start to hear it from other sources as well by now, besides SWEclockers? Are there any other sources reporting this?
 

Black Octagon

Golden Member
Dec 10, 2012
1,410
2
81

That particular link isn't working for me right now, but I did read it earlier this week. With all due respect, people should be very careful about relying on WCCFTech rumours. When they post the sources of their rumours, it's always something we've already heard about (e.g., the original SWEClockers article). When they post 'new' information like this, they just refer vaguely to 'sources' without getting specific. It's never they who have the tech in their hands.

More to the point, they've published some truly drivel rumours in the past, so much so that I consider them the tech journalistic equivalent of that hilarious space between Fox News and The Onion. In short, I don't think they merit being described as a 'source.'
 

psolord

Golden Member
Sep 16, 2009
1,910
1,192
136
It depends on your fps. With my 580 SLI I don't want to drop below about 50fps because then I get microstutter. With a single GPU that would be no problem. Also there is the issue of scaling. While SLI scales excellent in most cases, there are those games where problems exist. In Skyrim I still get graphic glitches in the water, in FC3 with the original bits scaling in certain locations is abysmal and sometimes it doesn't work at all (Settlers 7 for example).

While I know how to fix many issues with my SLI setup through use of special compatibility bits, that is not something everyone can or wants to do. If you can get near the performance of a SLI/CF setup with one GPU, that is always the better choice.

I agree that low fps on dual gpu is worse than in single gpu, but that's why you get dual gpu in the first place, so you don't have low fps.

Also 50fps or thereabouts, is not so bad for dual gpu. Again in Crysis 3, i was getting around 45fps on my older 5850 CFX system and the game was quit playable.

Again, using one card was a disaster.

There's a reason why dual gpu came to be and that's not solely for e-pen reasons. There are very real usability reasons but somehow some people are now all against dual gpu. I've been using dual gpu since the 4870X2 days and never had a problem with it.

Games that are truly gpu limited, are coming with proper multi gpu support. For the rest I as gamer, don't really care.

I honestly would never buy a product that is more expensive than a dual gpu solution and offers lower performance on top of that, just so to avoid the dual gpu.

Is anyone going to tell me that 50fps on a Titan, is better than 60fps on a GTX 660 Ti solution? For real?
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,601
2
81
I agree that low fps on dual gpu is worse than in single gpu, but that's why you get dual gpu in the first place, so you don't have low fps.

Also 50fps or thereabouts, is not so bad for dual gpu. Again in Crysis 3, i was getting around 45fps on my older 5850 CFX system and the game was quit playable.

Again, using one card was a disaster.

There's a reason why dual gpu came to be and that's not solely for e-pen reasons. There are very real usability reasons but somehow some people are now all against dual gpu. I've been using dual gpu since the 4870X2 days and never had a problem with it.

Games that are truly gpu limited, are coming with proper multi gpu support. For the rest I as gamer, don't really care.

I'm quite happy with my SLI setup, but I'm not sugar coating it.

You cannot prevent fps drops to problematic levels here and there, because fps fluctuates all the time. For example FC3: In some areas I get 40-45fps, in others 70-80. And these 40-45fps feel quite bad to be honest.
I could turn down the settings so I would get 60+ all the time, but then I might as well go single GPU with a bit OC and be done with it.
Of course it's different with each game, but you get the message.

I honestly would never buy a product that is more expensive than a dual gpu solution and offers lower performance on top of that, just so to avoid the dual gpu.

Is anyone going to tell me that 50fps on a Titan, is better than 60fps on a GTX 660 Ti solution? For real?

And yes, I consider that a possibility, at least in some titles. fps != fps, that much is clear with AFR. Perceived performance cannot be measured in fps only.
 

Arzachel

Senior member
Apr 7, 2011
903
76
91
I haven't been around because there hasn't been much news...but is Titan what Oakridge now runs the top supercomputer with?

If so, then that is definately a "new arch". If not, educate my ignorant self. Thanks!

Think of it as Celeron/Pentium SKUs compared to Core iX. The latter has more features( AVX, VT-x etc. ) and more cache, but share the same cores. That is also true for the GK110, while it has more cache and bandwidth, beefier DP execution and virtualisation features, the underlying architecture is still the same as it is in GK104 and the like.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Agreed with the last few posts regarding over-volting, and consumer choices, and warranties, etc. In all honesty though, current Keplers would not have benefited from overvolting. The entire lineup bandwidth capped or limited in most real-world situations, and only with 1:1 overclocking between core and memory do the gains add up linearly. For the vast majority of Kepler cards. overvolting would have just increased power usage, temps, and possibly RMA's - not frame rates. Perhaps Nvidia knew this?

Probably! Personally didn't garner much success with over-volting my MSI Kepler Sku!
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Regardless, if Titan allows over-voltage, i'm ecstatic. While it isn't a pre-requisite for a good GPU, it certainly does add value! Consider the Intel K SKUs - as we know those are highly sought after by enthusiasts. I personally don't know any DIYer that bought a non K SKU - people value over voltage and overclocking. Does anyone here know of a DIYer with a non K intel SKU?

That said, if it performs well out of the box then nvidia could certainly do just fine with voltage lock - just as the GTX 680 has done very well.

I'm pretty excited at the hint of overvoltage that unwinder mentioned. I hope it's true. I'm not saying I won't buy it otherwise, but this is definitely a huge value added feature. I know that this will bring needless debate about warranties and what not, but keep in mind that software voltage can and always has been strictly controlled: the software determines how high you can go - and that can be perfectly within reason. For instance on the GTX 580, I was able to get higher clocks (up to 950MHz on lightning 580s) with slight voltage adjustments, however Afterburner had strict limits on how high you could go. Nvidia can offer slight voltage regulation through software while keeping a strict upper limit as to not damage cards - this was done with the Fermi.
 
Last edited:

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
Probably! Personally didn't garner much success with over-volting my MSI Kepler Sku!

I was able to hit 1320mhz with +50 in voltage offset, but I was already at 1240mhz on the core and 7150mhz on vram and the extra 80mhz on the core gave me like a literal 1-2% increase in frame rates while pushing temps and fan speed up noticeably. Wasn't worth it.
 

ICDP

Senior member
Nov 15, 2012
707
0
0
I was able to hit 1320mhz with +50 in voltage offset, but I was already at 1240mhz on the core and 7150mhz on vram and the extra 80mhz on the core gave me like a literal 1-2% increase in frame rates while pushing temps and fan speed up noticeably. Wasn't worth it.

I have found that once my GTX 680 is getting over 1250 core it is VRAM speed that is the biggest factor in performance. Going from 1250ish core to 1320 core gave a 2% increase in FPS for almost 6% core clock increase. The 7950 gave a far more linear increase, going from 1100 to 1175 core gave around 5%-6% performance for 7% core clock increase.
 
Last edited:

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
I was able to hit 1320mhz with +50 in voltage offset, but I was already at 1240mhz on the core and 7150mhz on vram and the extra 80mhz on the core gave me like a literal 1-2% increase in frame rates while pushing temps and fan speed up noticeably. Wasn't worth it.

I hit around 1290 -- around a 40hz increase but wasn't really worth it to me as well.
 
Last edited:
Status
Not open for further replies.