GeForce Titan coming end of February

Page 11 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Elfear

Diamond Member
May 30, 2004
7,097
644
126
To me its just impressive as to what they are about to do. Increasing single card performance by around 170% from a GTX 680. Definitely interested in getting this card, never been a fan of sli/crossfire.

Current rumors are 680 + 50%. Where are you hearing 70% from?
 

lopri

Elite Member
Jul 27, 2002
13,209
594
126
This product is a bit too late for me. It would have gotten my money had it been out 6 months ago.

- No games makes me want better graphics. In the past, there were games that made me want to play the way they were meant to be played. :biggrin: Doom 3, Oblivion, Crysis, etc,. There aren't games that are interesting enough to drive me to get more a powerful video card.
- Nerdy interest has faded having played with Kepler and GCN already. I know what they can/cannot do.
- I do not think I will buy a $500+ video card any more. When I can buy a quality tablet for $200~350 which is a complete system, a mere part that costs that kind of money (or double/tripple) no longer interests me. I know these are targeted at completely different audience and serve completely different purposes. But however irrational, the value just doesn't seem to be there anymore to me personally.

Unless there comes a killer game with photo-realistic visuals, I think I am being satisfied with "good enough" graphics. Perhaps I am getting old.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
This product is a bit too late for me. It would have gotten my money had it been out 6 months ago.

- No games makes me want better graphics. In the past, there were games that made me want to play the way they were meant to be played. :biggrin: Doom 3, Oblivion, Crysis, etc,. There aren't games that are interesting enough to drive me to get more a powerful video card.
- Nerdy interest has faded having played with Kepler and GCN already. I know what they can/cannot do.
- I do not think I will buy a $500+ video card any more. When I can buy a quality tablet for $200~350 which is a complete system, a mere part that costs that kind of money (or double/tripple) no longer interests me. I know these are targeted at completely different audience and serve completely different purposes. But however irrational, the value just doesn't seem to be there anymore to me personally.

Unless there comes a killer game with photo-realistic visuals, I think I am being satisfied with "good enough" graphics. Perhaps I am getting old.

The problem is that every time there comes a game that pushes hardware...people's E-peen get hurt and they whine about bad coding.
Crysis, ARMA 2, Metro 2033 ect.
 

Whitestar127

Senior member
Dec 2, 2011
397
24
81
Maybe I'm alone here, but if it performs 85% of a GTX 690 then I will probably be getting it. Even if it will cost $900. In my currency that is about the same as (maybe a bit more than) some high-end cards from the past, so it's not that bad.
 

Ozegamer

Junior Member
Jan 24, 2013
1
0
0
Is this card actually coming out?
I have been holding off for quite some time, So close to upgrading each month but I wasn't that impressed with the 680 cards, I always figured they were midrange rebadged as the die size just doesn't add up. I know everyone says otherwise but the top cards have usually always been the huge dies.
I have been after something to replace my nuclear reactors(480 SLI), And this card sounds like it is the one. 690 is good but I would rather retain single card solution and add another in a year or two as games need it and they probably will once next gen consoles come out there will be no excuse for crap ports over to pc. These rumours of 50% higher than a 680GTX I am extremely impressed. I was thinking the usual 15-20% but if it is double that I will get this card if it's under $1k. I am sure I am not the only one with 480's that just couldn't justify the move up to a 680, 480's score the same and sometimes slightly higher than a single 680 card. And 690 is just a bit too extreme, Something in between would be perfect.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Someone correct me if I'm calculating this wrong. So, hypothetically the Titan will be 85% of the 690.

From the TPU review of the 690:
perfrel.gif

The 680 is 69% of the 690 overall. 85% of the 690 will be 23% faster than the 680 average at all resolutions.

At 1080 the 680 is 66% of the 690:
perfrel_1920.gif

That makes the Titan 29% faster than the 680.

At 1600 the 680 is 59% of the 690:
perfrel_2560.gif

That makes the Titan 44% faster than the 680.

This is assuming the same amount of CPU overhead. Which might not be the case. Even best case at the least CPU bound resolution the Titan is 44% faster than the 680. At 1080, where a majority play it's only 29% faster.
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,601
2
81
The majority will not buy Titan. Why do you downplay Titan's potential on basis of a CPU bottleneck? Bias?
The 690 is 80-85% faster where it counts when there is no bottleneck. Question is how Nvidia or the source calculated the 690 performance. That remains to be seen.
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I'm presenting figures. How can that possibly show bias? I'm saying that Titan could do better at resolutions where the 690 is CPU bottlenecked because a single GPU uses less CPU than SLI. What does possible sales numbers have to do with anything?
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,601
2
81
Problem is, these figures are not showing the full potential of the 690 due to bottlenecks as I have shown in links to SirPauly.

And most of the people buying Titan will use it properly, i.e. 1440p and above, 3DVision, SGSSAA, newer games etc. 1080p ist not relevant for Titan.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,108
1,260
126
Problem is, these figures are not showing the full potential of the 690 due to bottlenecks as I have shown in links to SirPauly.

And most of the people buying Titan will use it properly, i.e. 1440p and above, 3DVision, SGSSAA, newer games etc. 1080p ist not relevant for Titan.

680SLI already creams 2560x1600, 7970GE CF does even more so. One of these single 'Titan' cards will come up short for that resolution, you'll still need two, at which point you'll be over the mark on what you need. I think a single Titan will have a lot of relevance for 1080P by finally delivering a single GPU that handles everything @ 1080P. Of course it is not going to be a card hardly any typical 1080P gamer will buy.

Two of them will wind up being overkill for 2560x1600/1440 as crazy as that sounds. You'll need SGSSAA or triple monitors to get them to flex their muscle.

Nvidia could be on to this as well. AMD/nvidia have to be waking up to the fact there is too much GPU power at this point with no games to push them. So we get $500 300mm2 dies with meagre performance increases and $900 low volume flagships that used to be $500 in the past. Probably makes a lot more sense to go this way. Take the have to have the best regardless of cost segment to the cleaners with 80% mark-ups in the form of $900 flagships and keep selling GK104 for $500.

This card could release just with the 'Titan' moniker and no other iterations of it with the GK104 continued to be sold as is. ;) I think that is the most likely scenario here if this price point is accurate.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
680SLI already creams 2560x1600, 7970GE CF does even more so. One of these single 'Titan' cards will come up short for that resolution, you'll still need two, at which point you'll be over the mark on what you need. I think a single Titan will have a lot of relevance for 1080P by finally delivering a single GPU that handles everything @ 1080P. Of course it is not going to be a card hardly any typical 1080P gamer will buy.

Two of them will wind up being overkill for 2560x1600/1440 as crazy as that sounds. You'll need SGSSAA or triple monitors to get them to flex their muscle.

Nvidia could be on to this as well. AMD/nvidia have to be waking up to the fact there is too much GPU power at this point with no games to push them. So we get $500 300mm2 dies with meagre performance increases and $900 low volume flagships that used to be $500 in the past. Probably makes a lot more sense to go this way. Take the have to have the best regardless of cost segment to the cleaners with 80% mark-ups in the form of $900 flagships and keep selling GK104 for $500.

This card could release just with the 'Titan' moniker and no other iterations of it with the GK104 continued to be sold as is. ;) I think that is the most likely scenario here if this price point is accurate.

Some good points indeed. GK104 *should* get an update even if it's only through node process / yield improvements as nvidia has already started branding some mobile products with 700 series names. Nvidia will likely pair it up with 6.4-6.6ghz vram and increase boost clocks to 1100-1150mhz to get a 10% performance increase while staying within the same tsp.
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,601
2
81
groove, my main point was that you cannot and should not judge performance with the brakes engaged, so to speak.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
groove, my main point was that you cannot and should not judge performance with the brakes engaged, so to speak.
What makes you think NVIDIA's not? I didn't see any figures about titan being xx% faster than a GTX 680, but rather 85% of a GTX 690. Everyone else is just assuming they're letting the GTX 690 stretch its legs. If this is incorrect, then nvm.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
Thanks again, for the links -- was using ComputerBase as an over-all gauge:

http://www.computerbase.de/artikel/grafikkarten/2012/test-nvidia-geforce-gtx-690/4/
The problem might be that you're using an article that's 10 months old. More recent data will give a more accurate picture: http://www.techpowerup.com/reviews/VTX3D/Radeon_HD_7870_XT_Black/28.html

The GTX 690 is 60% faster than a GTX 680 at 1080p and 74% at 1600p. This equates to a Titan being 36% faster than a GTX 680 at 1080p and 48% faster at 1600p, assuming a linear correlation. This is more in line with what would expect from the raw numbers already calculated in this thread (~35% higher shader power, ~35% higher memory bandwidth).
 

Homeles

Platinum Member
Dec 9, 2011
2,580
0
0
The problem might be that you're using an article that's 10 months old. More recent data will give a more accurate picture: http://www.techpowerup.com/reviews/VTX3D/Radeon_HD_7870_XT_Black/28.html

The GTX 690 is 60% faster than a GTX 680 at 1080p and 74% at 1600p. This equates to a Titan being 36% faster than a GTX 680 at 1080p and 48% faster at 1600p, assuming a linear correlation. This is more in line with what would expect from the raw numbers already calculated in this thread (~35% higher shader power, ~35% higher memory bandwidth).
Sorry, where were those numbers calculated? And let me guess, they were based on the faulty assumption that the card will have the same clocks as the K20X?

GTX 680 was clocked 35% higher than K10, and so has pretty much every high end consumer card vs. high end workstation card ever. It's absolutely silly to assume the 780 won't be clocked higher.
 

n0x1ous

Platinum Member
Sep 9, 2010
2,572
248
106
Sorry, where were those numbers calculated? And let me guess, they were based on the faulty assumption that the card will have the same clocks as the K20X?

GTX 680 was clocked 35% higher than K10, and so has pretty much every high end consumer card vs. high end workstation card ever. It's absolutely silly to assume the 780 won't be clocked higher.

Have to agree on the clocks. I expect 850-900 mhz
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
Sorry, where were those numbers calculated? And let me guess, they were based on the faulty assumption that the card will have the same clocks as the K20X?

GTX 680 was clocked 35% higher than K10, and so has pretty much every high end consumer card vs. high end workstation card ever. It's absolutely silly to assume the 780 won't be clocked higher.
It's irrelevant since the performance target is what's already been mentioned. It could be clocked at 2GHz, it doesn't really matter since the rumor states 85% the performance of a GTX 690.

I would actual hope it was at lower clocks to get that kind of performance as that can leave more head room for overclocking (which may be moot if the lock down the voltage again). If K20X is already at a 235W TDP, higher clock speeds might require NVIDIA to really push the power envelope, at which case this card won't do much to differentiate itself from what we already have. A 7970 @ 1.3GHz+ is already 75-80% the speed of a GTX 690 in many games. K20X giving slightly more performance at a lower power consumption is nice, but it's just an evolutionary step instead of a leap, especially considering it's coming out more than a year later. Also, for $900 it will be a poor buy, even for $600 it's nothing special, especially since you can't mine bitcoins effectively to offset the cost.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
The problem might be that you're using an article that's 10 months old. More recent data will give a more accurate picture: http://www.techpowerup.com/reviews/VTX3D/Radeon_HD_7870_XT_Black/28.html

The GTX 690 is 60% faster than a GTX 680 at 1080p and 74% at 1600p. This equates to a Titan being 36% faster than a GTX 680 at 1080p and 48% faster at 1600p, assuming a linear correlation. This is more in line with what would expect from the raw numbers already calculated in this thread (~35% higher shader power, ~35% higher memory bandwidth).

My original point was, over-all, I don't think the GTX 690 is 80 percent faster than the GTX 680.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
even for $600 it's nothing special, especially since you can't mine bitcoins effectively to offset the cost.

I think for 600 dollars would provide some nice gaming value if offered -- if performance is close to a GTX 690 with default clocks. At times, high-end sku's provide some value to me -- 9700Pro -- X1900XTX -- 8800 GTX!
 

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
680SLI already creams 2560x1600, 7970GE CF does even more so. One of these single 'Titan' cards will come up short for that resolution, you'll still need two...

Yup, no doubt about it. For Far Cry 3, a 690 or two 670's is good for pulling about 60fps average....at 1080p. Any less performance than that and you start to suffer at 1080p if you want the game maxed or almost maxed. This Titan card will be a 1080p card for graphics intense games I hate to say. You will infact need two of them for 1440 or 1600p and 2 grand is reaching pretty far for a GPU setup.
I'm not saying it just cause I have them, but at this point (if you like nvidia) I still say two 670's will be the better option and will be about the same price as this Titan card. Paying 2 grand for cards only to see them equalled in a year or so by cards costing half as much really hurts if you don't have money coming out of your ears, or no brains in between them. If you got cash to burn, then hell, grab three of them.
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,601
2
81
With the state of 20nm being even more dire than 28nm was at the beginning (or so I hear), don't hold your breath. I bet we won't see anything significant until Q3 2014.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
My original point was, over-all, I don't think the GTX 690 is 80 percent faster than the GTX 680.
Ah, then I misunderstood. I agree though, I don't believe it is unless you run multi-monitor resolutions to really crush the GPU's.
I think for 600 dollars would provide some nice gaming value if offered -- if performance is close to a GTX 690 with default clocks. At times, high-end sku's provide some value to me -- 9700Pro -- X1900XTX -- 8800 GTX!
To me it's the same argument I had with myself last year getting a 7970 - I imagined I could overclock to match or outperform 6970 CF, which it did. If Titan can overclock as well as it's siblings, it should be able to make up the last 15% and catch the GTX 690/GTX 680 SLI. However, I also knew I could make back my $550 by mining bitcoins, something that I don't see being the case here. If a lot of this changes (especially the bitcoins), I'll pick one up definitely.
 
Status
Not open for further replies.