GeForce Titan coming end of February

Page 62 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

2is

Diamond Member
Apr 8, 2012
4,281
131
106
Lol at the posted graph... Using the standard 7970 instead of the GHZ edition, thats unbiased benchmarking right there :awe:

There is nothing to discuss though... This is just yet another release where Nvidia takes advantage of its lack of competition to annihilate peoples wallets, as they have done since the Geforce 2 Ultra days
If you compare how many times AMD went above 600$ vs Nvidia (not counting dual cards), it gets pretty easy to see why so many people favor AMD on this site

Yeah, they are both companies out to get your money, but one of them is completely shameless about it, taking full advantage of the number of fanboys they have, just like Apple

I don't see "so many" people favoring AMD on this site. I see a handful of people who are overly vocal about it.
 

BoFox

Senior member
May 10, 2008
689
0
0
I can't get an answer anywhere as to when the NDA definitively lifts.

I was hoping by midnight tonight, but it doesnt make sense because Monday is a holiday (President's Day), so it's probably tomorrow night?!?

But I'm feeling that it's really gonna be later on - like as if tomorrow will just be a "press release" of what the Titan will be featuring, etc.. - not even a paper launch until next week.

That site, arabpcworld.com is only showing pictures - there are not even any leaked "reviews" with benchmark stuff, really - which would usually be abound during the day before.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I think what is influencing this price/perf discussion in AMDs favor is that Nvidia cards have been more price stable in the recent past. A year after its launch, the GTX580 1.5GB still was a tad over 400 Euros in Germany, a 20-25% drop from the original MSRP. In the same time period the 7970 dropped about 40% from 549 to now 330 Euros.

Ya, that makes NV cards even worse value since they depreciate like crazy in the used market later if you don't sell them at the right time. Like GTX280 or 480 lost an outrageous amount of resale value in 2 years. There are guys in Japan who bought a GTX580 3GB for less than $60 USD this month.

http://img831.imageshack.us/img831/7454/gtx5803gb.jpg
http://www.xe.com/ucc/convert/?Amount=5300&From=JPY&To=USD

That means in his country your GTX580 3GB x 2 would have lost $880 USD over 2 years of ownership. o_O

As for the 6GB:
I wouldn't dismiss them so fast. Of course you'll need two or more cards to run settings that begin to profit from more than 3GB. But I'd rather not have another VRAM cripple ...Add newer games (next gen console ports?) and texture mods and during Titans lifetime, 6GB might become useful in enthusiast setups.

So spending $1,800 on a pair of 6GB GPUs for potential benefit in games in 2014-2015? Ya, I'd rather pick up 20nm Volcanic Islands or Maxwell or their refreshes when next gen games actually launch and make use of > 3GB of VRAM and save $800 in the process while playing console ports for most of 2013. $499 x 2 2014-2015 GPUs will mop the floor with 2 $1800 Titans. You'll be buying $1,800 worth of GPUs to max out 2013 console ports like DMC, Dead Space 3, Bioshock Infinite and Tomb Raider? Crysis 3 and Metro LL may be the only 2 games in all of 2013 that could push the boundaries, but that's just 2 games. After what happened with 7800GTX 512MB, 8800GTX / U, GTX280 and 480, I would never recommend anyone spend $1,800 to "future-proof". I can understand going from a $250 GPU to a $400 one, but $1,800 to future-proof with 20nm gen out next year? To each his own I suppose.

4K already requires a tad more than 3GB in certain games.

Which games?

HD7970 3GB beats GTX680 4GB in every single game at 7680x1600 in this review. If 3GB wasn't a bottleneck at 7680x1600, where will it be? I don't even think 2 Titans will be fast enough for 3 x 2560x1600 monitors with MSAA, especially not with downsampling. When do you expect 3GB to be a huge bottleneck if it's not at 7680x1600?
http://www.legionhardware.com/images/review/Gigabyte_GeForce_GTX_680_4GB/BF3_02.png
http://www.legionhardware.com/images/review/Gigabyte_GeForce_GTX_680_4GB/AW_02.png
http://www.legionhardware.com/images/review/Gigabyte_GeForce_GTX_680_4GB/Crysis_02.png
http://www.legionhardware.com/images/review/Gigabyte_GeForce_GTX_680_4GB/Deus_02.png
http://www.legionhardware.com/images/review/Gigabyte_GeForce_GTX_680_4GB/Dirt_02.png
http://www.legionhardware.com/images/review/Gigabyte_GeForce_GTX_680_4GB/MOHW_02.png

"That said, we have to wonder why anyone would bother with the GeForce GTX 680 4GB for extreme resolutions when the Radeon HD 7970 GHz Edition was constantly faster at both 5040x1050 and 7680x1600. In fact at 7680x1600 the 7970 GHz Edition was on average 20% faster than the GeForce GTX 680 4GB in the half dozen games that we tested with."
 
Last edited:

Elfear

Diamond Member
May 30, 2004
7,169
829
126
As for the 6GB:
I wouldn't dismiss them so fast. Of course you'll need two or more cards to run settings that begin to profit from more than 3GB. But I'd rather not have another VRAM cripple - I bought my two 580s with 3GB and haven't regretted it since. 4K already requires a tad more than 3GB in certain games. Add newer games (next gen console ports?) and texture mods and during Titans lifetime, 6GB might become useful in enthusiast setups. 4GB would be a sweet spot I guess, but since we're stuck with 384bit, it's "all or nothing" with 3 or 6GB.

I see what you're saying but as far as single cards go (which is what most people buy) the comparison of a 3GB 580 to a 3GB 7970 is much more valid than a 6GB Titan to a 6GB 7970. The resolution/settings you'd have to use to need more than 3GB of vram would cripple a single card. A year ago 1.5GB was already showing it's age but the same is not true today of 3GB.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
"That said, we have to wonder why anyone would bother with the GeForce GTX 680 4GB for extreme resolutions when the Radeon HD 7970 GHz Edition was constantly faster at both 5040x1050 and 7680x1600. In fact at 7680x1600 the 7970 GHz Edition was on average 20% faster than the GeForce GTX 680 4GB in the half dozen games that we tested with."


How much do you think the 680 is being held back by it's 100GB/s less bandwidth at those resolutions?

20% faster with almost 50% more bandwidth... What do you think GK110 will do with considerably more TMU's and bandwidth if the lowly 680 is only 20% slower?


I guess we'll find out soon! :p
 

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
Well looks like Titan is the most talked about card in ATF history(as far as I'm aware) while considering It comes with a price premium :D fun times. So far I think we will all acknowledge

1.Titan is an impressive engineering product, no video card has achieved >=60% performance increase without a node shrink.

2.It is not about value but for bragging rights just like Mars II was.It can be a great product for people currently plagued by multi gpu issues.

3.The card should not be tested without top of the line CPU, it is very likely it will be cpu bottlenecked across a large spectrum of games.
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
A year? The k20 and k20x have been shipping since September. That is 6 months. We didn't see it sooner in geforce because nvidia was backed up with preorders they were obligated to fulfill.

Please people can you at least attempt to stop exaggerating the truth And perpetuating crap hands over fist?

Compared to the first cards released from this generation.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
My post meant that it appears that the Titan has the same voltage controller as reference 680's, which is a controller that basically doesn't let you adjust anything. My post didn't say that it doesn't have a voltage controller, so I don't know what your reply means.

I was agreeing with you that the 680 also had voltage control. :)
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
How much do you think the 680 is being held back by it's 100GB/s less bandwidth at those resolutions?

If 4GB was beneficial over 3GB, GTX680 would have won at least 1 benchmark at Legion Hardware. It didn't win even 1. I agree that Titan's memory bandwidth increase is going to be very important.

Here is Xbitlab's review of GTX670 4GB. In all 6 games where 4GB showed a measurable benefit over GTX670 2GB, 670 4GB could not outperform a 3GB 7970 until GTX670 was overclocked, which implies a GPU bottleneck not a VRAM one. That means 4GB is useless and 3GB is the sweet spot because otherwise the 670 4GB would have smoked the 7970 if 3GB VRAM bottleneck showed up.

http://www.xbitlabs.com/images/graphics/evga-geforce-gtx-670-4gb/04_metro.png
http://www.xbitlabs.com/images/graphics/evga-geforce-gtx-670-4gb/08_civ5.png
http://www.xbitlabs.com/images/graphics/evga-geforce-gtx-670-4gb/09_shog.png
http://www.xbitlabs.com/images/graphics/evga-geforce-gtx-670-4gb/11_hr.png
http://www.xbitlabs.com/images/graphics/evga-geforce-gtx-670-4gb/12_bat.png
http://www.xbitlabs.com/images/graphics/evga-geforce-gtx-670-4gb/15_sniper.png

Titan 3GB for $799 would have been better than Titan 6GB for $899 for games.
 
Last edited:

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
If 4GB was beneficial over 3GB, GTX680 would have won at least 1 benchmark. It didn't. I agree that Titan's memory bandwidth increase is going to be very important.

Yes you can see what happens when you run out with the 2GB card in some cases.

How much do you think bandwidth is holding back the performance of the 680 at 1440+ resolutions currently, and do you think the performance estimates are at these higher resolutions or the typical 1080/1200 that Nvidia concerns themselves with?
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
If 4GB was beneficial over 3GB, GTX680 would have won at least 1 benchmark. It didn't. I agree that Titan's memory bandwidth increase is going to be very important.

he was asking about bandwidth, not capacity, both are necessary for ultra high resolutions/textures

a 4GB GTX680 has the same memory bandwidth as a 2GB GTX680...
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
he was asking about bandwidth, not capacity, both are necessary for ultra high resolutions/textures

a 4GB GTX680 has the same memory bandwidth as a 2GB GTX680...

I get that. This has already been discussed when Sapphire Toxic 6GB was compared to the 3GB version. Even in the context of HD7970's way higher memory bandwidth, real world benefit of 6GB over 3GB isn't there. If it is, it comes at a point where you are talking about 15 fps vs. 19 fps (like 7680x1600). Ya it might show up in Crysis 3 on 3x 2560x1600 monitors with 8x MSAA. You'd need 4 Titans to play that.

How much do you think bandwidth is holding back the performance of the 680 at 1440+ resolutions currently, and do you think the performance estimates are at these higher resolutions or the typical 1080/1200 that Nvidia concerns themselves with?

A lot, I am guessing almost a linear relationship. Titan's 288GB/sec is 50% more memory bandwidth over 192GB/sec of 680. Going beyond 3GB on the Titan is a waste of VRAM. Even 2 such cards aren't powerful enough to drive 3 monitors at 7680x1600 8xMSAA where you'll start running out of 3GB VRAM. You can't play Skyrim with ENB mods on so many monitors either with only 2 Titans. Cases where the 7970 runs out of 3GB of VRAM in Skyrim ENB is when the card is a total slide-show. 50% more GPU performance won't even make a dent under such circumstances.

The 6GB VRAM here is for marketing reasons. It justifies the price more since people love big specs even if they don't matter for real world performance. I am sure NV didn't want to have to face questions why a $900 card shipped with less VRAM than a $600 one. That's another reason.
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
How much do you think the 680 is being held back by it's 100GB/s less bandwidth at those resolutions?

20% faster with almost 50% more bandwidth... What do you think GK110 will do with considerably more TMU's and bandwidth if the lowly 680 is only 20% slower?


I guess we'll find out soon! :p

Since it's twice as expensive I hope it's a lot faster.
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
Ya, that makes NV cards even worse value since they depreciate like crazy in the used market later if you don't sell them at the right time. Like GTX280 or 480 lost an outrageous amount of resale value in 2 years. There are guys in Japan who bought a GTX580 3GB for less than $60 USD this month.

http://img831.imageshack.us/img831/7454/gtx5803gb.jpg
http://www.xe.com/ucc/convert/?Amount=5300&From=JPY&To=USD

That means in his country your GTX580 3GB x 2 would have lost $880 USD over 2 years of ownership. o_O



So spending $1,800 on a pair of 6GB GPUs for potential benefit in games in 2014-2015? Ya, I'd rather pick up 20nm Volcanic Islands or Maxwell or their refreshes when next gen games actually launch and make use of > 3GB of VRAM and save $800 in the process while playing console ports for most of 2013. $499 x 2 2014-2015 GPUs will mop the floor with 2 $1800 Titans. You'll be buying $1,800 worth of GPUs to max out 2013 console ports like DMC, Dead Space 3, Bioshock Infinite and Tomb Raider? Crysis 3 and Metro LL may be the only 2 games in all of 2013 that could push the boundaries, but that's just 2 games. After what happened with 7800GTX 512MB, 8800GTX / U, GTX280 and 480, I would never recommend anyone spend $1,800 to "future-proof". I can understand going from a $250 GPU to a $400 one, but $1,800 to future-proof with 20nm gen out next year? To each his own I suppose.



Which games?

HD7970 3GB beats GTX680 4GB in every single game at 7680x1600 in this review. If 3GB wasn't a bottleneck at 7680x1600, where will it be? I don't even think 2 Titans will be fast enough for 3 x 2560x1600 monitors with 8AA.
http://www.legionhardware.com/images/review/Gigabyte_GeForce_GTX_680_4GB/BF3_02.png
http://www.legionhardware.com/images/review/Gigabyte_GeForce_GTX_680_4GB/AW_02.png
http://www.legionhardware.com/images/review/Gigabyte_GeForce_GTX_680_4GB/Crysis_02.png
http://www.legionhardware.com/images/review/Gigabyte_GeForce_GTX_680_4GB/Deus_02.png
http://www.legionhardware.com/images/review/Gigabyte_GeForce_GTX_680_4GB/Dirt_02.png
http://www.legionhardware.com/images/review/Gigabyte_GeForce_GTX_680_4GB/MOHW_02.png

"That said, we have to wonder why anyone would bother with the GeForce GTX 680 4GB for extreme resolutions when the Radeon HD 7970 GHz Edition was constantly faster at both 5040x1050 and 7680x1600. In fact at 7680x1600 the 7970 GHz Edition was on average 20% faster than the GeForce GTX 680 4GB in the half dozen games that we tested with."

Too lazy to properly quote, forgive me it is early here :)

Value loss:
I was talking about a 1 year period, not nearly 2+ years (280 -> 480, 580 until now). Because then usually a new process node becomes available and that changes things dramatically. So you really should compare apples only to apples.

If you want to include the second hand market:

Newegg lists the cheapest 7970 at $360 (including $20 rebate) with 2 free games. The next more expensive card clocks 19% higher, is quieter and costs only 8% more. At best you would be able to sell a 7970, bought for $549 at launch, for $200-250, because the two games are missing. Even optimistically, the value loss amounts to $600 for two cards...in just 1 year...on the same manufacturing process (unlike your 280/480 or 580 examples where new processes and significantly faster cards were available). I don't even want to know what these cards are worth when the new parts launch, supposedly in Q4.

Let's get back to the actual argument: Listed price in shops, that is what I meant. The used market always is another matter.

6GB:
Okay, maybe not "need", but take a look here:

http://www.pcgameshardware.de/Metro...o-2033-auf-4K-Monitor-in-4096-x-2160-1043768/

3.1GB already in a small level. Think open world, think texture mods.

http://videos.pcgameshardware.de/hdvideo/13041/NEU-Battlefield-3-auf-4KMonitor-in-4096-x-2160

2.5GB, but also on a relatively small map. No large multiplayer environment. Still fine, but for my taste too close to 3GB for comfort, when this game is from 2011, but I want to keep the Titan until maybe 2014.

And don't forget, Titan launches now. I wouldn't expect Maxwell or Volcanic Islands before H2 2014. I think we'll get a refresh of current tech first. That is over a year of gaming fun AND a bit more future proof.
More memory isn't about avg fps only. Stuttering stemming from texture streaming can be annoying, too. Unfortunately, most reviews don't investigate this.
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
he was asking about bandwidth, not capacity, both are necessary for ultra high resolutions/textures

a 4GB GTX680 has the same memory bandwidth as a 2GB GTX680...

Interesting that it's OK to call the 680 bandwidth limited now, when it's been pointed out before as a reason to buy Tahiti it didn't matter then. :confused:
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,330
126
http://www.2compute.net/ASUS393069.htm

850€ before VAT, 1000€ after.

Bad time to live in Europe I guess.

KHnpRFX.jpg
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Interesting that it's OK to call the 680 bandwidth limited now, when it's been pointed out before as a reason to buy Tahiti it didn't matter then. :confused:

Context would be something you're missing.

We can't directly compare AMD's raw figures against Nvidia, since they do less with more. We're just spitballing trying to get an idea of where GK110 might stand in these more limited situations (for fun).

Relax, AMD is still #1 in our hearts and minds.
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
Most of the stuff is normal European pricing http://www.2compute.net/EVGA357152.htm GTX 680.

Although that Titan price has to be early gouging or an early posting. The pre-VAT price is $1150 CDN, they pay more for PC hardware in Europe though, particularly nvidia cards.

This. The 690 dropped by about 100 Euros in a couple of days over here. I would expect 900 Euros a week after launch (unless availability is reeeeally bad and prices spike again).
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
3.The card should not be tested without top of the line CPU, it is very likely it will be cpu bottlenecked across a large spectrum of games.

You don't have to make any special CPU provisions. At 1080/1200P, this would be no different than testing a GTX690. At way higher resolutions/triple-monitor gaming, even GTX690 Quad-SLI system is still GPU bottlenecked in demanding titles. Whoever is getting more than 2 Titans is already running LGA 2011 with at least a 3930 OC. It would be highly doubtful for any review to test Titan in Tri-SLI on anything other than LGA 2011 with an overclocked 6-core i7. And if you are CPU limited, there is nothing you can do anyway since Haswell is still months away.
 
Last edited:
Status
Not open for further replies.