GeForce Titan coming end of February

Page 73 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Rvenger

Elite Member <br> Super Moderator <br> Video Cards
Apr 6, 2004
6,283
5
81
Since when did the 7970 get called a midrange card? That's the first I ever heard of this. The garbage that gets spread around these forums amazes me. I swear we have instigators here that just keep pressing to get their point across.
 

chimaxi83

Diamond Member
May 18, 2003
5,457
63
101
Just curious, what do you think enthusiasts do on an enthusiast forum if they aren't discussing new hardware?

I myself actually enjoy discussions about new hardware, and as an enthusiast a very important topic is price/performance. If I'm to understand correctly people with differing views are supposed to stop contributing to the forums? That makes no sense.

Exactly. According to all the usual characters, unless this card makes you jizz your pants, you don't belong in this thread, and you're just mad because you're too poor to buy the card. Lol NVidiots never fail to deliver.
 

chimaxi83

Diamond Member
May 18, 2003
5,457
63
101
Since when did the 7970 get called a midrange card? That's the first I ever heard of this. The garbage that gets spread around these forums amazes me.

Since Balla decided that calling it midrange would further his opinions in this thread.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Since it was named GK104.
Look into it.

NVIDIA used the time to refine GK110 yields, due to AMD's GNC not scaring them.

Do people posting in the V&G forum have some kind of special shortterm memory?

GF104 -> GF114 -> GK104
GF100 -> GF110 -> GK110

Simple as it's gets...
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Since Balla decided that calling it midrange would further his opinions in this thread.

I actually never said the 7970 was a mid-range design (I said the 680 was), only that it performed similar to a mid-range product.

At nearly the same consumption levels it's slower than the 680, at closer to flagship power draw it's only slightly faster.

power_peak.gif


Peak is above the 580, last gens high end product. We should see Titan around this level of draw, only considerably faster than the GHz is (not enough hardware, clocks too high, poor per/watt ratio due to high clocks instead of additional hardware).

If you have any questions as to what actual high end 28nm looks like, benches should be out soon.
 
Last edited:

Keromyaou

Member
Sep 14, 2012
49
0
66
The question if the $900 price tag is valid or not depends on the performance. We don't know the correct performance data about Titan yet. This card can be useful for those who run games at 1440p, 1600p, and multi-display setup if the performance of the card is strong enough. If you run games at 1080p (60Hz), this card is pretty much irrelevant since one gtx680/hd7970 is good enough for most of games. If the performance of Titan is 150-60% of gtx680, one Titan should be pretty much enough for 1440p or 1600p gaming and two Titans will be useful for multi-gpu setups. In this sense the price tag $900 doesn't sound outrageous at all. But if the performance is merely 120% of gtx680, then this price is too much, I think. We will see when the real performance data appear this week.
 

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
Nvidia team is putting in some serious overtime here :D

I cant believe your serious....AMD trolls are rampant trying hard to discredit a GPU not even released....LOL, its quite funny, if not farken annoying having to wade thru the BS...
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
If the Geforce Titan 6GB version is released with an MSRP anywhere close to $849 USD, then that will be a realistic price considering that the HD 7970 GHz Ed. 6GB version sells for $599 USD, and the GTX 680 4GB version sells for $529 USD. The expectation here is that the 6GB Geforce Titan will have close to 40-50% faster performance on average than the 6GB 7970 GHz Ed. and 4GB GTX 680, respectively, and that this increase in performance from Geforce Titan will also be accompanied by an increase in performance per watt too.

That comparison is not really relevant since 6GB on HD7970GE is not beneficial in games (i.e., 6GB will not make a game playable when 3GB is unplayable on a $700 7970 6GB). If someone is buying those 7970 6GB cards, more power to them for wasting their $. The key selling point of the TOXIC 7970 6GB card was its overclocking/improved VRM/PCB circuitry, not 6GB of VRAM. Trying to hold on to the 6GB vs. 6GB card comparison assumes that the 6GB of VRAM is actually beneficial. It's not in games, which is why it's not a selling feature, just a marketing gimmick. The forum was consistent when it labelled the Sapphire Toxic 7970 6GB card a waste of $ and wished for a 3GB version of such a card. When the same was brought up regarding the Titan, people went into defensive mode. Since NV has the capability to do tricks with the memory controller/memory combination as we saw with GTX660Ti, in theory they could have released a cheaper Titan with less than the worthless 6GB of VRAM. Talking about this is not being biased, but simply a technical discussion and various benchmarks of cards with 2GB vs. 3GB vs. 4GB of VRAM were linked to show that 6GB of VRAM is overkill and an unnecessary waste.

Using GTX680 4GB price of $530 as justification for the Titan's price assumes that GK104 card's price is actually reasonable. It's obviously not since that card loses to a stock 1050mhz HD7970GE 3GB card in all cases where more than 2GB of VRAM is required.

Just because there are people buying GTX680 4GB $530 and $600-700 HD7970GE 6GB cards, doesn't mean those cards are reasonable priced against a $400 1Ghz HD7970 3GB.

Again, this isn't about complaining about Titan's price at $900. It's already been stated many times the price is perfectly fine for a limited edition halo card. It's about pointing out the hypocritical nature of certain members here who pick and choose when price/performance matters and when it doesn't. Since most of us don't have a money tree, it's usually a factor in our purchasing decisions. Discussion of price vs. performance should be brand agnostic and certain people on our forum cannot grasp this simple concept. If high prices apply to fastest single-GPUs and such high prices are acceptable for early adopter/latest and greatest product reasons, this should also be brand agnostic.

How can we have objective forum environment when people refuse to be objective? The Titan is rumoured to deliver about 40-45% more performance over HD7970GE for a $500 price increase. HD7970 delivered about 20% more performance over the GTX580 for a $100-150 price increase. If high prices of 8800GTX/280/7970 are a normal course of business for next gen flagship GPUs, what was the point of all that bickering from these member who don't care about price/performance regarding HD7970's price? If they did care about price/performance, why did their stance change regarding Titan's price? Next time we should all just ignore price/performance for next generation flagships and save everyone time and this pointless bickering that ensued!!

/ end rant
 
Last edited:

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,330
126
Good thread, 7970 launch thread in a mirror. Official nvidia slides leak showing Titan to be half as fast as expected for double the price. Personal metrics of value are polarized, panic emerges, defence force congeals and hilarity ensues.

Personal guess based on nvidia's official slides is reviews will give the card kudos for being the fastest single GPU - but extremely inefficent in perf/mm2, perf/w and massively overpriced and actually regressing the expected improvement in the perf/$ curve new hardware usually delivers. Should be a good launch.
 

ams23

Senior member
Feb 18, 2013
907
0
0
That comparison is not relevant since 6GB on HD7970GE is not beneficial in games (i.e., 6GB will not make a game playable when 3GB is unplayable on a $700 7970 6GB).

The benefit or even potential benefits of 6GB RAM on a graphics card is irrelevant. What is relevant here is that NVIDIA is releasing a very high end single GPU graphics card with 6GB RAM, and they need to price this card realistically based on the closest 6GB card from their competition.

If someone is buying those 7970 6GB cards, more power to them for wasting their $.

Fine, but the fact remains that the 6GB Geforce Titan should be significantly faster than the 6GB 7970 GHz Ed., with better performance per watt and with better overclocking and overvolting headroom, so it would be completely unrealistic to expect NVIDIA to not charge significantly more in comparison.
 

chimaxi83

Diamond Member
May 18, 2003
5,457
63
101
I cant believe your serious....AMD trolls are rampant trying hard to discredit a GPU not even released....LOL, its quite funny, if not farken annoying having to wade thru the BS...

I agree with you. I like the potential this card has. I wish it had been released as a 680, would've been a lot more fun to play with over the 3 that I had. That price is bonkers though.

It's just kind of hypocritical that the 7970 released with the increased price and performance it had over 6970, and the relentless NVidiots hammered it to death. The Titan is being released with what looks like a decent performance increase over 680, but at an absurd price increase, and these guys now want to have Titan babies. It's strange.
 
Last edited:

Shmee

Memory & Storage, Graphics Cards Mod Elite Member
Super Moderator
Sep 13, 2008
8,313
3,177
146
ams23, are you sure that the titan will have better OC and OV results? I am curious, as last I heard, nvidia was voltage locking their cards. If it is not voltage locked, and oc's well, that makes things a lot more interesting :D
 

Masahiro

Member
Oct 25, 2011
87
0
66
@RussianSensation you're expecting the internet to be objective, that's your first problem. Fact is consumers don't all follow trends, supply and demand is a rough science at best. In the end it's a normalized curve representing the average distribution of both sides, as such you'll always have outliers. For that matter, outliers on the high end tend to be the ones that frequent these forums. So you're already not dealing with either a decent representative sample or objective buyers (high end buyers are very opinionated).

Pricing is pricing, get over it to all of you. If you disagree, don't buy, and NVIDIA will get the message. If you're cool with the price, buy it and congratulations to you. Also while they do conduct market research based on forums (this is definitely true), they also take it with a grain of salt, as with any market research. One of the inherent flaws with said research is there's absolute 0 commitment from the subjects. They SAY they'll buy it at that price, but it doesn't mean they actually WILL. So even though a lot of people have expressed interest in buying it, it doesn't directly translate into sales. Much like supply and demand, it's a rough "science" at best.

Yeah it's kinda pricy, but the price is somewhat in line with the performance between the 680 and 690 (at MSRP). You can obviously take this from the standpoint of well what about the 780 then? Problem with making that assumption is we don't know 100% the performance of the 780 or official pricing (though it's a reasonable assumption to say it would be the same as launch 680, but I believe it'll be slightly higher). You can't give the Titan such excessively negative remarks based on a product that hasn't launched yet (and may not launch this year).

As for people making comparisons to AMD and the 7970s and 7990s, I have one simple word for you all:

PREFERENCE

If you think the AMD's are a better deal, than stick with them and enjoy. If you like NVIDIA then go for it and enjoy who cares. It's like buying a Porsche or a Maserati, they both do the same things, prices are somewhat different, all up to the final buyer what's best. It's not all only about numbers and price alone, there's an inherent bias everyone has towards one or the other regardless of which may be better on paper. I for example would gladly pay for the Titan if the numbers are comparable to a 690 (which I currently use) because I prefer Nvidia, I'm big on aesthetics (and the 690/titan design is one of my favorites), need the extra VRAM (1440p 120hz), and because it won't be too big a hit on my wallet after I sell my 690.

Everyone has their reasons, and no one specific person's opinion is absolute.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
I agree with you. I like the potential this card has. I wish it had been released as a 680, would've been a lot more fun to play with over the 3 that I had. That price is bonkers though.

It's just kind of hypocritical that the 7970 released with the increased price and performance it had over 6970, and the relentless NVidiots hammered it to death. The Titan is being released with what looks like a decent performance increase over 680, but at an absurd price increase, and these guys now want to have Titan babies. It's strange.

Why would any "nvida fanboy" give a care about the 7970s increase over a second tier last gen product?

Most of us love the hardware, we just don't like the price. We can discuss these things without throwing hissy fits and tantrums about what we feel may or may not have happened a year ago.

There are a lot of problems with it, it's late, it's expensive, there is only one version, it has 6GB of vram, the list goes on and on... We've covered them in great detail, nobody is really arguing against them being negitives. Now we want to discuss the actual technology involved in bringing a product of this type to market, and the potential performance it may contain at stock and overclocked.

The difference between this launch and the 7970 launch is that the "supposed" Nvidia people don't deny basic facts about the product, unlike the 7970 people who said "highest performance cards are worth the premium" and gave every excuse under the sun and then made some up as to why they could "justify" a lowly 20-30% performance increase over an already overpriced GTX 580 for a next gen full node shrink product.


In short, if you feel personally wounded by what you believe happened a year ago, why are you doing the same thing you didn't like now? Why don't you try ending the cycle instead of enabling it?
 

ams23

Senior member
Feb 18, 2013
907
0
0
ams23, are you sure that the titan will have better OC and OV results? I am curious, as last I heard, nvidia was voltage locking their cards. If it is not voltage locked, and oc's well, that makes things a lot more interesting :D

Based on the leaked slides (http://videocardz.com/images/2013/02/GeForce-GTX-Titan-GPUB.jpg , http://videocardz.com/images/2013/02/GeForce-GTX-Titan-Presentation-3.png), yes, I do expect Geforce Titan to have improved OC + OV results vs. 7970 GHz Ed. and GTX 680. If you look at Anandtech's review on the 7970 GHz Ed., notice that Metro 2033 performance per watt for the 7970 GHz Ed. actually goes down as the card is overclocked (even with no adjustment to voltage). And if you look at Anandtech's review on the GTX 680 Classified, notice that Metro 2033 performance per watt for the GTX 680 Classified goes up as the card is overclocked, but then goes down as the card is overclocked + overvolted.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
If high prices of 8800GTX/280/7970 are a normal course of business for next gen flagship GPUs, what was the point of all that bickering from these member who don't care about price/performance regarding HD7970's price?

It was like that from a very long long time ago for flagship video cards and if memory serves me right.. the X1900XTX for instance was like $649 when it sold.

But the main bickering came from what I believe is the after-effect created by AMD themselves during the time where they offered products that were priced relatively lower (or sometimes alot lower e.g. HD4870 vs GTX280) than the nvidia counterparts while maintaining 80~90% of the performance. This happened for about ~3 generations where I tend to think that it created this expectation of perf/price followed by "If AMD can do it, why not nVIDIA?" so forth.

And then with the release of tahiti, AMD put their original strategy aside and tried to re-assume the old flagship pricing model where their highest part performed only 20% faster (at the time) according to AT against the GTX580 while weighing in at $549. This is where the bickering came from as most consumers (in the forums anyway) were looking forward the next generation bang per buck card like the HD4870/5870. If they released the GHz version instead of trying to meet that 250W power envelope, things might have been quieter since the highend would have been more of a "highend" so to speak.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
It was like that from a very long long time ago for flagship video cards and if memory serves me right.. the X1900XTX for instance was like $649 when it sold.

But the main bickering came from what I believe is the after-effect created by AMD themselves during the time where they offered products that were priced relatively lower (or sometimes alot lower e.g. HD4870 vs GTX280) than the nvidia counterparts while maintaining 80~90% of the performance. This happened for about ~3 generations where I tend to think that it created this expectation of perf/price followed by "If AMD can do it, why not nVIDIA?" so forth.

And then with the release of tahiti, AMD put their original strategy aside and tried to re-assume the old flagship pricing model where their highest part performed only 20% faster (at the time) according to AT against the GTX580 while weighing in at $549. This is where the bickering came from as most consumers (in the forums anyway) were looking forward the next generation bang per buck card like the HD4870/5870. If they released the GHz version instead of trying to meet that 250W power envelope, things might have been quieter since the highend would have been more of a "highend" so to speak.

Funny how things died.
Sweetspot = DEAD.
Monthly WHQL = DEAD.

I guess NVIDIA's way makes more money ;)
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,330
126
It was like that from a very long long time ago for flagship video cards and if memory serves me right.. the X1900XTX for instance was like $649 when it sold.

But the main bickering came from what I believe is the after-effect created by AMD themselves during the time where they offered products that were priced relatively lower (or sometimes alot lower e.g. HD4870 vs GTX280) than the nvidia counterparts while maintaining 80~90% of the performance. This happened for about ~3 generations where I tend to think that it created this expectation of perf/price followed by "If AMD can do it, why not nVIDIA?" so forth.

And then with the release of tahiti, AMD put their original strategy aside and tried to re-assume the old flagship pricing model where their highest part performed only 20% faster (at the time) according to AT against the GTX580 while weighing in at $549. This is where the bickering came from as most consumers (in the forums anyway) were looking forward the next generation bang per buck card like the HD4870/5870. If they released the GHz version instead of trying to meet that 250W power envelope, things might have been quieter since the highend would have been more of a "highend" so to speak.

There is some truth to this. There would be some blowback as AMD/ATI went back to the traditional pricing on a new single-gpu flagship. If we're talking about what we saw on these forums it was not even relevant as it came from posters who don't even buy AMD and were just being disruptive because it was an AMD launch.

We're seeing something a lot worse here though. 25% faster than a 680 for $900. Almost doubling the price and this is unprecedented.

er_photo_184834_52.png



So now it looks like Titan is essentially what the GTX 680 was expected to be, the standard 60-70% increase over the GTX 580 that the 680 only delivered half of, except it's going to go for $900 now. And the above is a slide from nvidia, so we can expect this is best case performance in Crysis 3. I think reviews will be paying a lot of attention to price and the performance delivered and how relevant it is in the context of what looks to be the worst price/performance markup ever.

Really it's looking like this is the card that should of released 9 months ago for $500.
 
Last edited:

badb0y

Diamond Member
Feb 22, 2010
4,015
30
91
OK so I was out all day and came back to visit this thread to find it pretty screwed up lol.

First, I don't think anyone in this thread doesn't believe that this card will not be impressive. At the rumored performance increase of 40%~ over the GTX 680 I find this card to be a real beast and worth the name Titan. However, I am not going to sit here and act like the price is not concerning for me because really what is the supposed successor to GTX 580 is going to cost a lot more than I am comfortable with.

In the grand scheme of things I am happy that this card is coming out into the consumer market place but I am sad to learn it is going to be $900 halo card that only a few people will get. Honestly, I think the price is so high because it's going to be a limited supply card and nVidia feels that the GTX 680 is sitting well at it's price point so no reason to disrupt that. AMD doesn't seem to have anything lined up to counter Titan which makes nVidia's position more understandable but still I am a sad panda.

As for the fanboys going back and forth please stop throwing jabs at each other. I don't think anyone here is really viral marketeer, we are all just passionate GPU enthusiasts with different viewpoints. There is no reason to go at each other in a antagonistic way. Let's keep the trash talk out of this and actually speculate about the product itself.

:D
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
GF100 -> GF110 -> GK100 (failed to launch --> 1 year delay) --> GK110

Simple as it's gets...

Fixed.

10,000 units launch, 1 year late, 6 months backlog of GK110 Tesla cards to corporate clients, only started to ship them in October to Oak Ridge. If GK110 is a spiritual successor to GF110, why are there only 10,000 such cards? You are telling us NV only had prepared 10,000 GTX480/580 chips for sale too? How can you succeed mass produced cards like GTX480 and 580 with a limited edition card?

If NV can stamp out GK110s without issues, why aren't they launching 100,000 of them and bringing some down to $500-800 levels too?

This is what's really happening: NV is being outgunned in price/performance and single GPU performance (HD7970 GE > GTX680) at almost all price levels on the desktop; they are are losing TWIMTPB vs. AMD GE war because the developer optimizations have shifted from tessellation to GCN DirectCompute; they are losing key game bundles with big AAA titles to AMD's GE. NV knows they can only rely on their brand for so long before consumers smarten up and consider looking at more reasonably priced alternatives. Instead of waiting for the inevitable price drops on GTX600 series, which would lower their margins and upset NV's shareholders, as the sales start to swings to HD7000 series, they are going with a very clever marketing trick that has worked in the GPU space many times for them -- release the most baddest single GPU card as a halo card and the average Joe still thinks NV has the fastest series on the market because it has the fastest single GPU.

For this reason as well NV's viral tried as hard as possible to spin the media by not accepting that HD7970GE > GTX680 as far back as June 2012 because they understand how important the perception of the fastest single GPU is in the eyes of the average GPU buyer. NV didn't want the average Joe to think GTX680 was no longer the class leader which would have undermined NV's brand equity. Launching GK110 is a very effective and cheap marketing method to swing the momentum back to GTX600 series without needing to drop their prices.
 
Last edited:

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Fixed.

10,000 units launch, 1 year late, 6 months backlog of GK110 Tesla cards to corporate clients, only started to ship them in October to Oak Ridge. If GK110 is a spiritual successor to GF110, why are there only 10,000 such cards? You are telling us NV only had prepared 10,000 GTX480/580 chips for sale too? How can you succeed mass produced cards like GTX480 and 580 with a limited edition card?

If NV can stamp out GK110s without issues, why aren't they launching 100,000 of them and bringing some down to $500-800 levels too?

This is what's really happening: NV is being outgunned in price/performance and single GPU performance (HD7970 GE > GTX680) at almost all price levels on the desktop, they are are losing developer support and big AAA titles to AMD's GE. Instead of fearing that they have to start dropping prices eventually on GTX600 series, which would lower their margins and upset shareholders, they are doing a very clever marketing trick that has worked in the GPU space many times -- release the most baddest single GPU card as a halo card and the average Joe still thinks NV has the fastest series on the market because it has the fastest single GPU. For this exact reason NV's viral tried as hard as possible to not accept that HD7970GE > GTX680 as far back as June 2012 because they didn't want the average Joe to think GTX680 was no longer the class leader which would have undermined NV's brand equity.

Don't include me in your red herring ramblings, thx.

You didn't fix anything...you polluted my post.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Almost doubling the price and this is unprecedented.

Imho,

With competitive advantages or opportunities companies showcase their predator fangs as they devour consumers and value.

This pricing is just another example of the evolutionary and incremental price performance on a substantial and significant node and arch!

Defenders of premiums -- there it is - higher MSRP than the GeForce 8800 Ultra!

However, market sets pricing as vocal views, especially mine are irrelevant!
 
Status
Not open for further replies.