Post in thread '10 years of Nvidia Video cards. Ultra high end ,high end, mid range, lower end, and

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

linkgoron

Platinum Member
Mar 9, 2005
2,298
818
136
I thought it was quite obvious. Instead of looking at die size, look at transistor count. Guess which one effects the performance of a GPU? Or do we not care about performance anymore?
I don't really understand how it works. I did the following calculation on the last 3 node jumps. I took the largest single card, divided the nm of it and the next shrinked big card nm, then I squared the number and multiplied by the number of transistors. The transistor numbers are in millions.

card......nm......transistors......next.........nm....size diff............expected.....actual
8800.....90.......681...................280..........65.....1.917159763....1305...........1400
285.......55.......1400.................480..........40.....1.890625..........2647............3000
580.......40.......3000.................Titan........28.....2.040816327....6123...........7080
Titan X..28.......8000.................Titan XP...16....3.0625...............24500.........12000

Info is from here: https://en.wikipedia.org/wiki/List_of_Nvidia_graphics_processing_units

EDIT: taking the 65m 280 instead of the 55nm 285 makes the 480 look a bit worse, but not by much. The expected number is a bit higher, at 3600. Added the 8800 just because I also checked it.

EDIT: well, when editing the above looks like a table, but the formating is ruining it for me :\
EDIT2: tried to improve it

However, I'm not sure I did it correctly, please correct me if I'm wrong. If you want to compare it in other ways, the 260/285->480 node shrink more than doubled the amount of transistors. The 580->Big Kepler more than doubled the number of transistors. The node shrink from Titan X to Titan XP resulted in only 50% more transistors. So you're right - the number of transistors has increased, which is not very surprising given that it happens in every generation. However, it actually changes less than in previous generations. So, it does not refute my point at all. Note that the 580 to Titan transition is about in line with the 285 to 580 transition. size is actually is also an option 280 and not the 280 as it makes the 580 actually

But as 96Firebird stated, transistor count has obviously increased. What this tells me is that die size isn't as important for determining performance as you think. Performance per watt, has increased TREMENDOUSLY since Fermi, whilst die size obviously hasn't.

This goes to show where NVidia is focusing the brunt of their R&D effort; into increasing performance per watt, or performance per mm^2. Compared to AMD, they have a massive lead in those two areas.
Transistor count has always increased, see above. It is a straw man. Your statement about die sizes is also not quite true, obviously it is very important as otherwise we wouldn't have 500-600mm^2 high end cards generation after generation.

Otherwise, everything you've said is true. Nvidia's engineering has been exceptional, and they're essentially wiping the floor with AMD (What does this say about AMD? That they're essentially only providing low end cards, IMO. However, as we've been stuck at 1080p for quite some time, it doesn't really show). It is not relevant to the discussion, however.

The only reason that anyone is considering the 1080 and Titan XP as high-end and ultra-high-end and not mid-range and high-end are mainly thanks to a lack of competition from AMD, and Nvidia using their great marketing machine (let's face it, people would react very negatively to a $1000 x80 It's much easier to introduce a "new" lucrative super-high-end "Titan" lineup, which is just actually the old x80 in rebranded form), brand recognition and loyal fanboys to jump on the opportunity and push their lower-end cards up the stack.

Anyway, I think this is enough... I think I've already made my point clear several times, including posting actual hard facts.
 
Last edited:
  • Like
Reactions: AtenRa and Headfoot

AtenRa

Lifer
Feb 2, 2009
14,001
3,357
136
I thought it was quite obvious. Instead of looking at die size, look at transistor count. Guess which one effects the performance of a GPU? Or do we not care about performance anymore?

You want transistor count ?? here we go........

2010/11

Tier 1 Ultra High-End : GTX 590 = 2x 3B or 6B transistors = 2x times more than Tier 2
Tier 2 High-End : GTX 480 = 3B transistors = 50% more than Tier 3
Tier 3 Middle-End : GTX 460 = 1,95B transistors = 66% more than Tier 4
Tier 4 Low-End : GTX 450 = 1,17B transistors

2016

Tier 1 Ultra High-End : GTX 1090?? = 2x 12B or 24B transistors = 2x times more than Tier 2
Tier 2 High-End : GTX Titan XP = 12B transistors = 50% more than Tier 3
Tier 3 Middle-End : GTX 1080 = 8B transistors = 82% more than Tier 4
Tier 4 Low-End : GTX 1060 = 4.4B transistors

Too bad we dont have a DUAL GP102 as the Ultra High-End card that some here believe its the TITAN XP.
 

DooKey

Golden Member
Nov 9, 2005
1,811
458
136
I just don't get this place. All this speculation and "facts". Here's a fact for all of you. NV and AMD don't give a damn what you "think" the pricing of their products should be. The crap going on here from both sides fanatics is nothing but mental masturbation. The prices are the prices and the market has proven that the prices are correct. Deal with it and vote with your wallet. Don't come here and think you are somehow going to change a dang thing.
 
  • Like
Reactions: Carfax83

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
The only reason that anyone is considering the 1080 and Titan XP as high-end and ultra-high-end and not mid-range and high-end are mainly thanks to a lack of competition from AMD, and Nvidia using their great marketing machine (let's face it, people would react very negatively to a $1000 x80 It's much easier to introduce a "new" lucrative super-high-end "Titan" lineup, which is just actually the old x80 in rebranded form), brand recognition and loyal fanboys to jump on the opportunity and push their lower-end cards up the stack.

I think the most telling aspect of what constitutes ultra high end, high end, midrange etcetera is just performance. We can go on all day about die sizes and transistors, but it's ultimately performance that's going to drive consumer interest and purchasing. Titan XP is definitely an ultra high end product, because it is the ONLY single GPU card that can consistently break through the 4K 60 FPS barrier with most or all graphics enhancing options enabled.. Not for all games of course, but for most of them it can; and this is before overclocking. That's an impressive feat, because high FPS gaming at 4K is a very tough nut to crack. In fact, it won't be fully cracked until Volta, but at least Titan XP took a huge bite out of it. That is why the card has been selling so well, despite its exceptionally high price.

In fact, I would have bought one myself if it had fully mastered 4K. But since it didn't, I will wait for Volta. Same with the GTX 1080. It's a extremely capable 2K card that can give you 60 FPS and greater in the vast majority of titles at that resolution with all the eye candy options.. No other card on the market will give you that, minus the Titan XP..

Lack of competition from AMD is also a factor, but if NVidia didn't have the giblets so to speak, then it wouldn't matter.. I think people underestimate how large the gap is between AMD and NVidia when it comes to performance and capability in their GPUs..
 

AtenRa

Lifer
Feb 2, 2009
14,001
3,357
136
I think the most telling aspect of what constitutes ultra high end, high end, midrange etcetera is just performance. We can go on all day about die sizes and transistors, but it's ultimately performance that's going to drive consumer interest and purchasing. Titan XP is definitely an ultra high end product, because it is the ONLY single GPU card that can consistently break through the 4K 60 FPS barrier with most or all graphics enhancing options enabled..

What personally im talking about is not what is the Ultra High-End tier today but what we were getting as an Ultra High-End back in 2010.

Back in 2010/11 Ultra High-End was a Dual Chip Graphics Card that was faster than any Single Chip card available at the time. Most of the time it used 2x of the biggest (Fastest) Single Chips like 2x GF110.

Today we pay more for less, we pay Ultra High-End prices for less performance relatively to 2010/11

Thats the problem with todays prices, not if the TITAN XP is the fastest card and that makes it Ultra High-End today.

I wouldnt mind if todays Ultra High-End was a Dual GP102 graphics Card at $1200. That would be fine by me although 2x more expensive that what an Ultra High-End cost 6 years ago.

People should try and understand the difference here.
 

linkgoron

Platinum Member
Mar 9, 2005
2,298
818
136
I just don't get this place. All this speculation and "facts". Here's a fact for all of you. NV and AMD don't give a damn what you "think" the pricing of their products should be. The crap going on here from both sides fanatics is nothing but mental masturbation. The prices are the prices and the market has proven that the prices are correct. Deal with it and vote with your wallet. Don't come here and think you are somehow going to change a dang thing.

This is a thread about prices of cards "per tier", claiming that nvidia's prices have not increased by much. However, not only have the prices in each of tiers in the OP increased, the "tiers" themselves have moved, and I think that the facts support this. Even the "transistor numbers" claim people were pushing, nicely place the Titan XP as a x80 card (it even has less transistors than the Titan Z) and not as part of a new "larger super single-gpu high end" lineup, which never existed before.

The high-end x80 has moved from $500-$600 to $1200. The x60 cards have moved from $250-$300 to $600-$700. Also, you don't have to participate in this thread if you don't want to. This thread is about pricing, and that's what we're talking about. You don't have to participate.

I think the most telling aspect of what constitutes ultra high end, high end, midrange etcetera is just performance. We can go on all day about die sizes and transistors,
Well, I think we can agree at least that everything lines up perfectly with the fact that the Titan XP "tier" jumped 1080?

but it's ultimately performance that's going to drive consumer interest and purchasing.
Sure, many users here feel the same. But when we're in a thread talking about how pricing moved, IMO, we have to discuss the fact that the mid-range cards of yesterday have become the high-end cards of today, and thus their pricing has increased by a very large amount (not to mention the previous high end). As already stated, considering everything, the Titan XP fits in exactly with the old high-end tier, and not the super-high-end tier which it is put in by the OP.

Titan XP is definitely an ultra high end product, because it is the ONLY single GPU card that can consistently break through the 4K 60 FPS barrier with most or all graphics enhancing options enabled.
Indeed, It is a ultra high-end product because of lack of competition. In a historical perspective, is just a rebranded 1080 at double the price. Thus, prices have increased by a lot. Once, $500-$600 would get you the best single GPU card, and the x90 lineups would get you two. Now, for $650 you get the second best GPU, and for $1200 you get a cut-down best single GPU card, which for all intents is just a modern version of the old $500-$650 card.

However, marketing wise, even with all this talk about performance I believe that without the Titan rebranding, a 1200$ 1080 card would be hard to swallow for many people. As a 1080. instead of a Titan XP, I believe that you wouldn't see it placed as a super-high-end card, but as a high end card. Moreover, it probably wouldn't sell as many cards as it selling under the Titan brand. Not to mention the current 1080/70/60 lineups.

Lack of competition from AMD is also a factor, but if NVidia didn't have the giblets so to speak, then it wouldn't matter.. I think people underestimate how large the gap is between AMD and NVidia when it comes to performance and capability in their GPUs..

As I've already said myself - Nvidia is wiping the floor with AMD. AMD are not nearly competitive even with Nvidia's mid-sized GPUs, and their best card (Fury X) is significantly slower than the 18 months old AIB 980ti. Thus, it is not surprising that Nvidia has shifted their tiers upwards. AMD are also somewhat benefiting from Nvidia's pushed up tiers, as they are also saving face this way. How many 460 Polaris 10 cards would AMD sell vs. a 1060 gp104? Now they can sell them as 480 cards vs 1060 GP106. They look like they're competing at the mid-range and not the low/entry range. With the current structure and pricing, AMD look a little better. Additionally, if Vega is ever released they can price things according to the new market designed by Nvidia (similarly to what they tried to do with the 390/Fury lineup).
 

Pariah

Elite Member
Apr 16, 2000
7,357
20
81
You want to know one major reason NVidia is beating AMD to a pulp right now? And why the hope that AMD is going to magically turn this around is wishful thinking at best? Here you go:

rnd_large.png



It costs money to develop high end tech products. A LOT of money. You people whining about Nvidia's pricing act like the only cost associated with producing a graphics card is the actual cost of the GPU die itself, and then producing that GPU of a given size is a fixed cost that never changes over time. Even worse, the prevailing argument is that with each node shrink it gets cheaper to produce dies of a given size. No, it doesn't. It never has. Nodes shrinks used to decrease the cost per transistor, (with Finfet, not even that is true anymore), but that cost benefit is cancelled out by squeezing many more transistors in a given die space.

In fiscal 2016, Nvidia spent 1.413 billion dollars on R&D. Based on the fact Nvidia's R&D budget is almost exclusively for graphics products, it's a miracle that AMD, with its lower budget that also has to be split with CPU development, had stayed as close to Nvidia as they had until recently. Having such a significant advantage in funding is only going to increase the gap between Nvidia and AMD going forward. Intel buried AMD 10 years ago with the release of the CORE architecture and AMD has never come close to competing at the highend since. We'll likely look back at Pascal as the moment Nvidia relegated AMD to an also ran in the high end.

Just 5 years ago Nvidia spent $951 million in R&D. That's a nearly $500 million increase in 5 years. Where do you think that money has to come from? There is simply no way Nvidia can afford that increase while not increasing the price of their products.

You want Nvidia to never raise prices on GPU's? Be prepared to see GPU improvements grind down to the pace of CPU's are now if we are lucky.
 

ultimatebob

Lifer
Jul 1, 2001
25,135
2,445
126
Yeah... I think that AMD needs a major cash infusion to be able to complete. Who might be interested in buying them at this point... Samsung, perhaps? Maybe Sony?
 

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
What I am about to say is crazy, but hell, I'm Bogg, so its cool, right? Take a look at the audacity that Nvidia has shown with the blatant price increases. There are good reasons for it, like lack of competition etc, but regardless of that fact, if most companies were presented with such a plan to increase prices in this way, don't you think they would be afraid of a backlash or heavy criticism? If someone presented these ideas in a meeting, I think they would almost seem to outlandish to obtain credibility, unless there was a damn good reason to trust the wisdom of the person presenting the price changes.
My suspicion, and its only a suspicion because I see certain "fingerprints" here that I recognize from other industries, is that the wisdom regarding the marketing and pricing strategies didn't come from a person, but from a powerful, analytical, strategizing artificial intelligence system. Call me crazy, but I have seen the difference between human labor with regard to design and the advancements that come from collaborating with an AI system. It seems like they have an analytical computer that predicts human reactions and takes everything into account, a set of variables too great in number and complexity for any person to fully assess from every angle, and then produce a judgement. I'm just saying that's what it looks like to me.
 
  • Like
Reactions: VirtualLarry