What are todays "mid-range" cards?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Shivansps

Diamond Member
Sep 11, 2013
3,918
1,570
136
When you say something, it wise to say why you think that way.

What you have said is this: "Nvidia and AMD are killing the market with those prices, because thats what I believe in".

Read the whole post. Because you appear to have read the first paragraph and immediately responded.

P.S. Are you sure, that if Manufacturing costs have nothing to do, with product price, AMD and Nvidia should sell their products at a loss? ;).

Are you attemping to re-define how pricing works? come on ive learned this on basic product development at the university a few years back, you can use google to find how a price is determined, ive already told you the basics.

Another example of this is the mining craze, the demand went to hell and beyond, and with that the prices, and to make it worse the miners were still willing to pay, and now you are telling me that, in a shrinking dGPU market, the complete oposite, creates the same effect just because the costs are a little bit higher? Not sure why is so hard for you to accept this is happening due to a duopoly and price fixing, all these effects are know and documented. Again if you dont belive me google is your friend.
 

DrMrLordX

Lifer
Apr 27, 2000
23,191
13,275
136
We've basically covered the same arguments in the Navi thread. I've not much to add. It's going to be hard for people to justify gaming on PCs in the present or future if card prices keep tracking upwards but console prices don't.
 
  • Like
Reactions: Ranulf

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Guys, it hit me. We are too old for this. The younger generation are being trained to accept the new prices as the norm.

Goodbye my fellow gaming travelers, our time has passed, but if you can change your expectations, welcome to the new definitions of price/performance. You have to be reborn as a babe with an open suggestible mind. Cast aside your old beliefs and accept the new reality.
IMO as long as there are consoles, that is the real competition and price point floor. The % of people who will pay more than the cost of a console to play a game on PC instead of the latest flavor of Playstation or Xbox just isn't that big

I told you already, we're a dying breed. Kids nowadays yeetin' at upscaled highly compressed streaming video/audio and clearly next will be gaming.
 

maddie

Diamond Member
Jul 18, 2010
5,203
5,612
136
The reason why AMD and Nvidia are charging more for GPUs is very simple. Manufacturing costs, design costs, and shrinking dGPU market.

We complain about RTX 2070. But lets think about it. The GPU costs just 10% less to manufacture, in best case scenario, than manufacturing of GTX 1080 Ti, because of its die size, and because of the cost of GDDR6 compared to GDDR5X which is around 25% more. Design costs are not as high, for 12 nm process, but Nvidia did GT102, GT104, GT106, GT116 and GT117. Each design between 100 and 150 mln USD.

AMD? Design costs are around 250 mln for the GPU alone. The die is relatively small, but each wafer costs 12500$, compared to 6500 for 12 nm, and we do not know the yields. Effectively each N7 GPU can cost AMD the same, as it costs Nvidia 12 NM FFN. Plus there is GDDR6 memory cost. Each GPU die costing 10$, makes 80$ in just memory chips, alone. Explain to me. If each GPU, with its PCB, shroud, packaging, etc costs 200$ to make, how do you expect that AMD and Nvidia will sell those GPUs for 300$ in shrinking, or flat market? The only hope for us is that manufacturing costs will go down, and alongside, the prices will go down, for GPUs.

Consoles, can cost 500$, because even if they have 12 memory chips, you get 120$ for GDDR6 memory, and you get SINGLE die designed, and yielded on the process. Which effectively can cost 60-80$. So 200$ for WHOLE system, not counting the SSD prices. And consoles are actually not shrinking. Heck, current projections say that market can even grow more in upcoming 2-3 years for consoles, and both Sony and Microsoft may see growth in this space.

And last thing. 3 000 000 000 000 $ did it cost TSMC to develop N7 process. Two times more than it cost them development of 16 nm FF process. 5 nm process will cost 5 000 000 000 000 $. Where do you guys think TSMC will get that money back from, eh?
These quoted cost numbers are imaginary and postulating a set of $ values to justify a position point is deceptive.

If you do totally different designs with NO shared elements, you will have to repeat the full design costs for each, but we all know that many of the functional elements are reused, just packaged in different combinations. Two designs sharing sub-units do not cost double. In fact, the more distinct classes you can market reduces the shared cost of each class of product. Do you really think it costs double to do a shader or CU layout or a ROP or a memory controller unit or whatever if it ends up in a 2080 or a 2060 part. Quadruple costs for the 2080Ti, 2080, 2070, 2060? Really?

I firmly reject this attempt to justify the present price as unavoidable and by extension that we should be content.
 

Glo.

Diamond Member
Apr 25, 2015
5,930
4,991
136
I told you already, we're a dying breed. Kids nowadays yeetin' at upscaled highly compressed streaming video/audio and clearly next will be gaming.
God save us, from game streaming...
 

Glo.

Diamond Member
Apr 25, 2015
5,930
4,991
136
These quoted cost numbers are imaginary and postulating a set of $ values to justify a position point is deceptive.

If you do totally different designs with NO shared elements, you will have to repeat the full design costs for each, but we all know that many of the functional elements are reused, just packaged in different combinations. Two designs sharing sub-units do not cost double. In fact, the more distinct classes you can market reduces the shared cost of each class of product. Do you really think it costs double to do a shader or CU layout or a ROP or a memory controller unit or whatever if it ends up in a 2080 or a 2060 part. Quadruple costs for the 2080Ti, 2080, 2070, 2060? Really?

I firmly reject this attempt to justify the present price as unavoidable and by extension that we should be content.
In my calculations, I excluded the design costs from the manufacturing costs. The GPU dies for AMD and Nvidia for RX 5700XT and RTX 2070 are costing them around 50-55$ in the best case scenario. Add to that 80-92$ for GDDR6, depending on price, PCB costs, shroud, packaging, shipping and you end up with 200$ manufacturing costs per GPU alone, without taking into account the design costs.

How people expect that companies, that are supposed to make money would sell them at 300$?

And I am not happy with that either. But that is the reality we live in.
 

Glo.

Diamond Member
Apr 25, 2015
5,930
4,991
136
Are you attemping to re-define how pricing works? come on ive learned this on basic product development at the university a few years back, you can use google to find how a price is determined, ive already told you the basics.

Another example of this is the mining craze, the demand went to hell and beyond, and with that the prices, and to make it worse the miners were still willing to pay, and now you are telling me that, in a shrinking dGPU market, the complete oposite, creates the same effect just because the costs are a little bit higher? Not sure why is so hard for you to accept this is happening due to a duopoly and price fixing, all these effects are know and documented. Again if you dont belive me google is your friend.
Your point is Moot. You can find in Google the same proofs that I provided to show you the REALITY that its manufacturing costs that drive prices up.

I will ask you once again. Where will TSMC get back up 3 000 000 000 000$ that they have put into development of N7 node, if not by price hiking the silicon wafers?

How will TSMC get back 5 000 000 000 000$ that they will put into development of N5 process node if not by price hiking the wafer prices EVEN MORE?

Why do you expect AMD and Nvidia to sell 200$ GPU in manufacturing costs for 300$ in the market, knowing that dGPU, DIY market is shrinking?

Companies want to break even with their investments as fast as possible, which is why they maintain the highest possible margins for as long as they can. Use this rule for N7 process and think about what are we facing from this point of view.
 

amrnuke

Golden Member
Apr 24, 2019
1,181
1,772
136
I see that both times you ignored the RX560 but included the similarly performing GTX1050. If you did the avg for AMD would be less.
Well, you're 100% correct. But if we adjust average price and performance down to the RX550 and GTX1030, then the mid-range cards are even lower.

AMD: RX550 ($85, 17.9) -> Radeon VII ($700, 92.4) - price mid-point is $392, performance midpoint is 55.2 -- the mid-point price lets you buy a Vega64 or a new RX5700, the mid-point performance is right around an RX580 ($170)

Nvidia: 1030 ($80, 13.0) -> 2080 ti ($1100, 98.4) - price mid-point is $590, performance midpoint is 55.7 -- the mid-point price lets you buy an RTX 2070, the mid-point performance though is right between a 1060 and a 1660 ($170-$220)

So a mid-level (from a performance standpoint) card turns out to be even cheaper. Grab the RX580!
 
  • Like
Reactions: SPBHM

Paratus

Lifer
Jun 4, 2004
17,743
16,060
146
Your point is Moot. You can find in Google the same proofs that I provided to show you the REALITY that its manufacturing costs that drive prices up.

I will ask you once again. Where will TSMC get back up 3 000 000 000 000$ that they have put into development of N7 node, if not by price hiking the silicon wafers?

How will TSMC get back 5 000 000 000 000$ that they will put into development of N5 process node if not by price hiking the wafer prices EVEN MORE?

Why do you expect AMD and Nvidia to sell 200$ GPU in manufacturing costs for 300$ in the market, knowing that dGPU, DIY market is shrinking?

Companies want to break even with their investments as fast as possible, which is why they maintain the highest possible margins for as long as they can. Use this rule for N7 process and think about what are we facing from this point of view.

Um Glo, are you suggesting TSMC has payed $3 Trillion dollars for 7nm? That seems excessive. Maybe $3 Billion?
 

Glo.

Diamond Member
Apr 25, 2015
5,930
4,991
136
Um Glo, are you suggesting TSMC has payed $3 Trillion dollars for 7nm? That seems excessive. Maybe $3 Billion?
Sorry guys, was writing that at 3 A.M. in the morning, and I had only milk or green tea, to help myself to not fall asleep. My bad. Yes, 3 Billion USD.
 
  • Like
Reactions: Paratus

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
It's the second biggest chip from each company.

So 2070 Super and 2080 (and Super). RX 5700 and 5700 XT (not even counting Radeon VII necessarily, but just anticipating there will be a larger chip maybe next year).

Unfortunately the prices have risen from $250 ($285 with inflation) GTX 560 Ti -> $700 RTX 2080 Super in 8 years, but it's still mid range.
 

ubern00b

Member
Jun 11, 2019
171
75
61
In my calculations, I excluded the design costs from the manufacturing costs. The GPU dies for AMD and Nvidia for RX 5700XT and RTX 2070 are costing them around 50-55$ in the best case scenario. Add to that 80-92$ for GDDR6, depending on price, PCB costs, shroud, packaging, shipping and you end up with 200$ manufacturing costs per GPU alone, without taking into account the design costs.

How people expect that companies, that are supposed to make money would sell them at 300$?

And I am not happy with that either. But that is the reality we live in.
oh my god, you're correct, how the hell did they manage to release all those mid range cards that came before them and only stagnated when AMD could not match the performance of them at just $200 to $350, the poor buggers must have been making losses for years. But we know this is total BS as NV shares have been increasing year on year for many years now, disregarding when they went one step further and took advantage of the mining craze. mid range and entry level is where the profits are made, they will likely outsell the high end and flagship many many times over to recoup all R+D, manufacturing and marketing costs many times over due to them being sold in bulk to OEM and budget friendly end users which most of the market is made up from.
 
Last edited:

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
Way too many tiers. Ultra enthusiast even higher?

I guess when prices are so massive you have to add 10 tiers.

Easier in 2011 when it was $350-$500 (570-580) for High End, $200-$250 (560 - 560 Ti) for Midrange, and $150 Mainstream (550 Ti).

That's now $1200-$2500 (2080Ti-Titan) for High End and $500-$700 (2070S-2080S) for Midrange. Harder to define Mainstream with both the TU116 and TU106, but I guess we can include both and say Mainstream is $220-$400, which is a big ass range but it is what it is.

Not even gonna cry and say it should be 2011 prices, or be a corporate apologist and claim they need to jack prices to not run at a loss, it's obviously a combination of chasing far larger profits while battling more expensive R&D and manufacturing. But the truth is prices have increased from 3.42-5x on the largest chip, 2.5-2.8x on the second largest chip in 8 years. And that's sad for consumers.
 
Last edited:

DrMrLordX

Lifer
Apr 27, 2000
23,191
13,275
136
And that's sad for consumers.

Consumers will have to stick up for themselves. Sadly, that means buying consoles rather than PCs for gaming. I'd rather see people just hold out for cheaper cards, but they aren't going to hold out. They're just going to walk away.
 

amrnuke

Golden Member
Apr 24, 2019
1,181
1,772
136
Way too many tiers. Ultra enthusiast even higher?

I guess when prices are so massive you have to add 10 tiers.

Easier in 2011 when it was $350-$500 (570-580) for High End, $200-$250 (560 - 560 Ti) for Midrange, and $150 Mainstream (550 Ti).

That's now $1200-$2500 (2080Ti-Titan) for High End and $500-$700 (2070S-2080S) for Midrange. Harder to define Mainstream with both the TU116 and TU106, but I guess we can include both and say Mainstream is $220-$400, which is a big ass range but it is what it is.

Not even gonna cry and say it should be 2011 prices, or be a corporate apologist and claim they need to jack prices to not run at a loss, it's obviously a combination of chasing far larger profits while battling more expensive R&D and manufacturing. But the truth is prices have increased from 3.42-5x on the largest chip, 2.5-2.8x on the second largest chip in 8 years. And that's sad for consumers.
I'm confused. Why is the second-best card from each company the mid-range?

Nvidia has 2080 ti, 2080 SUPER, 2080, 2070 SUPER, 2070, 2060 SUPER, 2060, 1660 ti, 1660, 1060, 1650, 1050, 1030, etc. Are you implying that you think once 2080 SUPER is released, that's the new mid-level card? It's neither mid-level price nor mid-level performance.

What is mid-level performance to you? If you think 2070 SUPER is mid-range, that means you must think 4k 30+fps at high settings in all modern games is "mid-range". What, then, is high-end? How are you defining these categories? Simply based on product stack? If it's just the second-best card, couldn't Nvidia "please the crowd" by only offering a 2080 ti and a 1030 2GB - thus allowing all gamers to have a "mid-range" card for $80?
 

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
Not 2nd best card, second largest chip. TU102 is largest, TU104 is second largest (midrange), TU106/116 is 3rd largest (making an exception here because it's a unique situation with TU106 being closer to 104 than they usually are).

Technically it's not the middle chip because there are more than 3 with much smaller value or entry level chips: currently the TU117 (GTX 1650). But that was true in 2011 as well as there were chips smaller than the GTX 550 Ti which was the 3rd largest chip.

What we're looking at now is product name inflation and price inflation. Also die size inflation in this scenario, but if you go back to Pascal, just 1-3 years, prices were still inflated and die sizes were actually smaller than in Fermi. GTX 1080, which was $500-$700 in its MSRP/FE life span, was midrange.
 
Last edited:

amrnuke

Golden Member
Apr 24, 2019
1,181
1,772
136
Not 2nd best card, second largest chip. TU102 is largest, TU104 is second largest (midrange), TU106/116 is 3rd largest (making an exception here because it's a unique situation with TU106 being closer to 104 than they usually are).

Technically it's not the middle chip because there are more than 3 with much smaller value or entry level chips: currently the TU117 (GTX 1650). But that was true in 2011 as well as there were chips smaller than the GTX 550 Ti which was the 3rd largest chip.

What we're looking at now is product name inflation and price inflation. Also die size inflation in this scenario, but if you go back to Pascal, just 1-3 years, prices were still inflated and die sizes were actually smaller than in Fermi. GTX 1080, which was $500-$700 in its MSRP/FE life span, was midrange.

Nvidia should eliminate all but their top and bottom size chips then. This would solve the issue, correct?

Nvidia could also, if they wanted, make a very poorly binned and very slow chip on a large process and large die and sell it for very cheap. That would solve the problem as well, correct?

Or, and I know this sounds crazy, perhaps we can evaluate what "mid range" performance is, rather than worry about die sizes and product stacks?
 

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
How does evaluating it off performance change anything?

GTX 560 Ti was actually closer to GTX 580 than the RTX 2080 Super is to RTX TItan, in terms of shaders. If 560 Ti was mid range, then so is the 2080S / GTX 1080 / GTX 980.

2080S has an even stronger claim to midrange than the 560 Ti did as the latter is closer to the largest chip in the stack. Name and price inflation doesn't change the chip designation.

I'm largely consistent here based on chip sizes and in terms of performance it mostly lines up too (at least if using TU116 for 3rd largest, leaving TU106 as an anomaly that I could see slotting into midrange). I'm not sure what your metric is though.
 

amrnuke

Golden Member
Apr 24, 2019
1,181
1,772
136
How does evaluating it off performance change anything?

GTX 560 Ti was actually closer to GTX 580 than the RTX 2080 Super is to RTX TItan, in terms of shaders, ROPs, VRAM. If 560 Ti was mid range, then so is the 2080S / GTX 1080 / GTX 980.

I'm largely consistent here based on chip sizes. I'm not sure what your metric is.
My metric is the end result: gaming performance. I'd consider the middle third of performance FPS-wise on esports and AAA games to be mid-range. Is there any fallacy to this logic, from your point of view? If so, why?

As for the results of looking at things from a middle-third FPS standpoint, the results are definitely at least a little different.

The 1030 can only with difficulty run most modern games at a frame every now and then at Ultra 1080p, clearly that is low-end. The 2080 ti makes mincemeat of nearly all AAA games at Ultra 1080p (CPU limited in some games), and is clearly the top-end. Using those markers, the middle third of FPS performance ranges from about RX590 and 1660 up to 2060 and Vega64. A price range of $200-400 can easily get you mid-range performance.

If you're looking for 1440p60 or 1080p144, what's wrong with a 1660 Ti or Vega 56?
 

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
1660 Ti and Vega 56 are good mainstream options.

I can't call a 1660 Ti midrange though when it has only 33% of the shaders of the fully enabled largest chip. It is the successor the mainstream GTX 550 Ti, which was 37.5% of the fully enabled largest chip.

The problem with your logic is its opinion based on what is a good resolution and frame rate, settings, and game selection. We live in a world with 4k 144hz monitors where even the RTX Titan can't play at the res/fps maxed out. Is $2500 not high end enough?

How do you compare your to history? How do you classify 2011? Was the $149 GTX 550 Ti midrange to you then? It was pretty capable at 1680x1050, a common resolution back then. If so, you're consistent at least. Then what do you think of the 560 Ti?

But I have to disagree and say tertiary chips such as these, that are approximately 1/3 of the largest chip, cannot be considered midrange.
 
Last edited:

DeathReborn

Platinum Member
Oct 11, 2005
2,786
789
136
Going by pascal, the 1080 was definitely NOT mid range. 1030, 1050, 1050 Ti, 1060 3GB, 1060 6GB, 1070, 1070 Ti, 1080, 1080 Ti, Titan. Going by that the 1060 3GB & 1060 6GB were mid range with 1070 being a borderline mid range/performance card. If you split it further you end up with:

Entry - 1030
Mainstream - 1050, 1050 Ti
Mid Range - 1060 3GB, 1060 6GB
Performance - 1070, 1070 Ti
High End - 1080
Enthusiast - 1080
ePeen Compensator - Titan

Wiki says 1050-1060 are Mid Range, no mainstream or performance tiers. Perhaps a site like Anandtech should do an in depth article on the state of the graphics industry, how it's changing and what to expect in the near future. Us enthusiasts are good at coming up with arguments (perf/w, die size are good examples) but not good at seeing the whole picture.
 

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
For Pascal I'd say:

Entry - 1030
Value - 1050, 1050 Ti
Mainstream - 1060 3gb, 1060 6gb
Mid range - 1070, 1070 Ti, 1080 (1070 Ti is closer to 1080 than to the 1070... lop them all together or pair the 1070 Ti and 1080).
High end - 1080 Ti, Titan XP, Titan Xp

But we are just disagreeing over words. I'll use your tiers, that's fine. Then in 2011 we had:

Entry - GeForce 510
Mainstream - GT 530
Mid range - 550 Ti, GT 545
Performance - 560, 460 768mb
High End - 560 Ti
Enthusiast - 570, 560 Ti Core 448
ePeen Compensator - GTX 580

Still doesn't change the sad state of affairs of $250 -> $700 for 560 Ti to 1080 (FE launch price)
 

amrnuke

Golden Member
Apr 24, 2019
1,181
1,772
136
1660 Ti and Vega 56 are good mainstream options.

I can't call a 1660 Ti midrange though when it has only 33% of the shaders of the fully enabled largest chip. It is the successor the mainstream GTX 550 Ti, which was 37.5% of the fully enabled largest chip.

The problem with your logic is its opinion based on what is a good resolution and frame rate, settings, and game selection. We live in a world with 4k 144hz monitors where even the RTX Titan can't play at the res/fps maxed out. Is $2500 not high end enough?

How do you compare your to history? How do you classify 2011? Was the $149 GTX 550 Ti midrange to you then? It was pretty capable at 1680x1050, a common resolution back then. If so, you're consistent at least. Then what do you think of the 560 Ti?

But I have to disagree and say tertiary chips such as these, that are approximately 1/3 of the largest chip, cannot be considered midrange.

Right. Again, I don't care how many shaders it has. All I care about which cards currently available are in the middle third of performance, which I consider mid-range.

I'm not sure how I classify 2011. Perhaps if you pointed me to a mega-benchmark from 2011 comparing the then-available cards, we could deduce which ones performed at mid-range levels.