Nvidia ,Rtx2080ti,2080,2070, information thread. Reviews and prices September 14.

Page 48 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

moonbogg

Lifer
Jan 8, 2011
10,736
3,454
136
Haven't been following the countless posts and updates. I was just wondering though, are these things worth $1300 yet?
 

sze5003

Lifer
Aug 18, 2012
14,320
683
126
Haven't been following the countless posts and updates. I was just wondering though, are these things worth $1300 yet?
We won't know until the 14th or the 15th. They probably will be more powerful than we speculate but as far as worth it, that's debatable considering the price has rtx baked in there.
 
  • Like
Reactions: moonbogg

TheF34RChannel

Senior member
May 18, 2017
786
310
136
Haven't been following the countless posts and updates. I was just wondering though, are these things worth $1300 yet?

Watched the video of the Bugatti Divo reveal today. All 40 units were sold before any customer even saw it - that's 5 million a pop - and suddenly the Ti doesn't look so expensive anymore :p

Edit: typo
 
Last edited:
  • Like
Reactions: moonbogg

gdansk

Diamond Member
Feb 8, 2011
4,795
8,095
136
Haven't been following the countless posts and updates. I was just wondering though, are these things worth $1300 yet?
We won't know until there are independent benchmarks and/or reviews. But I find it unlikely that any gaming graphics card could be worth $1300.
 
  • Like
Reactions: moonbogg

psolord

Platinum Member
Sep 16, 2009
2,142
1,265
136
Watched the video of the Bugatti Devo reveal today. All 40 units were sold before any customer even saw it - that's 5 million a pop - and suddenly the Ti doesn't look so expensive anymore :p
Well, there are 7.2 billion idiots on this planet. Chances are some of them have a lot of money.
 
  • Like
Reactions: moonbogg

sze5003

Lifer
Aug 18, 2012
14,320
683
126
Well at least with a gpu you hopefully won't need to service it haha. But yea $1200 plus tax for one component is steep considering we will all be back here next year chasing the carrot stick.

Looks like the rtx feature is just an on and off toggle here in cyberpunk 2077.

https://i.imgur.com/6XOUrKT.png
 
Last edited:

Muhammed

Senior member
Jul 8, 2009
453
199
116
Those waiting on cheap 7nm GPUs will be seriously disappointed come next year ..

As a result of TSMC being the only vendor for 7nm in the near future, prices will be high, capacities will be limited and divided on too many customers, AMD, NVIDIA, Apple and others. So AMD will prioritize their 7nm share on Zen2, and some Vega Instincts as pipe cleaners and that's it. Navi will get the shaft till capacities become plentiful and production costs reasonable.
 
Last edited:

moonbogg

Lifer
Jan 8, 2011
10,736
3,454
136
The thing that concerns me is 7nm pricing. It won't be any better if AMD doesn't come up with something. Part of me wants to ask Nvidia, "Hey, what about us Ti buyers who want an upgrade at the same price range?". I'd expect Nvidia to tell Ti buyers to either pay the $1300 or settle for the X80 class for $800. If we still don't like it we can pound sand and buy a Switch or something.

I honestly expect Nvidia to actually tell their "old" customers to go straight to hell with their $700 Ti expectations. Nvidia has enough people in the $1000+ category to more than make up for it. So, I am learning they don't actually need traditional high end gamers and aren't interested in selling us product anymore. They have to realize we won't settle for an X80 class at the same price point. I simply won't buy it and I am confident a lot of others won't either. They will lose the traditional high end gamer customers who aren't willing to either pay way more money or settle for a lesser product for the same money.

Seriously. If Starbucks suddenly raised the price of a frap to $12.00 and a regular iced coffee to $8.00, I wouldn't sit there and rationalize why it might be worth it still. I wouldn't consider anything about it. I'd simply not buy the product, lol. This will be just as easy for me. I'll chuck the whole hobby to the curb with prices like this. No problem. I tap out, easy.
 

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,250
136
Haven't been following the countless posts and updates. I was just wondering though, are these things worth $1300 yet?

Last debate I think was over the value of the size of the die. It's a given that a larger die will cost more to produce, but the value of the die has yet to be determined. I think those who are buying value the die size greater than those whom aren't buying. Maybe they're looking at die size value as some form of psychological lube for when they bend over and take one for the team?

$1200-$1300 will buy you a complete gaming rig with a 1080Ti if your a deal shopper.
 
  • Like
Reactions: moonbogg

JDG1980

Golden Member
Jul 18, 2013
1,663
570
136
I think it's important to look at the history of Nvidia's GPUs to see just how out of control the pricing has gotten here.

Fermi: GTX 560 Ti (full GF114) - $249 MSRP
Kepler: GTX 680 (full GK104) - $499 MSRP
Maxwell: GTX 980 (full GM204) - $549 MSRP
Pascal: GTX 1080 (full GP104) - $599 MSRP
Turing: RTX 2080 (cut TU104) - $699 MSRP

Nvidia has inflated the numbering and prices on its second-tier (4-series) GPUs, to the point that they are selling what should be a $249 video card for $699. And we now don't even get the full chip!
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
I think it's important to look at the history of Nvidia's GPUs to see just how out of control the pricing has gotten here.

Fermi: GTX 560 Ti (full GF114) - $249 MSRP
Kepler: GTX 680 (full GK104) - $499 MSRP
Maxwell: GTX 980 (full GM204) - $549 MSRP
Pascal: GTX 1080 (full GP104) - $599 MSRP
Turing: RTX 2080 (cut TU104) - $699 MSRP

Nvidia has inflated the numbering and prices on its second-tier (4-series) GPUs, to the point that they are selling what should be a $249 video card for $699. And we now don't even get the full chip!

Die area costs, have gotten more expensive over time, at MUCH higher rate than inflation, and bigger dies cost much more than smaller ones, and the rate of increase of cost to die size, is greater than linear.

GP104 is ~300mm2 die, while TU104 is ~500mm2.

Also in case you haven't noticed we have had a run on RAM pricing, and GDDR6 is the hot new specialty ram that will have premium pricing.

Pricing hasn't got out of control, but things that cost more to produce are going to cost more to buy.
 
  • Like
Reactions: amenx

maddie

Diamond Member
Jul 18, 2010
5,204
5,614
136
I think it's important to look at the history of Nvidia's GPUs to see just how out of control the pricing has gotten here.

Fermi: GTX 560 Ti (full GF114) - $249 MSRP
Kepler: GTX 680 (full GK104) - $499 MSRP
Maxwell: GTX 980 (full GM204) - $549 MSRP
Pascal: GTX 1080 (full GP104) - $599 MSRP
Turing: RTX 2080 (cut TU104) - $699 MSRP

Nvidia has inflated the numbering and prices on its second-tier (4-series) GPUs, to the point that they are selling what should be a $249 video card for $699. And we now don't even get the full chip!
You people keep bringing up these inconvenient facts in your posts. It's really frustrating for me to handle.

Do I have to start all over again to explain why these prices are reasonable and must happen?

edit: Guess I was psychic while typing.
 

maddogmcgee

Senior member
Apr 20, 2015
414
426
136
Had been following these threads for months while I considered which Nvidia card to buy. Ended up getting a cheap rx 580 8gb that I will underclock a bit. Putting the savings towards a week in Melbourne in a couple of weeks and could not be happier. Most of my games are CPU bound anyway.
 
  • Like
Reactions: moonbogg

TheF34RChannel

Senior member
May 18, 2017
786
310
136
The thing that concerns me is 7nm pricing. It won't be any better if AMD doesn't come up with something. Part of me wants to ask Nvidia, "Hey, what about us Ti buyers who want an upgrade at the same price range?". I'd expect Nvidia to tell Ti buyers to either pay the $1300 or settle for the X80 class for $800. If we still don't like it we can pound sand and buy a Switch or something.

I honestly expect Nvidia to actually tell their "old" customers to go straight to hell with their $700 Ti expectations. Nvidia has enough people in the $1000+ category to more than make up for it. So, I am learning they don't actually need traditional high end gamers and aren't interested in selling us product anymore. They have to realize we won't settle for an X80 class at the same price point. I simply won't buy it and I am confident a lot of others won't either. They will lose the traditional high end gamer customers who aren't willing to either pay way more money or settle for a lesser product for the same money.

Seriously. If Starbucks suddenly raised the price of a frap to $12.00 and a regular iced coffee to $8.00, I wouldn't sit there and rationalize why it might be worth it still. I wouldn't consider anything about it. I'd simply not buy the product, lol. This will be just as easy for me. I'll chuck the whole hobby to the curb with prices like this. No problem. I tap out, easy.

I selected your post because it sums up the cost debate in its entirety, so it's not a direct reply to you alone.

It appears that by keeping the names the same Nvidia created this illusion of price hikes. If they'd named the Ti ultra/extreme/whatever no one would be having this idea; they'd know they'd be buying Titan class performance.

The fact they launched the Ti at launch, and it being TU102 are dead giveaways that in the past this would be named Titan.

One could say what's in a name, but this is a perfect example of how much a name can do to customers.

The internet seems devided between those who are aware of the above and those who only look at the name. No offence to anyone, just an observation.

The least that Nvidia could have done is explain the current segments plus naming. Better would've been to have had new naming schemes.
 
  • Like
Reactions: PeterScott

krumme

Diamond Member
Oct 9, 2009
5,956
1,596
136
I selected your post because it sums up the cost debate in its entirety, so it's not a direct reply to you alone.

It appears that by keeping the names the same Nvidia created this illusion of price hikes. If they'd named the Ti ultra/extreme/whatever no one would be having this idea; they'd know they'd be buying Titan class performance.

The fact they launched the Ti at launch, and it being TU102 are dead giveaways that in the past this would be named Titan.

One could say what's in a name, but this is a perfect example of how much a name can do to customers.

The internet seems devided between those who are aware of the above and those who only look at the name. No offence to anyone, just an observation.

The least that Nvidia could have done is explain the current segments plus naming. Better would've been to have had new naming schemes.
All semantics. The end result can be seen in the margins. We will know next quarter who was wrong and who was right.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Die area costs, have gotten more expensive over time, at MUCH higher rate than inflation, and bigger dies cost much more than smaller ones, and the rate of increase of cost to die size, is greater than linear.

GP104 is ~300mm2 die, while TU104 is ~500mm2.

Also in case you haven't noticed we have had a run on RAM pricing, and GDDR6 is the hot new specialty ram that will have premium pricing.

Pricing hasn't got out of control, but things that cost more to produce are going to cost more to buy.

What you never include in your examples is the time frame, 16nm wafer price in 2016 in not the same as in 2018. So TU104 may be 500mm2 but the wafer price of 12nm in 2018 is lower than 16nm wafer price in 2016.
And dont say that 12nm is a new node because its not, its just an enhanced 16nm with same rules but a little higher density than 16nm. So today, 500mm2 dies at 12nm are not more expensive than 300mm2 16nm dies back in 2016. ;)
 

coercitiv

Diamond Member
Jan 24, 2014
7,491
17,907
136
It's not the naming that's the problem, it's the specs that come with that naming and how they relate to previous gen hardware that's still on the shelves. To illustrate, if we take the jump in performance the 1070 had to offer relative to 980Ti (stock vs. stock it was around 15% faster) and we apply it to the new Turing lineup... then it's not the 2070 that needs favorably compare vs. the 1080Ti, it is.... hold on... the 2060. That's what this naming offset means.

This is the real disconnect that creates all the friction: the weird angle in which we look at the 2080Ti as a Titan class card (to justify pricing) but then turn around and compare 2070 to 1080Ti (to justify performance).
 

JasonLD

Senior member
Aug 22, 2017
488
447
136
What you never include in your examples is the time frame, 16nm wafer price in 2016 in not the same as in 2018. So TU104 may be 500mm2 but the wafer price of 12nm in 2018 is lower than 16nm wafer price in 2016.
And dont say that 12nm is a new node because its not, its just an enhanced 16nm with same rules but a little higher density than 16nm. So today, 500mm2 dies at 12nm are not more expensive than 300mm2 16nm dies back in 2016. ;)

Source? Hopefully you know the actual prices of those wafers to back up your claims.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
Die area costs, have gotten more expensive over time, at MUCH higher rate than inflation, and bigger dies cost much more than smaller ones, and the rate of increase of cost to die size, is greater than linear.

Your argument falls flat on its face because:

YoY, just gaming revenue.

https://www.anandtech.com/show/11711/nvidia-announces-earnings-of-22-billion-for-q2-2018
Gaming revenue +51.8%

https://www.anandtech.com/show/12022/nvidia-announces-earnings-of-26-billion-for-q3-2018
+25%

https://www.anandtech.com/show/12418/nvidia-announces-record-q4-2018-results
+29%

https://www.anandtech.com/show/12741/nvidia-announces-record-q1-fy-2019-results
+66%

https://www.anandtech.com/show/13235/nvidia-announces-q2-fy-2019-results-record-revenue
+75.7%

Record revenue. Gross margin increases. Increases have minimal to do with die size. Perhaps it uses costs as an argument to justify price increases, but its all about revenue, which impacts share prices the most.

GTX 1080, which really accelerated the small die, expensive-to-buy trend released in Q2 2016. Look at this revenue graph: https://www.overclock3d.net/gfx/articles/2018/05/11100324577l.jpg

They went from 781 million in Q2 FY2017 Fiscal year to 1,723 million in Q1 FY2019. Q2 2017 Fiscal year announcement was done in August of 2016 for Nvidia, so right after GTX 1080 release.

Earlier graph: https://s22.q4cdn.com/364334381/fil..._reports/2018/Rev_by_Mkt_Qtrly_Trend_Q118.pdf

Funny that second graph is showing huge year-over-year jumps starting on Q3 FY2017, which is consumer Pascal year. Gaming is also the majority of revenue.

Clearly, they are having so much problems because of the ultra-high production costs of GDDR6, die size, and maybe Jensen's leather jacket!
 
Last edited:

Qwertilot

Golden Member
Nov 28, 2013
1,604
257
126
Sorry but no. What you're seeing here is precisely the same line up as mk1 Pascal had.

Big chip Titan only for now.

Next year is either 7nm all round, or a cheap(er!) T102 based card.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
  • Like
Reactions: Det0x

wanderica

Senior member
Oct 2, 2005
224
52
101
It's not the naming that's the problem, it's the specs that come with that naming and how they relate to previous gen hardware that's still on the shelves. To illustrate, if we take the jump in performance the 1070 had to offer relative to 980Ti (stock vs. stock it was around 15% faster) and we apply it to the new Turing lineup... then it's not the 2070 that needs favorably compare vs. the 1080Ti, it is.... hold on... the 2060. That's what this naming offset means.

This is the real disconnect that creates all the friction: the weird angle in which we look at the 2080Ti as a Titan class card (to justify pricing) but then turn around and compare 2070 to 1080Ti (to justify performance).

You managed to make the point better than I could. I wasn't born last night, and I'm fully capable of understanding a shift in naming. Benchmarks should sort this out. My prediction is that the 2070 and 2080 will occupy the slots vacated by the 1070 and 1080 respectively while the 2080Ti gets bumped up to the old Titan slot effectively skipping what would have been the xx80Ti spot, but again, this won't truly be sorted until we have more information on performance.

It's needlessly confusing, and I submit that it's also purposefully confusing. Using myself as an example, I always ignore the Titans simply because it's a halo product that is always overpriced compared to its xx80Ti little brother, yet I considered preordering a 2080Ti. In the end, I think this is going to backfire on Nvidia. They should have changed the naming entirely; the whole stack. Jay can insult his fans and Tom's can gush on about it all they want. It isn't going to change perception and precedent from 4 generations of branding. It's not just naming after all of this time. It's branding. It doesn't matter what slot it holds in the lineup, especially when Nvidia could have said as much, but chose not to. The 2080Ti, not the Titan T, not the 2090, not the 2080 Ultra, not the Nvidia Beastmode Ultra Uber Black, but the 2080ti costs $1200, and they did it deliberately, precisely because folks are going to allow them to get away with it by justifying a "naming shift."
 
Last edited:
  • Like
Reactions: crisium

Qwertilot

Golden Member
Nov 28, 2013
1,604
257
126
If this were "precisely" the same lineup as mk1 Pascal had, including the Titan price point, then 2080 MSRP would be $599 and 2070 would be $379. The "Turing Titan" in "2080Ti" clothing is pulling all prices up.

Or inflation, technology cost, plain greed or whatever :) That's the cogent comparison anyway.

And, yes, it obviously is a bit confusing.
 
Status
Not open for further replies.