Nvidia ,Rtx2080ti,2080,2070, information thread. Reviews and prices September 14.

Page 49 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

maddogmcgee

Senior member
Apr 20, 2015
414
426
136
Nope, its clearly a monopoly taking its toll. Nvidia increases prices because they can. Why wouldn't they if the AMD options are way over priced at the tail end of the mining boom and Intel makes crappy integrated graphics that barely many year old games.
 
  • Like
Reactions: Ken g6

LTC8K6

Lifer
Mar 10, 2004
28,520
1,576
126
Nope, its clearly a monopoly taking its toll. Nvidia increases prices because they can. Why wouldn't they if the AMD options are way over priced at the tail end of the mining boom and Intel makes crappy integrated graphics that barely many year old games.
I actually wonder if NV even takes notice of AMD any more.
 

n0x1ous

Platinum Member
Sep 9, 2010
2,574
252
126
You managed to make the point better than I could. I wasn't born last night, and I'm fully capable of understanding a shift in naming. Benchmarks should sort this out. My prediction is that the 2070 and 2080 will occupy the slots vacated by the 1070 and 1080 respectively while the 2080Ti gets bumped up to the old Titan slot effectively skipping what would have been the xx80Ti spot, but again, this won't truly be sorted until we have more information on performance.

It's needlessly confusing, and I submit that it's also purposefully confusing. Using myself as an example, I always ignore the Titans simply because it's a halo product that is always overpriced compared to its xx80Ti little brother, yet I considered preordering a 2080Ti. In the end, I think this is going to backfire on Nvidia. They should have changed the naming entirely; the whole stack. Jay can insult his fans and Tom's can gush on about it all they want. It isn't going to change perception and precedent from 4 generations of branding. It's not just naming after all of this time. It's branding. It doesn't matter what slot it holds in the lineup, especially when Nvidia could have said as much, but chose not to. The 2080Ti, not the Titan T, not the 2090, not the 2080 Ultra, not the Nvidia Beastmode Ultra Uber Black, but the 2080ti costs $1200, and they did it deliberately, precisely because folks are going to allow them to get away with it by justifying a "naming shift."

People keep saying its just a name shift and you could make that case for 2080ti being a Titan T, but not a 2080 being a Ti. Ti has always been the big chip not the 104 chip so that argument doesn't hold water in this case.
 
  • Like
Reactions: crisium

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
Your argument falls flat on its face because:

YoY, just gaming revenue.

During the Mining Craze there was NO such thing as "just gaming" revenue. Products normally have a life cycle where price cuts start happening soon after launch, but when you have a run on product like that caused by mining, not only are there are no price cuts, but there are price increases as well, and your production is running full tilt, which tends to be more efficient.

So "Golf Clap" revenues and margins are up during the mining craze.

I don't like price increases any more than the next guy, but there are more egregious price increases.

GTX 1060's price went up, while the die size decreased, that really has little excuse except a pure cash grab.

But when die sizes shoot up to the size of the next tier, you shouldn't be surprised to pay next tier pricing. If you expect a company to just eat into margins to give you the same prices you are living in a dreamland.
 

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
first 3d mark results
RTX2080 6% faster than reference 1080TI.AIB 1080TI is slightly faster than rtx 2080
https://videocardz.com/77763/nvidia-geforce-rtx-2080-3dmark-timespy-result-leaks-out

In line with my expectations. 2080 FE not only has a higher boost than 1080 Ti FE, but with a better cooler it can presumably maintain higher clocks longer as well. But AIB 1080 Ti's eat away at this clock advantage. And if 2080 has the same 2-2.1Ghz barrier, then it's a real possibility max OC both cards will see a 1080 Ti winner in traditional rasterization games. And unlike gained ground from OCs in previous generations, this time the x80 card has less VRAM than the older x80 Ti so this will sting even more if it holds up September 14th. I'm looking forward to reviews.
 

Hitman928

Diamond Member
Apr 15, 2012
6,755
12,501
136
During the Mining Craze there was NO such thing as "just gaming" revenue. Products normally have a life cycle where price cuts start happening soon after launch, but when you have a run on product like that caused by mining, not only are there are no price cuts, but there are price increases as well, and your production is running full tilt, which tends to be more efficient.

So "Golf Clap" revenues and margins are up during the mining craze.

I don't like price increases any more than the next guy, but there are more egregious price increases.

GTX 1060's price went up, while the die size decreased, that really has little excuse except a pure cash grab.

But when die sizes shoot up to the size of the next tier, you shouldn't be surprised to pay next tier pricing. If you expect a company to just eat into margins to give you the same prices you are living in a dreamland.

This happens all the time in the business world. Supplier costs go up but MSRP stays the same. This is also exactly what would have happened if AMD had a competitive product on the market.
 

Hitman928

Diamond Member
Apr 15, 2012
6,755
12,501
136
In line with my expectations. 2080 FE not only has a higher boost than 1080 Ti FE, but with a better cooler it can presumably maintain higher clocks longer as well. But AIB 1080 Ti's eat away at this clock advantage. And if 2080 has the same 2-2.1Ghz barrier, then it's a real possibility max OC both cards will see a 1080 Ti winner in traditional rasterization games. And unlike gained ground from OCs in previous generations, this time the x80 card has less VRAM than the older x80 Ti so this will sting even more if it holds up September 14th. I'm looking forward to reviews.

I don't remember where, but I read that TSMC's own comment on 12 nm was that it didn't give any performance benefits versus it's 16 nm process. So I am assuming the max OC on the new generation will be roughly the same as the old.
 

Despoiler

Golden Member
Nov 10, 2007
1,968
773
136
I don't remember where, but I read that TSMC's own comment on 12 nm was that it didn't give any performance benefits versus it's 16 nm process. So I am assuming the max OC on the new generation will be roughly the same as the old.

12nm is 16nm with a bigger reticle limit. That's it. There are some other changes, but all of them are unremarkable from a technical improvement standpoint. Nvidia spun it like the next coming of Jesus like it does with everything.
 
  • Like
Reactions: psolord

SMOGZINN

Lifer
Jun 17, 2005
14,359
4,640
136
People keep saying its just a name shift and you could make that case for 2080ti being a Titan T, but not a 2080 being a Ti. Ti has always been the big chip not the 104 chip so that argument doesn't hold water in this case.

It also fails because Nvidia is comparing the cards to the same named cards of the previous generation. If they are wanting to do a name change, along with a corresponding segmentation change, they need to let consumers know what the new name of the segmentation they fall into and be consistent with it in their marketing. Nvidia is doing the exact opposite of that, they are still encouraging 1080 buyers to buy the 2080 and Ti buyers to buy the Ti, which means that Titan users will wait for the expected Titan chip next year. If this is just a name change to replace the Titan segment with the Ti segment they are causing a lot of bad will among their consumers by failing to communicate this.
 

Qwertilot

Golden Member
Nov 28, 2013
1,604
257
126
Well Titan performance never was much different to the Ti's, you just got it earlier.

The question that'll potentially make the difference for indifferent vs very good is DLSS.
 

realibrad

Lifer
Oct 18, 2013
12,337
898
126
You people keep bringing up these inconvenient facts in your posts. It's really frustrating for me to handle.

Do I have to start all over again to explain why these prices are reasonable and must happen?

edit: Guess I was psychic while typing.

Go back further, and explain the price of the 8800 and the 8800 ultra.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
This happens all the time in the business world. Supplier costs go up but MSRP stays the same. This is also exactly what would have happened if AMD had a competitive product on the market.

You mean like how the RX480 kept NVidia from increasing prices on GTX 1060(despite 1060 having a smaller die than GTX 960).

Nope... NVidia raised prices on GTX 1060 despite a competitive AMD product already on the market. GTX 1060 moved from $200(GTX 960) to a $250 price point (25% increase).

Competitive AMD doesn't affect NVidia pricing.

AMD would need to be both significantly faster,and significantly cheaper for NVidia pricing to be affected, and that won't happen. If AMD had a significantly faster product, they aren't going to price it cheaper, because, surprise, they also would like a healthy profit margin on their GPUs.

And no way on earth is anyone going to get into a price war on monster 754 mm2 dies.
 

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
Both companies have a long and storied history of releasing faster products for less money when they competed more equally.

I'm really getting a vibe from that post that you are saying there is no way, even with competition, that we could get better performance-per-dollar from NV than right now.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Source? Hopefully you know the actual prices of those wafers to back up your claims.

I dont have to know the exact wafer price today to know that wafer price two and a half years later is not the same at the same node. TSMC 16nm and 12nm are the same node (same rules, same equipment etc etc), price of 16nm wafers today (H2 2018) are cheaper than H1 2016. Also yields today are higher than H1 2016. Everybody knows that, i dont need to backup anything here. ;)
 

Hitman928

Diamond Member
Apr 15, 2012
6,755
12,501
136
You mean like how the RX480 kept NVidia from increasing prices on GTX 1060(despite 1060 having a smaller die than GTX 960).

Nope... NVidia raised prices on GTX 1060 despite a competitive AMD product already on the market. GTX 1060 moved from $200(GTX 960) to a $250 price point (25% increase).

Why are you so obsessed with die sizes from a consumer perspective? The only thing that matters is perf/$ (I'm including less tangible metrics in perf here). By this logic, if company A makes a GPU that is 700 mm^2 and is half the performance of company B with a GPU that is 500 mm^2, well, we should still pay more company A's GPU because the die size is bigger. I understand you're considering that the cost to manufacture the GPU has some play into what a company wants to price their GPU at, but that's for the company execs to figure out, it should have no bearing on how much cost I, as a consumer, am willing to accept.

The problem with your GTX 1060 example is that AMD wasn't (and isn't) competitive. They had 2 products that performed well against their competition in pure performance (but still lacked in perf/w) but that is a small piece of the full product stack available to consumers. AMD can't be disruptive if they can only reach up to the lower side of the midrange of Nvidia offerings. There was also that whole mining thing that prevented the normal price evolution from occuring.

Even in a normal market, if AMD priced the 580 at $150, what would that do? It might drop the 1060 and 1070 by $50 and the rest on up would stay the same. Until AMD can reach up and provide above 1080 (and now 2080) performance, they can't have nearly as significant effect on pricing.

Competitive AMD doesn't affect NVidia intel pricing.. . If AMD had a significantly faster product, they aren't going to price it cheaper, because, surprise, they also would like a healthy profit margin on their GCPUs.

And no way on earth is anyone going to get into a price war on monster 754 mm2 8 core dies.

This is exactly what we heard about intel pricing for many years until, you know, AMD put out a competitive product stack from top to bottom and forced intel to drastically reduce their pricing structure. I wouldn't expect the difference to be nearly so stark in the GPU realm, but to say that AMD putting out an actually competitive product stack wouldn't force Nvidia price cuts is being willfully ignorant.
 
  • Like
Reactions: crisium

jpiniero

Lifer
Oct 1, 2010
17,205
7,580
136
I dont have to know the exact wafer price today to know that wafer price two and a half years later is not the same at the same node.

IIRC, wafer prices tends to plateau though. And remember 16 nm was nearly a year old by the time the 1080 launched, so it was a mature node even then.

I'm assuming the fake MSRP for the 2080/Ti was in part to discourage AMD from releasing a Vega 20 Frontier part.
 

jpiniero

Lifer
Oct 1, 2010
17,205
7,580
136
Why are you so obsessed with die sizes from a consumer perspective? The only thing that matters is perf/$ (I'm including less tangible metrics in perf here). By this logic, if company A makes a GPU that is 700 mm^2 and is half the performance of company B with a GPU that is 500 mm^2, well, we should still pay more company A's GPU because the die size is bigger.

If you had a situation like that, Company A wouldn't even release the product.
 
  • Like
Reactions: PeterScott

DeathReborn

Platinum Member
Oct 11, 2005
2,786
789
136
I will still be buying a few 2080's come the beginning of October even with the price (and costs to produce) increase Don't get me wrong, I dislike the new prices but since I delayed upgrading many systems by ~16 months (and counting) while still putting money into the fund I can absorb the price difference with no extra outlay. Might even splurge on a 2080 Ti for my main rig just for the hell of it.

You mean like how the RX480 kept NVidia from increasing prices on GTX 1060(despite 1060 having a smaller die than GTX 960).

Nope... NVidia raised prices on GTX 1060 despite a competitive AMD product already on the market. GTX 1060 moved from $200(GTX 960) to a $250 price point (25% increase).

Competitive AMD doesn't affect NVidia pricing.

AMD would need to be both significantly faster,and significantly cheaper for NVidia pricing to be affected, and that won't happen. If AMD had a significantly faster product, they aren't going to price it cheaper, because, surprise, they also would like a healthy profit margin on their GPUs.

And no way on earth is anyone going to get into a price war on monster 754 mm2 dies.

Didn't AMD a few years ago in crease the price of a card after launch by $50? AMD themselves partially screwed up their pricing by putting everything from the 460 to 480 8GB too close together, then mining hit and the price people were prepared to pay went up, Nvidia (and no doubt AMD) saw that and will try to use that to boost margins and pad revenues... gotta loathe capitalism sometimes.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
Why are you so obsessed with die sizes from a consumer perspective?

Because Die size is the primary determinate of production costs, and in the end, retail pricing.

Look at right now, with falling GTX 1080 prices. Have Vega 64 prices followed?

GTX 1080 cards have better performance, are from the leading company (mindshare) and are selling for less.

Why haven't competitive pressure dropped Vega 64 prices?

Everything from a consumer perspective says they should be selling for less than GTX 1080. But they aren't.

You have to look beyond the consumer perspective, to understand why, and it's the production cost, driven primarily by die size economics.

It isn't economically viable to sell that big chip card for the price of the small chip competitor. Ultimately they may be forced to sell remaining stock cheaper, but that will be at a loss, which isn't economically viable. They will probably soon end production of Vega consumer cards (if it already hasn't ended).

When dies get larger, production costs increase, and production cost increases are normally passed on to the consumer.
 

Timmah!

Golden Member
Jul 24, 2010
1,572
935
136
If it turns out that the ray trace cores only do Megarays just like AMD's GPUs and their gigaray quote is from the tensor core upsample, they will have officially lost all decency in my book. So, I'm waiting it out. If this gigaray nonsense is a farce, there's no reason to go w/ them vs AMD and with AMD opening up their software stack and having the same compatibility with Vulkan, it's them who I will invest resources with. Lastly, you can already do ray tracing in current Nvidia GPUs, its just slower. For dev purposes, I'm going to focus on doing just that with Pascal. For gaming, I use Maxwell and have no performance issues. I'll upgrade my gaming rig in 2020 probably when this idiocy comes back down to earth.

Obviously, i have no way to know, and i do not really understand these things nowhere near as much as you, but i think they might be serious with their Gigarays quote. However, it seems to me it is just some theoretical maximum peak value, not something achievable at any particular situation.

I am basing this feeling on quotes from Jules Urbach, the CEO of Otoy, in regard to Octane render. He said following things:

"On Pascal OctaneBench was getting about 400 million rays a second and on Turing it is about 3.2 billion."

"We will see probably OctaneBench scores in the 700's or 800's, which will be pretty crazy" - A Quadro GP100 currently scores around 230 OB.

"So compared to a 1080, the RTX 6000 is already 10x faster in the OctaneBench interior path tracing test (which is a reliable world benchmark we added in Octane Bench 3). But in some cases like the Cornell Box (same scene as shown in the keynote), we are seeing more like 7 billions rays/second and that is approaching NVIDIA's maximum numbers."


https://www.youtube.com/watch?v=IJ77a0erU4w
https://www.youtube.com/watch?v=6l2vQ8eRbiY

Bottom line, dont know about gaming, but for pro-apps like Octane, this is gonna be absolutely awesome, at least it looks that way. That potential speed-up (6x, 8x, 10x...) - its something unthinkable. I bought 1080 2 years ago for Octane (actually 2 of them) - they would push 300 points top in Octanebench combined, so 150 per GPU. Compared to my previous GPU, gtx580 from 2010?, which scored on OB cca 60 points, that was like 2,5x speed-up in 6 years. From this POV, even the price increase does not look that bad :) Although obviously, i am not gonna defend it, i am annoyed by it as anyone.
 

Hitman928

Diamond Member
Apr 15, 2012
6,755
12,501
136
If you had a situation like that, Company A wouldn't even release the product.

Why did AMD release Bulldozer then? (Obviously my hypothetical was an exaggeration, but the situation of a significantly bigger core being significantly outperformed was true).
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
GTX 770 2GB release day May 2013 , die size 294mm2 at 28nm, launch price $399

GTX970 3.5GB release day September 2014, die size 398mm2 at 28nm , launch price $329

GTX970 at $329 was as fast as GTX780Ti that launched at $699 one year earlier.

This is the normal thing to happen in computer hardware. We always get higher performance at lower price 1-2 years later.
 
  • Like
Reactions: psolord and crisium
Status
Not open for further replies.