RTX Sales Reported as Disappointing

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Why wouldn't AMD bin and name the chips themselves? Vega VII / Vega VII XT or whatever for the better performing chips.

Cost. Binning every chip gets expensive, and it ultimately results in some chips not passing spec. Those that don't pass either have to be tossed, or relegated to some other product. By setting the voltage higher, they know every chip they have will work, so they lose less money from throwing some away.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
Cost. Binning every chip gets expensive, and it ultimately results in some chips not passing spec. Those that don't pass either have to be tossed, or relegated to some other product. By setting the voltage higher, they know every chip they have will work, so they lose less money from throwing some away.
Why not just call it an SE and clock it lower? How much can that cost? They test every chip anyway.
 

sandorski

No Lifer
Oct 10, 1999
70,099
5,639
126
Why not just call it an SE and clock it lower? How much can that cost? They test every chip anyway.

They are just biding their time until Navi. So they released this token card just to remind Users they still exist. Binning and extra Models just adds costs and effort that is not worth it.

Undervolting is really just a feature/characteristic like Overclocking. Perhaps one day the technology will exist wherein a GPU will dynamically change Voltage based upon its' unique Undervolting capability, much like how Clockspeeds are dynamic today.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
They are biding their time when NV has given them an opening.

It is allowing NV to get away with a mistake.
 

KevinH

Diamond Member
Nov 19, 2000
3,110
7
81
In a shock to basically nobody, Nvidia is facing some downturn here, blaming it on delayed 2060 and such.

I think the 2070 and 2080 are fairly unappealing products, and of course 2080ti is even more ludicrously expensive. Beyond any metrics like delays and such, the worst issue facing Nvidia is competition from their own 10xx owners who honestly have nothing worth buying at the same price segments. Buyers typically hang about in their comfortable zone, eg a $250 buyer isn't suddenly going to feel like tossing $500+, a $500 buyer isn't suddenly feeling great about dumping $1200, etc.

The failure to increase performance per $ is going to make this a dark time for Nvidia, gamers, and the state of PC gaming as a whole until this cloud passes.

Shocking ... well, not that shocking


Yep., This is EXACTLY the case for me. I've never gone over $300ish since I got into gaming with the Voodoo 2.
 

coercitiv

Diamond Member
Jan 24, 2014
6,187
11,855
136
Undervolting is really just a feature/characteristic like Overclocking. Perhaps one day the technology will exist wherein a GPU will dynamically change Voltage based upon its' unique Undervolting capability, much like how Clockspeeds are dynamic today.
There's a number of hardware techniques that allow to lower the voltage margins. AMD used AVFS in their last Bulldozer iteration, Zen, Polaris and as far I know - Vega.

What AVFS does is it introduces a way to quickly alter clocks to keep the chip stable under various workloads. On average clocks drop by just 1-3% but that allows for a 10% drop in voltage.

AVFS-Polaris-640x366.jpg

The irony here is the very tech that in theory helps with lowering stock voltage margins may be the one that allows excellent undervolting results and further fuels this perception that AMD cards have inefficient default voltage tables. It's not just the user finding a good undervolt, it's also the card adapting to that voltage, doing more to keep stable with razor-sharp margins.

As far as I can tell, AVFS does a great job of trying to keep my Vega card stable while I undervolt the chip: it's the first card I've seen that varies clocks not only based on frequency settings, but based on voltage as well - with aggressive undervolting for a given target frequency it will lower clocks relative to that frequency while trying to keep the card stable under heavy loads. This make things easier once you know what to expect from the card, since you gradually push down the voltage under load until frequency starts to drop, then adjust a bit back up if u want to minimize clock loss. The results are quite staggering, especially if you're willing to give up 10% in clocks. My Sapphire Pulse running at 1350Mhz @ 0.9V keeps fans around 800 rpm to maintain 75C core temp in mining, let alone gaming.

I haven't looked for details on how Nvidia manages voltage margins in their cards, especially considering they effectively lock undervolting through drivers (workarounds exist, but they're messy). However, given their effciency I would expect them to use similar if not more polished tehniques. AMD can probably do a lot better too.
 
Last edited:

Guru

Senior member
May 5, 2017
830
361
106
We've known that AMD has been on a very limited budget in the lead up to Ryzen. It's quite likely that they set voltage levels at a place where almost all of their silicon would be able to hit a particular clock speed. They didn't have the resources to bin more aggressively or create specific parts that would remove the worst 15% of performers and allow for more sane voltages in the remaining chips.

The other problem is that GCN has pretty much hit a wall. They need a new architecture that's been designed from the ground up. Supposedly we'll get that after Navi based on old roadmaps. Hopefully now that AMD has competitive CPU designs, they'll be able to afford more R&D for their graphics division.

Where is this falsehood coming from? AMD are NOT using GCN in Vega, it's a new architecture. Heck even GCN 1-4 are very very different from each one. There is no comparison between HD 2000 and RX 400/500 series, no way!

Jesus Christ who the hell goes by name in terms of architecture differences? You have to be EXTREMELY ignorant to do so! Just because Nvidia has different names, doesn't mean they have 100% new architectures every 2-3 years. Their latest Turing architecture is essentially straight out of their GTX 200 series. A big shift to their GTX 400, from then on it's been steady improvements. Turing is basically the end result of 10 years of steady improvement over the same architecture introduced in their GTX 200 series, which was their first "unified shaders" architecture.
 
  • Like
Reactions: Alusan

DrMrLordX

Lifer
Apr 27, 2000
21,620
10,829
136
There's a number of hardware techniques that allow to lower the voltage margins. AMD used AVFS in their last Bulldozer iteration, Zen, Polaris and as far I know - Vega.

The irony here is the very tech that in theory helps with lowering stock voltage margins may be the one that allows excellent undervolting results and further fuels this perception that AMD cards have inefficient default voltage tables. It's not just the user finding a good undervolt, it's also the card adapting to that voltage, doing more to keep stable with razor-sharp margins.

AVFS is kind of the devil. Pre-AVFS cards, like Hawaii-based 390s, were much easier to deal with since they didn't vary clockspeed. You could dial in clocks versus voltage on a per-card basis and get some good results. My Vega FE liked to boost all over the place and do goofy stuff, like boost the card to higher clocks as I lowered vGPU. Then at lower voltages (which I could not achieve, because my sample was terrible at undervolting) it would start to underclock to remain stable.

Vega20 doesn't really do that.

It still has boost-ish behavior which varies according to new rules that I still don't quite understand, but it isn't necessarily the user-configured voltage OR temperature. It seems more based on workload. In GPGPU compute workloads, it keeps steady clocks. In games/graphics benchmarks, it likes to stutter down to ~1660 MHz or lower when it feels like it. In really light gaming benchmarks (like 3DMark Ice Storm) clocks drop like a rock and never come up, killing performance. Do the drops correspond to high temps or something else obvious? No. Drop volts and behavior remains unchanged completely. Change power settings, and . . . still no change in behavior. If I wanted my VegaFE to hold 1585 MHz solid in everything, all I had to do was run the power slider up to 50%, undervolt, and cool that thing, and it was ready to go. Radeon VII? Nope. I can't make it hold clocks at 1801/1802 MHz at all. It just has to stutter. Unless I run GPUPI or similar, and then it's rock-solid. You'd think it was a compute card or something.
 

Mopetar

Diamond Member
Jan 31, 2011
7,831
5,980
136
Where is this falsehood coming from? AMD are NOT using GCN in Vega, it's a new architecture. Heck even GCN 1-4 are very very different from each one. There is no comparison between HD 2000 and RX 400/500 series, no way!

Vega is GCN-based. AMD says so on their own website. Why you bring up HD 2000 is beyond me since that's their even older TeraScale architecture.

Jesus Christ who the hell goes by name in terms of architecture differences? You have to be EXTREMELY ignorant to do so!

So why does Vega 20 still only have 4 shader (compute) engines? Oh that's right, because it's still GCN and it's a limitation of the architecture that's either not possible to change, or requires such a huge amount of work that it wasn't possible for them to do with their limited resources.

Yeah, Vega has a lot of stuff bolted on top, but the foundation is the same, and when there're cracks in the foundation, slapping a shiny new coat of paint on the walls doesn't help much.

We know that AMD is working on a new architecture. It's obvious that they're aware of the GCN limitations. There's only so much that they can do while working within the confines of those limitations.

1-AMD-CES-GPU-Feature--740x210.jpg
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
The tensor cores are needed for RTX right?

I actually think the Ray Tracing support is pretty cool. What I think sucks is DLSS. Then again, anti-aliasing methods in general have been thought of as blur filters. Newer methods are just fancier blur filters.

If Nvidia doesn't give up on the whole thing we might see much wider array of games and cards using ray tracing in the third generation. First generation is just an introduction, second generation is for proliferating, and third generation is where it really matters.

Besides, 7nm is more like a full node jump compared to 14nm, not two generations. You'll need the second generation 7nm or even 5nm to see decent gains.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
The tensor cores are needed for RTX right?

I actually think the Ray Tracing support is pretty cool. What I think sucks is DLSS. Then again, anti-aliasing methods in general have been thought of as blur filters. Newer methods are just fancier blur filters.

If Nvidia doesn't give up on the whole thing we might see much wider array of games and cards using ray tracing in the third generation. First generation is just an introduction, second generation is for proliferating, and third generation is where it really matters.

Besides, 7nm is more like a full node jump compared to 14nm, not two generations. You'll need the second generation 7nm or even 5nm to see decent gains.

RTX is the branding for Nvidia, you're probably referring to DXR, which is part of the DX12 API. From everything I've read, Tensor cores aren't needed for it, but they accelerate some of the process. Which is why when DLSS+DXR are done in tangent from the ground up they seem to produce beautiful results with acceptable performance (see Port Royal benchmark), but when bolted on they seem to suffer performance (early builds of DXR for BF5) or horrendous image quality (just about any example I've seen of DLSS).

Pretty sure Pascal and Vega can do DXR, just the performance would be awful due to the lack of RT-Cores (NV's branding).

EDIT: The Port Royal demo, if this is the feature NV is trying to sell me on, sign me up. However, their early attempts (specifically to DLSS) are not acceptable.

 
  • Like
Reactions: Arkaign

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
@railven

Yes, I meant Ray Tracing, not including DLSS.

Looking back on the Turing architecture, it uses Tensor cores for sharpening images for its hybrid ray tracing. That's different from DLSS. Though AT article notes developers aren't really going for tensor core clean up at the moment.

What I mean is Ray Tracing by itself is cool. DLSS is not, whether its with Ray Tracing on or with it off.

If the Turing RTX cards were decently priced, I wouldn't mind a RTX 2060 on a 1080p monitor with ray tracing enabled.

There's a bigger issue than Nvidia's pricing too high. Didn't Battlefield V get a lukewarm reception? So one of the three games it takes advantage of the features are mediocre. The GN article was also talking about memory leak issues and some other mentions say it may really be a BFV issue.

If we need crappy games to use our graphics cards then really what's the point?
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
@railven

Yes, I meant Ray Tracing, not including DLSS.

Looking back on the Turing architecture, it uses Tensor cores for sharpening images for its hybrid ray tracing. That's different from DLSS. Though AT article notes developers aren't really going for tensor core clean up at the moment.

What I mean is Ray Tracing by itself is cool. DLSS is not, whether its with Ray Tracing on or with it off.

Like I mentioned, if DLSS+DXR is going to look like how it does in the Port Royal benchmark, sign me up. So far DLSS isn't looking good, but it's new, and even Metro already shows improvements with a patch. So there is SOME hope.

If the Turing RTX cards were decently priced, I wouldn't mind a RTX 2060 on a 1080p monitor with ray tracing enabled.

There's a bigger issue than Nvidia's pricing too high. Didn't Battlefield V get a lukewarm reception? So one of the three games it takes advantage of the features are mediocre. The GN article was also talking about memory leak issues and some other mentions say it may really be a BFV issue.

If we need crappy games to use our graphics cards then really what's the point?

I've been following PC games (gaming) for years. I've noticed this time the bad game taste is in NV's mouth. What I mean is, look back through the games NV usually signed a partnership with versus AMD. They often had great NV performance on day one, and some NV perks. AMD side, games they partnered with often had great neutral performance but didn't do very much to leverage the AMD product and in some instances, the game got caught in a scandal that sort of hurt it (most recently to come to mind was Wolfenstein 2). Elements outside of the control of NV/AMD churned crap up. How much NV's involvement was with BF5, I won't know, but I'm pretty sure NV had zero involvement with the Epic Games deal and Metro. Which is now causing Metro to get thrown under the bus. Game is already cracked and tons of torrent sites to find it.

How this will affect DXR adoption (thus the fate of RTX) I don't know. But looking at it from the start, it isn't good. New techs should come out with their best foot forward. DXR and DLSS are both botched from a PR angle.
 

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
25,542
14,496
136
Nah, the card got marked up to $1099 for a couple weeks during the beginning of January. I was following that card for a bit and considered buying one during the Christmas season, but backed off once the price went up.
And the 2080TI FTW3 is $1500 !! I bought 12 of the 1080TI FTW3 cards@ about an average of $800. When the 2080TI cards come down to a reasonable number (and are available), I will get one. But at the rate they are going, it will be when hell freezes over.
 
  • Like
Reactions: moonbogg

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
And the 2080TI FTW3 is $1500 !! I bought 12 of the 1080TI FTW3 cards@ about an average of $800. When the 2080TI cards come down to a reasonable number (and are available), I will get one. But at the rate they are going, it will be when hell freezes over.
Hey, there's a $100 rebate on those...
 

mopardude87

Diamond Member
Oct 22, 2018
3,348
1,575
96
And the 2080TI FTW3 is $1500 !! I bought 12 of the 1080TI FTW3 cards@ about an average of $800. When the 2080TI cards come down to a reasonable number (and are available), I will get one. But at the rate they are going, it will be when hell freezes over.

Donate a 1080ti to me will ya? :p At this point i am saving up and waiting for the next generation cards over hoping the 2080ti will be more sensibly priced any time soon.I fixed the upgrade bug i had sorta with my 1070ti and slapped on a Artic accelero 3 cooler and modded a few 120mm fans on it.Imagine load temps not even breaking 50 cel! Sure beats the previous 77cel load temps i could have hit.

Next generation for me may also involve moving from 1440p to 4k. I think the longer we wait,the better off we will be.
 

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
25,542
14,496
136
Donate a 1080ti to me will ya? :p At this point i am saving up and waiting for the next generation cards over hoping the 2080ti will be more sensibly priced any time soon.I fixed the upgrade bug i had sorta with my 1070ti and slapped on a Artic accelero 3 cooler and modded a few 120mm fans on it.Imagine load temps not even breaking 50 cel! Sure beats the previous 77cel load temps i could have hit.

Next generation for me may also involve moving from 1440p to 4k. I think the longer we wait,the better off we will be.
They are all folding to cure cancer. I lost my battle to cancer in 2 ways, I lost my bladder and prostate, but also due to the cancer drugs trying to kill the cancer, I lost my hearing, and my balance.
 

mopardude87

Diamond Member
Oct 22, 2018
3,348
1,575
96
They are all folding to cure cancer. I lost my battle to cancer in 2 ways, I lost my bladder and prostate, but also due to the cancer drugs trying to kill the cancer, I lost my hearing, and my balance.

Sorry to hear that.I used to fold back some time ago but using folding@home i ran into some issues with a 770 not wanting to work with it.I guess cause of how f@h was picky about drivers at that time,i had no choice but to quit and after that idk May try again to join a team,i miss contributing to a cause like that.
 

Aikouka

Lifer
Nov 27, 2001
30,383
912
126
And the 2080TI FTW3 is $1500 !! I bought 12 of the 1080TI FTW3 cards@ about an average of $800. When the 2080TI cards come down to a reasonable number (and are available), I will get one. But at the rate they are going, it will be when hell freezes over.

I think you're touching on another aspect that's a bit of an issue with the 2080 Ti... the wide pricing spectrum for higher-end variants. I was originally eying the FTW3-HC because I had no plans to air-cool it anyway, but I ended up going with the XC Black + separate HydroCopper block. If the cheaper Black model had been available, I would've likely gone with that as the difference ($150) would've paid for almost the entire water block ($180). Yes, the XC Black has the better A-chip, but in reality, the differences are going to be 5% or less. In general, I've often been okay with paying for better coolers and slightly better out-of-the-box performance, but that was when it cost you ~$100 more (from $700 to $800). I recall my 1080 Ti Strix cards costing me about $100 more than MSRP and HydroCopper cards were about $150 more.

Although, if you want water cooling, both Gigabyte and MSI have pre-water-cooled models available... but then you'd also have to go with Gigabyte and MSI. :p Plus, some reviews have complaints about their temperatures being way too high out-of-the-box, and only going down when reseating the water block.