Question Why does the overall gaming GPU market treat AMD like they have AIDS?

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

VirtualLarry

No Lifer
Aug 25, 2001
56,552
10,171
126
I guess I get the (sub-liminal) "The way it's meant to be played" ads from NVidia, along with the recurring FUD tropes about "AMD drivers", but I honestly don't get the sales disparity, especially for the price.

I've owned both NVidia-powered as well as AMD powered GPUs, and IMHO, AMD is (generally) just as good. Maybe 99% as good.

Edit: And I think that there's something to be said about the viability of AMD technologies, when they're in both major console brands.
 

Leeea

Diamond Member
Apr 3, 2020
3,795
5,549
136
Hardware Unboxed did a great video on current prices:
https://www.youtube.com/watch?v=a5gxrcHUM0k

Part of the video compares AMD retail to Nvidia retail
and part of it compares AMD used to Nvidia retail
rdna1 parts have cratered on the used market


It all seems rather relevant to this discussion of AMD price point vs Nvidia.

I feel it illustrates the fallacy that AMD dropping their prices to increase market share would work. That has effectively happened. The rx5700 is roughly equivalent to the rtx3060 that goes for 2x the price.

I think most people, including me, would rather pay 2x for the 3060. Or, if staying in the same price bracket, a rx6500xt at 50% the performance of the rx5700xt.
 
Last edited:
  • Like
Reactions: scineram

SteveGrabowski

Diamond Member
Oct 20, 2014
7,428
6,157
136
Hardware Unboxed did a great video on current prices:
https://www.youtube.com/watch?v=a5gxrcHUM0k

Part of the video compares AMD retail to Nvidia retail
and part of it compares AMD used to Nvidia retail

An interesting part was rdna1 parts have cratered on the used market, people just do not want them.


It all seems rather relevant to this discussion of AMD price point vs Nvidia.


It also illustrates the fallacy that AMD dropping their prices to increase market share would work. That has effectively happened. The rx5700 is roughly equivalent to the rtx3060 that goes for 2x the price.

I think most people, including me, would rather pay 2x for the 3060.

I absolutely refuse to buy a card that's going to have 20,000 hours running balls to the wall mining eth on it unless the price reflects that. It would have to be like $80 max and in perfect working order with good temps considering I don't think I'd even put 4,500 hours of gaming on any card over five years. I know I'd be buying the tail end of that card's life so I'd only pay bottom barrel prices as great a card as I think the RX 5700 is.
 

Leeea

Diamond Member
Apr 3, 2020
3,795
5,549
136
I absolutely refuse to buy a card that's going to have 20,000 hours running balls to the wall mining eth on it unless the price reflects that. It would have to be like $80 max and in perfect working order with good temps considering I don't think I'd even put 4,500 hours of gaming on any card over five years. I know I'd be buying the tail end of that card's life so I'd only pay bottom barrel prices as great a card as I think the RX 5700 is.
That is exactly my point. We know a used rx5000 series is a card that was burned out in the mines and is just going to be problems.


The same is true with AMD vs Nvidia. As long as people perceive AMD cards as having problems, it does not matter what the price is. They are going to buy Nvidia. Even if it is 2x the price. There is no point for AMD to discount the price to gain market share.
 

SteveGrabowski

Diamond Member
Oct 20, 2014
7,428
6,157
136
That is exactly my point. We know a used rx5000 series is a card that was burned out in the mines and is just going to be problems.


The same is true with AMD vs Nvidia. As long as people perceive AMD cards as having problems, it does not matter what the price is. They are going to buy Nvidia. Even if it is 2x the price.

I mean I wouldn't touch a 1080 Ti over $60 either for the same reason. My brother bought a used 1080 Ti a year and a half ago and is a very light gamer: only a few hours per week, and playing easy to run things like Apex, WOW, and Diablo. And the card died on him a couple of months ago. Probably only got six months in the mines before he bought it.
 

Leeea

Diamond Member
Apr 3, 2020
3,795
5,549
136
I mean I wouldn't touch a 1080 Ti over $60 either for the same reason. My brother bought a used 1080 Ti a year and a half ago and is a very light gamer: only a few hours per week, and playing easy to run things like Apex, WOW, and Diablo. And the card died on him a couple of months ago. Probably only got six months in the mines before he bought it.
I had not thought about that.

Right now any used card is just asking for problems and a horrible experience.

In hindsight, it has been that way for years.


I wonder if @AnitaPeterson's cursed rx570 was a used card when purchased?
 
Last edited:

AnitaPeterson

Diamond Member
Apr 24, 2001
5,962
456
126
[...] If you look past the snazzy presentation, a lot of nvidia benefits are either not practical (RTX) or not useful (NVENC).

Sorry, but calling NVENC "not useful" is laughable and completely wrong.
You may not be using that feature - fair enough! - but such a blanket statement is factually wrong and ridiculous to anyone who dabs in video encoding.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
7,150
7,645
136
Sorry, but calling NVENC "not useful" is laughable and completely wrong.
You may not be using that feature - fair enough! - but such a blanket statement is factually wrong and ridiculous to anyone who dabs in video encoding.
Hyperbole much?

- Yep, if anything NVENC is one of the more useful features of NV's current suite for the large portion of the gaming audience (that does not include me) that streams.

I think AMD would do really, really well to slap a 3.0 behind a lot of their features and "relaunch" them alongside the 7xxx series to simply reintroduce them to the masses. You cannot discount NV marketing, they do a great job keeping their "gaming adjacent" features in the spotlight. Do an incremental upgrade on some side feature? Throw a new version number on it and talk it up to the moon!

Been part of a number of reddit discussions where people straight up don't know Freesync = Gsync, FSR 2.0 = DLSS 2.0, RTX is just NV's DX12 Ultimate branding, etc etc etc. People either don't even realize these features are "AMD" features or they assume that they're not an alternative feature but rather the "Medium setting to NV's proprietary Ultra Setting".
 

Leeea

Diamond Member
Apr 3, 2020
3,795
5,549
136
The devil is in the details. First of all, please remember the prices and availability are different outside the US. Your quoted MSRP prices are useless here. Also, FWIW, I bought the RX570 in December 2019, a few months before the pandemic. So at least in theory it *should* have ironed all the bugs by then.

Second, a card that's just dropped in the machine and instantly works with any game or application you throw at it should be the normal experience. Yet, the RX570 even had trouble displaying 4K content (multiple HDCP issues every day, no matter how many HDMI cables I changed), aside from gaming.
Some thoughts:
I had an rx580, (purchased new) and if I remember right it had a HDMI 2.0 port, not the 2.0a or 2.0b.

So it could do 4k @ 60 Hz, and that was it. 4k @ 60 Hz - HDR? it could be forced, but it would break all the time
Never had HDCP problems, but never played protected content either

Thing is, on the card spec sheet it was clear that it was either 4k @ 60 Hz, or 4k @ 30 Hz with HDR. I wonder if you just pushed the 2017 spec HDMI port beyond its specs.

I did use hardware encoding on my rx580, but it was with discord to share my screen with my friends. Seemed to work just fine. Very different app though.

It was also unable to use hardware encoding in software packages like ClownBD. And that was in 2020, a good 3 years after it became available.
Using GPU encoders for movie backup is very rare. Most people prefer the much superior CPU encoding. From my minimal research, it seems nvenc specifically is disliked for its poor handling of dark areas.

Yes, AMD got better hardware encoding in the next generation, but even then it trailed Nvidia spectacularly. You don't have to take my word for it. See this detailed comparison:

https://obsproject.com/forum/resour...s-2020-nvenc-vs-amf-vs-quicksync-vs-x264.998/
The problem with that, is the OBS people themselves indicate the AMD encoder is not a priority, the plugin was done years ago by a guy who stated he did not know what he was doing, and they are not going to do anything about it. They do not care.

It seems most serious OBS users are targeting CPU encoding, which kind of explains why they don't care.


If your going to do youtube / twitch streaming on the cheap with an AMD card right now, it looks like ReLive is the best / only option. On the Nvidia side there is geforce experience, although it seems nvidia does offer official OBS plugins.


But it seems from the research I did, the reason most people use OBS is for its CPU streaming capability. Making this entire topic irrelevant.


-----------------------------------
While your experience with your rx570 was horrible, lets be blunt. Your using niche software tools in a manner that is niche even for those tools. Nearly everyone using those software tools with those use cases are using CPU encoding.
 
Last edited:
  • Like
Reactions: scineram

fleshconsumed

Diamond Member
Feb 21, 2002
6,485
2,363
136
Sorry, but calling NVENC "not useful" is laughable and completely wrong.
You may not be using that feature - fair enough! - but such a blanket statement is factually wrong and ridiculous to anyone who dabs in video encoding.
I didn't call it useless. I called it useless for the majority of nvidia buyers because majority of nvidia buyers are not using it. How many nvidia buyers actually stream on twitch or dabble in video editing? 1%? 2%? So yes, I stand by my statement, it's useless for the majority of nvidia buyers, just like RTX or GSync.
 

Leeea

Diamond Member
Apr 3, 2020
3,795
5,549
136
Sorry, but calling NVENC "not useful" is laughable and completely wrong.
You may not be using that feature - fair enough! - but such a blanket statement is factually wrong and ridiculous to anyone who dabs in video encoding.
The problem is most people dabbling* with video encoding are not going to use nvenc even if they own a nvidia card.

It is inferior to software encoding, and everyone knows it.


The only real use case for nvenc is people doing real time encoding that are also poverty** stricken. That is a pretty small group.


People serious about encoding just buy a multi-core** CPU and enjoy the superior experience that offers.


-------------------------------------------

*the casual user will never notice. They use Geforce Experience or Radeon ReLive. Maybe Discord to share with their friends. With those commonly used software choices there is no apparent difference to the casual user.

**TwitchTV recommends any 6 core CPU for 1080p.
 
Last edited:

SPBHM

Diamond Member
Sep 12, 2012
5,059
413
126
sometimes is just a little annoying being an AMD user, like I'm trying to stream on discord, and they support hardware video encoding for Nvidia, not on my AMD card, I'm playing my favorite and slightly obscure MMO, and the ambient occlusion option works fine on Nvidia but has been glitched for years on AMD and they don't bother touching it,
there are advantages to having what 70%+ of other gamers are using sadly :/

other than that I always preferred the higher perf per $ ATI/AMD has historically offered,
 

AnitaPeterson

Diamond Member
Apr 24, 2001
5,962
456
126
Sorry, but without going into details about my video usage needs and scenarios, I can only point out that a beefier CPU is not always the easiest, cheapest or fastest path to take (and it's definitely NOT the quietest, either!)

In the end, every user has their own (generic or particular) requirements.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
29,559
24,422
146
Sorry, but without going into details about my video usage needs and scenarios, I can only point out that a beefier CPU is not always the easiest, cheapest or fastest path to take.

In the end, every user has their own (generic or particular) requirements.
I agree with this. I used NVENC H.265 highest Quality to transcode some blurays, and they look better than the streaming version. Hence, good enough for me. Even Return of the King was around 30 minutes I think? With my aging eyes, it's all good.
 
Feb 4, 2009
35,245
16,716
136
Well said and EXACTLY why we need intel to be a 3rd player in the market. The Duopoly has done what duopolies do. Given us one high end player that is very expensive and another low end/midrange player that is “good enough”. Being “Good Enough” is simply not that exciting.
WTF someone down voted that comment. Down voter do you happen to work with AMD or nVidia?
 
  • Haha
Reactions: GodisanAtheist

DasFox

Diamond Member
Sep 4, 2003
4,668
46
91
So if AMD is going to continue having an Inferior Feature Set compared to Nvidia, basically, anyone’s best guess, how much lower should their cards be selling for?

I’ll be honest, I have not been following the DLSS scene, but, I thought Ray Tracing is a big deal, and possibly still a very big reason to own Nvidia, especially if the titles you play have support.

I’ve been locked so long in the Nvidia world, because of how bad AMD’s drivers use to be in Linux, that I’ve never looked back, to ever consider buying AMD.

So, let’s say, the majority here, only ever bought EVGA, as I have done, and now that’s gone, what are we really going to miss, if we also jump off the Nvidia ship and head to AMD?

I don’t know the present features between the latest GEN AMD vs Nvidia, because I’ve never had a reason till now.

Oh I’m gonna cry, bye bye EVGA! :(

6053F833-F688-4C53-B752-E7584E077C04.gif

hmm 🤔
 

DasFox

Diamond Member
Sep 4, 2003
4,668
46
91
It will be interesting to see where AMD is headed next, with EVGA out of the scene, this is going to interesting.

hmm 🤔
 
Last edited:

Thunder 57

Platinum Member
Aug 19, 2007
2,975
4,545
136

Yesterday I was talking to someone who had screen tearing in their game, and I said I just set the game at the desired framerate through Nvidia's control panel. He uses an AMD GPU and his reaction was one of disbelief, the fact I could just lock the framerate per application, even my browser's FPS.

People said Intel selling dedicated GPU would be good for the consumer. No it's not, it has been nothing but issues, Intel is releasing half-assed drivers just like AMD.

With posts like that, their username should be Miss Information! Also not sure what they're talking about with DirectX 9 and AMD. Never had trouble with any DX9 games with any AMD card.
 

Leeea

Diamond Member
Apr 3, 2020
3,795
5,549
136
IPS Panel, 165Hz, 1ms
wrong forum, but that 1 ms is marketing lies.

Reality is going to be closer to 20 ms. They all advertise 1 ms.

The step up from the MSI Optix G273QPF, the $400 MSI Optix MAG272CQR also advertises 1 ms. It actually has a 15 ms response time:

I could not find a decent review on the G273QPF, most likely for sad reasons.

--------------------------------

If you think about it, 1 ms was never possible. At 165 Hz, the monitor is only going to start an update every 6 ms. Which is going to be the theoretical best.

--------------------------------

The reason GSYNC died is the gsync monitors tend to be more expensive then the freesync counterparts. A gsync module is typically $50 part. Usually companies release the same monitor as both. For example, you could have purchased a MSI G273QF, which is the exact same panel with the exact same performance as your gsync monitor, for $30* cheaper:

Which is why gsync is dead.

*yours is an old model, they are flushing the inventory and discounted it

On 4k monitors the gsync module has a fan. People who buy expensive monitors dislike fans in their monitor.
 
Last edited:
  • Like
Reactions: Ranulf

Leeea

Diamond Member
Apr 3, 2020
3,795
5,549
136
I agree with this. I used NVENC H.265 highest Quality to transcode some blurays, and they look better than the streaming version. Hence, good enough for me. Even Return of the King was around 30 minutes I think? With my aging eyes, it's all good.
Most people doing that sort of thing use Handbrake, which does support AMD encoding and it works just fine.


However, as previously mentioned, most people do not use GPU encoders because they are inferior. Using a CPU encoder yields both better quality and a smaller output file. Typically for non-time sensitive encoding, people prioritize file size and quality over speed.
 
Last edited:

Leeea

Diamond Member
Apr 3, 2020
3,795
5,549
136
I’ll be honest, I have not been following the DLSS scene, but, I thought Ray Tracing is a big deal, and possibly still a very big reason to own Nvidia, especially if the titles you play have support.
The sad thing about raytracing is it usually takes an expert to tell if it is even turned on:

AMD gpus have hardware support for raytracing. Although, they call it Microsoft DirectX 12 DXR rather then RTX. RTX is just Nvidia's version of Microsoft Directx 12 DXR.

AMD gpus also have motion vector upscaling, but instead of calling it DLSS 2, they call it FSR 2.


If you have a Nvidia GT or GTX GPU, Nvidia prevents those cards from using DLSS.

They didn't have to either, AMD FSR works with old AMD gpus like the rx500 series, and also works with nvidia GT and GTX cards. Nvidia was just implementing market segmentation at nvidia gpu owners expensive.
 
Last edited:

Leeea

Diamond Member
Apr 3, 2020
3,795
5,549
136
I’ve been locked so long in the Nvidia world, because of how bad AMD’s drivers use to be in Linux, that I’ve never looked back, to ever consider buying AMD.
are you aware that most people consider AMD drivers on Linux to be superior to Nvidia's?

I believe Linus Torvalds said it best:
"the single worst company we've ever dealt with":

Link removed.

Links showing profanity are not allowed in the tech forums regardless of being hidden with spoiler tags.

AT Mod Usandthem
 
Last edited by a moderator:

yottabit

Golden Member
Jun 5, 2008
1,491
522
146
Most people doing that sort of thing use Handbrake, which does support AMD encoding and it works just fine.


However, as previously mentioned, most people do not use GPU encoders because they are inferior. Using a CPU encoder yields both better quality and a smaller output file. Typically for non-time sensitive encoding, people prioritize file size and quality over speed.

I dunno, I’m another heavy Plex / Handbrake user firmly in the NVENC camp. I’ve spent many hours (days realistically) coming up with my handbrake profiles and A/B testing sample clips on my projector.

In my experience for equivalent to my eyes picture quality, NVENC is maybe 15 to 20% larger file size than CPU encode. It’s also many times faster than a top end CPU with even a midrange card and uses less energy. I’m not that strapped for storage space where I’d consider CPU encode for this

This is for mostly Bluray source though and using h265. For lower resolution media like 480p / DVD material I do use CPU encode and the differences are more pronounced

Anyways I’ve got two Nvidia Shield TVs in the house as game streaming clients… I use DLSS 2 in VR… I run CUDA accelerated apps… Nvidia has me locked in pretty well lol

at least some of my monitors are FreeSync

I do agree the vast majority of users would be just fine or better off perf/dollar with Radeons
 

CP5670

Diamond Member
Jun 24, 2004
5,535
613
126
The problem is that Nvidia had a complete monopoly at the high end for many years. AMD only became competitive at the high end with the 6800/6900 generation. I've used AMD/ATI cards in the past and loved them, but am too tied to the Nvidia ecosystem of third party tools and feature set at this point to seriously consider an AMD card now. I would use an AMD card if I ever needed a second gaming PC and do recommend them to other people.
 
  • Like
Reactions: Tlh97 and Leeea