• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

News [PCGamer] AMD is ahead of Nvidia in overall GPU shipments for the first time since 2013

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Am I the only one who thinks that counting processors with iGPUs toward video card sales is rather silly? When I mention video cards I mean exactly that, not something that is built into the CPU and I always have. Although I do have to admit that iGPUs have gotten way better than when they first came out.

Yes I am aware that many users either don't need a dGPU or the games they play run fine on iGPUs. But still...
 
Am I the only one who thinks that counting processors with iGPUs toward video card sales is rather silly? When I mention video cards I mean exactly that, not something that is built into the CPU and I always have. Although I do have to admit that iGPUs have gotten way better than when they first came out.

Yes I am aware that many users either don't need a dGPU or the games they play run fine on iGPUs. But still...

I agree and it's especially bad with headlines like the following:
AMD’s graphics cards are crushing it, overtaking Nvidia for the first time in five years

TechRadar, not good.
 
Nope, not in Wolfesntein 2 or Youngblood, not in Rage 2, not even in the good old Doom, You have the habit of speaking without reference:

View attachment 10300
https://techreport.com/review/34646...eon-rx-5700-series-graphics-cards-reviewed/3/

View attachment 10301

https://www.techpowerup.com/review/amd-radeon-rx-5700-xt/21.html

The 2070S is so fast it even beats Radeon VII in Wolfesntein:

View attachment 10302

https://www.techpowerup.com/review/wolfenstein-youngblood-benchmark-test-performance/4.html


Not any more, Turing is faster in Forza 4 now, using the latest driver which brought 30% uplift!
https://www.patreon.com/posts/29480671


Faster here too.
https://www.patreon.com/posts/29480671
No its not, you are comparing the $500 2070super to a $400 RX 5700xt, WTF?

The RX 5700 and RX 5700xt easily beat the 2060 and 2060s in Wolfenstein 2, Metro Exodus, SOTTR, Forza 4, Strange Brigade and others.
 
Hardware Unboxed tested 37 games with the latest drivers and the 2070S was only 6% faster (at 1440, I think).

I don't think 25% higher cost for 6% more performance is worth it, but someone might.
 
From the article
Bear in mind that these numbers include all computer graphics shipments (though not console GPUs), including AMD's accelerated processing units (APUs). That's why Intel is way out ahead—a large portion of its processors have integrated graphics, and it ships a lot more CPUs than AMD, especially once you factor in laptops
Point still stands. Misleading headline. If you include the integrated gpus, intel is far ahead of either. AMD still trails badly in discrete sales, despite fire sale prices. That is what not being competitive does to you.
 
Hardware Unboxed tested 37 games with the latest drivers and the 2070S was only 6% faster (at 1440, I think).

I don't think 25% higher cost for 6% more performance is worth it, but someone might.
Wccf tech did a test of 5700XT vs 2060S with them both o/c:
The result was basically a tie as 2060S o/c's so much better, so in reality it's not actually much cheaper then the equivalently performing Nvidia card, but lacks the features (which is not just RT/DLSS but things like variable shading, VR support, GSYNC, etc).

Point still stands. Misleading headline. If you include the integrated gpus, intel is far ahead of either. AMD still trails badly in discrete sales, despite fire sale prices. That is what not being competitive does to you.
This is the descrete figure: https://wccftech.com/amd-radeon-nvidia-geforce-graphics-card-gpu-market-share-q2-2019/
AMD is making a lot of sales of cheap 570/580's which is not surprising as they are a bargain.
 
but lacks the features (which is not just RT/DLSS but things like variable shading, VR support, GSYNC, etc).

Reducing the level of detail on parts of the frame is hardly a "feature" to crow about having, it's compromising your IQ for crying out loud. You also list Gsync as a feature deficit for the AMD card!?!. Really? Please expand on the etc, I would really love to know what makes the price premium bearable to you.
 
Wccf tech did a test of 5700XT vs 2060S with them both o/c:
The result was basically a tie as 2060S o/c's so much better, so in reality it's not actually much cheaper then the equivalently performing Nvidia card, but lacks the features (which is not just RT/DLSS but things like variable shading, VR support, GSYNC, etc).


This is the descrete figure: https://wccftech.com/amd-radeon-nvidia-geforce-graphics-card-gpu-market-share-q2-2019/
AMD is making a lot of sales of cheap 570/580's which is not surprising as they are a bargain.
So AMD has gone from 1:4 to 1:2 versus Nvidia in AIB cards. I was wondering why the manic postings lately.
 
The result was basically a tie as 2060S o/c's so much better, so in reality it's not actually much cheaper then the equivalently performing Nvidia card, but lacks the features (which is not just RT/DLSS but things like variable shading, VR support, GSYNC, etc).

Yes, AMD lacks hardware RT.
Variable shading is a horrible "feature". If I wan't lower IQ, I will adjust it down myself.
VR support may be a bit better on nVidia from a performance standpoint, but its not missing from AMD. So you can't say it lacks support.
GSYNC is a dead technology. FreeSync won the war the moment nVidia capitulated and added support for it to their cards.
 
Yes, AMD lacks hardware RT.
Variable shading is a horrible "feature". If I wan't lower IQ, I will adjust it down myself.
VR support may be a bit better on nVidia from a performance standpoint, but its not missing from AMD. So you can't say it lacks support.
GSYNC is a dead technology. FreeSync won the war the moment nVidia capitulated and added support for it to their cards.
Variable rate shading is essential to enable foveated rendering for VR. I don't understand why anyone would consider it a feature for the normal desktop monitor.
 
Gsync is an advantage for Nvidia in that if you have or want to buy a display that has Gsync you are locked to Nvidia, if you have freesync then you can still use Nvidia.

Variable shading looks like it has a lot of potential - if it gives you a 10-15% performance boost with no noticeable loss of image quality which is the aim I think then I don't see why anyone would hate it. Needs some proper reviews but writing it off before you have seen them just sounds like fanboy bias.
 
Gsync is an advantage for Nvidia in that if you have or want to buy a display that has Gsync you are locked to Nvidia, if you have freesync then you can still use Nvidia.
GSYNC monitors are also the only ones with 4K and HDR1000 and VRR from 0Hz to max Hz with a Full Array Backlight of 384 ones and above. FreeSync 2.0 monitors have yet to feature a HDR1000 standard, let alone the Full Array Backlight .
 
Gsync is an advantage for Nvidia in that if you have or want to buy a display that has Gsync you are locked to Nvidia, if you have freesync then you can still use Nvidia.

Variable shading looks like it has a lot of potential - if it gives you a 10-15% performance boost with no noticeable loss of image quality which is the aim I think then I don't see why anyone would hate it. Needs some proper reviews but writing it off before you have seen them just sounds like fanboy bias.
There's already enough IQ cheating for the sake of performance. The last thing the end user or consumer should be championing as a "feature", is more IQ compromising algorithms. Nvidia and AMD can, just not the consumer. I't's like self-harming.
 
GSYNC monitors are also the only ones with 4K and HDR1000 and VRR from 0Hz to max Hz with a Full Array Backlight of 384 ones and above. FreeSync 2.0 monitors have yet to feature a HDR1000 standard, let alone the Full Array Backlight .
With LFC, freesync is essential 0 to max VRR too and for everything else you typed, just remember those are attributes of the monitor NOT of gsync.
 
There's already enough IQ cheating for the sake of performance. The last thing the end user or consumer should be championing as a "feature", is more IQ compromising algorithms. Nvidia and AMD can, just not the consumer. I't's like self-harming.
We don't know what it does to IQ because no one has tested it. Once we know what the IQ/performance trade off is then we can make a call on whether it's worth it.
 
GSYNC monitors are also the only ones with 4K and HDR1000 and VRR from 0Hz to max Hz with a Full Array Backlight of 384 ones and above. FreeSync 2.0 monitors have yet to feature a HDR1000 standard, let alone the Full Array Backlight .

But you said that 4K was dumb, and that nobody wants is....

Variable rate shading is essential to enable foveated rendering for VR. I don't understand why anyone would consider it a feature for the normal desktop monitor.

And it makes sense for VR where frame skip can literally make you sick. But it also sounds like a cheat for benchmarks on the desktop.

Variable shading looks like it has a lot of potential - if it gives you a 10-15% performance boost with no noticeable loss of image quality which is the aim I think then I don't see why anyone would hate it. Needs some proper reviews but writing it off before you have seen them just sounds like fanboy bias.

It literally drops IQ to increase frame rates. It can't do this and keep IQ the same.
 
Wccf tech did a test of 5700XT vs 2060S with them both o/c:
The result was basically a tie as 2060S o/c's so much better, so in reality it's not actually much cheaper then the equivalently performing Nvidia card, but lacks the features (which is not just RT/DLSS but things like variable shading, VR support, GSYNC, etc).


This is the descrete figure: https://wccftech.com/amd-radeon-nvidia-geforce-graphics-card-gpu-market-share-q2-2019/
AMD is making a lot of sales of cheap 570/580's which is not surprising as they are a bargain.
No one should EVER use WCFtech as a source or info or reference for anything, it's a useless gossip site, worse than the celeb gossip ones.

Nothing they write or do there should be taken serious, it's not a serious website, it's a gossip blog. So I wouldn't trust a thing coming off of that cess pit.

Do you really still consider RT which is pretty much for DLSS as a feature? Literally a simple sharpening filter is 10x times better than Nvidia's limited DLSS. Its limited to certain games, at certain resolutions and certain settings and even then it works bad. I'm sorry but Nvidia's own new sharpen filter does a much better job than their DLSS.

variable shading is already supported by AMD and has been since Vega, they just don't call it that. Plus if you want to drop names Nvidia still doesn't support cache memory, rapid packed math, radeon chill, etc...
VR support? What are you smoking? All VR games work on all amd cards, its a niche market with less than 10 million VR units sold, and that is a number of how many are on shelfs, not really representative how many people have bought them and it's now been over 3 years since we've had VR tech. Again VR games work on amd as they do on nvidia. performance depends on game engine, unfortunately most use unreal engine which has always favored nvidia by a lot. Its an old engine, it uses old techniques for rendering, it doesn't support vulkan or DX12, it doesn't utilize a lot of processor cores, its limited in scope, but its easy and accessible, that is why its mostly used by small 3-4 people team who create small indie games.

Gsync is a sync technology, it has nothing to do with monitor feature specification, though yeah Nvidia did pay monitor manufacturers to create amazing spec monitors with gsync, but they also cost an arm and a limb, the starting point for these monitors is $1000, an unrealistic, overpriced hardware demo.
 
Have a feeling that some think variable rate shading is an Nvidia thing, its not. Its a DX12 feature.
https://devblogs.microsoft.com/directx/variable-rate-shading-a-scalpel-in-a-world-of-sledgehammers/

3DMark even incorporated it into their benchmark suite recently, and here you can see the effects of it on IQ. If properly done, it should only apply to textures of objects in the background that you would not normally focus on. And its not forced down anyones throat, its an on/off option per user preference.

 
Have a feeling that some think variable rate shading is an Nvidia thing, its not. Its a DX12 feature.
https://devblogs.microsoft.com/directx/variable-rate-shading-a-scalpel-in-a-world-of-sledgehammers/

3DMark even incorporated it into their benchmark suite recently, and here you can see the effects of it on IQ. If properly done, it should only apply to textures of objects in the background that you would not normally focus on. And its not forced down anyones throat, its an on/off option per user preference.

I am not sure what AMD supports right now, googling AMD and variable rate shading gives me a load of old articles about how they might support it in Navi. Googling the same with nvidia gives me links to how it's used in games: https://www.nvidia.com/en-us/geforce/news/nvidia-adaptive-shading-a-deep-dive/

The theory seems sound, it's in 3DMark and apparently Wolfenstein . Hopefully someone decent will do a proper review of it soon.
 
Back
Top