Panino Manino
Senior member
- Jan 28, 2017
- 625
- 759
- 136
Isn't just on DX12?AMD now have 20-30% more performance in all CPU bottleneck scenarios under dx12/vulkan.
Isn't just on DX12?AMD now have 20-30% more performance in all CPU bottleneck scenarios under dx12/vulkan.
I watched that author's video on Turing back when it was released and he spared no superlatives when talking about Turing's DLSS and RT so it's funny to contrast with his concern (trolling?) about the importance of the "here and now" versus how the RDNA2 might work in the future.Really? I've found their card reviews spot on. They have obviuos bias on rendering technology advances, so things like RT and DLSS techniques are worth a premium to them and it shows in their reviews. For example:
![]()
AMD Radeon RX 6900 XT review: the Digital Foundry verdict
The RX 6000 line of graphics cards in general and the RX 6900 XT in particular are difficult to assess during a time of…www.eurogamer.net
Hard to disagree with their take. Heck with the hindsight, AMD Super Resolution is not out yet and even if it was released today, how many games would support it? Meanwhile Nvidia is pushing UE4 plugins that enables DLSS support without much work.
3090 + 1400 | 3090 + 1600X | 3090 + 2700X | 3090 + 5900X | 3090 + 4770K | 3090 + 7700K | 3090 + 8700K | 3090 + 10700K | |
Watch Dogs Legion (DX 12 Ultra) | 56 | 68 | 79 | 115 | 72 | 96 | 109 | 115 |
The Medium (DX 12 Max) | 66 | 76 | 84 | 110 | 79 | 109 | 110 | 110 |
Hitman 3 (DX 12 Max) | 71 | 84 | 99 | 139 | 84 | 117 | 122 | 131 |
AC: Valhalla (DX 12 Very High) | 61 | 73 | 85 | 96 | 75 | 95 | 96 | 96 |
Cyberpunk 2077 (DX12 Ultra) | 56 | 66 | 78 | 115 | 56 | 69 | 86 | 104 |
Forza Horizon 4 (DX 12 Max) | 95 | 110 | 124 | 178 | 115 | 151 | 155 | 170 |
Total War Troy (DX 11 Max) | 56 | 75 | 94 | 116 | 67 | 82 | 96 | 113 |
Mount & Blade II (DX 11 Very High) | 121 | 137 | 155 | 200* | 148 | 196 | 200* | 200* |
Nioh 2 (DX 11 High) | 104 | 115 | 120* | 120* | 120* | 120* | 120* | 120* |
Valheim (DX 11 Max) | 58 | 62 | 66 | 162 | 92 | 128 | 127 | 138 |
Outriders Demo (DX 11 Max) | 69 | 73 | 84 | 131 | 73 | 106 | 113 | 118 |
It Takes Two (DX 11 Max) | 100 | 111 | 129 | 243 | 125 | 165 | 173 | 196 |
Mafia Definitive Edition (DX 11 High) | 83 | 103 | 119 | 191 | 115 | 154 | 167 | 179 |
6900XT + 1400 | 6900XT + 1600X | 6900XT + 2700X | 6900XT + 5900X | 6900XT + 4770K | 6090XT + 7700K | 6090XT + 8700K | 6090XT + 10700K | |
Watch Dogs Legion (DX 12 Ultra) | 69 | 86 | 96 | 117 | 85 | 116 | 116 | 117 |
The Medium (DX 12 Max) | 73 | 81 | 87 | 87 | 81 | 87 | 87 | 87 |
Hitman 3 (DX 12 Max) | 79 | 93 | 108 | 152 | 99 | 137 | 143 | 158 |
AC: Valhalla (DX 12 Very High) | 81 | 91 | 101 | 114 | 95 | 109 | 109 | 109 |
Cyberpunk 2077 (DX12 Ultra) | 60 | 73 | 84 | 115 | 62 | 77 | 93 | 111 |
Forza Horizon 4 (DX 12 Max) | 117 | 131 | 144 | 179 | 140 | 179 | 179 | 179 |
Total War Troy (DX 11 Max) | 57 | 75 | 89 | 117 | 65 | 81 | 96 | 114 |
Mount & Blade II (DX 11 Very High) | 122 | 138 | 162 | 200* | 140 | 187 | 195 | 200* |
Nioh 2 (DX 11 High) | 87 | 113 | 116 | 120* | 95 | 120* | 120* | 120* |
Valheim (DX 11 Max) | 57 | 63 | 66 | 131 | 99 | 126 | 126 | 131 |
Outriders Demo (DX 11 Max) | 66 | 75 | 86 | 133 | 70 | 101 | 109 | 115 |
It Takes Two (DX 11 Max) | 116 | 125 | 139 | 253 | 131 | 169 | 180 | 197 |
Mafia Definitive Edition (DX 11 High) | 67 | 90 | 110 | 162 | 95 | 125 | 141 | 145 |
To give other readers full context, that video was showing huge performance difference between RTX 3080 and RTX 2080. Hardware Unboxed was one of the news outlets which reacted to that promo video, they explained that at least some titles shown running at 4K Ultra were VRAM limited on the 2080, hence the huge delta in performance.IIRC before the review embargo on Ampere was lifted there was a DF Ampere early access video of sorts being passed around here where they were allowed by Nvidia to share some of the hype, just not exact performance numbers.
This shows that if you wanted a smooth 144hz experience in FH4 with a 10100, 7700K, 3300X or similar CPU then you cannot get it with an NV GPU. If you have Zen+ or a pre skylake Intel CPU the situation is even worse.Frankly R1400 was and is horrible CPU. People forget that it is has 2 CCX'es active in a 2+2 setup. Cores have access to just 4MB of L3 maximum.
That means inter thread communications have to cross CCX boundary and on ZEN1 that was especially slow affair. Any workload that overloads intercore comms is gonna suffer.
AMD has advantage here, cause they are parents of this abomination and probably force critical driver threads on same CCX as game render threads, so there is less intercore overhead and they continue to scale somewhat.
Compare to proper quad core with 6MB of L3:
View attachment 42092
True, actually i think even Intel 4C CPUs that were hurt with uCode updates like pre CoffeeLake stuff would also have problems.This shows that if you wanted a smooth 144hz experience in FH4 with a 10100, 7700K, 3300X or similar CPU then you cannot get it with an NV GPU. If you have Zen+ or a pre skylake Intel CPU the situation is even worse.
Wait, what? If anything, Nvidia (granted they are doing too much on the CPU as you describe) is lucky CPU performance has climbed so much in the past couple of years compared to the five years prior to that. And, arguably that’s thanks to AMD turning up the pressure which, let’s be honest, could have not happened if they had made a couple (more) missteps.True, actually i think even Intel 4C CPUs that were hurt with uCode updates like pre CoffeeLake stuff would also have problems.
NV DX12 driver is far from optimal even from my limited experience. For example in Anno 1800 benchmarks DX11 vs DX12, DX11 consistently wins in averages, even if later stages of the benchmark are horrendously limited by CPU.
So you either give up peak performance, or suffer lower minimums with DX11 in mid-late game. And this should be a poster child of DX12 gaming. The only redeeming factor is game genre, i ended up running with 80FPS limit DX12, 3090 seems to hold it real well.
The real question is - can Nvidia improve their DX12 driver CPU optimizations, maybe they are doing too much heavy lifting in the driver ( like sorting, batching draw calls an so on) that was beneficial in DX11 era, but is hurting them now when GPUs push way more FPS, while at same time CPU performance has stagnated.
Maybe they'll switch back to using hardware scheduling in future architectures, but I can see why they would have skipped it with Ampere. It's doubtful that a hardware scheduler provides any real benefits to compute workloads so there wasn't as much priority on it.Obviously, to me, this is a choice they are making for reasons only they seem to know. Maybe it would make the whole upscaling argument more relevant? It’s baffling Ampere would not have addressed this given all the silicon searching for a use they did bother to include.
Isn't ZEN3 in the CPU performance lead lately? It might be annoying for Nvidia to use AMD cpus in reference testing, but more FPS is more FPS? Heck AMD had used Intel CPUs when showcasing their GPUs during Bulldozer era, no big deal.Say what you want about what is coming and market share, but Nvidia knows that if they are tying the performance ceiling of their GPUs to CPU performance, they are hitching their wagon to Intel.
Too much drama, i like both DLSS and RT and I support NV for pushing technologies forward, instead of safe choice of more and more classic rasterization performance.Obviously, to me, this is a choice they are making for reasons only they seem to know. Maybe it would make the whole upscaling argument more relevant? It’s baffling Ampere would not have addressed this given all the silicon searching for a use they did bother to include.
Nvidia doesn't sell x86 CPUs so they'll use whatever makes their cards look the best. I believe that they'd done a bit of marketing in the past with AMD CPUs, but really until Zen 3 dropped Intel was still wearing the gaming crown. Maybe once the market settles down a little more we'll see them change over to using AMD CPUs for some of their comparisons.Isn't ZEN3 in the CPU performance lead lately? It might be annoying for Nvidia to use AMD cpus in reference testing, but more FPS is more FPS?
DLSS is a bit of a double-edged sword. I think it has the most use in low-power situations and is probably a big deal for Nvidia's ARM ambitions. The flip side is that it might be used as a crutch or worse yet push developers towards relying on it to get acceptable performance out of games.Too much drama, i like both DLSS and RT and I support NV for pushing technologies forward, instead of safe choice of more and more classic rasterization performance.
Given how long new consoles were in development, it was coming no matter what. Nvidia was smart to create their own, and do a good job of marketing it, so as to make their name synonymous with RT in most gamers minds.Too much drama, i like both DLSS and RT and I support NV for pushing technologies forward, instead of safe choice of more and more classic rasterization performance.
I don't know if RTRT was always the plan for the new consoles. The patent for AMD's RT hardware solution was filed in December 2017, that's just months before DXR was launched and well after Turing was designed. Honestly, a the relatively late addition of RTRT hardware as a feature requirement could explain some of AMD's RT hardware design choices.Given how long new consoles were in development, it was coming no matter what. Nvidia was smart to create their own, and do a good job of marketing it, so as to make their name synonymous with RT in most gamers minds.
I remember RT running on a Vega 56. And since it did not require Nvidia hardware to run, and was a DX12 feature, I figured it was being developed and tested for a good bit before release a couple years ago. You may be right though. I honestly can't say differently.I don't know if RTRT was always the plan for the new consoles. The patent for AMD's RT hardware solution was filed in December 2017, that's just months before DXR was launched and well after Turing was designed. Honestly, a the relatively late addition of RTRT hardware as a feature requirement could explain some of AMD's RT hardware design choices.
That crytek demo is always way overstated. It was never going to work in a real game and the RT quality was very low. It was very much a proof of concept for cards in 5+ years. RT appeared because Nvidia pushed it, I think it's only in the consoles at all because the console makers demanded something having seen what the Nvidia gpu's could do. Hence we get AMD's sticking plaster solution. If it had really been planned well in advance AMD would have had a much more Nvidia like solution with separate RT hardware for better performance and a AI cores for a DLSS equivalent.I remember RT running on a Vega 56. And since it did not require Nvidia hardware to run, and was a DX12 feature, I figured it was being developed and tested for a good bit before release a couple years ago. You may be right though. I honestly can't say differently.
RT went to consoles not because NVIDIA released Turing but because Microsoft + NVIDIA + AMD were designing DXR together.That crytek demo is always way overstated. It was never going to work in a real game and the RT quality was very low. It was very much a proof of concept for cards in 5+ years. RT appeared because Nvidia pushed it, I think it's only in the consoles at all because the console makers demanded something having seen what the Nvidia gpu's could do. Hence we get AMD's sticking plaster solution. If it had really been planned well in advance AMD would have had a much more Nvidia like solution with separate RT hardware for better performance and a AI cores for a DLSS equivalent.
Not that I think Nvidia did it just for the gamers. Their pro cards have been doing RT (non real time) for years so they'd probably already considered adding specialist hardware to speed that up. Then they had their AI cores for servers, and they worked out they were really good a touching up images which is key for RT as you have to de-noise the image every frame. Put them together and you suddenly end up with something capable of real time RT. Finally AMD gpu's were rubbish so Nvidia could afford to use a cheaper process, blow a ton of silicon on next gen features that weren't used yet and still end up with something better then AMD.
That crytek demo is always way overstated. It was never going to work in a real game and the RT quality was very low. It was very much a proof of concept for cards in 5+ years.
RT hardware didn't go into consoles because of the Turing launch but they might have gone into them because of the Turing RTRT plans. Given how long it likely takes for a API like DXR to be cooked up, the other involved parties would have been in on Nvidia's RTRT plans for Turing a good deal of time before Turing or DXR actually launched. AMD's RT hardware patent filing landing just months before DXR launched makes me doubt that RT hardware was planned all along.RT went to consoles not because NVIDIA released Turing but because Microsoft + NVIDIA + AMD were designing DXR together.
Microsoft announced DXR for Windows 10 on March 19 2018, same day NVIDIA announced RTX.
Microsoft DXR tier 1.1 API was co-developed with AMD for the Xbox and Windows
AMD RDNA2 has dedicated RT Cores for Ray Tracing and the design was not changed at the end to incorporate RT because NVIDIA released Turing in 2018. It was designed with RT for the start, 2018 was to late in to the development of RDNA2 to changed anything at that point.
I'd think that if AMD was sitting on the patent for a significant amount of time before the development of DXR, Inline Ray Tracing would have been a part of DXR 1.0 instead of joining over a year later as a part of DXR 1.1.Patents will be filed as late as possible in order to avoid tipping one's hand to a competitor and to maximize the life of the patent. It's not like either company could just decide to add in ray tracing on a whim and it's likely something that both have been working on for a considerable time.
Even now it's a bit ahead of its time. Few games support it, many that do aren't making the best use of it, and the performance definitely isn't there outside of cards at the extreme high-end. If Nvidia didn't have DLSS to go along with it, I question if they would have released it at all because it would necessitate running the games at lower resolutions just to hit acceptable frame rates.
My point was no matter what they could bench with, Intel sells the most x86 CPUs so Ampere is going to be paired with Intel a lot. Maybe even in the vast majority of cases when we take prebuilt PCs (and basically the only way to buy RTX cards right now)... And you have to make sure you are performant on the most widely used brand, right? It would “come out” if you can use value CPUs with the competition and still get solid benchmark results.Isn't ZEN3 in the CPU performance lead lately? It might be annoying for Nvidia to use AMD cpus in reference testing, but more FPS is more FPS? Heck AMD had used Intel CPUs when showcasing their GPUs during Bulldozer era, no big deal.
With CPU core count increasing there is actually more horsepower to extract from CPUs. Nvidia was already doing this multithreaded processing during DX11 era.
Too much drama, i like both DLSS and RT and I support NV for pushing technologies forward, instead of safe choice of more and more classic rasterization performance.
To be fair Ryzen 1400 is only extracting 1/2 of performance out of AMD high end GPUs as well.Same story the new patch didnt help with the CPU overhead , even RTX3060 is affected at 1080p with the Ryzen 1400
If you put everything maxed on without understanding the visual vs perf impact of each individual setting, you will likely get a sub-par experience.Cyberpunk is an unoptimized pile of mess, I had parts where I was bottlenecked by 3090 even at 1080p. At 4K max gfx, you can’t play it without DLSS (~10-25 FPS) So yeah, that was my experience with the game. So I switched back to Doom Eternal aka technical marvel, game gets repetitive, but it’s a pleasure to deal with, especially on high refresh hdr enabled monitors (e.g. LG 27GN950-B). Kudos to iD. I wish more games get to use their technology.
Edit: screen below was one such spot pre-1.20 patch. 3090 totally demolished by the game. Lowest FPS I’ve seen.