Radeon 7900 Reviews

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Will update this list as more come along.

ArsTechnica:
(Ryzen 5800X3D, Asus ROG Crosshair VIII Dark Hero, 64GB DDR4-3200, Windows ???)
https://arstechnica.com/gadgets/202...0-gpus-are-great-4k-gaming-gpus-with-caveats/

Gamers Nexus:
https://www.youtube.com/watch?v=We71eXwKODw

Guru3D:
(Ryzen 5950X, ASUS X570 Crosshair VIII HERO, 32 GB (4x 8GB) DDR4 3600 MHz, Windows 10)
https://www.guru3d.com/articles-pages/amd-radeon-rx-7900-xtx-review,1.html

Hardware Canucks
(Ryzen 7700X, Asus X670E ROG Crosshair hero, 32GB DDR5-6000, Windows 11)
https://www.youtube.com/watch?v=t3XPNr506Dc

Hardware Unboxed:
(Ryzen 5800X3D, MSI MPG X570S Carbon Max WiFi, 32GB DDR4-3200, Windows 11)
https://www.youtube.com/watch?v=4UFiG7CwpHk

Igor's Lab:
(Ryzen 7950X, MSI MEG X670E Ace,32GB DDR5 6000)
https://www.igorslab.de/en/amd-rade...giant-step-ahead-and-a-smaller-step-sideways/

Jay's Two Cents:
https://www.youtube.com/watch?v=Yq6Yp2Zxnkk

KitGuruTech:
(Intel 12900K, MSI MAG Z690 Unified, 32GB DDR5)
https://www.youtube.com/watch?v=qThrADqleD0

Linus Tech Tips:
https://www.youtube.com/watch?v=TBJ-vo6Ri9c

Paul's Hardware:
(Ryzen 7950X, Asus X670E ROG Crosshair Hero, 32GB DDR5-6000, Windows 11)
https://www.youtube.com/watch?v=q10pefkW2qg

PC Mag:
(Intel 12900K, Asus ROG Maximus Z690 Hero, 32GB 5600MHz, Windows 11)
https://www.pcmag.com/reviews/amd-radeon-rx-7900-xtx

Tech Power Up:
(Intel 13900K, ASUS Z790 Maximus Hero, 2x 16 GB DDR5-6000 MHz, Windows 10)
AMD: https://www.techpowerup.com/review/amd-radeon-rx-7900-xtx/
ASUS: https://www.techpowerup.com/review/asus-radeon-rx-7900-xtx-tuf-oc/
XFX: https://www.techpowerup.com/review/xfx-radeon-rx-7900-xtx-merc-310-oc/

Tech Spot:
(Ryzen 5800X3D, MSI MPG X570S, 32GB of dual-rank, dual-channel DDR4-3200 CL14, Windows ???)
https://www.techspot.com/review/2588-amd-radeon-7900-xtx/

TechTesters:
(Intel 13900K, ASUS ROG Maximus Z790 HERO, 32GB DDR5-6000, Windows 11)
https://www.youtube.com/watch?v=3uQh4GkPopQ
 
Last edited:

GodisanAtheist

Diamond Member
Nov 16, 2006
6,891
7,289
136
Well this card certainly has potential to run at 3Ghz+ , if they manage to smooth out the voltage/power curve somehow:


oc-cyberpunk.png


It scales quite well, once you put 450-500W into it :D

- NV was willing to dump 100's of extra watts into the 4090 for 5-10% more performance. If only AMD had the cajones to do the same (just throw a "quiet" mode on the card for everyone else).

This whole launch would have had a completely different tenor.

Love the passion in here too. Nothing gets a bunch of PC geeks worked up quite like a new GPU launch.
 

Hitman928

Diamond Member
Apr 15, 2012
5,372
8,209
136

So, you are saying this result with the 4090 being 5% faster is accurate:

1670950900862.png

But this result with a minor swing in favor of the 7900XTX under a multiplayer test run shows clear bias/ineptitude? You do realize these relative results are not far from each other at all and the relative difference is well within what you would expect when testing different play modes or even just different scenes, right?

rx-7900-xtx-review-_-techspot-png.72760


You also realize that HWUB's average at 4K only had the 7900XTX with a 3.7% lead over the 4080 whereas computerbase, which you said is the best out there, had a 0.3% difference in favor of the 7900XTX. All this consternation when their results are virtually the same in the end.
 

amenx

Diamond Member
Dec 17, 2004
3,968
2,196
136
Looking at the TPU review of the ASUS model, I see they have even the reference 7900XTX beating the 4090 in Far Cry 6. Even when RT is turned on, the reference 7900XTX virtually ties the 4090. Time to pull out the pitchforks and burn down TPU for clearly having flawed benchmark methodology and an AMD bias! Who's with me?

View attachment 72796
Surely you must know there are different implementations of RT with some more intensive than others. Some only shadows (SOTR), some reflections (FC6), some with global illumination (Metro) some with hybrid GI (Cyberpunk), etc. You cant lump them together as all equal in performance hit.
 

Hitman928

Diamond Member
Apr 15, 2012
5,372
8,209
136
Surely you must know there are different implementations of RT with some more intensive than others. Some only shadows (SOTR), some reflections (FC6), some with global illumination (Metro) some with hybrid GI (Cyberpunk), etc. You cant lump them together as all equal in performance hit.

Where did I lump them together?
 

blckgrffn

Diamond Member
May 1, 2003
9,139
3,074
136
www.teamjuchems.com
- NV was willing to dump 100's of extra watts into the 4090 for 5-10% more performance. If only AMD had the cajones to do the same (just throw a "quiet" mode on the card for everyone else).

This whole launch would have had a completely different tenor.

Love the passion in here too. Nothing gets a bunch of PC geeks worked up quite like a new GPU launch.

Man, I really hope the 7950XTX is respin to fix teething issues and is faster in the same “envelope” - AMD saying “hold my beer” and using a 3x 8 pin design with cooler just a smidgen smaller than the 4080/4090 and amping it to 450W+ is just not my idea of progress.

But yeah, many bar graphs would swing the right way, I get that.

I feel like that’s what the inter RDNA xx50 refresh was and I am meh on it.
 

Hitman928

Diamond Member
Apr 15, 2012
5,372
8,209
136
By inference. That if the XTX did so well in FC6 with RT on and equaling the RTX 4090, then it must be a great RT performer.

I think you completely missed the point of my post. I was not inferring any such thing. Maybe I needed to add the traditional /s at the end of my post, but I thought I was being obvious with it.
 

amenx

Diamond Member
Dec 17, 2004
3,968
2,196
136
I think you completely missed the point of my post. I was not inferring any such thing. Maybe I needed to add the traditional /s at the end of my post, but I thought I was being obvious with it.
Yeah I knew you were being /s just thought the RT bit was odd. I really want to to see the 7900 cards in the best possible light because for the first time in a long time I want to jump off the NV train and so was hyped up for RDNA 3. As mentioned earlier, I'm still on the fence and want to see these cards widely circulated with plenty of user feedback before I decide.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
You should realise that each game is different and in some cases Maxing settings means pushing things beyond reasonable, creating artificial bottlenecs. In many cases this is the prime tool AMD, Nvidia or Intel will use through their developer partnership programs to create artifical advantage over their competition, playing on their own architectural advantages to win over the rest of the field, including own prior generations.
I'm not saying this is good or bad, as that sometimes is the only way to push new interesting technlogies to the market, but limiting yourself to olny MAXED SETTINGS is not how majority of people will be using their purchased hardware. You need datapoints from all types of workloads to understand how hardware works.

I think gauging raw performance is important when testing a new GPU. Don't you want to find out how it behaves when the settings are dialed up and it's stressed to the max?

Not only from a performance standpoint either, but from a build standpoint. What kind of temperatures and power draw can you expect under full load during gaming.

Instead, we have certain reviewers presenting benchmarks that aren't using ultra settings even when the option exists. And for CPU reviews, HWUB doesn't even test games under CPU limited scenarios for the most part.

I can't believe we're even debating this. Computerbase.de tested all of these GPUs at maxed settings with RT on and off, and also with DLSS and FSR on and off.

That should be the standard, and none of this half assed crap.

Exaplme: The Witcher 2 and SSAA which was implemented but not useful for a decade. It was an interesting optioin no sane gamer would use in their settings to play that game within 5 years of it's launch.

Yeah I remember that setting, and the developer stated it was for "future GPUs." I don't think it took a decade to become playable either. Doesn't mean that testing it with contemporary GPUs isn't warranted though, and reviewers did test with that setting as I recall but just for laughs.

But in any case, these reviews are supposed to show us what the GPUs are capable of. How can you expose their capabilities when they're not being pushed to the max?
 

Gideon

Golden Member
Nov 27, 2007
1,665
3,776
136
Man, I really hope the 7950XTX is respin to fix teething issues and is faster in the same “envelope” - AMD saying “hold my beer” and using a 3x 8 pin design with cooler just a smidgen smaller than the 4080/4090 and amping it to 450W+ is just not my idea of progress.

But yeah, many bar graphs would swing the right way, I get that.

I feel like that’s what the inter RDNA xx50 refresh was and I am meh on it.
Don't forget, according to Skyjuice the MCDs can also have a 3D cache Layer bumping the L3 to over 190 MB. add that on top of the rest and it might be quite a the performer

(yea yea, hype train, i know!)
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Looking at the TPU review of the ASUS model, I see they have even the reference 7900XTX beating the 4090 in Far Cry 6. Even when RT is turned on, the reference 7900XTX virtually ties the 4090. Time to pull out the pitchforks and burn down TPU for clearly having flawed benchmark methodology and an AMD bias! Who's with me?

View attachment 72796

Hate to break it to you, but that's not the benchmark that includes ray tracing. This one is :cool:

Also, Far Cry 6 uses hybrid RT reflections which are combined with SSR, that probably reduces the RT workload.

far-cry-6-rt-3840-2160.png
 

In2Photos

Golden Member
Mar 21, 2007
1,636
1,656
136
Hate to break it to you, but that's not the benchmark that includes ray tracing. This one is :cool:

Also, Far Cry 6 uses hybrid RT reflections which are combined with SSR, that probably reduces the RT workload.

far-cry-6-rt-3840-2160.png
Hate to break it to you but his post said that it beats the 4090 in raster and nearly ties the 4090 with RT on. The graph he posted was for raster.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
So, you are saying this result with the 4090 being 5% faster is accurate:

But this result with a minor swing in favor of the 7900XTX under a multiplayer test run shows clear bias/ineptitude? You do realize these relative results are not far from each other at all and the relative difference is well within what you would expect when testing different play modes or even just different scenes, right?

If you followed the thread, @alexruiz claimed that "every reviewer got comparable results in this game" when referring to Call of Duty MW2.

I merely cited 3 examples that contradicted his statement. When the settings are dialed up, the RTX 4090 will pull ahead slightly compared to when using basic settings.

This is in line with observations that AMD's DX12 driver is more efficient at lower resolutions and settings than Nvidia's.

You also realize that HWUB's average at 4K only had the 7900XTX with a 3.7% lead over the 4080 whereas computerbase, which you said is the best out there, had a 0.3% difference in favor of the 7900XTX. All this consternation when their results are virtually the same in the end.

Which graph are you looking at? Computerbase.de has the RTX 4080 leading the 7900XTX by 4% in rasterization at 4K, and it's 32% with RT.
 

Hitman928

Diamond Member
Apr 15, 2012
5,372
8,209
136
If you followed the thread, @alexruiz claimed that "every reviewer got comparable results in this game" when referring to Call of Duty MW2.

I merely cited 3 examples that contradicted his statement.

But they don't contradict his statement, that's the thing. A small swing one way or the other is basically the definition of comparable.

Which graph are you looking at? Computerbase.de has the RTX 4080 leading the 7900XTX by 4% in rasterization at 4K, and it's 32% with RT.

1670959787339.png

1670959938123.png

Edit:

1670960074663.png

 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Reread my post, I never claimed the graph I showed was with RT enabled.

You implied though, when you said:

Even when RT is turned on, the reference 7900XTX virtually ties the 4090.

Either way though, the outcome isn't much different to be honest. Other reviewers showed close performance between the 7900XTX and the RTX 4090 in Far Cry 6 both with and without RT enabled.

As I said before, the RT workload doesn't appear to be very high due to the hybridized reflections and the RT shadows have lots of limitations. Dunia engine has also historically scaled poorly with multithreaded CPUs, so that probably has some impact as well.

I can almost guarantee that the RTX 4090 isn't even running at full bore at 4K maxed settings with RT enabled.

 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
But they don't contradict his statement, that's the thing. A small swing one way or the other is basically the definition of comparable.

This guy meant that the 7900XTX was faster than the RTX 4090, not that they were comparable. You have to look at the context of the conversation.

The graph that @amenx posted originally from HWUB had the 7900XTX 28% faster than the RTX 4090, which is a big gap.

And I think it's likely accurate, because AMD's DX12 driver is more efficient than Nvidia's under CPU limited circumstances. When things are GPU limited though, it's the opposite.
 

Hitman928

Diamond Member
Apr 15, 2012
5,372
8,209
136
You implied though, when you said:

No, there was no such implication. The graph I showed was clearly for rasterization as the 7900xtx was leading in the graph just as I said it did. I just added further commentary that when RT was enabled, the 7900xtx still was in a virtual tie with the 4090, which is also true.


Either way though, the outcome isn't much different to be honest. Other reviewers showed close performance between the 7900XTX and the RTX 4090 in Far Cry 6 both with and without RT enabled.

As I said before, the RT workload doesn't appear to be very high due to the hybridized reflections and the RT shadows have lots of limitations. Dunia engine has also historically scaled poorly with multithreaded CPUs, so that probably has some impact as well.

I can almost guarantee that the RTX 4090 isn't even running at full bore at 4K maxed settings with RT enabled.


You're still completely missing the point of the post and actually reinforcing it at the same time.
 

Hitman928

Diamond Member
Apr 15, 2012
5,372
8,209
136
This guy meant that the 7900XTX was faster than the RTX 4090, not that they were comparable. You have to look at the context of the conversation.

The graph that @amenx posted originally from HWUB had the 7900XTX 28% faster than the RTX 4090, which is a big gap.

And I think it's likely accurate, because AMD's DX12 driver is more efficient than Nvidia's under CPU limited circumstances. When things are GPU limited though, it's the opposite.

No one has argued the 7900xtx is faster than the 4090. No one. There are some corner cases where it might, but I don't think anyone has argued it is faster outside of a couple of these corner cases.

@amenx posted the 4K results. @biostud posted the 1440p results. He wasn't claiming the 7900xtx was faster because of this one result, he literally was asking why it was beating the 4090 in this corner case which is not only at 1440p, but was also tested in a multiplayer test bench which also increases the CPU load/dependence. So if someone is looking to game at all competitively on MW2 multiplayer, this is a great test case to see. If it's not relevant, ignore it and move on. There are plenty of corner cases that are not relevant at all to me on both sides that I just plainly ignore. At most I'll say I don't think it's relevant for x, y, and z and that's it. If others find it useful, then great. No need to go further than that.
 

blckgrffn

Diamond Member
May 1, 2003
9,139
3,074
136
www.teamjuchems.com
How can kids play games today, I have no idea.

Their parents buy them a gaming laptop with a 1650 in it, if they are lucky. Otherwise they get Xe or Vega iGPU.

If they are lucky, an XPS with a 3060!

We are way over on the margins here. Speaking for myself, I build PCs for fun and so peddle more in the 6700xt/3060ti range. Out of all the others I’ve done two with 3080s and one awesome 3090ti build.
 
  • Like
Reactions: lightmanek

biostud

Lifer
Feb 27, 2003
18,266
4,779
136
Asus tuf 7900XTX overclocked performance, does not say anything about power consumption.
oc-cyberpunk.png
 
  • Like
Reactions: Kaluan and scineram

biostud

Lifer
Feb 27, 2003
18,266
4,779
136
No one has argued the 7900xtx is faster than the 4090. No one. There are some corner cases where it might, but I don't think anyone has argued it is faster outside of a couple of these corner cases.

@amenx posted the 4K results. @biostud posted the 1440p results. He wasn't claiming the 7900xtx was faster because of this one result, he literally was asking why it was beating the 4090 in this corner case which is not only at 1440p, but was also tested in a multiplayer test bench which also increases the CPU load/dependence. So if someone is looking to game at all competitively on MW2 multiplayer, this is a great test case to see. If it's not relevant, ignore it and move on. There are plenty of corner cases that are not relevant at all to me on both sides that I just plainly ignore. At most I'll say I don't think it's relevant for x, y, and z and that's it. If others find it useful, then great. No need to go further than that.
Even if it is more CPU dependent in MP, wouldn't you just expect the cards to tie in performance, not see the "lesser" card severely beating the much faster card?