Radeon 7900 Reviews

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Will update this list as more come along.

ArsTechnica:
(Ryzen 5800X3D, Asus ROG Crosshair VIII Dark Hero, 64GB DDR4-3200, Windows ???)
https://arstechnica.com/gadgets/202...0-gpus-are-great-4k-gaming-gpus-with-caveats/

Gamers Nexus:
https://www.youtube.com/watch?v=We71eXwKODw

Guru3D:
(Ryzen 5950X, ASUS X570 Crosshair VIII HERO, 32 GB (4x 8GB) DDR4 3600 MHz, Windows 10)
https://www.guru3d.com/articles-pages/amd-radeon-rx-7900-xtx-review,1.html

Hardware Canucks
(Ryzen 7700X, Asus X670E ROG Crosshair hero, 32GB DDR5-6000, Windows 11)
https://www.youtube.com/watch?v=t3XPNr506Dc

Hardware Unboxed:
(Ryzen 5800X3D, MSI MPG X570S Carbon Max WiFi, 32GB DDR4-3200, Windows 11)
https://www.youtube.com/watch?v=4UFiG7CwpHk

Igor's Lab:
(Ryzen 7950X, MSI MEG X670E Ace,32GB DDR5 6000)
https://www.igorslab.de/en/amd-rade...giant-step-ahead-and-a-smaller-step-sideways/

Jay's Two Cents:
https://www.youtube.com/watch?v=Yq6Yp2Zxnkk

KitGuruTech:
(Intel 12900K, MSI MAG Z690 Unified, 32GB DDR5)
https://www.youtube.com/watch?v=qThrADqleD0

Linus Tech Tips:
https://www.youtube.com/watch?v=TBJ-vo6Ri9c

Paul's Hardware:
(Ryzen 7950X, Asus X670E ROG Crosshair Hero, 32GB DDR5-6000, Windows 11)
https://www.youtube.com/watch?v=q10pefkW2qg

PC Mag:
(Intel 12900K, Asus ROG Maximus Z690 Hero, 32GB 5600MHz, Windows 11)
https://www.pcmag.com/reviews/amd-radeon-rx-7900-xtx

Tech Power Up:
(Intel 13900K, ASUS Z790 Maximus Hero, 2x 16 GB DDR5-6000 MHz, Windows 10)
AMD: https://www.techpowerup.com/review/amd-radeon-rx-7900-xtx/
ASUS: https://www.techpowerup.com/review/asus-radeon-rx-7900-xtx-tuf-oc/
XFX: https://www.techpowerup.com/review/xfx-radeon-rx-7900-xtx-merc-310-oc/

Tech Spot:
(Ryzen 5800X3D, MSI MPG X570S, 32GB of dual-rank, dual-channel DDR4-3200 CL14, Windows ???)
https://www.techspot.com/review/2588-amd-radeon-7900-xtx/

TechTesters:
(Intel 13900K, ASUS ROG Maximus Z790 HERO, 32GB DDR5-6000, Windows 11)
https://www.youtube.com/watch?v=3uQh4GkPopQ
 
Last edited:

lixlax

Member
Nov 6, 2014
183
150
116
I see lots of people saying that the7900xt should've been called 7800xt and priced at 650USD.
I just realised that where I live 900USD is worth exactly as much 650USD was worth in december 2020.

If the XTX is a 6900xt replacement then it should've been priced at 1300-1400😆.

Actually the inflation is sad.
 

coercitiv

Diamond Member
Jan 24, 2014
6,214
11,959
136
while there was a time when they were going for volume where what mostly happened is that people bought Nvidia cards for less.
Gamers trained AMD to NOT play the value game, now everybody is suprised AMD plays the margin game while figuring out ways to improve their mind share instead.

Even someone like RedGamingTech figured this out:
How "we want a competitor" works:
  1. I'd buy AMD if they had better hw/sw
  2. AMD improves
  3. Nvidia launches product
  4. AMD is slightly worse in X, I'll buy if they're cheap
  5. AMD sells cheaper
  6. I wonder if Nvidia will lower their price to match
  7. *person buys Nvidia*
 

Saylick

Diamond Member
Sep 10, 2012
3,172
6,410
136
Yikes. I had tempered my expectations, but didn't think AMD would sail underneath them still. It definitely puts a damper on my original plan to get a 7900XTX at launch. Clearly there's an issue in the GCD and perhaps with the drivers as well (it should not be spending 100W idling for multi-monitors). It really does seem like AMD bit off more than they could chew here. I'll sit this out for a few months until some of this stuff is sorted and then reevaluate. I really do hope they get a respin out in 6 months rather than 12.
 

alexruiz

Platinum Member
Sep 21, 2001
2,836
556
126
Who were able to get a XTX?

I got my hands into a reference XT, and a XFX Merc XT also, but no XTX yet.
Supposedly, I should be getting a XTX Merc also, that is, if B&H ships it soon.

For all the comments about noise, the reference version noise level doesn't seem any different than the RX 6000 series reference version.
The card is heavier than the RX 6000s, but very comparable in dimensions. The reference card will fit in almost any case.
Don't know if the Radeon logo is supposed to light up or not, because stays off in the XT reference version.

As far as the Merc, well, let's just say that is huge.Most normal sized towers won't be able to take it.
I will put it in different build and see how it goes.Big support bracket to hold it.

With adrenalin 22.12.1, and paired with a Ryzen 7 7700X on a Gigabyte B650 Aorus Elite ax, the reference version card is stable and fast.

I know that the XT is looked down, and many of us see it that it exists to upsell the XTX.
In the end, however, for as much as we don't like the price, there is no faster $900 card available.
The supply of 6800XT, 6900XT and 6950XT has almost dried up, and the ones available are not far in price from a 7900 XT.
RTX 3080Tis, 3090s and 3090Tis are more expensive than a XTX.
 

KompuKare

Golden Member
Jul 28, 2009
1,016
934
136
The prices AMD is charging for a clearly inferior product makes me see them as totally delusional and capable of only making bad choices. They won't last long in the market doing this crap. These 7900 cards aren't worth anything near $1000. They should be $600 and $700, not $900 and $1000, lol. Totally ridiculous. After the launch, people suddenly started buying 4080s because these 7900 cards suck so bad they make even the 4080 look like a good deal. I expect Nvidia could completely bury AMD in a single generation by releasing high-value, big performing products but they don't only to avoid monopoly law issues, etc.
What this launch says to me is that AMD are not interested in volume or clawing back marketshare. Even when they have lower costs (the whole point of chiplets), they want to continue dream about high margins.

The Radeon group has been loosing marketshare for years now, and while there was a time when they were going for volume where what mostly happened is that people bought Nvidia cards for less.

However, 10% of the market it not viable but undercutting NVidia by 5-10% is not going to gain them much marketshare.

While we have no idea if they are production limited with packing chiplet GPUs, it seems more likely that the obsession with margins is behind it.

Irony being that has happened for a few gens is basically this:
  1. The launch a card. Drivers are not yet polished or there is some other issue*.
  2. The card is priced about 10% less than the similar Nvidia card.
  3. The card sells poorly.
  4. Drivers improve and the card might move one or more tiers up compared to the competition.
  5. The card still sells poorly.
  6. AMD swallow their pride and reduce the price.
Rinse and repeat.

The price cuts often take a long time to materials and are not official.

So what has often happened is that they have high margins at launch but little volume.
high margins * low volume = not much profit.
Later they have lower margin and a but more volume.
Lower margins * medium volume = not much profit.

Seems to be a predicable cycle.

But to generate profit? Well you either need high margins or high volumes. Now that consoles are not (yet) competing for the same wafers, and assuming that AMD is actually interested in getting a return for the huge fixed costs of designing RDNA3, I can't help but think that lower margins and far greater volumes would have made sense.

That Nvidia is okay with killing the golden gamers will buy anything at any price goose, destroying the PC gaming market by over-pricing everything doesn't sound like a successful long-term strategy.

* I freely admit that some of these issues do seem to get blown up a lot and suspiciously follow very similar talking points - almost exactly what we would expect if some PR company was using influencers to drive the narrative.
 

Saylick

Diamond Member
Sep 10, 2012
3,172
6,410
136
Gamers trained AMD to NOT play the value game, now everybody is suprised AMD plays the margin game while figuring out ways to improve their mind share instead.
Same thing happened when AMD took the gaming crown with Zen 3. AMD priced their Zen 3 lineup based on how it performed relative to the competition, i.e. higher than Intel because they simply had the superior CPU, and then a sizeable portion of gamers cried foul play. People expect AMD to be the budget brand and with that implication comes the expectation that AMD's sole purpose in the market is to be a price check to Intel and Nvidia. They expect AMD to always price their products based on a pricing structure that is a step down from their competitors, never in-line with the competition's pricing structure, even if they have the better product. If AMD give in to gamer's silly expectations instead of pricing things from the simple business standpoint of "what makes us the most money", they'd be broke because they'd always be selling themselves short. I've heard people constantly say "AMD needs to gain market share, so the 7900XTX should have been $600", but they are just looking out for themselves. They don't make that statement because they care about AMD's market share, or the lack thereof; they really just want cheaper GPUs and they know that Nvidia will never give that to them, so they yet again put the expectation on AMD to be the budget brand. Unsurprisingly, the same people are actually just waiting for Nvidia to follow suit and lower the price of the 4080/4090 to buy that instead.
 
Jul 27, 2020
16,340
10,352
106

1670869434867.png

7900XTX is beating the 4080 FE in 10 out of 17 games at 5,120 × 2,880. I can see why AMD was promoting the fake 8K Samsung monitor in its presentation so much. At these high resolutions, the 4080 FE's 16GB VRAM starts to become problematic.
 

Timorous

Golden Member
Oct 27, 2008
1,626
2,793
136
After looking at more reviews one thing is clear and that is the numbers are all over the place. Some games work as expected and others just fall of a cliff with barely any gain at all over the 6950XT.

I expect going forward there will be driver updates to make improvements and there will be a bit of a fine wine effect.

I was considering buying one of these GPUs early next year but now I am not so sure. I wIll have to wait and see how drivers mature and what kind of tweaking can be done. I would consider NV but my setup will have 2 OLED TVs for screens and they don't have (or I have not seen at least) cards with 2x HDMI 2.1 outs where as quite a few of the AMD AIBs have replaced the USB-C port with a 2nd HDMI 2.1 output which is ideal for what my setup will have.

Yes, they stated many times that they are not changing CPU for 13900K to keep the accumulated data consistent and wait for Zen4 3D Cache, if you watched their videos you should know. If you like it or not, it's not your decision, but you can stop watching them if you don't like.
How about GN using 12700K, is that wrong in your eyes too?
How about TPU using 5800X on AMD side when they could easily use 7950X?

Why creating drama where there is none? They clearly state game/resolution/setting and platform used for testing, what else do you need? Are you looking for tests using only Ultra settings and 4K or 8K with overclocked 13900K and over 7000MHz memory? If that's the case then limit your reading/watching to content providing that and not 'demand' everyone follows suite as this would become boring after reading 2 reviews. Sorry, I do not see your bashing HWUB as valid as I appreciate wide spectrum of data points.
It would be a different matter if they manipulated results in malicious way, but all their numbers are fully reproducible.

TPU upgraded to a 13900K for the 7900XT and XTX reviews. It is why the selection of cards is smaller than their prior reviews.
 

coercitiv

Diamond Member
Jan 24, 2014
6,214
11,959
136
This was posted before, but anyone looking at it will see a busted architecture, or at least one where some parts are being used intermittently. Look at that voltage spread for the same frequency.
It's worth noting that RDNA2 also exhibits high voltage spread for the same frequency, albeit with a much cleaner distribution.
1670930320304.png
1670930529180.png
 

Hitman928

Diamond Member
Apr 15, 2012
5,324
8,015
136
Looking at the TPU review of the ASUS model, I see they have even the reference 7900XTX beating the 4090 in Far Cry 6. Even when RT is turned on, the reference 7900XTX virtually ties the 4090. Time to pull out the pitchforks and burn down TPU for clearly having flawed benchmark methodology and an AMD bias! Who's with me?

1670950554830.png
 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,827
7,191
136
Well this card certainly has potential to run at 3Ghz+ , if they manage to smooth out the voltage/power curve somehow:


oc-cyberpunk.png


It scales quite well, once you put 450-500W into it :D

- NV was willing to dump 100's of extra watts into the 4090 for 5-10% more performance. If only AMD had the cajones to do the same (just throw a "quiet" mode on the card for everyone else).

This whole launch would have had a completely different tenor.

Love the passion in here too. Nothing gets a bunch of PC geeks worked up quite like a new GPU launch.
 

Hitman928

Diamond Member
Apr 15, 2012
5,324
8,015
136

So, you are saying this result with the 4090 being 5% faster is accurate:

1670950900862.png

But this result with a minor swing in favor of the 7900XTX under a multiplayer test run shows clear bias/ineptitude? You do realize these relative results are not far from each other at all and the relative difference is well within what you would expect when testing different play modes or even just different scenes, right?

rx-7900-xtx-review-_-techspot-png.72760


You also realize that HWUB's average at 4K only had the 7900XTX with a 3.7% lead over the 4080 whereas computerbase, which you said is the best out there, had a 0.3% difference in favor of the 7900XTX. All this consternation when their results are virtually the same in the end.
 

Hitman928

Diamond Member
Apr 15, 2012
5,324
8,015
136
By inference. That if the XTX did so well in FC6 with RT on and equaling the RTX 4090, then it must be a great RT performer.

I think you completely missed the point of my post. I was not inferring any such thing. Maybe I needed to add the traditional /s at the end of my post, but I thought I was being obvious with it.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Reread my post, I never claimed the graph I showed was with RT enabled.

You implied though, when you said:

Even when RT is turned on, the reference 7900XTX virtually ties the 4090.

Either way though, the outcome isn't much different to be honest. Other reviewers showed close performance between the 7900XTX and the RTX 4090 in Far Cry 6 both with and without RT enabled.

As I said before, the RT workload doesn't appear to be very high due to the hybridized reflections and the RT shadows have lots of limitations. Dunia engine has also historically scaled poorly with multithreaded CPUs, so that probably has some impact as well.

I can almost guarantee that the RTX 4090 isn't even running at full bore at 4K maxed settings with RT enabled.

 

Hitman928

Diamond Member
Apr 15, 2012
5,324
8,015
136
You implied though, when you said:

No, there was no such implication. The graph I showed was clearly for rasterization as the 7900xtx was leading in the graph just as I said it did. I just added further commentary that when RT was enabled, the 7900xtx still was in a virtual tie with the 4090, which is also true.


Either way though, the outcome isn't much different to be honest. Other reviewers showed close performance between the 7900XTX and the RTX 4090 in Far Cry 6 both with and without RT enabled.

As I said before, the RT workload doesn't appear to be very high due to the hybridized reflections and the RT shadows have lots of limitations. Dunia engine has also historically scaled poorly with multithreaded CPUs, so that probably has some impact as well.

I can almost guarantee that the RTX 4090 isn't even running at full bore at 4K maxed settings with RT enabled.


You're still completely missing the point of the post and actually reinforcing it at the same time.
 

Hitman928

Diamond Member
Apr 15, 2012
5,324
8,015
136
This guy meant that the 7900XTX was faster than the RTX 4090, not that they were comparable. You have to look at the context of the conversation.

The graph that @amenx posted originally from HWUB had the 7900XTX 28% faster than the RTX 4090, which is a big gap.

And I think it's likely accurate, because AMD's DX12 driver is more efficient than Nvidia's under CPU limited circumstances. When things are GPU limited though, it's the opposite.

No one has argued the 7900xtx is faster than the 4090. No one. There are some corner cases where it might, but I don't think anyone has argued it is faster outside of a couple of these corner cases.

@amenx posted the 4K results. @biostud posted the 1440p results. He wasn't claiming the 7900xtx was faster because of this one result, he literally was asking why it was beating the 4090 in this corner case which is not only at 1440p, but was also tested in a multiplayer test bench which also increases the CPU load/dependence. So if someone is looking to game at all competitively on MW2 multiplayer, this is a great test case to see. If it's not relevant, ignore it and move on. There are plenty of corner cases that are not relevant at all to me on both sides that I just plainly ignore. At most I'll say I don't think it's relevant for x, y, and z and that's it. If others find it useful, then great. No need to go further than that.
 

Hitman928

Diamond Member
Apr 15, 2012
5,324
8,015
136
Even if it is more CPU dependent in MP, wouldn't you just expect the cards to tie in performance, not see the "lesser" card severely beating the much faster card?

AMD’s drivers with modern APIs have less CPU overhead than Nvidia’s. If you look at the chart, the Nvidia GPUs have trouble scaling at 1440p like you would expect when more CPU bottlenecked whereas the Radeon cards scale roughly like you would expect relative to their GPU power.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
... no.

Vapor chambers work based off of Capillary action, not gravity:

fyi the nvidia cards also use vapor chambers along with everyone else.

Your just wrong.

Vapor chambers are very much effected by orientation.

The GN video with nVidia's engineer on the 4090 actually talked about how orientation actually impacts their performance. Heat pipes are not effected by it. But vapor chambers certainly are.
 
  • Like
Reactions: Aapje

GodisanAtheist

Diamond Member
Nov 16, 2006
6,827
7,191
136
Oh yeah the "AMD needs to be competitive and cheap so I can afford an NV card" meme is pretty deeply entrenched at this point.

AMD essentially has an impossible task, which is to consistently beat, not just match but beat, NV across the board in everything for say three generations or more, while also somehow costing less and making a profit.

All things being equal, people will still buy the NV card thanks to brand recognition alone, so it just isn't enough.
 

DeathReborn

Platinum Member
Oct 11, 2005
2,746
741
136
People tend to get hung up on temps too much. The chips are rated for 95C before they will start to do any sort of throttling. If the chips are designed for that temp, running them there won't hurt them.

However, better cooling will allow them to boost to higher clocks. So if you can have a better cooler, you can also get better performance. This is why undervolting is the new overclocking. The automatic boosting logic for both AMD and NV means lowering the voltage will lower temps, which will allow higher boost clocks.

The problem comes from throttling as the room heats up, or in some cases as soon as you put a load on the cards. I have seen a 7900XTX hit 110c when horizontal but only 86c when vertical, it's a random issue but it affects different cards differently. Der8auer in his video only tests cards in vertical mount outside of a case, he could be missing that data point or have no issue no matter the orientation. High temps bother me so just the risk of it is enough to put me off the MBA cards.
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,949
504
126
whats the deal with the high temps drama now on several youtube channels?
one guy posts a response from AMD where they say 100c is normal.
100C may be considered "normal" but it shouldn't be. Also can AMD please kick their board partners in the arse and get them to improve their quality control? I have two Red Dragon 6800XT cards they both had the same problem, GPU not contacting the cooler properly. Easy enough fix I removed the spring limited screws and used regular screws.

Prior to the fix they would go past 115 degrees and shut down. GPU is supposed to throttle. Doesn't. These are the kinds of things that drive people to Nvidia.
 
  • Like
Reactions: Ranulf