Review AMD RX 5600XT Review Thread

Page 10 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Hitman928

Diamond Member
Apr 15, 2012
5,285
7,918
136
Print Media

https://translate.google.com/translate?sl=auto&tl=en&u=https://www.computerbase.de/2020-01/radeon-rx-5600-xt-test/ [translated]


Video Reviews



From early reviews it seems like the 5600XT and Nvidia 2060 are basically tied both in performance and power consumption although GN's 12V rail power numbers and Anandtech's system power numbers don't agree, not sure why that is.
 
Last edited:

RetroZombie

Senior member
Nov 5, 2019
464
386
96
In the video, they are very careful to say that this is an ongoing mystery, not helpful to gaming, and not known how widespread this particular new T104 die is vs the previous T106 2060 die.
But his mistake is doing the recommendations (around 14:00), who would you blame after if it goes wrong?
- Evga
- Nvidia
- Yourself
- GN (or Steve)

That said, agreed. I think the 2060 and 5600 are bogus deals.
If all was limited to only those two...
 

tamz_msc

Diamond Member
Jan 5, 2017
3,821
3,641
136
But your point was that in Dishonored 1 the RX5700 have the same performance of RX580, and the card even underclocked to 'keep up' with the game low demand.

Which is also why RX580 = RX5700 = HD7970 = GTX680 = .... have all the same performance, because of the frame cap.

Why use hardware when with software+hardware you can do it 3 times faster than the most expensive gpu the nvidia rtx 2080 Ti currently at $1,199.00.
If the RX 5700 was functioning properly without downclocking, it would have been pegged at 130 FPS, provided that the CPU could keep up.
 

Hitman928

Diamond Member
Apr 15, 2012
5,285
7,918
136
If the RX 5700 was functioning properly without downclocking, it would have been pegged at 130 FPS, provided that the CPU could keep up.

Isn't that the point though? The 580 used as a comparison shouldn't have a problem maintaining the 130 fps the game is capped at but it dips down to the same levels as the 5700 when both are using the Ryzen 1700. That doesn't imply a problem with the 5700 but rather a CPU bottleneck and that the 5700 is just down clocking to save power because the CPU can't keep up. It's such an old game it's hard to find proper benchmarks, but it seems to be a CPU bottleneck and not an issue with the 5700 down clocking and leaving performance on the table.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
If that would be the case we would see price cuts on ALL of RTX 2XXX GPUs, and not just batch of faulty dies given to EVGA for KO lineup, like it was previously.
No you almost never see price cuts on Nvidia cards. What you do see is cards being sold at every possible performance tier and oddly cut down cards for maximum profit.
 

itsmydamnation

Platinum Member
Feb 6, 2011
2,773
3,151
136
I have a 5700 a 3600 and i own (but have not installed) dishonored, installing now will report back.

edit: i also have a FX-6300 with a 570, that would be an interesting comparison.

So game holds constant 130 fps

everything set to high/MLAA 3440x1440
gpu clock sits around 1100mhz
2 threads are actively used and sit at around 60% utilisation @ 4.2ghz.
 

Glo.

Diamond Member
Apr 25, 2015
5,711
4,556
136
No you almost never see price cuts on Nvidia cards. What you do see is cards being sold at every possible performance tier and oddly cut down cards for maximum profit.
Its the first time in absolute years when we have seen that Nvidia sold bigger GPU in lower SKU.

Last time was what? Previous KO model from EVGA?

Neither Pascal, neither Maxwell SKUs never offered larger GPUs in lower SKUs, than "standard" segmentation.

But what we have always seen - are price cuts, that were not forced with competition from AMD. So no, It appears that Nvidia is not doing sales of unused Turing stock in order to release next gen. gaming architecture this year.
 

DeathReborn

Platinum Member
Oct 11, 2005
2,746
741
136
Its the first time in absolute years when we have seen that Nvidia sold bigger GPU in lower SKU.

Last time was what? Previous KO model from EVGA?

Neither Pascal, neither Maxwell SKUs never offered larger GPUs in lower SKUs, than "standard" segmentation.

But what we have always seen - are price cuts, that were not forced with competition from AMD. So no, It appears that Nvidia is not doing sales of unused Turing stock in order to release next gen. gaming architecture this year.

GP104-150-KA-A1 appeared in 1060's alongside the normal GP106. GM204 appeared in a 960 OEM.
 

RetroZombie

Senior member
Nov 5, 2019
464
386
96
But what we have always seen - are price cuts, that were not forced with competition from AMD.
Actually they were.
AMD forced them doing the super line up, and with it they release of the RTX 2070 Super with the TU104-410-A1 (545mm2) that was used in the RTX 2080 instead of the TU106-400-A1 (445mm2) used in the RTX 2070.

With all that yield took a hit and they need to reuse the parts somehow.
We see the same with intel, many cpus with cut down graphics or huge dies with many cut down cores.

At least nvidia doesn't want to charge a premium for defective parts like intel does:
The Intel Core i9-9990XE Review: All 14 Cores at 5.0 GHz
 

Triloby

Senior member
Mar 18, 2016
587
275
136
Sounds like the RTG division made an even bigger mess with the 5600 XT's launch than what most people previously thought.

Why did AMD even choose to launch the 5600 XT around the same time the Lunar New Year's is happening? AMD's last-minute spec changes for the 5600 XT, plus the time taken off for factories during Lunar New Year's, have created more problems for AIBs.

 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
Sounds like the RTG division made an even bigger mess with the 5600 XT's launch than what most people previously thought.

Why did AMD even choose to launch the 5600 XT around the same time the Lunar New Year's is happening? AMD's last-minute spec changes for the 5600 XT, plus the time taken off for factories during Lunar New Year's, have created more problems for AIBs.

What this means is if I were to get a 5600XT I'd only buy one that came out of the factory at the new specs and that I didn't need to flash myself. I don't want one that every time I see a bug I'm not sure if it's because I flashed it and it's not completely stable or it's just a driver bug.
 
Last edited:
  • Like
Reactions: Ranulf

Zstream

Diamond Member
Oct 24, 2005
3,396
277
136
Not to derail, but if they could figure out x-fire, or SLI, companies could focus on perf/watt, and low power cards. I'd rather purchase two 5500's, or two 1060's vs. a 2080ti.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Not to derail, but if they could figure out x-fire, or SLI, companies could focus on perf/watt, and low power cards. I'd rather purchase two 5500's, or two 1060's vs. a 2080ti.

CF and SLI are dead in their current forms. Very few people even use it anymore because it does not work, for either company. AMD jumped ahead when they moved away from the bridge, so bandwidth was no longer an issue, which got rid of stuttering. But it seems like its just gone out of favor. For SLI, the only driving factor may be better RTX performance, but RTX and SLI do not work together. And even SLI 2080Ti's only show around a 24% performance boost over a single 2080Ti.

DX12 multi-GPU was supposed to solve the CF/SLI issues, but no devs actually coded for it.
 
  • Like
Reactions: Muadib

Zstream

Diamond Member
Oct 24, 2005
3,396
277
136
Also, did he just say it's a 5700 underneath the hood? Why is there outrage on the memory o/c
CF and SLI are dead in their current forms. Very few people even use it anymore because it does not work, for either company. AMD jumped ahead when they moved away from the bridge, so bandwidth was no longer an issue, which got rid of stuttering. But it seems like its just gone out of favor. For SLI, the only driving factor may be better RTX performance, but RTX and SLI do not work together. And even SLI 2080Ti's only show around a 24% performance boost over a single 2080Ti.

DX12 multi-GPU was supposed to solve the CF/SLI issues, but no devs actually coded for it.
Yeah, I agree, and understand. It's just this type of crap wouldn't happen if they could figure out a simplified manner of implementation. I'm tired of having a furnace like PC when I get to the high-end. The video stated the 5600XT is a cut down version of the 5700. So, why wouldn't the memory specs be in line with the 5700? Isn't this more of a board partner going the cheap route to begin with?
 

Arkaign

Lifer
Oct 27, 2006
20,736
1,377
126
Sounds like the RTG division made an even bigger mess with the 5600 XT's launch than what most people previously thought.

Why did AMD even choose to launch the 5600 XT around the same time the Lunar New Year's is happening? AMD's last-minute spec changes for the 5600 XT, plus the time taken off for factories during Lunar New Year's, have created more problems for AIBs.


Ugh. That's a great deep dive by GN, really brings home the practical issues this launch brought on. AMD was the one that gave out the 12mbit spec, clock speeds, and pricing, and then dumped a wrench into their AIB's spokes at the last minute, putting undue burden on groups with slim margins and limited staff (Shenzhen has seen a decline in qualified workers in recent times as factory work isn't as desirable as it once was there, leading to shortages in many areas).

Hopefully it's a big fat lesson to all involved, if you launch a product, don't throw your partners under the bus. If you need to adjust value at the last minute, drop the price and compensate your partners. Don't change the specs after things are already ready to ship or have shipped, ESPECIALLY when they should know that not every one of these things will live up to the newly chosen specs.

Hopefully we hear that AMD goes into overdrive helping clean this mess up for their partners.
 

lifeblood

Senior member
Oct 17, 2001
999
88
91
AMD should have know Nvidia would have done something to try to spoil this launch. They were going to launch a 1660 Super Ti, or a 2060 LE, or just reduce the price of the 2060. Nvidia just can't let AMD have a clear win, their ego requires them to do something. What the something is may not be clear, but that something will be done is guaranteed. They should have had a viable response ready.

Now AMD has a mess with both board partners and customers. Steve is right, they keep shooting themselves in the foot.
 
  • Like
Reactions: mohit9206

Glo.

Diamond Member
Apr 25, 2015
5,711
4,556
136
Any reliable source for that 14Gbps validation? I have 2 of them but not flashed them yet.
 

Arkaign

Lifer
Oct 27, 2006
20,736
1,377
126
For technically inclined individuals, I recommend this for basically all GPUs anyway. As noted by GN and others over the past few years : GPU assembly quality has been actually declining, probably due to reduction in qualified workers for these AIBs. Bad thermal paste application, inadequate or unmatched thermal pads, just loads of borderline QA issues, along with some stylized designs hampering cooling. Often leads to cards that 'work' to some extent, but hotter than they need to be, and occasionally so much so that it can lead to early GPU death.
 

Mopetar

Diamond Member
Jan 31, 2011
7,837
5,992
136
Not to derail, but if they could figure out x-fire, or SLI, companies could focus on perf/watt, and low power cards. I'd rather purchase two 5500's, or two 1060's vs. a 2080ti.

I don't think either company is really interested in selling you two low-end GPUs with smaller margins when they can sell you a more expensive high-end GPU at much higher margins.

In the past, the high-end cards were more reasonably priced, but they also weren't as powerful which in some ways necessitated these technologies. Now you can get a powerful single card that provides generally acceptable performance even at the highest resolutions and settings. For comparison a 2060 has a similar die size to an 8800 GTX which was a monster in terms of size and graphical prowess at the time. Even then a lot of people I know used two of them because even a single card could struggle with some games if you had a 1600 x 1200 monitor and SLI almost always kept the minimum FPS above 60.

There will likely be an increase in RMAs for the 14gbs bios flashed cards (since no memory validations before shipping).

I wonder how many cards had already been shipped off or if AMD knew that the physical product was going to be a bit delayed (you'd think having a Taiwanese American CEO would mean that your company understands the importance of the Chinese New Year) and that while the changes would cause some headaches for their partners, it wouldn't cause a substantial number of recalls.