***OFFICIAL*** Ryzen 5000 / Zen 3 Launch Thread REVIEWS BEGIN PAGE 39

Page 39 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Graphenewhen

Junior Member
Oct 13, 2020
15
15
41
HOW?!
AMD wasn't lying at all when they made their claims about power reduction on RDNA2 even with a much bigger and improved GPU?
What sorcery is this? Seriously, what they discovered to make such big improvement on the same nod?

Isn't it safe to assume that there's always room for improvement? I imagine it hasn't been worth it previously due to funds or another, superior node being available (back in the Bulldozer and GCN days).

After all, Bulldozer on 7nm would still be rubbish compared to Zen2.
 

tamz_msc

Diamond Member
Jan 5, 2017
3,821
3,643
136
If they are using LinX to measure power draw, then it may give inaccurate results because LinX is not optimized for Zen and hence it is not pushing Zen CPUs hard enough.
 

HumblePie

Lifer
Oct 30, 2000
14,667
440
126
Got my order in for a 5950x. Replacing the 3950x I have with it and giving the 3950x to the wife. She is currently on a 1700x. The next step is the GPU upgrades. Hoping the 6900xt has some availability unlike the 30x0 cards. I can pass the 2080ti over to the wife and be set for awhile on upgrades.
 

.vodka

Golden Member
Dec 5, 2014
1,203
1,537
136
If they are using LinX to measure power draw, then it may give inaccurate results because LinX is not optimized for Zen and hence it is not pushing Zen CPUs hard enough.

Yes, good old LinX 0.6.4 doesn't know what to do with any Zen CPU, it'll run slow and generate little heat while at it. Someting like LinpackXtreme would've been better as it has the intended effect.


That 5600x is hitting so high above its tier it's not even funny. It's also running cooler and consuming less power. All on the same node. Insane.

Zen3 Threadripper will be something else.
 

JoeRambo

Golden Member
Jun 13, 2013
1,814
2,105
136
AFAIK LinX uses unmodified LINPACK libraries which aren't optimized for Zen. One needs to see whether Linpack Extreme works with Zen 3.

Optimized or not, 97Gflops. Linx overall is very sensitive to memory, lat/bw one can gain like 100Gflops just by tuning secondary/tertiary timings.


If they are using LinX to measure power draw, then it may give inaccurate results because LinX is not optimized for Zen and hence it is not pushing Zen CPUs hard enough.

Yeah... Completely different loads. Even comparing during CB20 runs would be better than this.
 

Panino Manino

Senior member
Jan 28, 2017
821
1,022
136
Isn't it safe to assume that there's always room for improvement? I imagine it hasn't been worth it previously due to funds or another, superior node being available (back in the Bulldozer and GCN days).

After all, Bulldozer on 7nm would still be rubbish compared to Zen2.

If they had just improved the power consumption, but they also made big changes to achieve more performance, there's more CPU there.

AMD's future is bright, these teams are really good.
 

Asterox

Golden Member
May 15, 2012
1,026
1,775
136
HOW?!
AMD wasn't lying at all when they made their claims about power reduction on RDNA2 even with a much bigger and improved GPU?
What sorcery is this? Seriously, what they discovered to make such big improvement on the same nod?

7nm nod improvments+CPU redesign. But it can be even beeter, as goes for even lower power consumption in Mutithread situations.

In Multithread test Cinebench R20, look at "Monolithic Desktop 8/16 Renoir APU".In Cinebench R20 it scores 4900, and R5 5600X scores 4500.For 8/16 CPU this is very low power consumption, or 128W for total system power.

https://www.computerbase.de/2020-10/amd-ryzen-3-4350g-test/#diagramm-test-cinebench-r20-multi
 

Tup3x

Senior member
Dec 31, 2016
965
951
136
Tried to order one but failed the F5 race... Seems to be another Ampere style release (I hope not though). In any case, I ordered 5900X - no idea when I actually receive it.
 

inf64

Diamond Member
Mar 11, 2011
3,703
4,032
136
Oh my , I nailed the ST performance 100% versus what Computer base measured :D

My prediction :
hSznWK.jpg



Computerbase results: https://www.computerbase.de/2020-11/amd-ryzen-5000-test/5/#abschnitt_singlecoreszenarien

4wZyuX.jpg



edit: I missed the MT by a hair and gaming as well, within 2-3%
 

yeshua

Member
Aug 7, 2019
166
134
86
Everything as expected however I am not a fan that AMD has OC'ed their CPUs to the absolute limit this time around in order to beat Intel at 1080p by a few percent and by doing so worsened their thermals quite a lot. It would be nice to see all these CPUs with TDP being lowered by 5-20% - that could make them a lot more power efficient and colder.

https://tpucdn.com/review/amd-ryzen-7-5800x/images/cpu-temperature.png (75C under load FFS).

Also, and I know I've repeated it a dozen times already but I don't understand why AMD has the right (and not only that people somehow find a justification for that) to increase their prices so much. Intel used to release new substantially faster CPU architectures without doing this: Sandy Bridge, Haswell, Sky Lake were all a lot faster than previous generation CPUs without price hikes and in certain cases even cost substantially less than their predecessors, e.g. the Intel Core i5-2500K was released for $216 while the Intel Core i7-920 cost $305.

People keep saying that $50 is practically nothing, only AMD has decided to start the lineup with the 5600X which costs $300, vs the 3600 which costs $200. It's not a $50 price hike, it's a $100/50%(!) price hike. Intel would have been decimated by the internet mob if they had ever attempted to be sneaky like this. I don't give a damn about the X suffix because it doesn't change anything and it's just a marketing differentiation. There's no 5600 CPU for $250.

Lastly, AMD is playing a monopoly card and it's just ugly. They force people to buy the 5900X/5950X CPUs because both the 3600/3700X were the most popular models for the Ryzen 3000 series, while for this generation, the 5800X is the worst (!) investment in terms of the bang for the buck. Margins decide everything not only for Intel and NVIDIA, as AMD has happily joined the "we'll rip you off because we are the fastest" club. I'm quite appalled by all of this.
 

Hitman928

Diamond Member
Apr 15, 2012
5,321
8,000
136
AMD included several games in their IPC test to get to the 19% increase. Not saying that's invalid, it's perfectly valid, but computerbase measured ~15% IPC increase in multi-threaded non-gaming loads. AMD had a pretty big increase in "gaming IPC" with this gen which has now given AMD the lead in pretty much every category. Interesting how this CPU turned out.
 

Hitman928

Diamond Member
Apr 15, 2012
5,321
8,000
136
Everything as expected however I am not a fan that AMD has OC'ed their CPUs to the absolute limit this time around in order to beat Intel at 1080p by a few percent and by doing so worsened their thermals quite a lot. It would be nice to see all these CPUs with TDP being lowered by 5-20% - that could make them a lot more power efficient and colder.

https://tpucdn.com/review/amd-ryzen-7-5800x/images/cpu-temperature.png (75C under load FFS).

Also, and I know I've repeated it a dozen times already but I don't understand why AMD has the right (and not only that people somehow find a justification for that) to increase their prices so much. Intel used to release new substantially faster CPU architectures without doing this: Sandy Bridge, Haswell, Sky Lake were all a lot faster than previous generation CPUs without price hikes and in certain cases even cost substantially less than their predecessors, e.g. the Intel Core i5-2500K was released for $216 while the Intel Core i7-920 cost $305.

People keep saying that $50 is practically nothing, only AMD has decided to start the lineup with the 5600X which costs $300, vs the 3600 which costs $200. It's not a $50 price hike, it's a $100/50%(!) price hike. Intel would have been decimated by the internet mob if they had ever attempted to be sneaky like this. I don't give a damn about the X suffix because it doesn't change anything and it's just a marketing differentiation. There's no 5600 CPU for $250.

Lastly, AMD is playing a monopoly card and it's just ugly. They force people to buy the 5900X/5950X CPUs because both the 3600/3700X were the most popular models for the Ryzen 3000 series, while for this generation, the 5800X is the worst (!) investment in terms of the bang for the buck. Margins decide everything not only for Intel and NVIDIA, as AMD has happily joined the "we'll rip you off because we are the fastest" club. I'm quite appalled by all of this.

Are you really complaining about a 2 degree load temp increase on 1 model from 1 review site? That's, umm, a pretty absurd thing to complain about. BTW,

1604587938488.png

Also, every review I've seen so far has shown that there is actually overclocking headroom in the 5000 series chips, so your overclocked to the absolute limit is just false. Every review has also shown the 5000 series to be significantly more power efficient than the 3000 series as well, so your power efficiency comment is also just false.

As for your price complaint, they have launched 4 SKUs so far. There will be more. So maybe save your price complaints until you see the full lineup. If you don't like the $50 price increase, don't buy it. I'm pretty sure they will sell out anyway. Seems like just a bunch of faux outrage to me.
 
Last edited:

richierich1212

Platinum Member
Jul 5, 2002
2,741
360
126
@yeshua oh cry us a river. These are the flagship CPUs. AMD is on top now so of course they’ll charge a premium. Intel was the same for many years.

5600X does not equal the 3600. It’s more of an upgrade to the 3600XT ($250 MSRP).

Just wait for the 5600 sometime next year if you’re that price sensitive.
 

linkgoron

Platinum Member
Mar 9, 2005
2,300
821
136
People keep saying that $50 is practically nothing, only AMD has decided to start the lineup with the 5600X which costs $300, vs the 3600 which costs $200. It's not a $50 price hike, it's a $100/50%(!) price hike. Intel would have been decimated by the internet mob if they had ever attempted to be sneaky like this. I don't give a damn about the X suffix because it doesn't change anything and it's just a marketing differentiation. There's no 5600 CPU for $250.
Intel played the quad-core high end CPU for around a decade until Ryzen arrived. Those were the days, when 6 cores were on the HEDT platforms - and not $250 or $300 CPUs. Once Ryzen arrived, those 6700k/7700k 4C/8T i7s moved into the i3 segment in about two years.

Lastly, AMD is playing a monopoly card and it's just ugly. They force people to buy the 5900X/5950X CPUs because both the 3600/3700X were the most popular models for the Ryzen 3000 series, while for this generation, the 5800X is the worst (!) investment in terms of the bang for the buck. Margins decide everything not only for Intel and NVIDIA, as AMD has happily joined the "we'll rip you off because we are the fastest" club. I'm quite appalled by all of this.

Yes, their pricing is a bit high on this release. However, the 1800x was $500 and the 5800x is $450. The 6-cores have probably "suffered" the most. The 1600x was $250 and the 5600x is $300 and I believe lost the cooler, so I agree that it's definitely a price increase, but they really are beating Intel on all fronts. At least you can't say that they're providing a marginal upgrade (6 to 7 Intel series, for example). You can also reuse older boards, where Intel usually forces you to upgrade your MB as well.
 
Last edited:

kallisX

Member
Sep 29, 2016
45
39
91
deal with the price increase. it seems most were okay when intel charged high prices for their cpus, but now that the performence looks to be om amds side, amd raises its prices a little, and people start crying foul and loose their minds? lol
 

coercitiv

Diamond Member
Jan 24, 2014
6,211
11,941
136
Neighbour's little girl just came home bruised and crying: red mean people beat her into buying an overclocked 5950X just now, took all her lunch money. Now she's begging his daddy not to take her 2c/2t i3 away. Tears your heart apart, keeps yelling "Don't take Sandy away!".
 

therealmongo

Member
Jul 5, 2019
109
247
116
Everything as expected however I am not a fan that AMD has OC'ed their CPUs to the absolute limit this time around in order to beat Intel at 1080p by a few percent and by doing so worsened their thermals quite a lot. It would be nice to see all these CPUs with TDP being lowered by 5-20% - that could make them a lot more power efficient and colder.

https://tpucdn.com/review/amd-ryzen-7-5800x/images/cpu-temperature.png (75C under load FFS).

Also, and I know I've repeated it a dozen times already but I don't understand why AMD has the right (and not only that people somehow find a justification for that) to increase their prices so much. Intel used to release new substantially faster CPU architectures without doing this: Sandy Bridge, Haswell, Sky Lake were all a lot faster than previous generation CPUs without price hikes and in certain cases even cost substantially less than their predecessors, e.g. the Intel Core i5-2500K was released for $216 while the Intel Core i7-920 cost $305.

People keep saying that $50 is practically nothing, only AMD has decided to start the lineup with the 5600X which costs $300, vs the 3600 which costs $200. It's not a $50 price hike, it's a $100/50%(!) price hike. Intel would have been decimated by the internet mob if they had ever attempted to be sneaky like this. I don't give a damn about the X suffix because it doesn't change anything and it's just a marketing differentiation. There's no 5600 CPU for $250.

Lastly, AMD is playing a monopoly card and it's just ugly. They force people to buy the 5900X/5950X CPUs because both the 3600/3700X were the most popular models for the Ryzen 3000 series, while for this generation, the 5800X is the worst (!) investment in terms of the bang for the buck. Margins decide everything not only for Intel and NVIDIA, as AMD has happily joined the "we'll rip you off because we are the fastest" club. I'm quite appalled by all of this.
J u s t b u y a n I n t e l . . . . .