Question From a Ryzen 5800X3D to Ryzen 7800X3D: Equivalent Pentium (P5/P5-MMX) Generation Upgrade?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Jul 27, 2020
16,885
10,808
106
I did not get that. The quote I remember best was something like "if you need the CPU power, it makes a big difference, otherwise its still a little faster."
If someone already has a 5800X3D, it doesn't benefit much to upgrade to the 7800X3D. They will see most gains if they limit themselves (or have a cheap monitor) to 1080p. Better to stay with the 5800X3D and maybe wait for 8800X3D. The cost of the upgrade doesn't justify the meager gains.
 

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
25,646
14,636
136
If someone already has a 5800X3D, it doesn't benefit much to upgrade to the 7800X3D. They will see most gains if they limit themselves (or have a cheap monitor) to 1080p. Better to stay with the 5800X3D and maybe wait for 8800X3D. The cost of the upgrade doesn't justify the meager gains.
You missed the part where on certain games, the CPU of the 5800 is very weak. If someone plays those, they get MAJOR gains. Do I have to go back and reread and give you examples ?

One example. Hogwarts legacy@1440p, its 50% faster on the 7800x3d.

Edit: at 1080p 13 games are 20% or more faster on the 7800x3d. Thats a lot of games that are substantially faster.
 
Last edited:
Jul 27, 2020
16,885
10,808
106
One example. Hogwarts legacy@1440p, its 50% faster on the 7800x3d.

Edit: at 1080p 13 games are 20% or more faster on the 7800x3d. Thats a lot of games that are substantially faster.
Maybe for those games if someone is married to them. Still, from a cost perspective, current owners of 5800X3D have nothing to worry about. They are well served and will be served well by their CPU for a couple more years or even more.
 
  • Like
Reactions: Tlh97 and Ranulf

MadRat

Lifer
Oct 14, 1999
11,911
241
106
Funny story about the great recall. Intel bragged about Pentium having doubled the memory bandwidth and everyone knew video edits are bandwidth limited. Was going from an Intel 486 DX2/66 to P60 for video editing and it actually was worse performance. So much for Intel doubling memory bandwidth with Pentium, because in practice it sucked in our application. The video performance on a 60 MHz fsb (aka 60fsb with fsb = front side bus) was slower on the shiny new Pentium technology. The video render times, which at times exceeded 12 hours, were increasing. Little did I know when it came to PCI cards, CPU memory bandwidth was mostly inconsequential compared to PCI speeds.

But that was not the real issue. The Pentium had this annoyance for generating random artifacts after running video filters that never happened on the 486 it replaced. Those random artifacts on Pentium were simply unforgivable. Ran same data on backup 486 and had zero artifacts. Ended up trading it back to the guy for an Intel 486 DX2/100 on 50 MHz fsb (50fsb). It turns out the PCI bus was running much faster on the DX2/100 than on the P60's twice as wide memory bus. Nowhere close to a 50% boost going from DX2/66 to DX2/100 but shaving off an hour or two in big passes was actually pretty helpful. Then word came about the fpu bug and it all made sense.

It wasn't all sunshine on DX2/100. Over the next year went through 4 mainboards because 50fsb was too hot even with his suggestion to run a large heatsink on the Northbridge memory controller. The 50fsb was just too much for Intel's NB manufacturing process. The 50 MHz PCI was our biggest worry but our video edit card had no issues. PCI on P60, however, was running at 30 MHz. Intel soon skipped from 60fsb to 66fsb on the Pentium and it basically became the standard for a long time. PCI would get standardized to 33MHz in the same time period. We would run odd overclock steppings dependent on if Pentium NBs would remain stable, just to eek out a higher PCI speed.

Overclocking was really a necessity because Intel was sandbagging performance. Thankfully AMD and the others pushed back enough to force Intel to be honest. Otherwise we might all still be on Socket 3 today.
 
Last edited: