SiliconFly
Golden Member
- Mar 10, 2023
- 1,924
- 1,284
- 106
That seems to imply Ryzen chips are inferior (which I refuse to believe).How is 5200 MT/s on Ryzen the same as 5600 MT/s on the Intel CPUs?
That seems to imply Ryzen chips are inferior (which I refuse to believe).How is 5200 MT/s on Ryzen the same as 5600 MT/s on the Intel CPUs?
I find those results pretty weird. Why you ask?HXL posted this on X today & it's hilarious...
A measly Intel 13th gen Core i5 comfortably beats the top-end AMD Ryzen 9 7950X3D in starfield benchmarks and the really funny part is, this is an AMD sponsored game!
View attachment 85168
(Edited)
I dunno. But a i5 beating a 7950X3D in a AMD sponsored title is still hilarious!I find those results pretty weird. Why you ask?
7950X vs 7950X3D perform the same, the extra 64MB only compensates for a bit lower clockspeed.
12900K vs 13700K. No difference in core count, 4% higher boost, faster DDR5 and more cache.
Yet the performance is 26%, 52% and 64% faster.
Doesn't make sense.
Per their chart the 12900K is running DDR5-4400, the 13700K is running DDR5-5600.I find those results pretty weird. Why you ask?
7950X vs 7950X3D perform the same, the extra 64MB only compensates for a bit lower clockspeed.
12900K vs 13700K. No difference in core count, 4% higher boost, faster DDR5 and more cache.
Yet the performance is 26%, 52% and 64% faster.
Doesn't make sense.
That's for GPUs not CPUs as far as I know. BTW, this didn't always mean It performed best on AMD.I dunno. But a i5 beating a 7950X3D in a AMD sponsored title is still hilarious!![]()
That's 27% difference in frequency, ignoring worse timings. Two of the charts show much bigger difference.Per their chart the 12900K is running DDR5-4400, the 13700K is running DDR5-5600.
Not really. It's a CPU-bound game. CPU matters a lot! And the benchmarks were run with the same graphics card. So, that makes things pretty simple.That's for GPUs not CPUs as far as I know. BTW, this didn't always mean It performed best on AMD.
That's 27% difference in frequency, ignoring worse timings. Two of the charts show much bigger difference.
It might be more fair if at least they used the same speed memory. What they came out with and what they are now capable of are far different. 6000 is generally accepted on both platforms as pretty well used. Also, I would love to see more than one website do this testing, and one that I have heard of.Not really. It's a CPU-bound game. CPU matters a lot! And the benchmarks were run with the same graphics card. So, that makes things pretty simple.
It's just that the biggest game of the year, that too AMD sponsored, performs better with Intel CPUs rather than AMD CPUs.
This is a big blow to AMD. They shouldn't have sponsored it.
It was tested on Nvidia not AMD GPU, so your AMD sponsored comments are pretty irrelevant.Not really. It's a CPU-bound game. CPU matters a lot! And the benchmarks were run with the same graphics card. So, that makes things pretty simple.
It's just that the biggest game of the year, that too AMD sponsored, performs better with Intel CPUs rather than AMD CPUs.
This is a big blow to AMD. They shouldn't have sponsored it.
As others have said, sponsorship pertains more to GPUs than CPUs, especially because it's much harder to intentionally optimize for one vendor's CPU over the other's when it's all x86 with very similar underlying architectures at the end of the day. It's not like the days of old where Intel had a substantial IPC advantage, especially with respect to FP, than AMD's Faildozer. Modern high performance x86 CPUs have SMT, micro-op caches, OOO execution, etc. The biggest differences between Intel and AMD regarding gaming performance can be distilled down to ST performance and latency to memory.Not really. It's a CPU-bound game. CPU matters a lot! And the benchmarks were run with the same graphics card. So, that makes things pretty simple.
It's just that the biggest game of the year, that too AMD sponsored, performs better with Intel CPUs rather than AMD CPUs.
This is a big blow to AMD. They shouldn't have sponsored it.
Additional cache and the Raptors are using faster RAM.What I want to know is why Raptor performs so much better even against Alder Lake.
TPU won't try different CPUs only GPUs.Wait for TPU review before drawing conclusions.
Arrow is 20A only for the cpu.. Ericsson the lead customer for the 18A process
ditto. I did some reading on the twitter and it seems like this game is very sensitive to memory speeds. who knew said the tap dancing frog. Getting tired of these sites running jedec speeds when no one in the right state of mind is running slow as molasses jedec speeds.It might be more fair if at least they used the same speed memory. What they came out with and what they are now capable of are far different. 6000 is generally accepted on both platforms as pretty well used. Also, I would love to see more than one website do this testing, and one that I have heard of.
Then how on earth a slower, old & outdated 12th gen 12900K with slower DDR5-4400 RAM can comfortably match and almost outclass the almighty Ryzen 9 7950X3D with faster DDR5-5200?It might be more fair if at least they used the same speed memory. What they came out with and what they are now capable of are far different. 6000 is generally accepted on both platforms as pretty well used. Also, I would love to see more than one website do this testing, and one that I have heard of.
Only if you ignore the fact that not ALL games benefit from increased cache. AMD can't dictate on how to write x86 code to their partners so their game engines prefer AMD CPUs over the competition.This is too much fun!![]()
And Starfield is AMD sponsored!Only if you ignore the fact that not ALL games benefit from increased cache. AMD can't dictate on how to write x86 code to their partners so their game engines prefer AMD CPUs over the competition.
Game sponsorships were always about gpu, not cpu going back decades. this is also bethesda who don't know their rear end from their brain. I don't believe they've ever had a good release.And Starfield is AMD sponsored!
They did manage to get FSR prioritized over DLSS. But sadly failed with CPUs!
5200 to 5600 is a small difference, maybe you can add +2-3% and that's it.
Well, word on the street is, Starfield is one of their best releases yet.Game sponsorships were always about gpu, not cpu going back decades. this is also bethesda who don't know their rear end from their brain. I don't believe they've ever had a good release.
Great to hear you work street corners. It explains a lot.Well, word on the street is, Starfield is one of their best releases yet.
labeling error. The results are what matters and they're so far fetched than even intel zealots on other boards are crying foul at the stupidity of this site. same bench results has a zen+ destroying a 9900k when that is and will never be the case. Zen really sucks when it isn't fed the fastest ram it can handle, which in this case is 6000 mt/s. For intel there isn't a huge noticeable gain until you get far past 7000 mt/s. Either way pick a speed both processor ranges can handle and be done with it.Everyone talking memory speeds, look at the 7700x, 5.5Ghz?