I'd say the real nonsense will start when the reviews drop.oh no...2 more weeks of this nonsense...
I'd say the real nonsense will start when the reviews drop.oh no...2 more weeks of this nonsense...
Agree! Can't wait for anandtech for reviews. I don't think Pat (CEO) has anything to do with the development of Alderlake, right? If so, how soon can we find a release with his work?I'd say the real nonsense will start when the reviews drop.
Him being CEO now, we will never again see a design where he had any kind of influence at all. His work is something else: Get the people, ideas, time and money for innovation together. Maybe make some key decisions based on sophisticated and/or management targeted analysis.Agree! Can't wait for anandtech for reviews. I don't think Pat (CEO) has anything to do with the development of Alderlake, right? If so, how soon can we find a release with his work?
Needs a more powerful gpuHXL on Twitter: "12900K DDR4 3600 Gear 1 +RX 6600(Win11) Vs 5950X DDR4 3600 1 :1 +RX 6600(Win10) https://t.co/QmAutqoHwG" / Twitter
Gaming benchmarks using DDR4. About 2% faster than the 5950X here in this test, whatever it is.
Needs a more powerful gpu
It shows the framerates the CPU would be able to handle in the test the way SoTR does. Still similar story, in the "simulation" part (first row) the 5950X wins, in the rendering part (second row) the 12900K wins.Needs a more powerful gpu
Samsung Starts Mass Production of Most Advanced 14nm EUV DDR5 DRAM - Samsung US NewsroomI told you why. Because it isn't being binned to cherry pick the best chips to make DIMMs capable of minimal latency.
I'm not sure what process DRAM makers are using these days, but they might be using a newer process to make DDR5 chips which isn't as mature as the process being using to make most DDR4 chips.
Perhaps the internal ECC calculations might add a bit of latency, so maybe DDR5 never gets quite as fast as DDR4 even when it is being binned within an inch of its life on a mature process like DDR4. If so that would be maybe 5-10% at most though, not 40%.
14nm EUV! Didn't look at others, just happened to remember seeing this.
I think Will be an opportunity actually for the really good reviewers. The great CPU reviewers, like Ian, will be able to break things down and organize content so that we have a clear understanding of the benefits and drawbacks of each option you mentioned.ADL-S must be a nightmare for reviewer with all the possibilities. Windows 10, windows 11, DDR4, DDR5, big cores, small cores. I guess most will go with DDR5 and Windows 11. Hopefully some will do a DDR4 versus DDR5 comparison.
I'm well aware of all that. The area where gamers are early adopters are rather e.g. graphic cards (not of the crypto and AI variants), displays (especially of the high refresh rate variant), input devices (high polling rate mice), gamer branded stuff (but why?) etc.The enterprise world has always been first to move on new memory and storage standards. First to a new DDRx standard, first to larger capacity hard drives, adopted SSDs years before they appeared in the consumer world, etc.
Samsung Starts Mass Production of Most Advanced 14nm EUV DDR5 DRAM - Samsung US Newsroom
14nm EUV! Didn't look at others, just happened to remember seeing this.
Agree! Can't wait for anandtech for reviews. I don't think Pat (CEO) has anything to do with the development of Alderlake, right? If so, how soon can we find a release with his work?
Furthermore, we learn that the system was equipped with Zadak’s DDR4 memory (DDR4 3866 C14-14-14-34 2T Gear1), so it was a DDR4 compatible board.
The creator revealed the scores of Core i9-12900K in UL’s 3DMark Time Spy and Fire Strike benchmarks, which is CPU test to be specific:
What OS are you running?My Maxed 5950x:
Time Spy CPU Score: 19062
Time Spy Extreme CPU Score: 11851
Fire Strike CPU Score: 44674
Fire Strike Extreme CPU Score: 44267
Your RAM is faster, Time Spy is RAM sensitive and 4950 Mhz 5950x is super high OC for this CPU, this needs to be said.
If you're looking at the reported frequency then no, it is not. The entire 3D mark suite notes the highest clock recorded at any point during the test for both the GPU and the CPU, and not under any real load either. E.g. mobile Turing Max-Q GPUs often report 1800MHz despite actually clocking at around 1500MHz during the test itself.Your RAM is faster, Time Spy is RAM sensitive and 4950 Mhz 5950x is super high OC for this CPU, this needs to be said.
How did the article arrive to a <14000 score for the 5950X?If you're looking at the reported frequency then no, it is not. The entire 3D mark suite notes the highest clock recorded at any point during the test for both the GPU and the CPU, and not under any real load either. E.g. mobile Turing Max-Q GPUs often report 1800MHz despite actually clocking at around 1500MHz during the test itself.
4950MHz is actually quite low overall. My own 5950X is also a relatively bad clocker, unable to hold 5GHz on a single core workload (like CB20 1T or GB5) even after tuning PBO with a +75MHz offset, and well:
I scored 6 081 in Port Royal
AMD Ryzen 9 5950X, AMD Radeon RX 6700 XT x 1, 32768 MB, 64-bit Windows 10}www.3dmark.com
5GHz reported.
Moat of my tests are done whilst I was testing GPU settings, I'll need to look for one where I actually ran the CPU test. Although my own RAM is far worse, only DDR4-3200 cl14 so not really comparable to the DDR4-3866 cl14 used on the 12900K.
Turning off SMT in Timespy gives you a 2-3k higher cpu score when running a 5950xHow did the article arrive to a <14000 score for the 5950X?
Gotcha, thanks. In other words, we still just have to wait. Unless you're Joe Rambo and you have decided what the strengths and what the weaknesses of the new CPU are already.Turning off SMT in Timespy gives you a 2-3k higher cpu score when running a 5950x
(all the others 3dmarks don't punish you for having 32 threads, so SMT can be enabled when on running those)