Question Raptor Lake - Official Thread

Page 176 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Hulk

Diamond Member
Oct 9, 1999
4,269
2,089
136
Since we already have the first Raptor Lake leak I'm thinking it should have it's own thread.
What do we know so far?
From Anandtech's Intel Process Roadmap articles from July:

Built on Intel 7 with upgraded FinFET
10-15% PPW (performance-per-watt)
Last non-tiled consumer CPU as Meteor Lake will be tiled

I'm guessing this will be a minor update to ADL with just a few microarchitecture changes to the cores. The larger change will be the new process refinement allowing 8+16 at the top of the stack.

Will it work with current z690 motherboards? If yes then that could be a major selling point for people to move to ADL rather than wait.
 
  • Like
Reactions: vstar

Saylick

Diamond Member
Sep 10, 2012
3,217
6,585
136
In case my last post isn't clear enough, let me make it crystal. Cap is a self professed shill. His testing is amature hour.
Idk about him being a shill because it is hard to prove, but I do find it very funny that for a Twitter handle like his, which I presume is named after the fps software that he wrote and is trying to commercialize, he sure does spent a disproportionate amount of time making Intel look good with his cream-of-the-crop RAM timings. You'd think that he'd be making tweets that highlights the benefits of his software so that people would pay him for it, but I mostly just see him just generating content that trolls and fanboys use to spam against one another. It definitely doesn't help when his benchmarking takes a page out of amateur hour because most people who have an agenda to push already would have ran with his initial, flawed results.

Edit: Okay, just checked out his website. Looks like it's free to download, but he accepts donations. Either way, the amount of involvement he has with the Twitter hardware community seems a little biased, I have to admit. If I didn't know he made software or if you his his Twitter handle to me, I'd just assume purely based on his tweeting that he was another Intel fanboy trying to prove a point using someone else's fps benchmarking tool, not the owner and coder of that tool itself.
 
Last edited:

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,664
21,172
146
@Saylick

Here is a time he went after Dr. Cutress


I agree that with a professional demeanor and proper mission statement, he could have become a valued member of the PC gaming and hardware community. Instead he chose to be an edge lord. Alienating the most respected members of that field is a terrible idea. He is going to end up catering to a smaller and smaller audience as anyone with a modicum of maturity or knowledge tunes him out.
 

A///

Diamond Member
Feb 24, 2017
4,352
3,154
136
Capframx, the fps guy who looks like he's on adderall 24/7 and francois are all massively tedious adults with the mental maturity of a child. Francois finally got the message and trotted off into the sunset because he realized no one liked him, not even his former coworkers at Intel. There's also that one trotter with his paid blog. Four annoying saps.
 

Mopetar

Diamond Member
Jan 31, 2011
7,941
6,240
136
In case my last post isn't clear enough, let me make it crystal. Cap is a self professed shill. His testing is amature hour.

Cap's results are at least useful in the fact that you know they're the absolute worst case you could expect for AMD because you know that he looked for some combination of game, settings, and hardware setup to make Intel look as good as possible and that if there were a worse case for AMD he would have found it.

It also gives you a good indicator of who you can take seriously. Anyone presenting his posts without qualification or as a sole source of obvious truth can be taken about as seriously as someone who holds up MLID as a source for whatever crazy rumor they're trying to push.

A dowsing rod or healing crystal may themselves not be a useful tool for the stated purposes, but it can help you spot other tools and steer clear.

Edit: Fixing atrocious grammar.
 
Last edited:

John Carmack

Member
Sep 10, 2016
155
247
116
@Saylick

Here is a time he went after Dr. Cutress


I agree that with a professional demeanor and proper mission statement, he could have become a valued member of the PC gaming and hardware community. Instead he chose to be an edge lord. Alienating the most respected members of that field is a terrible idea. He is going to end up catering to a smaller and smaller audience as anyone with a modicum of maturity or knowledge tunes him out.

I was waiting for someone to reference the early Rocket Lake review he was raging over. He might as well have called Ian Cutress fake news. The irony of this partisan hack calling for us to not rush to judgements based on "flawed" reviews while he peddles staged 540p demos. Why ask for donations when they have a position available for him at Principled Technologies?
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
In case my last post isn't clear enough, let me make it crystal. Cap is a self professed shill. His testing is amature hour.

Whatever your problem with him is or whatever he said in the past is irrelevant, because in this instance, he's correct. The 13900K has a significant lead over the 7950x in this game when tested in the most CPU demanding area (Hogsmeade), as well as other games when RT is enabled. It's practically par for the course at this stage, as this pattern of behavior has been confirmed over and over again by other reviewers.

Meanwhile, HWUB does their testing at GPU limited settings or with RT disabled, which paints a false picture of the gaming capabilities of these newer CPUs. But if it makes AMD look good, I suppose it's alright.
 
  • Like
Reactions: controlflow

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
The irony of this partisan hack calling for us to not rush to judgements based on "flawed" reviews while he peddles staged 540p demos.

Anyone familiar with CPU testing knows that the resolution should be lowered in order to minimize GPU bottlenecks and maximize CPU bottlenecks. Also, this goes to show how CPU and GPU dependency can vary in games based on location. The GameGPU test was conducted in Hogwarts, which is more GPU dependent and showed less of a performance gap for CPU tests.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Okay, just checked out his website. Looks like it's free to download, but he accepts donations. Either way, the amount of involvement he has with the Twitter hardware community seems a little biased, I have to admit. If I didn't know he made software or if you his his Twitter handle to me, I'd just assume purely based on his tweeting that he was another Intel fanboy trying to prove a point using someone else's fps benchmarking tool, not the owner and coder of that tool itself.

So not showing your favorite team (not saying you btw) winning is biased? That seems to be the prevailing sentiment in this thread when it comes to gaming performance.

Anything remotely seen as negative towards AMD is seen as bias, regardless of how many times it's confirmed. It's ironic that HWUB was among the first reviewers to "unwittingly" demonstrate a performance discrepancy between AMD and Intel when it came to RT workloads before Raptor Lake and Zen 4 launched with Alder Lake and Zen 3, but now they seem to be intentionally avoiding it in their CPU reviews.
 
  • Like
Reactions: Henry swagger

Saylick

Diamond Member
Sep 10, 2012
3,217
6,585
136
So not showing your favorite team (not saying you btw) winning is biased? That seems to be the prevailing sentiment in this thread when it comes to gaming performance.

Anything remotely seen as negative towards AMD is seen as bias, regardless of how many times it's confirmed. It's ironic that HWUB was among the first reviewers to "unwittingly" demonstrate a performance discrepancy between AMD and Intel when it came to RT workloads before Raptor Lake and Zen 4 launched with Alder Lake and Zen 3, but now they seem to be intentionally avoiding it in their CPU reviews.
It's one thing to be a hardware enthusiast and to post your own benchmarking results on Twitter, but it feels weird to see the benchmarking tool vendor, who I'd argue should be a neutral party, putting their fingers on the scales and choosing sides. I'd be more cool with it if he was posting all this stuff on his own personal Twitter account from the POV of a hardware enthusiast with the caption "all Tweets are my own", or something along those lines, rather than on the official CapFrameX account, especially since a lot of his content is just that: his own personal testing. His tweets come off more like, "Hey guys, look at which CPU is yet again fastest here. Intel's better at gaming. Told ya." rather than "Hey guys, look at what CapFrameX fps monitoring software can do for you. Here is a sample of it working on *insert latest game here*". You know, typical self-promotion type language. When it's more of the former, I begin to wonder if the whole point of writing the software was just to push an agenda rather than writing the software for the sake of making a useful tool that can be used by others.
 
Last edited:

Schmide

Diamond Member
Mar 7, 2002
5,587
719
126
Lets be clear here, when you focus on an outlier and make it representative of a myriad of factors, that is when you run into trouble. I watch hardware unboxed and I've seen them run into outliers and more often than not they either remove them from the final average and or remind the viewer of such caveats in the final thoughts.

In this case they did their do diligence and came up with a viable explanation for the performance disparity.

I'd much rather have good analysis than gotcha click bait
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
It's one thing to be a hardware enthusiast and to post your own benchmarking results on Twitter, but it feels weird to see the benchmarking tool vendor, who I'd argue should be a neutral party, putting their fingers on the scales and choosing sides. I'd be more cool with it if he was posting all this stuff on his own personal Twitter account from the POV of a hardware enthusiast with the caption "all Tweets are my own", or something along those lines, rather than on the official CapFrameX account, especially since a lot of his content is just that: his own personal testing. His tweets come off more like, "Hey guys, look at which CPU is yet again fastest here. Intel's better at gaming. Told ya." rather than "Hey guys, look at what CapFrameX fps monitoring software can do for you. Here is a sample of it working on *insert latest game here*". You know, typical self-promotion type language. When it's more of the former, I begin to wonder if the whole point of writing the software was just to push an agenda rather than writing the software for the sake of making a useful tool that can be used by others.

Most humans have biases and prejudices one way or another, so nothing new there. I prefer when people are open about it rather than "pretending" to be unbiased. From what I've seen of CapFrameX, he buys most of his own hardware which he conducts the tests on and he usually gives a detailed testing methodology, unlike a lot of actual reviewers.

That said, he's not a professional reviewer and doesn't try to pass himself off as one. And his software is available for free. He just likes to test hardware, and in this particular case, his test was accurate bar the motherboard issue, which was probably caused by slower memory sub timings.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Lets be clear here, when you focus on an outlier and make it representative of a myriad of factors, that is when you run into trouble. I watch hardware unboxed and I've seen them run into outliers and more often than not they either remove them from the final average and or remind the viewer of such caveats in the final thoughts.

A lot of the times they don't remove outliers. One of the biggest outliers I've ever seen was discussed several times on these forums and on Reddit and garnered them much criticism. It actually caused me to unsubscribe, because while they acknowledged the discrepancy, they didn't bother to investigate it in depth at all:

BFV.png


The above chart makes no sense whatsoever. This was what Steve said about it in the review:

Let's jump into what's probably the most confusing benchmark of this series, Battlefield V. Oddly with the 200 fps frame cap removed from both configurations, the 1% lows were fairly similar, but the average frame rate wasn't as the 7600X was 36% faster at 1080p and 56% faster at 1440p, and even more shocking we still saw a 50% margin at 4K. It looks as though we're testing two different games here, but after multiple re-tests we ended up with the same results. We're not sure what's holding the 13600K back here, in both instances we set the frame limit to 900 fps, and triple checked the quality settings, needless to say everything was correct. It's possible this is an E-core bug, but we'll have to spend some more time looking into it.

They never followed up on it at all. And in the exact same review, they tested Battlefield 2042 which uses the exact same fricking engine as Battlefield V and had the opposite results. They never should have included that test at all.

BF_2042.png


But by far the biggest discrepancy of all I have with their testing methodology, is that for some reason, they are now not turning on RT in their CPU reviews. I don't see why on Earth they would do this, because having RT on increases CPU dependency and makes CPU testing a lot easier and more viable due to the much larger workload that ensues.

The only reason for them to disable RT is that it gives Intel CPUs the upper hand and makes AMD look one or sometimes two generations behind in performance depending on the game.
 
  • Like
Reactions: hemedans

JoeRambo

Golden Member
Jun 13, 2013
1,814
2,105
136
The only reason for them to disable RT is that it gives Intel CPUs the upper hand and makes AMD look one or sometimes two generations behind in performance depending on the game.

It's all about "consistency and professionalism". If you don't test RT in CPU tests, well then you can ignore RT in GPU tests as well. Really helps to underplay the importance of RT (or DLSS) before other vendor catches up.

The main problem i have with trusting online tests is the lack of methodology and disclosure. Games are very sensitive to memory performance and that means the following is true:

1) Motherboard and BIOS differences matter big time. It is not enough to disclose primary timings or say that XMP was loaded for RAM. Secondary and tertiary timings matter a lot as well. There are also factors like gear modes on Intel or F/U/MClock ratios that can have outsized impact. Heck, even with exact primary/secondary/tertiary timings one can have probably 3-5% mem sensitive benchmark difference due to things like power down control and RTL training.
So full disclosure of timings in a form of ZenTimings/Asrock Timing Configurator is a must, so everyone can see the timings and reproduce the results.

2) Due to (1) AMD is currently very vulnerable as their AM5 platform is not mature and motherboards have plenty of problems with memory compat. Due to RMA and support MB vendors are inclined to make things like loading EXPO profile just "work" and they do so by relaxing secondary/tertiary timings. Some even use SOC imc clock dividers. it will take time for situation to sort itself out.

3) Intel is undisputed king of gaming and desktop performance with 13th generation. They seem to have solved problems with their memory subsystem ( by enlarging L2 cache to rely less on weaker L3 and tightening uncore to make that L3 stronger and have less memory latency overall) and now this core is really unleached.

4) The exact % are of little relevance due to difference in arriving to those %. If we were to compare top memory tuning 13900K vs 7950x, i think ~15% is expected result. Trouble for AMD starts in "journey" part to said tuning as currently experience is disaster level, while Intel's is smooth sailing.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
1) Motherboard and BIOS differences matter big time. It is not enough to disclose primary timings or say that XMP was loaded for RAM. Secondary and tertiary timings matter a lot as well. There are also factors like gear modes on Intel or F/U/MClock ratios that can have outsized impact. Heck, even with exact primary/secondary/tertiary timings one can have probably 3-5% mem sensitive benchmark difference due to things like power down control and RTL training.
So full disclosure of timings in a form of ZenTimings/Asrock Timing Configurator is a must, so everyone can see the timings and reproduce the results.

2) Due to (1) AMD is currently very vulnerable as their AM5 platform is not mature and motherboards have plenty of problems with memory compat. Due to RMA and support MB vendors are inclined to make things like loading EXPO profile just "work" and they do so by relaxing secondary/tertiary timings. Some even use SOC imc clock dividers. it will take time for situation to sort itself out.

Looks like number 1 and 2 have a lot of validity. CapFrameX mentioned BIOS issues which explained the discrepancy he got with the 7950x in Hogwarts Legacy with the MSI B650I motherboard. Of course he was excoriated by AMD supporters, who aren't exactly understanding of the issue.

3) Intel is undisputed king of gaming and desktop performance with 13th generation. They seem to have solved problems with their memory subsystem ( by enlarging L2 cache to rely less on weaker L3 and tightening uncore to make that L3 stronger and have less memory latency overall) and now this core is really unleached.

Yeah, Raptor Lake's enhancements really made a big difference for performance over Alder Lake. The L3 cache upgrade in read bandwidth to me was particularly interesting, as it almost doubled.
 
  • Like
Reactions: Henry swagger

Kocicak

Senior member
Jan 17, 2019
982
973
136
Trouble for AMD starts in "journey" part to said tuning as currently experience is disaster level, while Intel's is smooth sailing.

I must say that I recently built a system with 13600K, Z790 board and 32GB of DDR5 6000MHz RAM, just selected the 6000MHz profile and everything just works.

Before that I built a system with 12600K, Z690 board and 32GB of DDR4 3600MHz RAM, just selected the 3600MHz profile and the system was occasionally freezing.

Before that I had a system with Ryzen 5600 and RX 580 graphic card on a X570 board and everything just worked.

Then I swapped 5600 and graphic card for 5700G and the system was showing signs of instability, finally becoming unusable.

The same 5700G and RAM kit now in a B550 board just works.

I am writing this because I want to point out that building a PC may be an untrivial thing, many settings, bios, drivers, OS, hardware incompatibility, and user errors can affect the result.

I have no experience with AM5 yet.
 

JoeRambo

Golden Member
Jun 13, 2013
1,814
2,105
136
I have no experience with AM5 yet.

The key word of my post being AM5 CPUs, right? I have started my overclocking back in jumper era, but never had to do so many full CMOS resets as with AM5. They even have a jumper to short battery, talk about achievements in convenient OC.
Oh and don't get me started about actual BIOS where same items are present in like 3 places and there is crazy interactions between them, it is only with AGESA 1.0.0.4 that things are really starting to look mature.

I must say that I recently built a system with 13600K, Z790 board and 32GB of DDR5 6000MHz RAM, just selected the 6000MHz profile and everything just works.

The problem with memory tuning is that it is not for the faint of heart. Most of users just select XMP profile and it either works or it doesnt. The problem is that vendors are hell bent to make profiles work and in the process of doing so they destroy performance with stupid secondary/tertiary timings and also love feeding ridiculous voltages like VCCSA, that are not exactly required for said speeds.
There is also a problem of buying RAM other than mainstream, chances of it working properly. So even with Intel, i was just building system with 2x32GB of DDR4 4000 and tertiary timings were a mess, all while sg/dg were fine, _dr was stupidly insanely high and was hurting performance.

So what is needed from reviewers is disclosure of full memory setup, so people can see and reproduce results.
 
Last edited:

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,664
21,172
146
Overclocking, memory sub timings, undervolting, etc. all that stuff is esoteric. Big reviewers don't spend any real time on it for a reason. What the vast majority of DIYers and gaming PC buyers want to know, they provide.

That Cap douche picking fights with them to clout chase is sad. Especially considering they have walked off KO'd him repeatedly.
 
  • Like
Reactions: Thunder 57 and Hulk

Henry swagger

Senior member
Feb 9, 2022
389
246
86
A lot of the times they don't remove outliers. One of the biggest outliers I've ever seen was discussed several times on these forums and on Reddit and garnered them much criticism. It actually caused me to unsubscribe, because while they acknowledged the discrepancy, they didn't bother to investigate it in depth at all:

BFV.png


The above chart makes no sense whatsoever. This was what Steve said about it in the review:



They never followed up on it at all. And in the exact same review, they tested Battlefield 2042 which uses the exact same fricking engine as Battlefield V and had the opposite results. They never should have included that test at all.

BF_2042.png


But by far the biggest discrepancy of all I have with their testing methodology, is that for some reason, they are now not turning on RT in their CPU reviews. I don't see why on Earth they would do this, because having RT on increases CPU dependency and makes CPU testing a lot easier and more viable due to the much larger workload that ensues.

The only reason for them to disable RT is that it gives Intel CPUs the upper hand and makes AMD look one or sometimes two generations behind in performance depending on the game.
Steve admitted he is a amd acolyte on his discord
 

Hitman928

Diamond Member
Apr 15, 2012
5,392
8,280
136
A lot of the times they don't remove outliers. One of the biggest outliers I've ever seen was discussed several times on these forums and on Reddit and garnered them much criticism. It actually caused me to unsubscribe, because while they acknowledged the discrepancy, they didn't bother to investigate it in depth at all:

[snip]

You're missing an important quote from that review in your criticism of HWUB.

From the performance summary section:
If we remove the Battlefield V data, which is potentially bugged due to an issue with maybe the E-cores, the 7600x is just 4% faster.

So @Schmide was correct, they gave total average results with and without the outlier. I believe it was on twitter too that they mentioned they retested this game 5 times after triple checking all of their settings and got the same results every time. This is more than I've seen almost any other reviewer do as most don't even mention in the review that the results are probably bugged (ahem, TPU) or take the time to retest so many times. It also looks like they've removed this game from their CPU test suite after not being able to figure out why it gave unusual performance on their Intel rig. What more would you have them do? Ask EA for the source code so they can debug the game on their test system?
 
Last edited:

DrMrLordX

Lifer
Apr 27, 2000
21,709
10,983
136
Lets be clear here, when you focus on an outlier and make it representative of a myriad of factors, that is when you run into trouble.

It's bad enough to pick one game and choose that as an outlier. CapFrameX seems to specialize in finding one spot in one map in one game and benching while staring at a particular spot. On (apparently) buggy hardware, no less.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
So @Schmide was correct, they gave total average results with and without the outlier. I believe it was on twitter too that they mentioned they retested this game 5 times after triple checking all of their settings and got the same results every time. This is more than I've seen almost any other reviewer do as most don't even mention in the review that the results are probably bugged (ahem, TPU) or take the time to retest so many times. It also looks like they've removed this game from their CPU test suite after not being able to figure out why it gave unusual performance on their Intel rig. What more would you have them do? Ask EA for the source code so they can debug the game on their test system?

They never should have uploaded that result. It wasn't an outlier, it was completely erroneous and nonsensical.

But while you're defending HWUB, explain why they suddenly decided to stop using RT in their CPU benchmarks. And their other little gem, nonsensical system power consumption. 13600K system for some reason is drawing a lot more power than the 7600x, power increases that could only be explained by the GPU , yet the 13600K somehow still performed less than the 7600x according to them. A Plague Tale Requiem being the most obvious example.

Power.png
 

Hitman928

Diamond Member
Apr 15, 2012
5,392
8,280
136
They never should have uploaded that result. It wasn't an outlier, it was completely erroneous and nonsensical.

We've already ridden around this carousel but to repeat myself, I disagree. There is no issue with sharing your results after doing your due diligence to make sure there isn't an issue in your setup. You should give commentary on the results explaining that it appears to be some kind of bug, as they did, and provide any kind of average or conclusion with the outlier results excluded, as they did. If everyone just ignored any results that seemed out of line, then lots of actual bugs and hardware issues that have been uncovered in the past would never have been brought to light and people would have been gaslit about issues they were experiencing on their own (see frame pacing, 970 VRAM config, high power consumption, etc.).

But while you're defending HWUB, explain why they suddenly decided to stop using RT in their CPU benchmarks.

Because it's a CPU benchmark and RT puts a tremendous amount of load on the GPU causing a GPU bottleneck in most cases and the vast majority of people who follow them told them they don't care about testing with RT on. You can usually drop the resolution low enough with the fastest card on the market to get to the point where the GPU is no longer a bottleneck, but then they would be going outside of their normal test flows just to test in corner cases that almost no one cares about.

And their other little gem, nonsensical system power consumption. 13600K system for some reason is drawing a lot more power than the 7600x, power increases that could only be explained by the GPU , yet the 13600K somehow still performed less than the 7600x according to them. A Plague Tale Requiem being the most obvious example.

Power.png

You just continue to recycle old arguments. I've again already commented on this but this is system power consumption which can be affected by many things and have wide swings between systems, even using the same CPU and GPU. This is probably the most valid criticism against HWUB in that it is not useful data for comparing CPUs, but to try and point to this as an example of bias or bad data is a huge stretch.
 
Last edited:

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Because it's a CPU benchmark and RT puts a tremendous amount of load on the GPU causing a GPU bottleneck in most cases and the vast majority of people who follow them told them they don't care about testing with RT on. You can usually drop the resolution low enough with the fastest card on the market to get to the point where the GPU is no longer a bottleneck, but then they would be going outside of their normal test flows just to test in corner cases that almost no one cares about.

I'd have to see where the "vast majority" of people who follow them said they don't care about testing with RT on. I guess they must be all AMD fans, which would explain a lot. Whether you like it or not, RT is the biggest thing right now in real time 3D graphics and it's only going to become more prevalent as more and more games implement it. Attempting to pretend it doesn't exist serves no one.

As for RT putting a tremendous amount of load on the GPU and causing a GPU bottleneck, that's if they are morons and screw up the testing by intentionally GPU bottlenecking it. RT on with low resolution loads the CPU a lot, but spares the GPU provided it's an RTX 4090. This is exactly what other reviewers do, including PCGH.de, Computerbase.de.

You just continue to recycle old arguments. I've again already commented on this but this is system power consumption which can be affected by many things and have wide swings between systems, even using the same CPU and GPU. This is probably the most valid criticism against HWUB in that it is not useful data for comparing CPUs, but to try and point to this as an example of bias or bad data is a huge stretch.

Yep, and like I said last time, there's no way on Earth anything other than the GPU could account for a 120w power draw in a gaming workload. I could probably stretch my mind and swallow this nonsensical result if it were some CPU bound test, but this was a gaming workload that was GPU bottlenecked.
 
  • Like
Reactions: hemedans and Sulaco

Schmide

Diamond Member
Mar 7, 2002
5,587
719
126
I'd have to see where the "vast majority" of people who follow them said they don't care about testing with RT on. I guess they must be all AMD fans, which would explain a lot. Whether you like it or not, RT is the biggest thing right now in real time 3D graphics and it's only going to become more prevalent as more and more games implement it. Attempting to pretend it doesn't exist serves no one.

Since you don't watch them, the inequalities on your summation really show. Their latest video is titled "Intel keeps dominating Intel Rolls AMD's ryzen" (note I haven't watched it so it could go either way) Also they don't avoid RT and often have a segment in most videos for it .

RT has its merits, but they only reach so far. In a sandbox game, where fidelity reigns supreme, it shows its worth. Competitive titles not so much. I wonder how big the viewer group for FPS + RT extremists is? Regardless they are two different metrics, you seem to put all your weight on one side of the issue causing a conflict with hardware unboxed. It is what it is.

As for RT putting a tremendous amount of load on the GPU and causing a GPU bottleneck, that's if they are morons and screw up the testing by intentionally GPU bottlenecking it. RT on with low resolution loads the CPU a lot, but spares the GPU provided it's an RTX 4090. This is exactly what other reviewers do, including PCGH.de, Computerbase.de.

There is a place for RT testing. Spiderman really showed that, although in a strange way. (I believe it was compiling or at the very least updating the shaders in real time but I didn't go down that rabbit hole). For the most part though. It's mostly a GPU load. Reducing resolution seems counter intuitive to eliminating the bottleneck.


Final thought on hardware unboxed. Their brand is their brand. If you choose to drag them over the coals due to your agenda, that's on you. I find their metrics, clear, concise, and forthright. They do the work so they get to call their shots.