Well an issue here is familiarity with the subject at hand. If a reviewer does this test it becomes counter productive if either the reviewer or audience is unfamiliar with the subject at hand.I completely agree. Streaming has become a profession for many people and is incredibly popular. Let us see benchmarks fit the use case for this and not distort anything further.
If you don't stream then why does it matter? This is for people who stream at incredibly high bit rates at 1080p60 and beyond. Streaming can be an INCREDIBLY profitable job for those at the top. They want the best performance and they don't care if it is AMD or Intel.
This is not about epeen, not about skewing gaming performance, not about distorting reality to show one camp is better than the other, but to show how streaming affects CPU core performance. I want to know more.
For example with some of your points the "top streamers" use separate dedicated machines if they are after maximum performance. For this you can just look at x264 encoding benchmarks.
The advantage of x86 software encoding is at low bit rates required for streaming. Who's streaming at high bit rates beyond 1080p60?
Unless they are going to do an entire article to examine all the angles and educate the audience (and likely themselves in the process) than it'll just be misleading numbers. Otherwise it'll just be back to the even earlier days of benchmarking where sites just threw a bunch of synthetic benchmark numbers out there which translates exactly to no meaning for anyone.