"Real world" CPU Tests?

gregoryvg

Senior member
Jul 8, 2008
241
10
76
So does anyone know of any "real world' CPU tests? It seems like a lot of CPU tests are sanitized. In reality most gamers have some sort of internet program (Chrom, Firefox or Explorer) on in the background (with probably several tabs open) and anti-virus software open as well as several other programs in the background (vent or stuff).

I guess I'm wondering if there are differences in the CPU results as you add background tasks. Would an Ryzen 7 or 5 gain ground in those cases over the Coffee Lake CPUs?

Sorry if this question has been asked before; I'm sure it has.
 

TheELF

Diamond Member
Dec 22, 2012
3,973
730
126
It's not 1999 any more, the background tasks you talk about hardly make a dent into a modern CPU even if it only has a few cores like 4 or even less.
Anti virus can bog down any system because it makes heavy use of the hard drive, if the game is on the same drive that is being scanned it will loose performance even if the cpu has nothing to do with it.
Everything else is a case of load balancing,if the OS is doing it wrong by giving a high priority to a heavy background task you will have to do it by hand.
 

gammaray

Senior member
Jul 30, 2006
859
17
81
Real world CPU tests show that Ryzen is on par with Intel mostly when one games at 1440p ultra settings and higher resolution. at 1080p Intel usually gets a 10% lead, sometimes higher, sometimes lower. That's the jig i get.
 

Topweasel

Diamond Member
Oct 19, 2000
5,436
1,654
136
So does anyone know of any "real world' CPU tests? It seems like a lot of CPU tests are sanitized. In reality most gamers have some sort of internet program (Chrom, Firefox or Explorer) on in the background (with probably several tabs open) and anti-virus software open as well as several other programs in the background (vent or stuff).

I guess I'm wondering if there are differences in the CPU results as you add background tasks. Would an Ryzen 7 or 5 gain ground in those cases over the Coffee Lake CPUs?

Sorry if this question has been asked before; I'm sure it has.

Ryzen 7 yes, Ryzen 5 not so much. Depends on who you ask but for the most part games for example are optimised for 4c but are basically starting to cap out what even the fastest 4c CPU's can manage. Some games like AC Origin or AOTS run better with more cores. But reasonably with a R5/R7/8K i5/i7 you have extra resources for extra work. Not that a game couldn't run better if they were better written for MT usage, but regardless a high 4c usage or scaling usage of most of the scores that would still equal out the full usage of 4 cores (this happens when a game is optimised for the i7 usage of HT). So you have extra compute cycles, which for the most part a lot of external applications are low impact when not in use. But if you had something using the CPU it wouldn't necessarily affect game play. In a maxed out CPU it could mean a large drop in frames just for receiving an email if you have your client open. You wouldn't see this in a game with a CPU with more than 4 cores.

Then there are actual workloads. I am currently re-encoding 1200 video files, to lower space usage on one of my drives. With all 8 cores of my CPU (1700) going right now the average time is about 30 minutes of video. When I am playing a game, let's say generally 4 cores with of work would be taken up by a game. Processing time would go up to about 45 minutes a video at this point. With a 7700k not only would it originally take lets say 40 minutes a video, now it would have nearly no processing time and would basically pickup as much scrap cycles it can get from HT. This could make the video take 10x as long to encode while playing a game.

Those are the drawbacks of staying so close to max CPU usage in games and are real world examples. It's because of the spottiness and generally unreliableness of the information that stops it from being part of a review. Take BF1 for example. Single player is almost completely limited by the optimizations for a 4c8t i7. The benchmarks are very recreatable, even if done as an actual play through. The differences are minor and act a little bit of noise during multiple runs. But we know that the 4c8t design goes out the window when doing MP. The more people the more the core count matters. But since it would require 100x the effort to collect the information because of the massive amount of runs needed to actually account for the differences per run, or the amount of choreography it would take to get enough closely recreatable runs, it isn't worth the effort to techtubers looking to beat the other tubers to the punch.

So in regards to your question. The R5 provides no more CPU resources to throw at a problem then an 8700, the clockspeed on an 8600 is going to make up for the lack of HT and Coffeelake both has a higher IPC and higher Clock speed. There are a few reasons to go r5 over Coffeelake, it's all in economy and not performance.

The R7 on the other hand has potential. But that potential is limited. It's only a 2 core lead and down on clocks and IPC. There are MT tasks and workloads it would shine in but generally the coffee lake alternatives would be better.
 
Feb 25, 2011
16,789
1,469
126
Real-World CPU tests are a toughie, since different people use their computers differently, and for different things.

Benchmarks like 3DMark are supposed to represent "games in general" and you could easily run 3DMark while doing a virus scan (or, if you want it to be repeatable and scientific, a storage benchmark of some kind) to quantify the hit. But most "gamers" are going to schedule their AV software to run scans when they're at work or something, to avoid the surprise performance drop during "gaming hours."

For me, the only time I really notice my CPU chugging is when I'm surfing the web and the browser chokes on some website that's mostly JavaScript. But then again, is it my CPU's fault? Or the browser? Or what? So I tend to weight browser and 'office' benchmarks a little more heavily when I'm reading a CPU review. I do a fair amount of video encoding, but that's a "walk away and forgetabowdit" activity, so I don't care that there are CPUs available that are 2-3x as good at that.
 
  • Like
Reactions: frozentundra123456

TheELF

Diamond Member
Dec 22, 2012
3,973
730
126
Then there are actual workloads. I am currently re-encoding 1200 video files, to lower space usage on one of my drives. With all 8 cores of my CPU (1700) going right now the average time is about 30 minutes of video. When I am playing a game, let's say generally 4 cores with of work would be taken up by a game. Processing time would go up to about 45 minutes a video at this point. With a 7700k not only would it originally take lets say 40 minutes a video, now it would have nearly no processing time and would basically pickup as much scrap cycles it can get from HT. This could make the video take 10x as long to encode while playing a game.
Not how the world works...
Video convertion takes advantage of all available cores,that's why it's so fast on your machine,if the convertor runs at a high priority it will screw with your system even if you have many cores,what you describe only happens if the conversion starts at a very low priority but that means that it would behave the same way on the i7,it would leave the game alone letting it take all the cpu power it want's and only use free cycles.
And unless the real world user has 1080ti in sli and plays at 720p there will be a lot of free cpu cycles.

(4 cores out of your 8 is 50% usage by the game, if so the conversion would take a lot longer even for your system)
 

Topweasel

Diamond Member
Oct 19, 2000
5,436
1,654
136
Not how the world works...
Video convertion takes advantage of all available cores,that's why it's so fast on your machine,if the convertor runs at a high priority it will screw with your system even if you have many cores,what you describe only happens if the conversion starts at a very low priority but that means that it would behave the same way on the i7,it would leave the game alone letting it take all the cpu power it want's and only use free cycles.
And unless the real world user has 1080ti in sli and plays at 720p there will be a lot of free cpu cycles.

(4 cores out of your 8 is 50% usage by the game, if so the conversion would take a lot longer even for your system)

Not always. Not everything the CPU does is about feeding data to the GPU. BF1 MP is an example of that. Games use generally close to 8 threads of decent to heavy usage (4 full cores worth with a bit of extra stealing of resources by using HT). It's one of the reasons why there are a lot of games that can show heavy core count usage but still run worse than the 4c counterparts. In the end the workload itself is balanced again the capabilities of probably something like a 4770k. Sure there are games out there that will have spare cycles on a 4c8t setup specially DX11 games running at a high rez. But when you are pegging as many newer games have a 95% CPU usage at decent resolutions, you don't have a whole lot of spare cycles in there for the video encoding.

As for the usage. I assumed some parallelism overhead. But fine call it 1 hour vs. 30 minutes. vs at 90% used for a game on a 4c8t system, 300 minutes. Point still stands low core count systems used with any sort of MT heavy workload kind of become single task systems. It's why streamers used to use secondary machines if they didn't get a X79 or X99 system. Now obviously if you are using it in a professional setting it would be kind of a single task machine during these periods anyways. But I was trying to give an idea of what kind of real world scenario's more cores help out in other things including gaming.
 
Aug 11, 2008
10,451
642
126
Actually, heavy multitasking while gaming in a niche within a niche. "Real world" cpu usage is office, social media, web usage, and pogo/Facebook games. Playing demanding games is a niche already in the overall spectrum of computer usage, and insisting on doing heavy cpu intensive tasks at the same time is a niche (IMO a very small one at that) within that niche. How many hours per day can one game? Two, 3, 4 at the outside? That leaves 20 or more hours to run other cpu intensive tasks. Streaming is the only cpu intensive task it makes sense to run while gaming, and even then it can offloaded to the gpu or for pros to another machine entirely.