This afternoon I tried to run some objective tests of dual core CPU utilization in Windows. I don't claim that these tests, and the results, are exhaustive, but I found them interesting. The rig used to perform the test is described in my sig.
To capture the CPU utilization data I used perfmon.exe with two counters, one for "% Processor Time" for each instance, 0 and 1. I then performed some task on the system, and screen capped the resulting graph of CPU utilization. Simple enough. Here are my results.
http://www.perambula.net/media/1_starting_google_earth.jpg
A simple and quick task to begin with. Fired up Google Earth, zoomed in and spun the globe around a bit. Not surprisingly this program doesn't stress the machine much. In this test the program utilized both CPUs at roughly the same level, with utilization maxing out around 50%.
http://www.perambula.net/media/2_outlook_email_search.jpg
Thought this would be a good test, but I don't have enough email in my inbox. Both processors are utilized while the search is performed, but core 1 gets about 3x the utilization of core 0.
http://www.perambula.net/media/3_load_and_play_cod.jpg
These tests don't tell us much, so let's play some games. This is Call Of Duty 1, single player, 1920 x 1200 widescreen, all gfx settings high. The graph shows the cpu utilization as I load and play the game for about a minute. Note very significant utilization of both cores, with core 1 again getting 2x to 3x the work handed to it. Note also that we never saturated either core.
http://www.perambula.net/media/4_bf2_load_and_log_in.jpg
In this test I load Battlefield 2 and log in to the account server. Settings are 1600 x 1200, all gfx settings on high. This program appears to like core 0 better. Still getting significant utilization of both cores.
http://www.perambula.net/media/5_bf2_In-game_single_player.jpg
In this test I logged in to the account server on BF2 and loaded a single player map. Same gfx settings. PLayed for about 2 or 3 minutes. The program still likes core 0 for the hard work. Both cores heavily utilized. No saturation.
http://www.perambula.net/media/6_bf2_In-game_multiplayer.jpg
Now this gets a little more interesting. Logged in and played a BF2 multiplayer game for a few minutes. Same gfx settings. Here we come closer to saturation on core 0 a couple of times, but never quite hit 100%. Utilization of core 1 is significant, at times exceeding utilization on core 0, but overall running at a level 25% to 75% that of core 0.
http://www.perambula.net/media/7_Playing_dvd.jpg
What about heavy duty audio/video decoding? Here I use PowerDVD to play the Lord of the Rings: The Two Towers extended edition. Dolby Digital 5.1, 1920 x 1200. This task doesn't stress the machine much. PowerDVD likes core 1 for the harder work. Both cores utilized significantly with core 0 running 20% to 120% or so of core 1 utilization.
What these quick tests tell me is that even in the most intense application, i.e. BF2, on this system at least having two cores prevents CPU saturation, and obviously allows the operating system to spread the work out to a significant degree. There are CPU intensive tasks that don't parallelize, and so we see one or the other core getting used more heavily, but the other core continues to pick up a fair bit of the workload. Given the percentages here I think it is clear that in a number of cases a single core would have been saturated.
Edit: One of the things that strikes me about the graphs is the horizontal symmetry of the results in CoD and BF2. As one core loads up the other offloads, and vice-versa. I don't yet know what this means. It might be an effect of the thread scheduling algorithm in windows, or of something else that I haven't figured out yet.
Edit: The CoD graphs aren't really symmetrical in the way the BF2 graphs are, although you can still see the inverse relationship between the load on the two cores.
Edit: this evening I installed the recently released nVidia 81.82 beta drivers. These are the latest versions of the nVidia forceware drivers with optimizations for dual core processors. I set up the test as previously but this time I opened up a perfmon log and spooled out 15 minutes of processor core utilization data while playing BF2. I did not notice a huge performance increase while playing (the game already runs very fluidly on high settings, and I have vsynch on so the graphics can't get faster than 60 fps anyway). The captured processor utilization data was eye-opening, to say the least.
http://www.perambula.net/media/8_bf2_nvidia_beta_drivers.jpg
This image shows an overview of the test data. The strip has a duration of 14:39, so this view hides a lot of detail while giving a good look at core utilization over the whole run. Note the very high utilization of both cores.
http://www.perambula.net/media/8_bf2_nvidia_beta_drivers_detail.jpg
This image shows a detail view of about 3:25 of the data strip. Pretty amazing difference from the previous driver version.
To capture the CPU utilization data I used perfmon.exe with two counters, one for "% Processor Time" for each instance, 0 and 1. I then performed some task on the system, and screen capped the resulting graph of CPU utilization. Simple enough. Here are my results.
http://www.perambula.net/media/1_starting_google_earth.jpg
A simple and quick task to begin with. Fired up Google Earth, zoomed in and spun the globe around a bit. Not surprisingly this program doesn't stress the machine much. In this test the program utilized both CPUs at roughly the same level, with utilization maxing out around 50%.
http://www.perambula.net/media/2_outlook_email_search.jpg
Thought this would be a good test, but I don't have enough email in my inbox. Both processors are utilized while the search is performed, but core 1 gets about 3x the utilization of core 0.
http://www.perambula.net/media/3_load_and_play_cod.jpg
These tests don't tell us much, so let's play some games. This is Call Of Duty 1, single player, 1920 x 1200 widescreen, all gfx settings high. The graph shows the cpu utilization as I load and play the game for about a minute. Note very significant utilization of both cores, with core 1 again getting 2x to 3x the work handed to it. Note also that we never saturated either core.
http://www.perambula.net/media/4_bf2_load_and_log_in.jpg
In this test I load Battlefield 2 and log in to the account server. Settings are 1600 x 1200, all gfx settings on high. This program appears to like core 0 better. Still getting significant utilization of both cores.
http://www.perambula.net/media/5_bf2_In-game_single_player.jpg
In this test I logged in to the account server on BF2 and loaded a single player map. Same gfx settings. PLayed for about 2 or 3 minutes. The program still likes core 0 for the hard work. Both cores heavily utilized. No saturation.
http://www.perambula.net/media/6_bf2_In-game_multiplayer.jpg
Now this gets a little more interesting. Logged in and played a BF2 multiplayer game for a few minutes. Same gfx settings. Here we come closer to saturation on core 0 a couple of times, but never quite hit 100%. Utilization of core 1 is significant, at times exceeding utilization on core 0, but overall running at a level 25% to 75% that of core 0.
http://www.perambula.net/media/7_Playing_dvd.jpg
What about heavy duty audio/video decoding? Here I use PowerDVD to play the Lord of the Rings: The Two Towers extended edition. Dolby Digital 5.1, 1920 x 1200. This task doesn't stress the machine much. PowerDVD likes core 1 for the harder work. Both cores utilized significantly with core 0 running 20% to 120% or so of core 1 utilization.
What these quick tests tell me is that even in the most intense application, i.e. BF2, on this system at least having two cores prevents CPU saturation, and obviously allows the operating system to spread the work out to a significant degree. There are CPU intensive tasks that don't parallelize, and so we see one or the other core getting used more heavily, but the other core continues to pick up a fair bit of the workload. Given the percentages here I think it is clear that in a number of cases a single core would have been saturated.
Edit: One of the things that strikes me about the graphs is the horizontal symmetry of the results in CoD and BF2. As one core loads up the other offloads, and vice-versa. I don't yet know what this means. It might be an effect of the thread scheduling algorithm in windows, or of something else that I haven't figured out yet.
Edit: The CoD graphs aren't really symmetrical in the way the BF2 graphs are, although you can still see the inverse relationship between the load on the two cores.
Edit: this evening I installed the recently released nVidia 81.82 beta drivers. These are the latest versions of the nVidia forceware drivers with optimizations for dual core processors. I set up the test as previously but this time I opened up a perfmon log and spooled out 15 minutes of processor core utilization data while playing BF2. I did not notice a huge performance increase while playing (the game already runs very fluidly on high settings, and I have vsynch on so the graphics can't get faster than 60 fps anyway). The captured processor utilization data was eye-opening, to say the least.
http://www.perambula.net/media/8_bf2_nvidia_beta_drivers.jpg
This image shows an overview of the test data. The strip has a duration of 14:39, so this view hides a lot of detail while giving a good look at core utilization over the whole run. Note the very high utilization of both cores.
http://www.perambula.net/media/8_bf2_nvidia_beta_drivers_detail.jpg
This image shows a detail view of about 3:25 of the data strip. Pretty amazing difference from the previous driver version.
