Here are the results from a test I performed awhile back. Ill redo the tests with my system overclocked soon, but I am using different drivers and .ini tweaks so it's abit of work just to get my system back to the way it was for accuracy's sake. Anyway thought you might find it interesting.
---------------
I decided to perform a series of simple tests on how my machine runs oblivion. Personally I did this just to answer a few of my own questions, and posted it so that anyone else can use the results for refrence.
I used the in game console command tdt to monitor the FPS.
I use nvidia temp logger to monitor my graphics card's temp.
I used windows taskmanager to monitor the CPU usage. Im not sure exactly how to get a point by point high/low % reading from it so I will not offer exact % cpu usage scores. Instead I will use something easier to eyeball...
There are 5 squares on my task manager cpu monitoring chart. So I will be able to make a solid guess down to 1/10 of the usage, and thus will report in that style.
Computer Information
Windows XP Professional
AMD 3800+ x-2 dual core at stock speed 2009.2 MHZ
Latest Driver from AMD no hotfix / registry editing.
EVGA 7900gt CO superclock at 550MHz Core and 1580Mhz (effective) Memory
Nvidia Beta 84.25 Drivers Image Setting to "Quality" AF + AA = Application Controlled, and "Global" Profile.
1 gig of craptastically cheap ram 2.5 4 4 8 (2x 512 both different companies) 200.9mhz x2
400.18 effective. Set at 1T command rate.
EVGA SLI Nforce4 Mobo latest nvidia drivers
Antec Smartpower 2.0 500w PSU
Western Digital Caviar HD 80 gig
I use the onboard mobo sound.
Thermaltake Tsunami Dream case
I used the .exe hack from
www.gameburnworld.com so that I do not need to have my CD in the drive to play. I do this so that I can keep the original game disc untarnished and also to decrease load times ever so slightly. I only say this as I want to include all the information I can about how this test was performed, though this will not increase or decrease FPS during this test.
Room temperature is slightly warm at 77F
Comp Temps
----------
Since I have the everest trial version I only can offer my idle temperatures for motherboard
and cpu
All Temps Idle
Mobo = 27C (81F)
CPU = 28C (82F)
GPU = 39C
To perform these tests I fast traveled to Shadeleaf Copse and left the character facing the exact direction it naturally faces coming from the Imperial City Palace. I chose this location because of the vast amount of trees, grass, and water. The ingame time was 2:24pm, and there was no rain. Just a typical sunny day.
I used 0 mods for this test, and the .ini file was default.
I save the game and reload for all the tests.
Each Test is run until the automatic 3rd person camera takes over.
Because this is such a short amount of time, it sort of makes the peak GPU temperature useless, although the temperature seems to flatline by this time. After 30 Minutes of gameplay or even 10 minutes expect temps to be much higher, especially in complex combat situations. Mostly I included them to see if we notice anything interesting.
During this test no NPC or animals entered the scene.
I'm sorry in advance for only reporting up to 1280x1024 resolution, but it is the max my monitor goes to. If you send me a better monitor I would be happy to report the test results to you for it's highest resolution.
Test Scenario 1
---------------
All ingame sliders were set to their maximum including all the sound sliders.
Texture size is set to Large
HDR Lighting
Everything was turned on or at its highest setting except for self shadows as that function is currently bugged, and beards (yes beards..) all the women -.-;
Cool N Quiet is set to off.
Results:
640x480 28-29 FPS core1 at 90% core2 always under 80% GPU peaked at 54C
800x600 26-27 FPS core1 at 90% core2 always under 80% GPU peaked at 54C
1024x768 22-23 FPS core1 was usually under 80% while core2 bounced a little above and below 90% GPU temp peaked at 58C
1280x1024 19-20 FPS core1 was at 80% usage the other was at 90% GPU temp peaked at 57C
Since this was the first series of tests I performed... I did them all 2 times -.-; and both times the results were the same.
Test Scenario2
--------------
All ingame sliders were set to their minimum including all sound sliders.
Texture size is set to small
No HDR No Bloom No AA
Everything was turned off, or at its lowest possible setting.
Cool N Quiet is set to off.
A quick note... with the settings like this..all you see is some ugly water, a rock, and the sky covers the horizon where trees and grass used to be.
640x480 108-139 FPS Core1 peaked at 70% core2 Peaked at 60% GPU peaked at 45C
800x600 101-137 FPS core1 peaked at 80% core2 Peaked at 60% GPU peaked at 48C
1024x768 108-139 FPS Core1 peaked at 80% core2 peaked at 60% GPU peaked at 52C
Interesting that the FPS low and high were the same as 640x480..well interesting if your a computer dork.
1280x1024 111-150 FPS (150fps for over 3 seconds) Core1 Peaked at 85% core2 Peaked at 60% GPU peaked at 59C
well i bet running these tests 2 times would vary the results, however...i have alot more testing to do..so I may update this later with those variations if anyone gives a rats ass.
for all i know, no one will care i took the time to do this whole thing x.x, atleast i care to know

=P
Test Scenario3
--------------
Same ingame settings as test scenario 1, except that I decided to run 2 instances of Prime95. Only the maximum and minimum resolutions will be featured for this test.
Because I am running 2 Prime95's both cores were at a constant 100% usage.
640x480 25-31FPS (usually 28-30) Core1 and Core2 Constant 100% usage GPU peaked at 59C
1280x1024 19-20 FPS Core1 and Core2 constant 100% usage GPU peaked at 59C again
Test Scenario4
--------------
All settings the same as test 1 except cool n quiet is turned on.
I only tested the highest resolution.
1280x1024 17-20 fps but a very very consistent 19. the strange thing is...both cores were consistent with each other and it looks exactly like this.. -_- going from 80 to 60 to 80% usage. This is for a decent period of time. Seems like after the game stabalized it drops to 60% usage and holds until the 3rd person camera kicked in. GPU peaked at 58C
Test5
-----
Same as test2 except cool n quiet is turned on.
I only tested the lowest resolution.
640x480 104-139 FPS Both cores at 60% usage GPU peaked at 46C
Test6
-----
For my final test, I decided to run the game at all medium settings.
all sliders in the middle except for the ones that limit viewdistance.
textures on medium. Cool N quiet On.
All settings that only had on or off...or high or low.. options were set to on or high.
All settings that feautred off normal or high were set to normal.
Bloom lighting was used.
1024x768 35-36 FPS Core1 80% Core2 70% GPU peaked at 57C
I am sure that if i turned off the shadows on grass effect and the treeshadows and set the shadow slider bars down lower I could dramatically improve performance without a whole lot of visual degredation. argh screw it...ill just test it.
Test7 (overtime)
-----
All settings same as test6 except for shadow sliders set down to 20% Tree canopy shadows off and shadows on grass off.
1024x768 38-40fps cpu1 90% cpu2 60% GPU peaked at 58C
There you have it.
There is alot of tweaking you can do to increase performance...such as lowering the height of the grass or screwing with the .ini file for hours and hours. But out of the box...with the recommended oblivion nvidia drivers...these are my results.
some people claim to be getting 60fps outside on a radeon 9800pro with all settings maxed.
some people claim to be getting 10fps on x1900xt outside under similar circumstances with all settings maxed... (in 1280x1024 res)
in anycase it is evident there are some things wrong with this coding.
I recommend anyone with a dual core system to apply the multithreading .ini tweaks.
Also if anyone out there isn't using the moon-->deathstar mod shame on you...