Why is my Hardware crap ???

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

AdamK47

Lifer
Oct 9, 1999
15,783
3,606
136
I've seen benchmarks with Oblivion at Firingsquad here. It looks like the nVidia 7900 GTX keeps up with the Ati x1900xtx and xt cards when using the newest drivers. However, it looks like the performance takes a nose dive an anything less than a 7900 GTX in nVidia land. The Ati cards fare much better.

With the rig listed in my signature I'm able to play at better than the Ultra high default settings. I have the tree and grass distances maxed with shadow filtering on high. The actor, object, and item view distances are also bumped up 25% higher than default. The resolution is set to 1280x1024 with no AA or AF. It runs great and I never feel I have to lower any detail settings while playing. The framerate is usually at 30 fps or higher.
 

alcoholbob

Diamond Member
May 24, 2005
6,389
468
126
If you look at the benchmarks, its primarily the SM3.0b video cards (7900 series, X1900 series, etc) that have a huge performance boost, even if they don't have significantly (as significant as shown in the framerate difference) much more horsepower than their predecessors. Considering Oblivion uses SM2.0 shaders, this is most likely an issue that can be fixed with a patch. You could manually disable SM3.0 in the ini but it would revert to SM1.0 (no shiny water), so I doub't many would be willing to take that course of action. Once they clear up this shader mess, I wouldn't be surprised if SM2.0-SM3.0a cards get a healthy 20-25% performance boost.

2GB of RAM helps, yes...for loading. Your in game framerate is still highly dependent on your GPU. Also, realize Oblivion is not the best programmed game. Bethesda is known for doing really, really dumb stuff like having little to no occlusion culling, which means everything off the screen is being rendered. A better programmer like id or epic seriously tweaks their engines to render mostly what's on-screen, which is why you see framerates in the 100s with current gear. Lowering resolution is probably not going to give you a major framerate boost, nothing other than lowering view distance will (just like in Morrowind). Of course the game will look like crap then, but this has always been Bethesda's style in avoiding decent occlusion culling programming.

I'm one of those guys who likes to play in 3rd person view, since I can see so much more, especially with multiple opponents coming. I'm a fighter/mage (mostly fighter), so seeing more is much more important so I know where and when to block a blindsided hit. If you are a mage, it's easier to play in first person since you need to aim your spells, and you are backpeddling most of the time anyway, with range being your advantage. In my experience 3rd person view can be as much as 10+ fps slower than first person, so if you are running in 3rd person primarily, try first for a while and see if its any faster.

Lowering grass distance will probably get you back 3-5 fps as well. Turn off shadows completely, and you might get back another 2-3. Things that you don't notice much (given how HDR blurs the screen so much anyway).

 

Arkane13131

Senior member
Feb 26, 2006
412
0
0
oblivion = poorly coded. Thems the facts of life.

allow me to backup my claim.

"I decided to perform a series of simple tests on how my machine runs oblivion. Personally I did this just to answer a few of my own questions, and posted it so that anyone else can use the results for refrence.

I used the in game console command tdt to monitor the FPS.

I use nvidia temp logger to monitor my graphics card's temp.

I used windows taskmanager to monitor the CPU usage. Im not sure exactly how to get a point by point high/low % reading from it so I will not offer exact % cpu usage scores. Instead I will use something easier to eyeball...

There are 5 squares on my task manager cpu monitoring chart. So I will be able to make a solid guess down to 1/10 of the usage, and thus will report in that style.

Computer Information

Windows XP Professional

AMD 3800+ x-2 dual core at stock speed 2009.2 MHZ
Latest Driver from AMD no hotfix / registry editing.

EVGA 7900gt CO superclock at 550MHz Core and 1580Mhz (effective) Memory
Nvidia Beta 84.25 Drivers Image Setting to "Quality" AF + AA = Application Controlled

1 gig of craptastically cheap ram 2.5 4 4 8 (2x 512 both different companies) 200.9mhz x2
400.18 effective. Set at 1T command rate.

EVGA SLI Nforce4 Mobo latest nvidia drivers

Antec Smartpower 2.0 500w PSU

Western Digital Caviar HD 80 gig

I use the onboard mobo sound.

Thermaltake Tsunami Dream case

I used the .exe hack from www.gameburnworld.com so that I do not need to have my CD in the drive to play. I do this so that I can keep the original game disc untarnished and also to decrease load times ever so slightly. I only say this as I want to include all the information I can about how this test was performed, though this will not increase or decrease FPS during this test.

Room temperature is slightly warm at 77F

Comp Temps
----------
Since I have the everest trial version I only can offer my idle temperatures for motherboard
and cpu

All Temps Idle
Mobo = 27C (81F)
CPU = 28C (82F)

GPU = 39C

To perform these tests I fast traveled to Shadeleaf Copse and left the character facing the exact direction it naturally faces coming from the Imperial City Palace. I chose this location because of the vast amount of trees, grass, and water. The ingame time was 2:24pm, and there was no rain. Just a typical sunny day.

I used 0 mods for this test, and the .ini file was default.

I save the game and reload for all the tests.
Each Test is run until the automatic 3rd person camera takes over.
Because this is such a short amount of time, it sort of makes the peak GPU temperature useless, although the temperature seems to flatline by this time. After 30 Minutes of gameplay or even 10 minutes expect temps to be much higher, especially in complex combat situations. Mostly I included them to see if we notice anything interesting.
During this test no NPC or animals entered the scene.

I'm sorry in advance for only reporting up to 1280x1024 resolution, but it is the max my monitor goes to. If you send me a better monitor I would be happy to report the test results to you for it's highest resolution.

Test Scenario 1
---------------

All ingame sliders were set to their maximum including all the sound sliders.
Texture size is set to Large
HDR Lighting
Everything was turned on or at its highest setting except for self shadows as that function is currently bugged, and beards (yes beards..) all the women -.-;
Cool N Quiet is set to off.


Results:

640x480 28-29 FPS core1 at 90% core2 always under 80% GPU peaked at 54C

800x600 26-27 FPS core1 at 90% core2 always under 80% GPU peaked at 54C

1024x768 22-23 FPS core1 was usually under 80% while core2 bounced a little above and below 90% GPU temp peaked at 58C

1280x1024 19-20 FPS core1 was at 80% usage the other was at 90% GPU temp peaked at 57C


Since this was the first series of tests I performed... I did them all 2 times -.-; and both times the results were the same.



Test Scenario2
--------------
All ingame sliders were set to their minimum including all sound sliders.
Texture size is set to small
No HDR No Bloom No AA
Everything was turned off, or at its lowest possible setting.
Cool N Quiet is set to off.

A quick note... with the settings like this..all you see is some ugly water, a rock, and the sky covers the horizon where trees and grass used to be.

640x480 108-139 FPS Core1 peaked at 70% core2 Peaked at 60% GPU peaked at 45C

800x600 101-137 FPS core1 peaked at 80% core2 Peaked at 60% GPU peaked at 48C

1024x768 108-139 FPS Core1 peaked at 80% core2 peaked at 60% GPU peaked at 52C
Interesting that the FPS low and high were the same as 640x480..well interesting if your a computer dork.

1280x1024 111-150 FPS (150fps for over 3 seconds) Core1 Peaked at 85% core2 Peaked at 60% GPU peaked at 59C o_O


well i bet running these tests 2 times would vary the results, however...i have alot more testing to do..so I may update this later with those variations if anyone gives a rats ass.
for all i know, no one will care i took the time to do this whole thing x.x, atleast i care to know :D =P


Test Scenario3
--------------

Same ingame settings as test scenario 1, except that I decided to run 2 instances of Prime95. Only the maximum and minimum resolutions will be featured for this test.
Because I am running 2 Prime95's both cores were at a constant 100% usage.

640x480 25-31FPS (usually 28-30) Core1 and Core2 Constant 100% usage GPU peaked at 59C

1280x1024 19-20 FPS Core1 and Core2 constant 100% usage GPU peaked at 59C again


Test Scenario4
--------------
All settings the same as test 1 except cool n quiet is turned on.
I only tested the highest resolution.


1280x1024 17-20 fps but a very very consistent 19. the strange thing is...both cores were consistent with each other and it looks exactly like this.. -_- going from 80 to 60 to 80% usage. This is for a decent period of time. Seems like after the game stabalized it drops to 60% usage and holds until the 3rd person camera kicked in. GPU peaked at 58C


Test5
-----

Same as test2 except cool n quiet is turned on.
I only tested the lowest resolution.


640x480 104-139 FPS Both cores at 60% usage GPU peaked at 46C


Test6
-----
For my final test, I decided to run the game at all medium settings.
all sliders in the middle except for the ones that limit viewdistance.
textures on medium. Cool N quiet On.
All settings that only had on or off...or high or low.. options were set to on or high.
All settings that feautred off normal or high were set to normal.
Bloom lighting was used.

1024x768 35-36 FPS Core1 80% Core2 70% GPU peaked at 57C

I am sure that if i turned off the shadows on grass effect and the treeshadows and set the shadow slider bars down lower I could dramatically improve performance without a whole lot of visual degredation. argh screw it...ill just test it.

Test7 (overtime)
-----
All settings same as test6 except for shadow sliders set down to 20% Tree canopy shadows off and shadows on grass off.

1024x768 38-40fps cpu1 90% cpu2 60% GPU peaked at 58C




There you have it.

There is alot of tweaking you can do to increase performance...such as lowering the height of the grass or screwing with the .ini file for hours and hours. But out of the box...with the recommended oblivion nvidia drivers...these are my results.

some people claim to be getting 60fps outside on a radeon 9800pro with all settings maxed.
those people are probably lying or derranged.
some people claim to be getting 10fps on x1900xt outside under similar circumstances with all settings maxed... (in 1280x1024 res) those people are out of their minds...or they need to stop running virus scanners prime95 videodecoders fraps morpheus azureus photoshop disk defrag protools and doom3 in the background."

Double Overtime
-------------------

Everything the same as the very first test, except I overclocked my cpu fro 2000-->2400.

lets look at these results.

640x480 = 30fps
1280x1024= 20fps
-------------------------------


due to the fact that a CPU overclock did not yeild any gain and that more than doubling the resolution offers a 50% gain...
comments?
 

alcoholbob

Diamond Member
May 24, 2005
6,389
468
126
I think your Test 2 cemented the fact that this game is not resolution bound, rather bound by some phantom variable, most likely poor occlusion culling.

Second, you have to give Bethesda props for attempting such a huge game. It's their niche after all. Dungeon Siege is like this, but you can't explore for 360 degrees for hours, the game world is a totally different size.

Third, sadly, the graphics aren't that impressive. NWN2 is coming out this summer and although its obviously going to be smaller sized levels (more level loading), it shows we don't have to play Oblivion because its the end all be all. Even nicer that NWN2 is just a holdover game for Dragon Age! IMO Bioware > Bethesda by quite a bit.

PS: Didn't you hate when Bethesda said "we aren't including dual-wielding because too many other games have it, and it would be 'dumming down' the game?" Obviously a not too discrete jab at Bioware, since those "too many other games" happen to ALL be Bioware games. With +1 attack, and -2 main hand and -4 off-hand penalties, with no shield benefit, I certainly don't see how it would've "dummed down" Oblivion to add a balanced dual-wielding style. More like "our animators aren't up to it, even though fans have been asking for 8 years."
 

Arkane13131

Senior member
Feb 26, 2006
412
0
0
i definatly give them props on this game.... hell i have 100 hours into it already. its a great game, and honestly that location is the slowest fps yeilding location I could find that one could just fast travel to. Morrowind was not coded very well. Had that same kind of phantom variable whatever it is..and a memory leak that was terrible (atleast on my old machine) the game would slowly slow down overan hour then crash the whole system.

I actually enjoy oblivions graphics. I have never seen a nicer looking outdoor world in a game. Im not sure how people got farcry to compair (as many claim it looks better after modifications) but im one of the freakeshly weird people that did not enjoy farcry that much so i did not play it very much. The reason alot of people hated the graphics of oblivion was because..as you exit the sewers your very first time..they faced you at a giant mountain off in the distance....that had mario64 quality textures. The ugliest textures in a video game I have seen in years. Why introduce the awesome outside to the player by pointing them at its ugliest blemish? Whoever on the bethesda team that decided that is acceptable should be slapped.
 

alcoholbob

Diamond Member
May 24, 2005
6,389
468
126
Basically, you can increase the texture distance (and reduce the "soupy mountain") if you are willing for a minor performance loss by tweaking the ini file.

The funny thing, as we have to come see, as maximum settings is hardly any more strenuous than medium settings. It's the view distance thats killing the framerates.
 

imported_Phil

Diamond Member
Feb 10, 2001
9,837
0
0
due to the fact that a CPU overclock did not yeild any gain and that more than doubling the resolution offers a 50% gain...

This is not an uncommon phenomenon with the higher-spec graphics cards, which won't "stretch their legs" until you get them up to higher resolutions.
 

alimoalem

Diamond Member
Sep 22, 2005
4,025
0
0
OP...maybe you should post somewhere in your original post that you fixed the problem and restating over there how you fixed the problem...half these people are replying without reading through the thread....
 

alcoholbob

Diamond Member
May 24, 2005
6,389
468
126
I think most people are just using this space now to discuss the game.

But thanks for the veiled insult. By the way, you do realize the OP doesn't need to repeat what you've said? Unless your mute of course and you're only telepathically communicating to the OP.
 

Mem

Lifer
Apr 23, 2000
21,476
13
81
didn't read all the posts inbetween but..
1. your gpu can't handle oblivion with aa/af on, nor could the 7900gt

I can run AA with Bloom on at 1600x1200 with no problems, even on my 7800GT card.

Yes he has fixed the problem.