Oblivion performance with Ati x1900xt?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
I find it both sad and comical at the same time that so many people honestly think that a Pentium D 920 could cause an X1900XT to drop into the teens in this game.

To the AMD fanboys: While your AMD64s are faster in games, THEY ARE NOT FASTER BY A LARGE ENOUGH MARGIN TO MAKE MUCH OF A DIFFERENCE IN REAL WORLD PLAY. THE DIFFERENCE BETWEEN 120FPS AND 105FPS IS NOT GOING TO AFFECT GAMEPLAY. AN ATHLON 64 X2 3800+ IS AT MOST 10-15% FASTER THAN A PENTIUM D 920. STOP WITH ALL OF YOUR USELESS BIASED INFORMATION. IF HIS GPU IS RUNNING THAT SLOWLY IN THE GAME, THEN IT IS OBVIOUSLY GPU LIMITATION AND NOT CPU. CPU LIMITATION ONLY OCCURS AT HIGHER FRAMERATES.

Perhaps the capslock will allow thickheaded AMD fanboys to learn.

For another example, Oblivion is multi threaded. A Pentium D 920 would run it better than an Athlon 64 FX57 if the performance benefit in dual core is anywhere near what it should be.

The game is badly coded. It will have framerate drops on any machine in some scenarios. You can either wait for a patch, or you can use a bunch of tweaks that can help performance.
 

Golgatha

Lifer
Jul 18, 2003
12,396
1,068
126
Check the temps on the video card when it starts to slow down. Could just be due to overheating. Also, overclock that 920 if you have a descent HSF :).
 

Sc4freak

Guest
Oct 22, 2004
953
0
0
If you're running with AAA, turn it off or on "Performance". It's a huge performance hit when outside.
 

framerateuk

Senior member
Apr 16, 2002
224
0
0
My system does around 20fps outside at the very least (running at 1920x1200 with everything maxed, 2XAAA, 16XAF.

The only time i see slowdown is when im in the town "Anvil" looking down the main street, there are a lot of houses, people and objects, and the framerate gets to about 14/15. It doesent jump or stutter though, and the framerate is pretty constant in that area so i dont really find it much of a problem.

I dont find i need an uber-high framerate in Oblivion (heck, i didnt have a smooth framerate in Morrowind until about 2 years later :)) as the game doesent appear fast paced enough to require one, although if you do get attacked by a lot of things at once, in the woods, you may notice quite a slowdown.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
920 is a great overclocker. All it takes is 15 min of cranking the FSB, getting to about 3.6ghz and seeing if there is any improvement. As other suggested, try lowering the resolution. If the game still stutters at 800x600 or 1024x768 we know it's neither the cpu or gpu and has to be some other issue. Get the card to run at stock XT speeds. Maybe it's overheating.
 

framerateuk

Senior member
Apr 16, 2002
224
0
0
If your getting stuttering, push the grass distance to max, aswell as the object and tree distance, i found this smoothed out my gameplay no end, but it will lower your fps. Looks like Oblivion loads more into memory at once and doesent have to lad from the HD so often.

Make sure to check out the Tweakguides Oblivion tweak guide too.
 

imported_ST

Senior member
Oct 10, 2004
733
0
0
Originally posted by: munky
Originally posted by: Dethfrumbelo
Munky was having the same issue. Maybe he's figured it out by now.

That would be me. I was playing at 1280x960 with maxed out settings (HDR but no AA) and the frames would sometimes drop into the 20's outside, but never into the teens. At other times it would not go below 30, so this game is inconsistent in performance. Right now I'm playing at 1024x768 with HDR, 2xAAA, HQ AF, max settings, and it never goes below 25fps, but I have a 2.4ghz dual core opteron, so I guess the P-D will have even lower minimum fps.

I would say this game just needs a patch to imporve performance and fix some bugs, because it occasionally crashes, either on exit or during gameplay.


sorry for the OT, but I am curious munky...why no HDR + AA? I thought it was ONE of the main reasons to buy the X1900s at 1280x960?
 

Arkane13131

Senior member
Feb 26, 2006
412
0
0
Here are the results from a test I performed awhile back. Ill redo the tests with my system overclocked soon, but I am using different drivers and .ini tweaks so it's abit of work just to get my system back to the way it was for accuracy's sake. Anyway thought you might find it interesting.
---------------


I decided to perform a series of simple tests on how my machine runs oblivion. Personally I did this just to answer a few of my own questions, and posted it so that anyone else can use the results for refrence.

I used the in game console command tdt to monitor the FPS.

I use nvidia temp logger to monitor my graphics card's temp.

I used windows taskmanager to monitor the CPU usage. Im not sure exactly how to get a point by point high/low % reading from it so I will not offer exact % cpu usage scores. Instead I will use something easier to eyeball...

There are 5 squares on my task manager cpu monitoring chart. So I will be able to make a solid guess down to 1/10 of the usage, and thus will report in that style.

Computer Information

Windows XP Professional

AMD 3800+ x-2 dual core at stock speed 2009.2 MHZ
Latest Driver from AMD no hotfix / registry editing.

EVGA 7900gt CO superclock at 550MHz Core and 1580Mhz (effective) Memory
Nvidia Beta 84.25 Drivers Image Setting to "Quality" AF + AA = Application Controlled, and "Global" Profile.

1 gig of craptastically cheap ram 2.5 4 4 8 (2x 512 both different companies) 200.9mhz x2
400.18 effective. Set at 1T command rate.

EVGA SLI Nforce4 Mobo latest nvidia drivers

Antec Smartpower 2.0 500w PSU

Western Digital Caviar HD 80 gig

I use the onboard mobo sound.

Thermaltake Tsunami Dream case

I used the .exe hack from www.gameburnworld.com so that I do not need to have my CD in the drive to play. I do this so that I can keep the original game disc untarnished and also to decrease load times ever so slightly. I only say this as I want to include all the information I can about how this test was performed, though this will not increase or decrease FPS during this test.

Room temperature is slightly warm at 77F

Comp Temps
----------
Since I have the everest trial version I only can offer my idle temperatures for motherboard
and cpu

All Temps Idle
Mobo = 27C (81F)
CPU = 28C (82F)

GPU = 39C

To perform these tests I fast traveled to Shadeleaf Copse and left the character facing the exact direction it naturally faces coming from the Imperial City Palace. I chose this location because of the vast amount of trees, grass, and water. The ingame time was 2:24pm, and there was no rain. Just a typical sunny day.

I used 0 mods for this test, and the .ini file was default.

I save the game and reload for all the tests.
Each Test is run until the automatic 3rd person camera takes over.
Because this is such a short amount of time, it sort of makes the peak GPU temperature useless, although the temperature seems to flatline by this time. After 30 Minutes of gameplay or even 10 minutes expect temps to be much higher, especially in complex combat situations. Mostly I included them to see if we notice anything interesting.
During this test no NPC or animals entered the scene.

I'm sorry in advance for only reporting up to 1280x1024 resolution, but it is the max my monitor goes to. If you send me a better monitor I would be happy to report the test results to you for it's highest resolution.

Test Scenario 1
---------------

All ingame sliders were set to their maximum including all the sound sliders.
Texture size is set to Large
HDR Lighting
Everything was turned on or at its highest setting except for self shadows as that function is currently bugged, and beards (yes beards..) all the women -.-;
Cool N Quiet is set to off.


Results:

640x480 28-29 FPS core1 at 90% core2 always under 80% GPU peaked at 54C

800x600 26-27 FPS core1 at 90% core2 always under 80% GPU peaked at 54C

1024x768 22-23 FPS core1 was usually under 80% while core2 bounced a little above and below 90% GPU temp peaked at 58C

1280x1024 19-20 FPS core1 was at 80% usage the other was at 90% GPU temp peaked at 57C


Since this was the first series of tests I performed... I did them all 2 times -.-; and both times the results were the same.



Test Scenario2
--------------
All ingame sliders were set to their minimum including all sound sliders.
Texture size is set to small
No HDR No Bloom No AA
Everything was turned off, or at its lowest possible setting.
Cool N Quiet is set to off.

A quick note... with the settings like this..all you see is some ugly water, a rock, and the sky covers the horizon where trees and grass used to be.

640x480 108-139 FPS Core1 peaked at 70% core2 Peaked at 60% GPU peaked at 45C

800x600 101-137 FPS core1 peaked at 80% core2 Peaked at 60% GPU peaked at 48C

1024x768 108-139 FPS Core1 peaked at 80% core2 peaked at 60% GPU peaked at 52C
Interesting that the FPS low and high were the same as 640x480..well interesting if your a computer dork.

1280x1024 111-150 FPS (150fps for over 3 seconds) Core1 Peaked at 85% core2 Peaked at 60% GPU peaked at 59C o_O


well i bet running these tests 2 times would vary the results, however...i have alot more testing to do..so I may update this later with those variations if anyone gives a rats ass.
for all i know, no one will care i took the time to do this whole thing x.x, atleast i care to know :D =P


Test Scenario3
--------------

Same ingame settings as test scenario 1, except that I decided to run 2 instances of Prime95. Only the maximum and minimum resolutions will be featured for this test.
Because I am running 2 Prime95's both cores were at a constant 100% usage.

640x480 25-31FPS (usually 28-30) Core1 and Core2 Constant 100% usage GPU peaked at 59C

1280x1024 19-20 FPS Core1 and Core2 constant 100% usage GPU peaked at 59C again


Test Scenario4
--------------
All settings the same as test 1 except cool n quiet is turned on.
I only tested the highest resolution.


1280x1024 17-20 fps but a very very consistent 19. the strange thing is...both cores were consistent with each other and it looks exactly like this.. -_- going from 80 to 60 to 80% usage. This is for a decent period of time. Seems like after the game stabalized it drops to 60% usage and holds until the 3rd person camera kicked in. GPU peaked at 58C


Test5
-----

Same as test2 except cool n quiet is turned on.
I only tested the lowest resolution.


640x480 104-139 FPS Both cores at 60% usage GPU peaked at 46C


Test6
-----
For my final test, I decided to run the game at all medium settings.
all sliders in the middle except for the ones that limit viewdistance.
textures on medium. Cool N quiet On.
All settings that only had on or off...or high or low.. options were set to on or high.
All settings that feautred off normal or high were set to normal.
Bloom lighting was used.

1024x768 35-36 FPS Core1 80% Core2 70% GPU peaked at 57C

I am sure that if i turned off the shadows on grass effect and the treeshadows and set the shadow slider bars down lower I could dramatically improve performance without a whole lot of visual degredation. argh screw it...ill just test it.

Test7 (overtime)
-----
All settings same as test6 except for shadow sliders set down to 20% Tree canopy shadows off and shadows on grass off.

1024x768 38-40fps cpu1 90% cpu2 60% GPU peaked at 58C


There you have it.

There is alot of tweaking you can do to increase performance...such as lowering the height of the grass or screwing with the .ini file for hours and hours. But out of the box...with the recommended oblivion nvidia drivers...these are my results.

some people claim to be getting 60fps outside on a radeon 9800pro with all settings maxed.
some people claim to be getting 10fps on x1900xt outside under similar circumstances with all settings maxed... (in 1280x1024 res)

in anycase it is evident there are some things wrong with this coding.

I recommend anyone with a dual core system to apply the multithreading .ini tweaks.
Also if anyone out there isn't using the moon-->deathstar mod shame on you...
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
How do you lower the height of the grass? That grass is so annoying, I kill something from a distance, and then I wonder around looking for it, until eventually I have to turn off the grass to see the dead mob. Or when the attack music starts playing and you have no idea where in that tall grass the mob is coming from.
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
Originally posted by: dguy6789
AN ATHLON 64 X2 3800+ IS AT MOST 10-15% FASTER THAN A PENTIUM D 920. STOP WITH ALL OF YOUR USELESS BIASED INFORMATION. IF HIS GPU IS RUNNING THAT SLOWLY IN THE GAME, THEN IT IS OBVIOUSLY GPU LIMITATION AND NOT CPU. CPU LIMITATION ONLY OCCURS AT HIGHER FRAMERATES.

Perhaps the capslock will allow thickheaded AMD fanboys to learn.
performance.


You are both right and wrong. Most first person shooters are not CPU intensive, so are no longer blocked on CPU anywhere over a 3200+ or so. So a 2.8 P4 is 10-15% slower (because clock % difference only translates to about 3/4 % performance difference) than that cpu. Or any other faster cpu.

Oblivion is a whole new animal. It really *DOES* want a fast cpu. If you don't believe me, underclock your CPU to 2.8 ghz if it's a P4, or like 1.6 if it's an amd64. You *WILL* see a giant performance hit. A 3200+ appears to not be sufficient, it seems to want a 3400+ or faster.

 

moonboy403

Golden Member
Aug 18, 2004
1,828
0
76
i'm glad i didn't go with dual core after all

given that i game at 1680 x 1050, dual core won't do anything for me in games
 

framerateuk

Senior member
Apr 16, 2002
224
0
0
Originally posted by: munky
Or when the attack music starts playing and you have no idea where in that tall grass the mob is coming from.


I love the way that happends :)

I got stalked by a lion yesterday, and i didnt see it aproaching until it pounced out of the grass :), i nearly died, but i was impressed ;)
 

deadseasquirrel

Golden Member
Nov 20, 2001
1,736
0
0
Originally posted by: moonboy403
i'm glad i didn't go with dual core after all

given that i game at 1680 x 1050, dual core won't do anything for me in games

Exactly. I held off purchasing an Opty 165 simply because I wanted to see how much benefit dual-core CPU would have with this particular game. I, for the most part, don't do anything else with my computer that would take advantage of dual-core. While I do encode the occassional video, my system is only used for simple web/email and gaming. I'll be keeping my A64 3000+ for a little while longer it seems. Though, I am sure, my next chip upgrade in the future will likely be a multi-core chip. But that won't be until dual-core actually has some benefits*** at the settings I play at-- 1600x1200 HDR 4xAA, 16xHQAF. To me, that extra $170+ for a 165 would be better spent on a faster GPU solution.

***I do not want to neglect to mention that many users report a better "feel" when gaming with the dual-core chips. While benchmarks might not show an increase in min/max/avg fps, users say the gaming experience is noticeably better since they upgraded (even at high resolutions). Normally, I would want "proof" such as benchmarks that show tangible results, but I've experienced something similar with adding another GB of ram for BF2. It stuttered and hitched at max settings 16x12 with 1GB. 2GB made all that go away, yet benchmarks showed no increase in fps at all.