Forcing Anti-aliasing without giving up HDR on Oblivion

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

benlogan87

Junior Member
Apr 10, 2006
1
0
0
I just installed the HDR patch from ATI etc... It works great except in the forests where the frame rate is like 25, i have the X1900XTX, AMD x2 4800.

Anyway, for the first 2 minutes or so the game looks like this, EXACTLY like it is in the picture, except it is flashing constantly. It will then occasional start again, then stop etc...

http://www.mediarogue.com/screen.jpg

When I disabled catalyst AI it stops, but then I lose the AA + HDR.

-------------------------------------------------------------------------------------------------

Ok, I fixed the huge pink border. I just removed my tweaked INI and did a default. HDR+AA works, with occasional black flashing, outdoors.

Wonder what was in my INI it didnt like?

-------------------------------------------------------------------------------------------------


I have a VX924 - 19inch. Does having vertical sync enabled cause this flashing?
I have confirmed that it only happens when in really bright areas, or areas with lots of lighting.
Anyone have any ideas?

Thanks.
 

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
Originally posted by: munky
I played Farcry with the 1.4 patch, and HDR + AA worked. For those that cant get it to work probably dont know how the enable it correctly. I didnt mind the unlimited ammo, but it did fail to load in one level, so I went back to 1.3

Also, SS2 and AOE3 use EXR HDR with AA on the x1k cards.

I was all freaked out when I got unlimited ammo too after using hdr+aa on farcry, thought I borked my install
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Originally posted by: FalllenAngell
Originally posted by: munky
Originally posted by: Sc4freak
Oblivion doesn't allow HDR + AA on any card, besides the Xbox360. Something to do with the way they did the HDR (I don't think it is FP16 HDR).

Why would it not use FP blending? The only other way I could think of doing HDR is with shaders, (like Lost Coast) but that should not interfere with AA. Either way, the lack of AA + HDR option is gay, especially for a game that was hyped up beyond belief. Either the developers got lazy, or somebody paid them a chunk of cash NOT to enable AA + HDR.

Or perhaps they made a business decision and decided the increased sales for an HDR+ AA enabled game would not outweigh the cost of making it one, due to a small user base. (most of whom will buy the game anyway)

Not everything in life is a conspiracy.

I would say this is a conspiracy.
 

MDesigner

Platinum Member
Apr 3, 2001
2,016
0
0
I'm jumping in a little late here.. I just got my 7900GT the other day, and hooked it up w/ the rest of my parts (Athlon64 X2 4400+). I forced 8xS AA in the nVidia settings, and turned on HDR in Oblivion. I can't tell if HDR is actually on or not.. the lighting looks better than when I had nothing on (neither HDR nor Bloom), but I don't know how to tell the difference between Bloom and HDR.

I suppose I'll just try 8xS with no HDR, then HDR with no AA..and whatever I think looks better, I'll keep.

Also, quick Q.. you know how in Oblivion when you're running around outside, bushes & trees draw at a certain distance? How do I mess with that distance, but without messing with the .ini file? Is there some option somewhere, or does the game auto-set this based on your graphics hardware? Incidentally, when I installed Oblivion on this new machine, it said the hardware was not recognized :) Gee.. because a 7900GT is so uncommon. Weird.
 

Ackmed

Diamond Member
Oct 1, 2003
8,498
560
126
No, you are not getting HDR+AA.

NV users, understand this, you cannot get HDR+AA.
 

Praxis1452

Platinum Member
Jan 31, 2006
2,197
0
0
Nvidia is hardware incapable of running HDR+AA I believe. HDR+AA is available in Oblivion with the chuck patch only ATI cards though.
 

Madellga

Senior member
Sep 9, 2004
713
0
0
I spent the last 2 days playing Oblivion with both cards (X1900XTX and 7900GTX).

I bought a X1900XTX last Saturday. My idea was to test drive it on Oblivion and sell the GTX, like Nitromullet did last week.

I plan to start a thread about both cards. I produced also some material (fraps videos) on Guild Wars and Kotor. I have to encode (each video is 400Mb, 18 Mb encoded with Nero to mpeg 2, must try divx also) and upload to a webpage, it will take some time. My idea is to show people shimmering and transparency AA differences between both cards.

As a starter, I can tell the following:

1) X1900 (Cat 6.4 + Chuck) can do HDR and 4xAA (nothing new here). It is very playable at 1920x1200 (HDR, 4AA, HQ AF), frames are normally above 30fps outdoors. Near grass, it drops to around 22. I replaced HDR also with bloom and I normally could not tell the difference without a screenshot. I might be blind or else, but it is actually quite close for me.

2) 7900GTX (Forceware 84.43) does well also with Bloom, 4xAA, HQ, Lod Clamp at 1920x1200. It stays also above 30fps, even near grass. IQ is good, but NEVER turn on MSAA or SSAA. This will throw the frames to 12fps near grass or when it is raining. Anyhow, no noticeable improvement. I don't recall Temporal Aliasing being an issue for the X1900, but there was no improvement either.

3) I would say that the HDR feature is a bit hyped on this game: bloom was good enough for me.

4) I found the GTX IQ more pleasant to my personal taste. Trees and Castles look better on the GTX from a long distance than with ATI. It is hard to describe and a screenshot does not show it, but there is a kind of shimmering from distance. I tried to capture it on video, but the compression loses it.

5) The X1900 was not that loud using Atitool fan control (dynamic profile) on most games, but the cooler screams on Oblivion. No changes in noise detected on the GTX - meaning it is quiet even playing Oblivion.

Summary: both cards are able to give you an awesome gaming experience with Oblivion. X1900 has the HDR advantage with similar or better performance as the 7900GTX (with bloom). On the other hand, the 7900GTX does not suck on this game as some people wants us to believe and has other advantanges on its own.

If Oblivion is the only game that matters to you, you want to save some bucks and don't care about noise, get the ATI.
If you play other games as well, values MSAA/SSAA and a quiet solution, don't mind about a few bucks more then get the GTX.

I used my wife's opinion in the decision process. My wife doesn't know what Nvidia or ATI are, but she saw the game last night (near Bruma and the Great Forest) on both cards (including HDR on the X1900) and she picked Nvidia. I didn't argue with her, I was torn between both cards and she made things easier for me. The X1900XTX was packed and returned to the store (7 day return policy) this afternoon.
 
Jun 14, 2003
10,442
0
0
7800GTX

no HDR and AA for you

only the X1800, X1900 is capable of that feat

id imagine what you are seeing is possibly the bloom effect, which is not true HDR

for the ATI owners, there is a patch, called the 'Chuck' patch which enables AA and HDR for the ATi cards (X1800/X1900). even so i would say the X1900 is the only one with the muscle to make it happen.

there is just no way your getting HDR and AA, unless they have a HDR akin to what Valve did with the Lost coast. AFAIK oblivion uses that openEXR type
 

imported_ST

Senior member
Oct 10, 2004
733
0
0
Originally posted by: Madellga
I spent the last 2 days playing Oblivion with both cards (X1900XTX and 7900GTX).

I bought a X1900XTX last Saturday. My idea was to test drive it on Oblivion and sell the GTX, like Nitromullet did last week.

I plan to start a thread about both cards. I produced also some material (fraps videos) on Guild Wars and Kotor. I have to encode (each video is 400Mb, 18 Mb encoded with Nero to mpeg 2, must try divx also) and upload to a webpage, it will take some time. My idea is to show people shimmering and transparency AA differences between both cards.

As a starter, I can tell the following:

1) X1900 (Cat 6.4 + Chuck) can do HDR and 4xAA (nothing new here). It is very playable at 1920x1200 (HDR, 4AA, HQ AF), frames are normally above 30fps outdoors. Near grass, it drops to around 22. I replaced HDR also with bloom and I normally could not tell the difference without a screenshot. I might be blind or else, but it is actually quite close for me.

2) 7900GTX (Forceware 84.43) does well also with Bloom, 4xAA, HQ, Lod Clamp at 1920x1200. It stays also above 30fps, even near grass. IQ is good, but NEVER turn on MSAA or SSAA. This will throw the frames to 12fps near grass or when it is raining. Anyhow, no noticeable improvement. I don't recall Temporal Aliasing being an issue for the X1900, but there was no improvement either.

3) I would say that the HDR feature is a bit hyped on this game: bloom was good enough for me.

4) I found the GTX IQ more pleasant to my personal taste. Trees and Castles look better on the GTX from a long distance than with ATI. It is hard to describe and a screenshot does not show it, but there is a kind of shimmering from distance. I tried to capture it on video, but the compression loses it.

5) The X1900 was not that loud using Atitool fan control (dynamic profile) on most games, but the cooler screams on Oblivion. No changes in noise detected on the GTX - meaning it is quiet even playing Oblivion.

Summary: both cards are able to give you an awesome gaming experience with Oblivion. X1900 has the HDR advantage with similar or better performance as the 7900GTX (with bloom). On the other hand, the 7900GTX does not suck on this game as some people wants us to believe and has other advantanges on its own.

If Oblivion is the only game that matters to you, you want to save some bucks and don't care about noise, get the ATI.
If you play other games as well, values MSAA/SSAA and a quiet solution, don't mind about a few bucks more then get the GTX.

I used my wife's opinion in the decision process. My wife doesn't know what Nvidia or ATI are, but she saw the game last night (near Bruma and the Great Forest) on both cards (including HDR on the X1900) and she picked Nvidia. I didn't argue with her, I was torn between both cards and she made things easier for me. The X1900XTX was packed and returned to the store (7 day return policy) this afternoon.

Nice review....I'm going to try to get some SS with AA on and off at 1920x1080 to assess whether it's even worth the framerates to enable...personally i think it's a moot point at higher res...

 

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
Originally posted by: Madellga
I spent the last 2 days playing Oblivion with both cards (X1900XTX and 7900GTX).

I bought a X1900XTX last Saturday. My idea was to test drive it on Oblivion and sell the GTX, like Nitromullet did last week.

I plan to start a thread about both cards. I produced also some material (fraps videos) on Guild Wars and Kotor. I have to encode (each video is 400Mb, 18 Mb encoded with Nero to mpeg 2, must try divx also) and upload to a webpage, it will take some time. My idea is to show people shimmering and transparency AA differences between both cards.

As a starter, I can tell the following:

1) X1900 (Cat 6.4 + Chuck) can do HDR and 4xAA (nothing new here). It is very playable at 1920x1200 (HDR, 4AA, HQ AF), frames are normally above 30fps outdoors. Near grass, it drops to around 22. I replaced HDR also with bloom and I normally could not tell the difference without a screenshot. I might be blind or else, but it is actually quite close for me.

2) 7900GTX (Forceware 84.43) does well also with Bloom, 4xAA, HQ, Lod Clamp at 1920x1200. It stays also above 30fps, even near grass. IQ is good, but NEVER turn on MSAA or SSAA. This will throw the frames to 12fps near grass or when it is raining. Anyhow, no noticeable improvement. I don't recall Temporal Aliasing being an issue for the X1900, but there was no improvement either.

3) I would say that the HDR feature is a bit hyped on this game: bloom was good enough for me.

4) I found the GTX IQ more pleasant to my personal taste. Trees and Castles look better on the GTX from a long distance than with ATI. It is hard to describe and a screenshot does not show it, but there is a kind of shimmering from distance. I tried to capture it on video, but the compression loses it.

5) The X1900 was not that loud using Atitool fan control (dynamic profile) on most games, but the cooler screams on Oblivion. No changes in noise detected on the GTX - meaning it is quiet even playing Oblivion.

Summary: both cards are able to give you an awesome gaming experience with Oblivion. X1900 has the HDR advantage with similar or better performance as the 7900GTX (with bloom). On the other hand, the 7900GTX does not suck on this game as some people wants us to believe and has other advantanges on its own.

If Oblivion is the only game that matters to you, you want to save some bucks and don't care about noise, get the ATI.
If you play other games as well, values MSAA/SSAA and a quiet solution, don't mind about a few bucks more then get the GTX.

I used my wife's opinion in the decision process. My wife doesn't know what Nvidia or ATI are, but she saw the game last night (near Bruma and the Great Forest) on both cards (including HDR on the X1900) and she picked Nvidia. I didn't argue with her, I was torn between both cards and she made things easier for me. The X1900XTX was packed and returned to the store (7 day return policy) this afternoon.

Excellent post, nice to see some unbiased and honest opinions. Ironic that when that happens, NV comes out on top :thumbsup: Considering all the negative words going around about NV+HDR+AA.
Not a surprise at all though. Glad to hear you got to fully test both cards and come to that conclusion, I think the majority of people would come to the same conclusion as well.

The blind "wife test" was the real kicker to it all. No bias at all there. :thumbsup:
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: benlogan87
I just installed the HDR patch from ATI etc... It works great except in the forests where the frame rate is like 25, i have the X1900XTX, AMD x2 4800.

Anyway, for the first 2 minutes or so the game looks like this, EXACTLY like it is in the picture, except it is flashing constantly. It will then occasional start again, then stop etc...

http://www.mediarogue.com/screen.jpg

When I disabled catalyst AI it stops, but then I lose the AA + HDR.

-------------------------------------------------------------------------------------------------

Ok, I fixed the huge pink border. I just removed my tweaked INI and did a default. HDR+AA works, with occasional black flashing, outdoors.

Wonder what was in my INI it didnt like?

-------------------------------------------------------------------------------------------------


I have a VX924 - 19inch. Does having vertical sync enabled cause this flashing?
I have confirmed that it only happens when in really bright areas, or areas with lots of lighting.
Anyone have any ideas?

Thanks.

Wow, this forum is pretty crappy... No one could stop bickering long enough to answer this guy... His first post too...

Black flashing is bad... It could be caused by overheating (or your monitor's power cable is loose). I had a very simlar issue with an X1900XTX as well. Aside from the black flashing do you occasionally see stretched triangles and diamonds flashing on the screen? A good way to stress your card to see if it overheating is to run 3DMark06. With my defective card, I would get serious artifacts and flashes like you described on the Canyon Flight test (the one with the airship and the dragon, sea monster thing). If you start to see flashes and/or other distortion, that means that your card is overheating or is otherwise damaged. btw... you can hit esc to cancel the demo at any time if you get visual artifacts. If you don't esc, the artifacts will get worse and your box will eventually crash.

What kind of X1900XTX do you have btw...?

Oh yeah, and welcome to AT fourms... I think you will find that people are generally more helpful than they were in the this thread. You might actually want to start a new thread for issues like this in the future. There are a lot of people on here who are way more seasoned overclockers than I am, so they know a lot more about artifacts than me. With the proper description or a srceenshot, they can tell you what is overheating - the gpu or the memory. Good luck with this, I hope that it turns our being a configuration issue because having a busted card sucks.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: Madellga
I spent the last 2 days playing Oblivion with both cards (X1900XTX and 7900GTX).

I bought a X1900XTX last Saturday. My idea was to test drive it on Oblivion and sell the GTX, like Nitromullet did last week.

I plan to start a thread about both cards. I produced also some material (fraps videos) on Guild Wars and Kotor. I have to encode (each video is 400Mb, 18 Mb encoded with Nero to mpeg 2, must try divx also) and upload to a webpage, it will take some time. My idea is to show people shimmering and transparency AA differences between both cards.

As a starter, I can tell the following:

1) X1900 (Cat 6.4 + Chuck) can do HDR and 4xAA (nothing new here). It is very playable at 1920x1200 (HDR, 4AA, HQ AF), frames are normally above 30fps outdoors. Near grass, it drops to around 22. I replaced HDR also with bloom and I normally could not tell the difference without a screenshot. I might be blind or else, but it is actually quite close for me.

2) 7900GTX (Forceware 84.43) does well also with Bloom, 4xAA, HQ, Lod Clamp at 1920x1200. It stays also above 30fps, even near grass. IQ is good, but NEVER turn on MSAA or SSAA. This will throw the frames to 12fps near grass or when it is raining. Anyhow, no noticeable improvement. I don't recall Temporal Aliasing being an issue for the X1900, but there was no improvement either.

3) I would say that the HDR feature is a bit hyped on this game: bloom was good enough for me.

4) I found the GTX IQ more pleasant to my personal taste. Trees and Castles look better on the GTX from a long distance than with ATI. It is hard to describe and a screenshot does not show it, but there is a kind of shimmering from distance. I tried to capture it on video, but the compression loses it.

5) The X1900 was not that loud using Atitool fan control (dynamic profile) on most games, but the cooler screams on Oblivion. No changes in noise detected on the GTX - meaning it is quiet even playing Oblivion.

Summary: both cards are able to give you an awesome gaming experience with Oblivion. X1900 has the HDR advantage with similar or better performance as the 7900GTX (with bloom). On the other hand, the 7900GTX does not suck on this game as some people wants us to believe and has other advantanges on its own.

If Oblivion is the only game that matters to you, you want to save some bucks and don't care about noise, get the ATI.
If you play other games as well, values MSAA/SSAA and a quiet solution, don't mind about a few bucks more then get the GTX.

I used my wife's opinion in the decision process. My wife doesn't know what Nvidia or ATI are, but she saw the game last night (near Bruma and the Great Forest) on both cards (including HDR on the X1900) and she picked Nvidia. I didn't argue with her, I was torn between both cards and she made things easier for me. The X1900XTX was packed and returned to the store (7 day return policy) this afternoon.

Well, I'm glad to have inspired someone...

Interesting though that my opinions are almost exactly opposite of your's. After I had artifacting issues with an HIS X1900XTX, I picked up dual 7900GT's to replace it, but they really pretty much sucked for Oblivion. They were lacking in performance for not letting me have HDR + AA IMO, and the real kicker was that they tended to hard lock after about 20-30 minutes of game play. They also had issues in WoW...

Long story short, I got rid of them and I'm back to an X1900XTX. The noise is definitely a downside, but performance and IQ is ultimatley king IMO. I didn't try the "blind wife" test though... Also, as many irritating stability issues as I had with the 7900GT's, I didn't have any of these issues with the 7900GTX's. The GTX's were stable in single and dual card config (I couldn't have sold them on ebay with clean conscience if they sucked as bad as the 7900GT's). Honestly, I was very shocked and quite disappointed with what I got. They were eVGA 7900GT CO Superclocks, by the way. I wonder if maybe eVGA is expecting more out of these cards than they are willing and able to give.

Glad that we're both ultimately happy with our decisions, as that's pretty much what counts...

Excellent post, nice to see some unbiased and honest opinions. Ironic that when that happens, NV comes out on top :thumbsup: Considering all the negative words going around about NV+HDR+AA.
Not a surprise at all though. Glad to hear you got to fully test both cards and come to that conclusion, I think the majority of people would come to the same conclusion as well.

The blind "wife test" was the real kicker to it all. No bias at all there. :thumbsup:

Geforce 7- The cool, quiet, low power consumption solution with the best single slot card available, superior multiGPU implementation and driver support in Windows and Linux, all while holding the performance lead.
Unfortunately as of today, both ATI and NV cards still shimmer.

Riva128/TNT/TNT2/GF/GF2/GF3/GF4/GF6/GF7=NV Domination over ATI. Sorry!

7900GT in rig. BFG 7900GTX OC on the way.

LOL!!! interesting that you would be even using the term unbiased. You do know what that means, right?
 

Ackmed

Diamond Member
Oct 1, 2003
8,498
560
126
Yeah, his sig is pathetic. Pretty sad someone can be that bias, and ignorant to the facts at once. You'd do better to just ignore his posts, as I have done. Ignored his PM's to me as well.
 

Madellga

Senior member
Sep 9, 2004
713
0
0
Nitromullet, thanks for sharing your experiences also.

Despite any quality issues you might have had with the GT's, I would say one issue was the 256MB. For this game, you really need the additional memory at high resolution and all goodies on.

I had before a 7800GT SLI and although it was working rather well, there were times they would play a trick on me (they would eventually freeze on WoW). I prefer a fast single card, less issues to deal with.

The deal for me was not Oblivion (in which the X1900 seems to have the lead), it was Guild Wars. I have some material to show later this week (probably over the weekend), the game looked for me better on the 7900 GTX. If I hadn't seen/compared Guild Wars on the X1900 and 7900GTX, I would have opted for the X1900.

At the end, what I really wanna say is that both cards are fine and there is no need to clash on the forums about what one can do and the other don't.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
At the end, what I really wanna say is that both cards are fine and there is no need to clash on the forums about what one can do and the other don't.

Agreed, I've already chalked up my experiences with those two GT's as a fluke or possibly an eVGA issue with the Superclocks. As I said, the 7900GTX's didn't have any of the stability issues that the GT's had. I also had issues with an HIS X1900XTX, so I won't wholeheartedly recommend the X1900XTX yet until this new one has proven itself.
 

MDesigner

Platinum Member
Apr 3, 2001
2,016
0
0
Quick question.. what is SSAA and MSAA? I know it's two types of anti-aliasing. Just a note to nVidia users: if you force AA in the nVidia control panel, Oblivion overrides it. So you cannot get 8xS in Oblivion. YOu have to use their AA options.

I wound up choosing 4x AA and Bloom, over no AA & HDR. HDR looks nice, but those jaggies just bug the hell out of me. AA makes the Oblivion world just look more realistic & organic. And Bloom isn't too bad.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,000
126
please name a game where this is true (aside from valve games which use the shader method which also works w/ nv hardware).
Far Cry, Serious Sam 2, Oblivion.

Or perhaps they made a business decision and decided the increased sales for an HDR+ AA enabled game would not outweigh the cost of making it one,
:roll:

It took Chuck about six hours to make the patch.
 

gobucks

Golden Member
Oct 22, 2004
1,166
0
0
HDR + AA is available on X1K series ATI GPUs, but it requires the patched 6.3 version of the catalyst drivers. ATI's GPUs can handle HDR + AA because of the way that their shaders are set up, which is different from nvidia, who uses a more traditional approach to shaders. nVidia cards simply dont have the shader flexibility to be able to perform HDR and antialiasing simultaneously. I think G80 will be the first chip that is able to do this.
 

akugami

Diamond Member
Feb 14, 2005
6,210
2,550
136
The ultimate irony. Crusader complimenting someone on his fair and impartial review. The fact the reviewer kept the nVidia card had nothing to do with it I'm sure.

Good Job on the mini-review Madellga. While I think only the true fanboys were saying the performance on the nVidia 7900 cards sucked (they don't) I think what most are saying is that the ATI X1900's edge them out by a bit. My bro just got the 7900GT and that thing was not quiet at all. It was more than bearable though and I'm going to assume my brand new X1900XT when it gets here will be just as bad and probably worse. Just crank up the sound I guess. I'll give my impressions on the game when it gets here but I'm going to assume it'll be much like the HardOCP Review. http://enthusiast.hardocp.com/article.html?art=MTAzMywxLCxoZW50aHVzaWFzdA==

NOTE: Bought the X1900 because they were cheap at $379 shipped from Dell. The 7900GTX was $415'ish I think after tax and the 7900GT which I had wanted to buy and overclock after volt modding it was a two month wait.

EDIT: Forgot to add that the HardOCP review is between an X1900 and a 7800GTX 512MB. Shouldn't matter too much since the 7900GTX and the 7800GTX 512MB are almost the same card performance and feature wise.
 

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
Originally posted by: nitromullet
LOL!!! interesting that you would be even using the term unbiased. You do know what that means, right?

I never said I didnt know the truth behind the video cards on the market.

Im saying that his opinion and experience comes from completely unbiased sources.. and it went Nvidia.
Is that a problem for you? Or would you prefer to deflect the issue away from his story and results? ;)

I'm glad hes happy, and you can tell this guy doesnt have NV or ATI bias like Ackmed (who I can guarantee wasnt an enthusiast in the Voodoo1 days or previous to that) who will stop at nothing to destroy Nvidia and any of its supporters.

Its just nice to hear a story that goes against all the "facts" that Team Red(neck) ;) think are so undeniable.. like how X1800/x1900 are the supreme choice due to HDR+AA using hacks. Woot?
 

beserker15

Senior member
Jun 24, 2003
820
0
0
why does the threads about hdr often tend to turn towards a fight between ati and nvidia ppl? it seems at first a mere fact that ati's new gen are the only ones that can do hdr+aa and some guy stating that his wife picked nvidia without knowing anything about computers. then others started putting in their own opinions and it becomes another war between the 7900 and the x1900... trying to stick to the topic, i want to repeat that yes, the nvidia cards now cannot use oblivion's default hdr+aa, but for those that still wants aa, try THIS MOD . It works for all dx9 cards to enable a different form of hdr and can be tweaked to your liking. I have an x800gto2 and run oblivion great at 1024 res w/ fake hdr and 6xaa, 16xaf.
 

framerateuk

Senior member
Apr 16, 2002
224
0
0
Originally posted by: Madellga
5) The X1900 was not that loud using Atitool fan control (dynamic profile) on most games, but the cooler screams on Oblivion. No changes in noise detected on the GTX - meaning it is quiet even playing Oblivion.

My X1900 screams every few minutes or so on Oblivion, i kinda wish they just set the fan to a constant speed instead of just blasting it every few minutes or so.

I used the overclocking options in the CCC for my X1900 XTX. Strangely, every game (that ive tried anyway, Quake 4, Doom 3, NFS:MW, Fear) works fine with the max overclock except Oblivion. After a few mins of playing Oblivion crashes and CCC pops up and tells me that the card wasnt behaving properly, and then lowers the overclock. Seems to work fine with the core overlclocked, but i have to keep the memory lower.

Still thats better than what happened with my previous motherboard/case (which was a Shuttle XPC), it just crashed on there and reset the PC. At least on my new setup the box remains stable.
 

Barkotron

Member
Mar 30, 2006
66
0
0
Yeah, Oblivion seems to kick the ever-living crap out of systems like nothing else. My X1900XT will run everything else at max Overdrive levels (700/800), but I have to pull back to 670/780 with Oblivion otherwise heat goes through the roof and ye olde VPU recover message comes up. Looks fantastic though :).