2560x1600 Oblivion

mazeroth

Golden Member
Jan 31, 2006
1,821
2
81
I'm absolutely addicted to Oblivion and am needing a new monitor pretty badly. I've been eyeing the Apple and Dell 30" monitors and with the recent price drops on the Dells the purchase of one is nearing. One thing I'm wondering is when do you guys feel we'll be able to play Oblivion at the 2560x1600 res, with no AA and 4-8x AF? Until then I think I'm going to hold off on the monitor upgrade. I'm currently playing Oblivion at 1600x1200 with 8xAF and no AA and it's absolutely stunning. 20-25 fps outside, 30s in cities etc. That's with 2 million pixels. The 2560x1600 is 4 million pixels, so I'm hoping the 8800 nVidias or the next gen of ATI will be able to pull it off, with a Conroe. What do you guys think?

Thanks!
 

tuteja1986

Diamond Member
Jun 1, 2005
3,676
0
0
Dell 3007FPW Monitor because apple 30" can't be adjusted.

Also ATI R600 isn't coming till early Feb so its either 8800GTS or wait till March and see Nvidia refresh and ATI launch the card.
 

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
There probably not much DX9 that you throw at it and it doesn't run it at 2560x1600.
You also get the new AA mode, and rumored performance free HDR/AA.

The extra 256MB (on top of the base 512MB) on the GTX is most likely dedicated to IQ features, hence the support for 16XAA on single cards for the first time ever.
Its been a long time coming for performance free AA. The GTS has 128MB of GDDR3 dedicated to IQ (or so the theory goes).


So if you can push 25x16 with an 8800GTX, you shouldnt have to worry too much about adding IQ options of top of it.
Should do that res with HDR+AA in Oblivion. Hopefully now that Nvidia has HDR+AA, they might patch Oblivion to support it.

Being performance-free, this will likely be "HDR+AA done right", courtesy of Nvidia. :thumbsup: I'm in for one 8800GTX on launch. But buy quick before the initial shipment runs out and your waiting with everyone else, you'll likely get the best price then as well due to higher quantity.

*salute Nvidia*
 

TecHNooB

Diamond Member
Sep 10, 2005
7,458
1
76
Originally posted by: mazeroth
I'm absolutely addicted to Oblivion and am needing a new monitor pretty badly. I've been eyeing the Apple and Dell 30" monitors and with the recent price drops on the Dells the purchase of one is nearing. One thing I'm wondering is when do you guys feel we'll be able to play Oblivion at the 2560x1600 res, with no AA and 4-8x AF? Until then I think I'm going to hold off on the monitor upgrade. I'm currently playing Oblivion at 1600x1200 with 8xAF and no AA and it's absolutely stunning. 20-25 fps outside, 30s in cities etc. That's with 2 million pixels. The 2560x1600 is 4 million pixels, so I'm hoping the 8800 nVidias or the next gen of ATI will be able to pull it off, with a Conroe. What do you guys think?

Thanks!

Have no fear, better games are on the way :)
 

Dainas

Senior member
Aug 5, 2005
299
0
0
If ATI had dabbled in quad core, and actualy gotten it to work (unlike nvidia) I'm pretty sure four X1950 cores would be able to handle it nicely.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: Crusader
There probably not much DX9 that you throw at it and it doesn't run it at 2560x1600.
What?
You also get the new AA mode, and rumored performance free HDR/AA.
I hope that's true.
The extra 256MB (on top of the base 512MB) on the GTX is most likely dedicated to IQ features, hence the support for 16XAA on single cards for the first time ever.
Actually I believe it has more to do with the amount of MB's the rame modules can hold and the number of GDDR3 modules the G80 is supposed to have.
Its been a long time coming for performance free AA. The GTS has 128MB of GDDR3 dedicated to IQ (or so the theory goes).
Like I said, if true that would be interesting to see how they implement that and control it.
So if you can push 25x16 with an 8800GTX, you shouldnt have to worry too much about adding IQ options of top of it.
Should do that res with HDR+AA in Oblivion. Hopefully now that Nvidia has HDR+AA, they might patch Oblivion to support it.
I'm anxious for Nvidia's first attempt at true HDR+AA.
Being performance-free, this will likely be "HDR+AA done right", courtesy of Nvidia. :thumbsup:
And the R600 is not going to support HDR+AA? More like courtesy of Nvidia getting their heads out of their arse and keeping up with the competition. If they don't support HDR+AA this round they would be left in the dust. They really had no choice beings how ATI was already doing it.
I'm in for one 8800GTX on launch. But buy quick before the initial shipment runs out and your waiting with everyone else, you'll likely get the best price then as well due to higher quantity.
What does this advertisment have to do with Oblivion being able to run 25x16? Are you trying to sell these things before they're on the market?
*salute Nvidia*
........well, it is green.............like a general I guess...............
 

Kromis

Diamond Member
Mar 2, 2006
5,214
1
81
I think you should be shot in the groin. :p

(There's this one episode of X-Play with Splinter Cell's Special Agent Bob and Secret Agent Steve, or whatever their names were, and they talked about addiciton to pornography and World of Warcraft and how people with those addictions should be shot in the groin.)

But uhh, 2560x1600 is like insane in the membrane for Oblivion with all high settings and AA/AF and whatnot. I don't think the G80 will handle it. Just my opinion.
 

Bateluer

Lifer
Jun 23, 2001
27,730
8
0
You could run at that res now, if you wanted to run X1950XT Crossfire or 7950GX2 SLI. There was someone in GH who posted screen shots of it a while back.
 

MrWizzard

Platinum Member
Mar 24, 2002
2,493
0
71
Im running oblivion at 2048 x 1536, not quite the res you are talking about but on a 21 inch screen I pretty much don?t need AA anymore....
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
There's no such thing as free lunch. And no free AA. I've seen claims of free 2x AA for years now, and still no video card to date provides free 2x AA, nevermind 4x AA. The g80 is not gonna change that.

The point is, that although a modern gpu can theoretically perform 2x AA for free, the additional memory bandwidth load always causes a performance drop.
 

Kromis

Diamond Member
Mar 2, 2006
5,214
1
81
Originally posted by: Modular
EA will ruin Crysis with the same spyware they ruined 2142 with : (

Uhh...

How can you say something like that? :)

Are most of the people that bash about EA the ones that just hop on the bandwagon and go along with the trend?
 

myocardia

Diamond Member
Jun 21, 2003
9,291
30
91
Originally posted by: Kromis
Originally posted by: Modular
EA will ruin Crysis with the same spyware they ruined 2142 with : (

Uhh...

How can you say something like that? :)

Are most of the people that bash about EA the ones that just hop on the bandwagon and go along with the trend?
If EA has anything at all to do with it, I can assure you that it will have spyware.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Yeah, if EA has already incorporated this tactic on one of their biggest franchises you can sure as hell bet they'll do it on a game hyped as much as Crysis.