• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Video memory - do I need more?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Originally posted by: Jeff7181
I'm going to tack this on here and hopefully no one will mind... 😀

I just got Fallout 3 (Steam holiday sale ftw) and I've played for a few hours up to level 5 or 6 now. I REALLY enjoy the game, but it runs poorly. I haven't actually checked the frame rates, but I went into a sewer for a quest and there are some areas that bring the action to a screeching hault. I'd estimate 2-3 FPS when there's a "heat wave effect" in front of me. I probably average somewhere between 20-30 otherwise... but man was that annoying. I almost died a few times because it's impossible to aim accurately at a moving target at 2 or 3 frames per second. I'm rather amazed at the benchmarks results I see for Fallout 3... GTX280's getting 80+ fps average at 1920x1200 with 4XAA and 8XAF... is that accurate? Could I expect at least 60 fps average with a GTX260?

Are you trying to use ultra settings?
 
Originally posted by: Azn
Originally posted by: Jeff7181
I'm going to tack this on here and hopefully no one will mind... 😀

I just got Fallout 3 (Steam holiday sale ftw) and I've played for a few hours up to level 5 or 6 now. I REALLY enjoy the game, but it runs poorly. I haven't actually checked the frame rates, but I went into a sewer for a quest and there are some areas that bring the action to a screeching hault. I'd estimate 2-3 FPS when there's a "heat wave effect" in front of me. I probably average somewhere between 20-30 otherwise... but man was that annoying. I almost died a few times because it's impossible to aim accurately at a moving target at 2 or 3 frames per second. I'm rather amazed at the benchmarks results I see for Fallout 3... GTX280's getting 80+ fps average at 1920x1200 with 4XAA and 8XAF... is that accurate? Could I expect at least 60 fps average with a GTX260?

Are you trying to use ultra settings?

Good question... I'll check next time I'm at that computer (posting from my laptop). I just installed the game and let it auto detect. Whatever it chose is what I'm using.
 
Originally posted by: Azn

9800gtx isn't too far off from 260 though. Lot of people are mistaken that 260 is so much faster which isn't true.
I disagree with that. The 8800 Ultra is faster than a 9800GTX and based on benchmarks that I?ve run, the GTX260+ is about 20% - 30% faster with 4xAA than the 8800 Ultra, if not more.

Also I would not bother with a 512 MB card if you?re upgrading now.
 
Originally posted by: BFG10K
Originally posted by: Azn

9800gtx isn't too far off from 260 though. Lot of people are mistaken that 260 is so much faster which isn't true.
I disagree with that. The 8800 Ultra is faster than a 9800GTX and based on benchmarks that I?ve run, the GTX260+ is about 20% - 30% faster with 4xAA than the 8800 Ultra, if not more.

Also I would not bother with a 512 MB card if you?re upgrading now.

You got the 216 version though.

Another 5-10%+ added into the mix.
 
Originally posted by: Azn

You got the 216 version though.

Another 5-10%+ added into the mix.
Quite true, but the 8800 Ultra is also 5% - 10% faster than a 9800GTX so it ends up being like a comparison between a 9800GTX and a regular 260.
 
Originally posted by: BFG10K
Originally posted by: Azn

You got the 216 version though.

Another 5-10%+ added into the mix.
Quite true, but the 8800 Ultra is also 5% - 10% faster than a 9800GTX so it ends up being like a comparison between a 9800GTX and a regular 260.

I wouldn't go far as 9800gtx being slower than the ultra. 9800gtx is stronger than the ultra when AA is not applied and depending on the game 9800gtx is on par, slightly slower or even superior than the ultra even with AA.
 
Originally posted by: Azn
Originally posted by: BFG10K
Originally posted by: Azn

You got the 216 version though.

Another 5-10%+ added into the mix.
Quite true, but the 8800 Ultra is also 5% - 10% faster than a 9800GTX so it ends up being like a comparison between a 9800GTX and a regular 260.

I wouldn't go far as 9800gtx being slower than the ultra. 9800gtx is stronger than the ultra when AA is not applied and depending on the game 9800gtx is on par, slightly slower or even superior than the ultra even with AA.

True... I guess... But since I consider AA extremely important, I would say the Ultra is faster. I think AA should pretty much be a requirement in games. Well, requirement would be a bit much, but an encouragement. That is how I feel about AA.
 
I'm about to upgrade my video card too since I just got a 1080p monitor. This discussion is kind of swaying me to get something with more vram. I was considering a used 9800gtx for like a $100. I don't need the best of performance considering I was very happy with my 8800gs @ 1440x900. I'm not such a nit picker like some of the guys here wanting to play ultra settings on fallout 3 when high settings look just as good. I would even be content with 2xAA or no AA at all considering I got a 1080p monitor with very low pixel pitch. More would be merrier of course.
 
Originally posted by: Azn
I'm about to upgrade my video card too since I just got a 1080p monitor. This discussion is kind of swaying me to get something with more vram. I was considering a used 9800gtx for like a $100. I don't need the best of performance considering I was very happy with my 8800gs @ 1440x900. I'm not such a nit picker like some of the guys here wanting to play ultra settings on fallout 3 when high settings look just as good. I would even be content with 2xAA or no AA at all considering I got a 1080p monitor with very low pixel pitch. More would be merrier of course.

Actually, the less picky people are, the happier they tend to be... I wish I wasn't picky about AA. As far as high resolution negating the need for AA. For me personally, I dissagree heavily with that since you see the shimmering effect around all straight lines. It almost looks like sparkles to me now... Funny because before I ever used AA, I probably would have never noticed the shimmering, probably because I would thought it was 'normal'.
 
Originally posted by: Azn

I wouldn't go far as 9800gtx being slower than the ultra.
With AA it is:

http://www.computerbase.de/art...rmancerating_qualitaet

Overall in all tested games, with 4xAA, the 8800 Ultra is 5% faster @ 1280x1024, 7% faster @ 1600x1200 and 32% faster @ 2560x1600.

With 8xAA the 8800 Ultra is 9% faster @ 1280x1024, 48% faster @ 1600x1200 and 379% faster @ 2560x1600.

So yes, the 8800 Ultra is faster with 4xAA and massively faster with 8xAA.
 
Originally posted by: ArchAngel777
Originally posted by: Azn
I'm about to upgrade my video card too since I just got a 1080p monitor. This discussion is kind of swaying me to get something with more vram. I was considering a used 9800gtx for like a $100. I don't need the best of performance considering I was very happy with my 8800gs @ 1440x900. I'm not such a nit picker like some of the guys here wanting to play ultra settings on fallout 3 when high settings look just as good. I would even be content with 2xAA or no AA at all considering I got a 1080p monitor with very low pixel pitch. More would be merrier of course.

Actually, the less picky people are, the happier they tend to be... I wish I wasn't picky about AA. As far as high resolution negating the need for AA. For me personally, I dissagree heavily with that since you see the shimmering effect around all straight lines. It almost looks like sparkles to me now... Funny because before I ever used AA, I probably would have never noticed the shimmering, probably because I would thought it was 'normal'.

I've been playing GTA4 and it doesn't have AA. I do see some shimmering but not enough to annoy me. Some games need it badly and others not so much. It's like when you've been drinking champagne all your life and going back to drinking beer. Much like your eyes have to adjust and it becomes less of a nuisance.

Lower pixel pitch would help anyway. I got a 22" 1080P and those jaggies would be much smaller and 2xAA would be same as 4xAA on a 24" monitor.
 
Originally posted by: Azn
Originally posted by: ArchAngel777
Originally posted by: Azn
I'm about to upgrade my video card too since I just got a 1080p monitor. This discussion is kind of swaying me to get something with more vram. I was considering a used 9800gtx for like a $100. I don't need the best of performance considering I was very happy with my 8800gs @ 1440x900. I'm not such a nit picker like some of the guys here wanting to play ultra settings on fallout 3 when high settings look just as good. I would even be content with 2xAA or no AA at all considering I got a 1080p monitor with very low pixel pitch. More would be merrier of course.

Actually, the less picky people are, the happier they tend to be... I wish I wasn't picky about AA. As far as high resolution negating the need for AA. For me personally, I dissagree heavily with that since you see the shimmering effect around all straight lines. It almost looks like sparkles to me now... Funny because before I ever used AA, I probably would have never noticed the shimmering, probably because I would thought it was 'normal'.

It's like when you've been drinking champagne all your life and going back to drinking beer. Much like your eyes have to adjust and it becomes less of a nuisance.

That may be true, but I don't think they should be there just from the fact that real life objects don't have aliasing all over them.

 
Originally posted by: BFG10K
Originally posted by: Azn

I wouldn't go far as 9800gtx being slower than the ultra.
With AA it is:

http://www.computerbase.de/art...rmancerating_qualitaet

Overall in all tested games, with 4xAA, the 8800 Ultra is 5% faster @ 1280x1024, 7% faster @ 1600x1200 and 32% faster @ 2560x1600.

With 8xAA the 8800 Ultra is 9% faster @ 1280x1024, 48% faster @ 1600x1200 and 379% faster @ 2560x1600.

So yes, the 8800 Ultra is faster with 4xAA and massively faster with 8xAA.

I never denied Ultra wasn't better with AA. I said some games 9800gtx wins even with AA over the ultra. I guess ultra wins more with AA in average. 2560 is more of a vram abnormally than anything else.
 
Originally posted by: Jeff7181
I'm going to tack this on here and hopefully no one will mind... 😀

I just got Fallout 3 (Steam holiday sale ftw) and I've played for a few hours up to level 5 or 6 now. I REALLY enjoy the game, but it runs poorly. I haven't actually checked the frame rates, but I went into a sewer for a quest and there are some areas that bring the action to a screeching hault. I'd estimate 2-3 FPS when there's a "heat wave effect" in front of me. I probably average somewhere between 20-30 otherwise... but man was that annoying. I almost died a few times because it's impossible to aim accurately at a moving target at 2 or 3 frames per second. I'm rather amazed at the benchmarks results I see for Fallout 3... GTX280's getting 80+ fps average at 1920x1200 with 4XAA and 8XAF... is that accurate? Could I expect at least 60 fps average with a GTX260?
If "New Desktop" is the rig the new card will be going into, then yes you could expect very close to 60+ FPS all the time in Fallout 3 at 1920 with 4xAA. FPS should always be above 60 indoors, with a few drops here and there depending on draw distance/direction outdoors. I'm not quite sure which heat wave effect you're referring to, but I can't remember any drops in FPS indoors, definitely not in the 2-3FPS range.
 
Isn't Fallout3 the *same* as Oblivion's Gamebryo engine?
😕
if so, it is very efficient until you load the textures to the max with the mods
- did they even optimize or build on it?

i have yet to get it 😛

 
Originally posted by: apoppin
Isn't Fallout3 the *same* as Oblivion's Gamebryo engine?
😕
if so, it is very efficient until you load the textures to the max with the mods
- did they even optimize or build on it?

i have yet to get it 😛
Yep, apparently slightly updated, not sure on the major differences as I haven't bothered with Oblivion. I just can't be bothered with games that have reputations for being mod-reliant just to be palatable. There's a few high-res packs but they don't really do much for the overly round-ness of everything, which were by far the biggest eye-sores in this game for me. Everything is rounded off, its just blatant and awful. Still a fun game though, and definitely a better mix of RPG and shooter than some past attempts, like STALKER.
 
Originally posted by: Azn

How so? Just because it has more vram? GTX 260 processing wise, texture fillrate wise it is only on par with 9800gtx if not a little lower. I has a bit more bandwidth and more vram. Other wise I don't see GTX260 being that much better unless Nvidia is sabotaging the G92 series for GTX 2 series. I would like to see a 1gig 9800gtx go against 260gtx. I bet you the performance difference isn't all that much if none at all.

The gtx260 has a lot more shader power than the 9800gtx. 😕

 
Originally posted by: schneiderguy
Originally posted by: Azn

How so? Just because it has more vram? GTX 260 processing wise, texture fillrate wise it is only on par with 9800gtx if not a little lower. I has a bit more bandwidth and more vram. Other wise I don't see GTX260 being that much better unless Nvidia is sabotaging the G92 series for GTX 2 series. I would like to see a 1gig 9800gtx go against 260gtx. I bet you the performance difference isn't all that much if none at all.

The gtx260 has a lot more shader power than the 9800gtx. 😕

Actually, he is right in some ways. The shader clock of the 128SP 9800GTX+ is 1836, and the shader clock of the 260GTX is 1242.

9800GTX 128SP x 1688 = 216,064
9800GTX+ 128SP x 1838 = 235,008
260GTX 192SP x 1242 = 238,464

Wow, after doing the math, the shader power really isn't much stronger... I was thinking the GTX260 had around 25% more shader power, but it turns out it only has a mere 2% more than the 9800GTX+ and 10% more than the 9800GTX.

 
Originally posted by: ArchAngel777
Originally posted by: schneiderguy
Originally posted by: Azn

How so? Just because it has more vram? GTX 260 processing wise, texture fillrate wise it is only on par with 9800gtx if not a little lower. I has a bit more bandwidth and more vram. Other wise I don't see GTX260 being that much better unless Nvidia is sabotaging the G92 series for GTX 2 series. I would like to see a 1gig 9800gtx go against 260gtx. I bet you the performance difference isn't all that much if none at all.

The gtx260 has a lot more shader power than the 9800gtx. 😕

Actually, he is right in some ways. The shader clock of the 128SP 9800GTX+ is 1836, and the shader clock of the 260GTX is 1242.

9800GTX+ 128SP x 1838 = 235,008
260GTX 192SP x 1242 = 238,464

Wow, after doing the math, the shader power really isn't much stronger... I was thinking the GTX260 had around 25% more shader power.

Oops, I forgot about the shader clock 😱 But he was talking about the regular 9800gtx so the GTX260 still has a 10% advantage there.

1gb 9800gtx+

They're not much cheaper than the gtx260.
 
Originally posted by: apoppin
Isn't Fallout3 the *same* as Oblivion's Gamebryo engine?
😕
if so, it is very efficient until you load the textures to the max with the mods
- did they even optimize or build on it?

i have yet to get it 😛

It's a cool game. You should definitely get it. Probably the best game of 2008.
 
Originally posted by: schneiderguy
Originally posted by: ArchAngel777
Originally posted by: schneiderguy
Originally posted by: Azn

How so? Just because it has more vram? GTX 260 processing wise, texture fillrate wise it is only on par with 9800gtx if not a little lower. I has a bit more bandwidth and more vram. Other wise I don't see GTX260 being that much better unless Nvidia is sabotaging the G92 series for GTX 2 series. I would like to see a 1gig 9800gtx go against 260gtx. I bet you the performance difference isn't all that much if none at all.

The gtx260 has a lot more shader power than the 9800gtx. 😕

Actually, he is right in some ways. The shader clock of the 128SP 9800GTX+ is 1836, and the shader clock of the 260GTX is 1242.

9800GTX+ 128SP x 1838 = 235,008
260GTX 192SP x 1242 = 238,464

Wow, after doing the math, the shader power really isn't much stronger... I was thinking the GTX260 had around 25% more shader power.

Oops, I forgot about the shader clock 😱 But he was talking about the regular 9800gtx so the GTX260 still has a 10% advantage there.

1gb 9800gtx+

They're not much cheaper than the gtx260.

No denying about that. GTX 260 is better card but how much better if it didn't have more vram and bandwidth? It has big frame buffer and the bandwidth. Other than that I don't see the core being that much better over 9800gtx. In terms of raw power they are very close.
 
Originally posted by: Azn
Originally posted by: schneiderguy
Originally posted by: ArchAngel777
Originally posted by: schneiderguy
Originally posted by: Azn

How so? Just because it has more vram? GTX 260 processing wise, texture fillrate wise it is only on par with 9800gtx if not a little lower. I has a bit more bandwidth and more vram. Other wise I don't see GTX260 being that much better unless Nvidia is sabotaging the G92 series for GTX 2 series. I would like to see a 1gig 9800gtx go against 260gtx. I bet you the performance difference isn't all that much if none at all.

The gtx260 has a lot more shader power than the 9800gtx. 😕

Actually, he is right in some ways. The shader clock of the 128SP 9800GTX+ is 1836, and the shader clock of the 260GTX is 1242.

9800GTX+ 128SP x 1838 = 235,008
260GTX 192SP x 1242 = 238,464

Wow, after doing the math, the shader power really isn't much stronger... I was thinking the GTX260 had around 25% more shader power.

Oops, I forgot about the shader clock 😱 But he was talking about the regular 9800gtx so the GTX260 still has a 10% advantage there.

1gb 9800gtx+

They're not much cheaper than the gtx260.

No denying about that. GTX 260 is better card but how much better if it didn't have more vram and bandwidth? It has big frame buffer and the bandwidth. Other than that I don't see the core being that much better over 9800gtx. In terms of raw power they are very close.


I believe and have always believed that the 9800GTX+/- was bandwidth starved.
 
Originally posted by: ArchAngel777
I believe and have always believed that the 9800GTX+/- was bandwidth starved.

I think 9800gtx with gddr5 would have competed with 4870 considering 4850 competes with 9800gtx.
 
Lets throw 9800gx2 into the mix. Ive been looking at it, and it looks like the best contender in this price range.

Any thoughts?
 
Back
Top