9600GT VS 8800GS VS 8800GT

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

AzN

Banned
Nov 26, 2001
4,112
2
0
There aren't too many 8800gs reviews. Go try finding them on the web. Not many games are tested on it.

Only game 9600gt really beats 8800gs is COH and WIC and 8800gs beats 9600gt in oblivion. In all other games they are really a toss up.

I don't think I need to repeat myself what I've said in this thread. You see the AA scores and you think that's how it will perform in all games or future games or whatever. The real numbers are within raw performance and shader counts far as longevity of a card.
 

Lithan

Platinum Member
Aug 2, 2004
2,919
0
0
That only holds true when you deliberately configure the game settings to make it so.
Why would you buy a card that can run AA and then disable it so that you can say "well it's just as fast as this other card because I don't enable AA"
Meanwhile the guy who bought the other card is running AA.
Yes at 1440x900 without AA the 8800gs is competitive with the 9600gt. But almost all the games out today are running high enough FPS at that point that it's silly not to turn AA on. You say the 8800gs is a better choice looking in the future. I disagree. We're not going to settle this. Let's just dig this thread up in a year and see who was right. You're basing your logic on (it seems) the fact that two generations ago, when cards suddenly jumped from ~16-24 shaders to 100+, older cards with more shaders faired well in new games. That may be sound reasoning. But I feel you're missing the obvious argument that more shaders wont help you much if your memory can't juggle data fast enough or store enough data to have a decent FPS anyway. AA and Resolution arent the only things that scale this up. I could play games at 1900xwhatever on my old gf4 card (64meg) with no problems. 64megs isnt enough to run todays games at 800x600. New games require more memory even if reso's and AA levels dont change. And sub 512meg cards are already being proven to have trouble with resos as low as 1280x1024 in some games, and only slightly higher in most games. Will the 9600gt fare any better? I can't say for sure. But every game I play lets me turn shaders way down. You have to turn a LOT of stuff down to reduce the amount of video memory a game wants to access. But even given that, I dont think it's reasonable to gamble with performance now (especially on a ~$100 card) for performance in the future. ($100 cards dont have much of a future). And almost every review today demonstrates that at the best settings the 9600gt can run comfortably in most games it soundly beats a 8800gs. Restricting the resolution only affects one aspect of the performance advantage it has. AA is still there, and at a low reso like that is probably even more desirable.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
In raw performance shader does help but not so much with AA. This has already been tested by BFG when we had a flame fest in the other thread.

http://episteme.arstechnica.co...7909965/m/453004231931

In a year or two 8800gs will easily outpace 9600gt because number of games would slowly be creeping up with more shader requirement and let's not forget the massive texture fillrate it has over 9600gt.

I never said 9600gt sucks for AA. It has the bandwidth to do it with minimal impact that is why it seems like such a great card but it's raw performance is no better than 8800gs or 3850 currently and would even fall harder when coders require more shader effects.

You just look at the performance numbers now with AA and performing 10-15% better and that's all you have to go on because you don't understand what each sections of the GPU represents and doesn't make sense to you.

What does 9600gt have over 8800gs really? pixel fillrate and memory bandwidth for playing higher resolutions? 8800gs is more powerful GPU. It might not have the bandwidth for uber high resolutions with AA but it has 33% more texture fillrate, 33% more SP equals longevity.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
I would love to see a proper review of the 8800GS, along with SLI performance.

For $110, this card is the best deal on a graphics card I have seen in ages. The core seems to overclock to at least 700mhz which is quite good. 384mb of memory should be fine for the type of workload that this card can handle. The 768mb flavours might come in handy in SLI running a game like Crysis.

By all accounts, the 8800GS has more 'raw' performance, but the 9600GT performs better with AA enabled.

I'm thinking that for $220, two 8800GS cards in SLI would smoke an 8800GT for about the same price.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
I think 9600gt sli would be better because of 512mb ram for higher resolutions.

768mb 8800gs is missing in action but would love to see that in SLI too and bandwidth limitations would be dropped considerably and not to mention more vram for uber high resolutions.

There just aren't too many reviews on 8800gs. Bunch of 9600gt with AA numbers all over the web though. Kind of misleading the public how powerful the card really is.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: Azn
I think 9600gt sli would be better because of 512mb ram for higher resolutions.

768mb 8800gs is missing in action but would love to see that in SLI too and bandwidth limitations would be dropped considerably and not to mention more vram for uber high resolutions.

There just aren't too many reviews on 8800gs. Bunch of 9600gt with AA numbers all over the web though. Kind of misleading the public how powerful the card really is.
IMO 384mb is fine for current games. I run pretty much anything on a 320mb GTS at 1920x1200. This type of setup isn't geared toward people with a 30"LCD anyways.

It's strange. 512mb of texture memory seems to be the sweet spot right now. 384 is not quite enough, but 768mb is too much (in most circumstances).

It's easy enough to work around texture memory limitations. You can disable AA, and turn down the in-game texture quality a little bit.

The one time I had to do that was in Gears of War. It couldn't run at High, but ran super fast on Medium. The game still looks great. I would imagine Crysis is a similar story. I just can't stand running it at 1280x1024.

I would personally be willing to sacrifice the 128mb of texture memory and get the 8800GS SLI setup instead of the 8800GT. The enormous raw power of that setup far outweighs the sacrifice you have to make in terms of texture memory (and the 256bit bus down to 192bit is no big deal).
 

Lithan

Platinum Member
Aug 2, 2004
2,919
0
0
"It has the bandwidth to do it with minimal impact"

And yet, 9600gt Beats 8800gt 256meg by a HUGE margin in AA... so it isn't just bandwidth. 256megs isnt enough video memory any longer. Even 384megs isn't at higher res. How long until it's not even enough at low res? Long enough that the shader impact you are counting on making the 8800gs a better card comes into play?

Disabling AA seems to help with memory interface limitations... not so much with capacity. Other than that, pretty much every single bump in any capacity increases the amount of gfx ram the game wants. It adds up and there isn't a quick fix by dropping one or two settings like there is with shaders. Hellgate London on my 7800gtx for instance. ~20fps with everything maxed (dx9 maxes). ~45fps with shaders on low and nothing else changed. Card clearly is struggling with the game because modern cards have 3-6 times the shaders it has. Now if I had a 128meg card, I'd have to disable all kinds of shit to get my fps up.


"It's easy enough to work around texture memory limitations."

Yes, but with the 9600gt you dont have to. So you are sacrificing that advantage, that extra performance... to get the possibility that maybe sometime down the line the 8800gs will be better at something than the 9600gt... seems crazy to be sacrificing guarenteed performance for potential future performance on a card that is clearly not a wise long-term investment.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
384mb is still fine for Crysis @ high settings @ 1600x1200.

http://www.3dnews.ru/_imgdata/img/2008/02/26/75453.jpg
crysis 1600x1200 high settings vram usage

http://www.3dnews.ru/_imgdata/img/2008/02/26/75452.jpg
crysis 1280x1024 high settings vram usage

Yeah it's because of AA and post processing within the last few years. 4xAA is like blowing up your resolution by 2 notches.

If 1280x1024 4xAA, it would be equivalent @ 1900x1200 with no AA.

Without AA 320mb GTS should be fine all the way up to 1920x1200.

You had problem with GOW @ high settings? Probably you ran uber high resolutions. :p I was running fine max settings @ 1440x900 on 8600gts though. 50fps average or so.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: Lithan
Wrong.

It gets Ripped to SHREDS by the 9600gt with AA off @ 1280x1024+. As does the 8800gt 256meg which has absolutely no disadvantage nnext to the 9600gt besides memory size (unless I missed something)
http://www.firingsquad.com/har...performance/page13.asp

The 256meg card cant even RUN it at 1920x1200
You simply can't run Crysis on 'high' on a 256mb card. I don't even attempt it on my 320mb GTS, dispite the fact that they still manage to break 20fps on it at 1280x1024 (meaning I could probably do ok at lower resolutions).

The thing is, two 8800GS's cost only $220. Two 9600GTs cost $300. At $300, it's tempting to just get a 9800GTX (or an 8800GTX). For $220, you're beating out an 8800GT which is already considered an amazing value.

The 8800GS is not listed in your benchmarks BTW so I don't know what you're saying was being "shredded". :confused:
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: Lithan
"It has the bandwidth to do it with minimal impact"

And yet, 9600gt Beats 8800gt 256meg by a HUGE margin in AA... so it isn't just bandwidth. 256megs isnt enough video memory any longer. Even 384megs isn't at higher res. How long until it's not even enough at low res? Long enough that the shader impact you are counting on making the 8800gs a better card comes into play?

Oh did you look the original poster post? He's running a 19" LCD and so am I. :eek:

Why do you keep insisting running uber high resolutions with AA to prove that 9600gt is better for higher resolutions currently because of 512mb vram and memory bandwidth?


Disabling AA seems to help with memory interface limitations... not so much with capacity. Other than that, pretty much every single bump in any capacity increases the amount of gfx ram the game wants. It adds up and there isn't a quick fix by dropping one or two settings like there is with shaders. Hellgate London on my 7800gtx for instance. ~20fps with everything maxed (dx9 maxes). ~45fps with shaders on low and nothing else changed. Card clearly is struggling with the game because modern cards have 3-6 times the shaders it has. Now if I had a 128meg card, I'd have to disable all kinds of shit to get my fps up.

What good is vram capacity when it doesn't have the power to run it for upcoming games?

9600gt is good with AA once again. I never it sucked. It's raw performance is only equivalent to 8800gs and 3850 currently. Come shader intensive games? it will fail miserably compared to 3850 or 8800gs. Come texture heavy games? it will fail miserably with 8800gs.


"It's easy enough to work around texture memory limitations."

Yes, but with the 9600gt you dont have to. So you are sacrificing that advantage, that extra performance... to get the possibility that maybe sometime down the line the 8800gs will be better at something than the 9600gt... seems crazy to be sacrificing guarenteed performance for potential future performance on a card that is clearly not a wise long-term investment.

You do know 9600gt has 33% lower texture fillrate than 8800gs right? Only thing 9600gt excels over 8800gs is AA performance because it has more memory bandwidth and 512mb vram while more texture fillrate over 3850. They are really equal in terms of performance. Look at raw performance #'s which I already posted. You seem to ignore the basis of my argument.

http://www.computerbase.de/art...rmancerating_qualitaet
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: Azn
You had problem with GOW @ high settings? Probably you ran uber high resolutions. :p I was running fine max settings @ 1440x900 on 8600gts though. 50fps average or so.
Yeah, it was at 'very high' actually. Then I read the HardOCP article on the game and scaled it back to 'medium'. I actually never tried 'high' per se. Oops. :)

I run all my games at either 1920x1200 or 1080P. Not that high a resolution by today's standards, but when I got my monitor a few years ago, my X800 card couldn't really run anything at native resolution (which is why I 'early adopted' with my GTS).

I'm surprised your 8600 performed so well. I always thought it was more of a budget card.
 

Lithan

Platinum Member
Aug 2, 2004
2,919
0
0
And 9600gts are selling for exactly $10 over what 8800gs sell for. Both with and without MIR. You can sli 9600gt's for $240. And A couple guys WITH sli 9600gt have reported that they ARE faster than 8800gtx... 8800GS I sincerely doubt would be.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: Lithan
And 9600gts are selling for exactly $10 over what 8800gs sell for. Both with and without MIR. You can sli 9600gt's for $240. And A couple guys WITH sli 9600gt have reported that they ARE faster than 8800gtx... 8800GS I sincerely doubt would be.

Faster than 8800gtx? :laugh:
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: Lithan
Read the post above mine.
He referred to the 8800GS and the 8800GTS 320mb. Are you saying that the 9600GT shreds them both? :confused:

IMO all three cards are in the same ballpark. Each has strengths and weaknesses.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: SickBeast
Originally posted by: Lithan
Read the post above mine.
He referred to the 8800GS and the 8800GTS 320mb. Are you saying that the 9600GT shreds them both? :confused:

IMO all three cards are in the same ballpark. Each has strengths and weaknesses.

Exactly what I'm trying to say. Thank you.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: Azn
Originally posted by: Lithan
And 9600gts are selling for exactly $10 over what 8800gs sell for. Both with and without MIR. You can sli 9600gt's for $240. And A couple guys WITH sli 9600gt have reported that they ARE faster than 8800gtx... 8800GS I sincerely doubt would be.

Faster than 8800gtx? :laugh:
Actually they are by most accounts.

8800GS's in SLI probably would be as well.

I didn't realize the price difference was only $10. Are you saying you can get a 9600GT for only $120? I would pay $10 more for the GT, not $40 tho, and only because of the additional texture memory.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: SickBeast

I'm surprised your 8600 performed so well. I always thought it was more of a budget card.

It's not that bad for a 19" LCD without AA. That seems to be the limit though. I was running COD 4 @ 50fps average too. Unreal 3 60+ fps.
 

Lithan

Platinum Member
Aug 2, 2004
2,919
0
0
Azn.
1. Because I was proving something that was said before wrong... something unrelated to the 19" LCD. Someone had suggested that the memory bandwidth was the only thing preventing the 8800gs from running AA.
2. "Come shader intensive games? it will fail miserably compared to 3850 or 8800gs. Come texture heavy games? it will fail miserably with 8800gs." You just keep waiting for that day when you can brag about running 10fps @ 800x600 instead of only 8fps. You keep preaching about how the 8800gs that barely runs modern games at acceptable resolutions will have no problems running these imaginary future games.
3. Your raw performance #'s amount to less benchmarks than I can count on one hand which you've cherry picked out of 20+ different benchmarks between the cards as the 10-15% of situations that the 8800gs can hold its own.
 

Lithan

Platinum Member
Aug 2, 2004
2,919
0
0
Originally posted by: SickBeast
Originally posted by: Azn
Originally posted by: Lithan
And 9600gts are selling for exactly $10 over what 8800gs sell for. Both with and without MIR. You can sli 9600gt's for $240. And A couple guys WITH sli 9600gt have reported that they ARE faster than 8800gtx... 8800GS I sincerely doubt would be.

Faster than 8800gtx? :laugh:
Actually they are by most accounts.

8800GS's in SLI probably would be as well.

I didn't realize the price difference was only $10. Are you saying you can get a 9600GT for only $120? I would pay $10 more for the GT, not $40 tho, and only because of the additional texture memory.

You don't say. I'd pay more for it because it's so shiney, cause lord knows I dont base my purchases on which actually performs best.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: SickBeast
Originally posted by: Azn
Originally posted by: Lithan
And 9600gts are selling for exactly $10 over what 8800gs sell for. Both with and without MIR. You can sli 9600gt's for $240. And A couple guys WITH sli 9600gt have reported that they ARE faster than 8800gtx... 8800GS I sincerely doubt would be.

Faster than 8800gtx? :laugh:
Actually they are by most accounts.

8800GS's in SLI probably would be as well.

I didn't realize the price difference was only $10. Are you saying you can get a 9600GT for only $120? I would pay $10 more for the GT, not $40 tho, and only because of the additional texture memory.

When I wanted to buy it was for $20 more after rebate and all. I'm only on 19" LCD so I could care less about extra 128mb vram. I just want the card to last me longer so I chose 8800gs instead.
 

Lithan

Platinum Member
Aug 2, 2004
2,919
0
0
I was responding to this.

"Without AA 320mb GTS should be fine all the way up to 1920x1200. "
I dont consider <10fps average "fine".
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: Lithan
I was responding to this.

"Without AA 320mb GTS should be fine all the way up to 1920x1200. "
I dont consider <10fps average "fine".
Well I own the card and I agree with him. It *is* fine. Crysis is nothing more than a tech demo at this point. Noone can run it properly. For all other modern games, I run at 1920x1200 and 90% of the time my FPS are pegged at 60.

If I had enough reason to upgrade my card, I would.

Even if you read the AT article on the 8800GTS 320mb, they said it was fine up to 1920x1200. Are you saying they're wrong too?