• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Review of 112sp 8800GTS 640 mb

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
oh oops, those are the comparative specs for the GT Stock vs SSC, not the GTS stock vs SSC... but its about the same... its 15% overclocked compared to a stock GT... rather then comparing stock to stock or 15% oc to 15% oc.
 
Originally posted by: taltamir
oh oops, those are the comparative specs for the GT Stock vs SSC, not the GTS stock vs SSC... but its about the same... its 15% overclocked compared to a stock GT... rather then comparing stock to stock or 15% oc to 15% oc.

oh my god.

dont you see that the STOCK GT has faster clocks than the GTS SCC. What difference does it make what the name of the card is. Its 600 vs 575 core and 1500 vs 1350 shaders. both have 112 shaders. HENCE the frame buffer and memory bandwith is what is making the difference.

yes no? are you getting there? slowly?

sorry im just picking at you know because of what you said to me in the other thread, I apologize. but read the specs of the card before you make assumptions based on the marketing name of the card.
 
Originally posted by: nitromullet
I dont know why people say SLI doesn't work in Crysis

I think that has been the assumption since none of the review sites has produced any Crysis SLI benchmarks.

Instead of 20 fps I get 25 fps WHOOHOOO big f'n deal.

...exactly why a $250ish 8800GT is attractive compared to $1000+ for dual 8800GTXes. The price difference is HUGE while the performance difference is not. That is really a shame that SLI doesn't scale well (at this point) in Crysis... If there is any game that could benefit from dual video cards, it would be Crysis.

this is temporary nitromullet. COD4 was getting horrible performance for me when I made my own profile to use SLI. as a matter of fact it ran better in single gpu!

then when nvidia added the profile, poof my frames doubled. my assumption is that whatever rendering nvidia is using for crysis isn't working very well, and they will have to work on an SLI profile specifically for crysis, since neither SF nor AFR seem to be working efficiently.

good things come to those who wait 🙂
 
Originally posted by: JAG87
Originally posted by: taltamir
oh oops, those are the comparative specs for the GT Stock vs SSC, not the GTS stock vs SSC... but its about the same... its 15% overclocked compared to a stock GT... rather then comparing stock to stock or 15% oc to 15% oc.

oh my god.

dont you see that the STOCK GT has faster clocks than the GTS SCC. What difference does it make what the name of the card is. Its 600 vs 575 core and 1500 vs 1350 shaders. both have 112 shaders. HENCE the frame buffer and memory bandwith is what is making the difference.

yes no? are you getting there? slowly?

sorry im just picking at you know because of what you said to me in the other thread, I apologize. but read the specs of the card before you make assumptions based on the marketing name of the card.

because, mister wise guy... that if they were both stock, or both 15% overclocked, then the GT will have less on the frame buffer and memory bandwidth, but it will have another 15% benefit to clockspeed... which will potentially balance out or outpace...

If the frame buffer difference gives X% boost to frame rate to the GTS, then it is balanced by the clock speeds giving Y% boost to the frame rate to the GT. However by using an overclocked GTS to compare to a non OC GT you are eliminating the GT's Y% benefit leaving only the X% benefit of the GTS.

And stop being so rude to people. You don't have to hurl personal insults at everyone on every thread JAG...
 
Taltamir. PS doesn't always scale in games while some do. Minimum frame rates are usually caused by memory bandwidth and vram while ROP gives you ability to run higher resolutions.

Texturing fillrate abilities give you the high fps so the average frame rates get boosted. It's clear where the 8800gt is better than 8800gts. Usually the lower resolutions with no AA or where the game doesn't need much memory bandwidth. This is reason why 8800gt wins 8800gts in average frame rate but lose in minimum frame rate especially once AA is applied.

 
Originally posted by: swtethan
Originally posted by: taltamir
Ok, i will go take a look... mmm, nope!

Company of heroes 1600x1200 MINIMUM FPS only:
GTS320MB - 17.9
GT (stock) - 36.8
GTS640MB Super Super Clocked (oced from 600mhz to 700mhz and SP from 1500 to ~1800) - 44.4

Performance boost of GTS on the MIN FRAMES ONLY (not average) on max resolution and max settings only... on a SSC version compared to a non SSC version... 17.1%

World in Conflict 1600x1200 MINIMUM FPS only:
GTS320MB - 4
GT (stock) - 17
GTS640MB Super Super Clocked (oced from 600mhz to 700mhz and SP from 1500 to ~1800) - 20

Performance boost of GTS on the MIN FRAMES ONLY (not average) on max resolution and max settings only... on a SSC version compared to a non SSC version... 15%



While the GTS640 gets alittle better, the GTS320 still costs more then the GT even with the price gouging... and they are also comparing a STOCK GT to a GTS overclocekd by 16.6%
So how will a GT fare in a fair test? also the GT is PCIE2 so it has DOUBLE the bandwidth on a PCIE2 mobo... it has a plethorea of other benefits, and we are comparing a min FPS reached, not the average over time... so yea during lag the GT will dip alittle lower then the SSC GTS... but how will the SSC GT compare to the SSC GTS? I think the SSC GT will blow the SSC GTS out of the water...

actually...... SSC G80 = 576 core, 1350 shader, 1600MHz memory

Actually it's 576, 1350 shader, 1800mhz memory
 
I dunno, to me these new GTS's are like having a CPU with a terrible stepping on a great motherboard, vs. having a CPU with an awesome stepping on a budget motherboard (8800GT).

I would take the GT, all things considered (mainly price).
 
What I want to see it the same texturing abilities as 8800gt with Geforce 8800gts/gtx memory bandwidth and stick some high speed ddr4 in those. The cards would literally pwn.
 
Originally posted by: munky
Originally posted by: JAG87
Originally posted by: munky
Higher resolutions is what matters more to me, and this is exactly the reason I'm not getting a 8800gt for 1920x1200 resolutions. See how horribly the 320MB gts lags behind in all those benches? The same thing will happen to the 8800gt as more new games are released.

oh my god, can you stop being telepathic with me?

LOL. Great minds think alike... 😛

LOL. Fools seldom differ... 😛


EDIT: Don't worry, I'm just "Munkying" around...😀.
I agree with the quoted point. Although my next monitor will only be 1680x1050 (22").
 
Originally posted by: JAG87
Originally posted by: munky
Higher resolutions is what matters more to me, and this is exactly the reason I'm not getting a 8800gt for 1920x1200 resolutions. See how horribly the 320MB gts lags behind in all those benches? The same thing will happen to the 8800gt as more new games are released.

oh my god, can you stop being telepathic with me?
gmta

 
Originally posted by: taltamir
Originally posted by: JAG87
Actually it seems that the GTS tops the GT in almost every benchmark despite having lower core clock and lower shader clock.

Just goes to show that more vram and more bandwith matter a lot more today, unless you still play on a 17 inch lcd.

The 8800 GT is only going to get worse and worse in the near future, what's sad is that many people jumped on it, but at least they will be able to enjoy pretty much any game released up until today in all their glory and at a reasonable price.

They are comparing a supersuperclocked eVGA GTS to a STOCK GT
eVGA has three levels of overclocking... superclocked, knockout, and supersuperclocked (in this order). The GTS outperfomes the GT by a percent or two on most tests, but is outperformed by almost twenty percent on the few tests the GT wins... so getting half an FPS faster on a GTS under most scenarios or 10 fps more on a few sceneraios the GT wins I would STILL choose the GT over the GTS even if the GT cost the same!...

I am also wondering how an overclocked GT compares to an overclocked GTS 112, or how a stock GTS 112 compares to a stock GT... not an overclocked version of one to stock another...

I actually expected the GTS with extra shader units to kick the GTs arse, this just proved my speculation wrong and shows the the G80 can not compare even with all the shader units enabled...
it's not really ssc, it's just the "new" gts640. nvidia should have changed that name to 8850 gts or something else to differentiate it from the older 8800gts 640 models, but instead they are allowing their partners to use "ko" "ssc" etc to differentiate it.

 
Originally posted by: taltamir
Originally posted by: JAG87
Originally posted by: taltamir
oh oops, those are the comparative specs for the GT Stock vs SSC, not the GTS stock vs SSC... but its about the same... its 15% overclocked compared to a stock GT... rather then comparing stock to stock or 15% oc to 15% oc.

oh my god.

dont you see that the STOCK GT has faster clocks than the GTS SCC. What difference does it make what the name of the card is. Its 600 vs 575 core and 1500 vs 1350 shaders. both have 112 shaders. HENCE the frame buffer and memory bandwith is what is making the difference.

yes no? are you getting there? slowly?

sorry im just picking at you know because of what you said to me in the other thread, I apologize. but read the specs of the card before you make assumptions based on the marketing name of the card.

because, mister wise guy... that if they were both stock, or both 15% overclocked, then the GT will have less on the frame buffer and memory bandwidth, but it will have another 15% benefit to clockspeed... which will potentially balance out or outpace...

If the frame buffer difference gives X% boost to frame rate to the GTS, then it is balanced by the clock speeds giving Y% boost to the frame rate to the GT. However by using an overclocked GTS to compare to a non OC GT you are eliminating the GT's Y% benefit leaving only the X% benefit of the GTS.

And stop being so rude to people. You don't have to hurl personal insults at everyone on every thread JAG...

naa im being rude just to you. be cool and Ill be cool.

the point is that despite the GT having higher clocks, the GTS performs better. its the same GPU essentially, except that the GT has 56/56 texture adresse/filtering units while the GTS has 24/48, and the GT has 16 ROPs while the GTS has 20. so it balances out somewhat.

you know what really makes the GTS perform better? i think you do 🙂
 
Interesting.

GT really struggles at high resolutions/eye candy settings in newer titles.

The upcoming 128sp GTS should pretty much be the same performance as the GTX from the looks of things.
 
It's too bad you can't step up to that card 🙁 I'm going from an old 8800GTS 640 to the 8800GT, and I hope I don't get lower performance in future titles.
 
well, keep in mind that this is a ssc version of the 2nd revision gts (G80 core but with more stream processors).... which came out the 29th... if you have an older GTS then you will get better performance on EVERYTHING no matter what... MUCH better performance.

oh and check this out:
http://www.firingsquad.com/har...performance/page18.asp

Notice an overclocked GT is getting as much as 20% boost in frame rate over the stock GT on some tests... (or as low as a few percent)

Which shows there is validity in the claims of the impartiality of testing an OC new GTS vs a stock GT
 
SSC whatever does it really matter? GTS can easily overclock to those levels and beyond.... Better performance in higher resolutions with AA than 8800gt even if it's older core.

Again games doesn't always scale with how many more stream processors. It's the combination. 16 more PS is not going to do anything in most situations. What matters is higher clock speed and memory bandwidth with a card like GTS.
 
LOL did I say that. Were you not talking about SSC GTS version of the card in your prior post?

SSC GTS clocks are 576/1800. Older GTS can easily do those speeds.
 
Techreport also did a review of what this thread was showing that 8800gts is superior to 8800gt when higher resolution an AA was applied especially for games that need more memory bandwidth and vram.

http://techreport.com/articles.x/13479/4

GTS is still the superior card if you want to run with AA and have low minimum frame rate in games. GT is superior in raw frame rates when AA is not used.
 
why do they even bother..... the performance is so close to the GT, they could just as well end the GTS all together.
 
Originally posted by: Azn
SSC whatever does it really matter? GTS can easily overclock to those levels and beyond.... Better performance in higher resolutions with AA than 8800gt even if it's older core.

Again games doesn't always scale with how many more stream processors. It's the combination. 16 more PS is not going to do anything in most situations. What matters is higher clock speed and memory bandwidth with a card like GTS.

How do you figure? If you took away those 16 SP's and reduced the G80 to 80SP's do you think that wouldn't make a difference either? Of course 16SP's would yield better performance than not having them. Look at the GTX. 32SP's over a standard GTS is substantial in EVERY situation.

All things being equal (core/mem clocks) between a GTS640 96sp and a GTS640 112sp, It's pretty much a given that the 112sp card would be more powerful. I could see variations from game to game, and the lead may shorten or lengthen, but the lead will always be there.

If the 8800GT and the 112 GTS640 were equal in price, It's a no brainer to go with the GTS.
Unfortunately, the 112 GTS640 is still brutally expensive to manufacture and it's price is prohibitive relative to it's performance. 8800GT is the winner hands down.

I'm waiting to see how the 128sp G92 GTS compares. I know it will be faster than the 8800GT and will most likely rival or best a GTX in most situations.

 
You can only wonder why they didn't release a line of 8900GT, 8900GTS and 8900GTX all based on the G92, that would have made the whole line much more understandable. With different versions of similar named cards it's not easy for those who don't follow closely.
 
Originally posted by: biostud
You can only wonder why they didn't release a line of 8900GT, 8900GTS and 8900GTX all based on the G92, that would have made the whole line much more understandable. With different versions of similar named cards it's not easy for those who don't follow closely.

Ya its confusing for sure, but I think the lack of a high-end part to usurp the 8800GTX is the reason they decided against changing the model numbers to 8900. It would be just as confusing if they had an 8900GT and 8900GTS which were both still slower than the 8800GTX without an 8900GTX. Typically NV has released the high-end part first justifying the change in model number.

As for the difference between the parts, I think there's definitely something holding back the G92s that the 16 extra SP and TMUs can't make up for. The 112SP G80 somewhat confirms this since its still sitting somewhere between the G92 GT and G80 GTX at higher resolutions, but really not all that different than the 96SP G80 (again see linked Tech-Report review above.) Also, keep in mind many original reviews showing massive difference between the GTS and GTX have the GTS running at 500/513 vs. the GTX at 576 or so. My guess is that the fewer ROPs and less VRAM at higher resolutions/AA is the main factor. Tending to rule out bandwidth limitations and stream processors since raising memory and shader clocks on the G80 has very little impact on performance compared to raising the core clock.

Once the new 112 or 128 SP G92 GTS is released (512/1GB versions), it'd be interesting to see an apples-to-apples comparison with all the different variants run at the same clock speed to see what's driving performance differences between the parts. I think the main thing NV has accomplished with all of these new releases is grey the lines between GTS and GTX performance while managing to lower prices for that level of performance. Its confusing for sure, but I think in the end its a good outcome for the end user.
 
I am almost tempted to buy the 112 shader Evga 8800gts at newegg. Seeing how the gt is now either out of stock or going for $300+, the gts for $360 doesn't seem like a bad deal now. If only Nvidia would hurry up and release the new gts already with 1GB of mem.
 
Back
Top