your 8800GTX is ancient for modern DX10 PC games ..
I know, and I have my other G92 EVGA 8800 GT SC 512 for sale...and when it goes I'll pick up a new card and then sell this one...thing is it still works great for the games I play.
your 8800GTX is ancient for modern DX10 PC games ..
Originally posted by: Dadofamunky
Yeah, I "upgraded" from an e6750 to an e8600 and don't really see any major changes; but I didn't really expect to. What I really wanted was to break the 4 GHz barrier for bragging rights (that and $1,75 will buy you a grande coffee and Starbucks). But no, the clouds did not part and the Angel Gabriel did not announce Computing Nirvana to me.
No more upgrades for me for quite awhile. I certainly do not need a quad under any circumstances.
Originally posted by: Martimus
It took a little digging for me, but here is the link if anyone wants to download this benchmark: Stalker: Clear Sky BenchmarkOriginally posted by: apoppin
if you play STALKER: Clear Sky .. it will become clear to you; download and run the CS benchmark which is one of the best stand alone benches ever
Originally posted by: apoppin
Originally posted by: Tweakin
Starting from my base 6750@3.4Ghz, I decided to try a E8400 and shoot for the magical 4Ghz. Did I make it...sure, easy on that chip. Did I notice a difference, nope...unless I work and play with synthetic bench marks...nada...The chip also had both sensors stuck/dead/inop.
Maybe I should try a Quad...everyone loves a quad...right? Back to the egg for a 9650. I tried a 6600 awhile back and could not get it stable on my IP35-E above 3Ghz...and compared to my 6750, it was much slower in real world applications.
This jump to a 9650 also required a new board, and with it some new memory. It came, I put it together and nothing...dead chip. After RMA and replacement, up and running...3.6 stable. Again, more stuck/dead/inop sensors and no real difference after three days of multitasking. Benchmarks went through the roof...games played the same, which are HL, COD, and FarCry2 for the most part.
Even when things were running in the background, it just didn't seem like the system responded any better then my dual, especially for the price. One additional thing, I tried three different new coolers, and none of them worked as well as my old Scythe Ninja.
So, as you can see by me sig, I'm back to my original system...waiting for all my refunds, trying to sell my last parts and looking forward to a new laptop...
what did you expect?
your 6750 was already fast at 3.4Ghz .. if you are a gamer, you are already "there" with your 8800GTX
This is from my latest testing with a SYNTHETIC that show what happens with a much faster video card than yours
it's one example from 3DMark06's stock benchmark using Q9550S with GTX280
* GTX280's score at 2.83 Ghz is 14907
* GTX280's score at 3.40 Ghz is 17906
* GTX280's score at 4.00 Ghz is 18167
notice where performance levels off
well, here is a lesser Videocard 4870/1GB with Q9550S scaling from stock to 3.6 to 4.0 Ghz
* HD4870's score at 2.83 Ghz is 14411
* HD4870's score at 3.60 Ghz is 15825
* HD4870's score at 4.00 Ghz is 16067
i also have examples with e8600 from 3.33Ghz to 4.25Ghz that show much the same thing .. somewhere around 3.6Ghz you hit "diminishing returns" with your system with a normal video card. i am testing with 4870x2 and CrossfireX-3 right now - mostly at upper midrange resolutions of 16x10 and 19x12
.. the same thing seems to apply
however, where a game uses all 4 cores; your dual will come up short - in comparison - no matter what you do
![]()
if you play STALKER: Clear Sky .. it will become clear to you; download and run the CS benchmark which is one of the best stand alone benches ever
if you *really* want to see improvement in your games, toss that aging GTX and get yourself a GTX280 or GTX285 .. 8800 series is not fit for modern games anymore
- not if you use DX10 and like details![]()
Originally posted by: dflynchimp
Originally posted by: apoppin
Originally posted by: Tweakin
Starting from my base 6750@3.4Ghz, I decided to try a E8400 and shoot for the magical 4Ghz. Did I make it...sure, easy on that chip. Did I notice a difference, nope...unless I work and play with synthetic bench marks...nada...The chip also had both sensors stuck/dead/inop.
Maybe I should try a Quad...everyone loves a quad...right? Back to the egg for a 9650. I tried a 6600 awhile back and could not get it stable on my IP35-E above 3Ghz...and compared to my 6750, it was much slower in real world applications.
This jump to a 9650 also required a new board, and with it some new memory. It came, I put it together and nothing...dead chip. After RMA and replacement, up and running...3.6 stable. Again, more stuck/dead/inop sensors and no real difference after three days of multitasking. Benchmarks went through the roof...games played the same, which are HL, COD, and FarCry2 for the most part.
Even when things were running in the background, it just didn't seem like the system responded any better then my dual, especially for the price. One additional thing, I tried three different new coolers, and none of them worked as well as my old Scythe Ninja.
So, as you can see by me sig, I'm back to my original system...waiting for all my refunds, trying to sell my last parts and looking forward to a new laptop...
what did you expect?
your 6750 was already fast at 3.4Ghz .. if you are a gamer, you are already "there" with your 8800GTX
This is from my latest testing with a SYNTHETIC that show what happens with a much faster video card than yours
it's one example from 3DMark06's stock benchmark using Q9550S with GTX280
* GTX280's score at 2.83 Ghz is 14907
* GTX280's score at 3.40 Ghz is 17906
* GTX280's score at 4.00 Ghz is 18167
notice where performance levels off
well, here is a lesser Videocard 4870/1GB with Q9550S scaling from stock to 3.6 to 4.0 Ghz
* HD4870's score at 2.83 Ghz is 14411
* HD4870's score at 3.60 Ghz is 15825
* HD4870's score at 4.00 Ghz is 16067
i also have examples with e8600 from 3.33Ghz to 4.25Ghz that show much the same thing .. somewhere around 3.6Ghz you hit "diminishing returns" with your system with a normal video card. i am testing with 4870x2 and CrossfireX-3 right now - mostly at upper midrange resolutions of 16x10 and 19x12
.. the same thing seems to apply
however, where a game uses all 4 cores; your dual will come up short - in comparison - no matter what you do
![]()
if you play STALKER: Clear Sky .. it will become clear to you; download and run the CS benchmark which is one of the best stand alone benches ever
if you *really* want to see improvement in your games, toss that aging GTX and get yourself a GTX280 or GTX285 .. 8800 series is not fit for modern games anymore
- not if you use DX10 and like details![]()
If you note the games that he plays, you can see that he doesn't need any more firepower beyond 8800GTX, which is still a great card to game with. Seriously there's nothing that ticks me off more than people trying to argue that a 30% increase in performance is worth tossing out a $300-400 card and splurging on another one.
If I had the money, sure I might do the same, but I wouldn't try and justify it by saying something as absurd as the 8800GTX being "not fit" for gaming.
This isn't a flame or angry retort, btw. I just doled out for a 4870X2, so I'm in the same boat here. But I did some benchmark hunting and the 4870X2 will at least double my 8800GTS 320's performance, which will make a difference when I'm running Far Cry and Crysis at high def. The only time I justify an upgrade is when it effects the magical 30fps threshold on my games at the resolutions I play at. If I'm already doing 50fps, I stick with where I am.
Originally posted by: skillyho
8800GTX still puts out and remains completely capable. GOSH, haters.
Seriously though, that card can't out bench/FRAPS a current gen card (260/280/etc..) but it's more than capable of playing 99% of the games out there at acceptable rates with appropriate settings, and it has been for 2+ years now. *Still* a great gaming card.
Originally posted by: apoppin
Originally posted by: skillyho
8800GTX still puts out and remains completely capable. GOSH, haters.
Seriously though, that card can't out bench/FRAPS a current gen card (260/280/etc..) but it's more than capable of playing 99% of the games out there at acceptable rates with appropriate settings, and it has been for 2+ years now. *Still* a great gaming card.
not at even 16x10 on the DX10 pathway
- withOUT AA
i don't know about your "appropriate settings", but they must not include DX10
--new games Kill it
i have a very long long list of games it cannot run very well with maxed details
--8800GTX falls into SINGLE-DIGITS in Stalker: Clear Sky
![]()
it *was* a great gaming card the year before last .. all the way 'till last Summer
- i still have one
. . . and my GTX280 eats it alive
:moon:![]()
Originally posted by: apoppin
let's put it another way, a "great gaming card" can play new games
8800GTX cannot even play Clear Sky with maxed details on the DX10 pathway at even 14x9 or 12x10![]()
Originally posted by: skillyho
Originally posted by: apoppin
let's put it another way, a "great gaming card" can play new games
8800GTX cannot even play Clear Sky with maxed details on the DX10 pathway at even 14x9 or 12x10![]()
Okay...so can it play on Medium-High in DX9? As I already said, I doubt anyone would jump on DX10 if their setup is more performance oriented on an DX9/XP Setup for minimal visual improvement.
Here's a tip...(just from me to you) most gamers play for the gaming experience, not to simply max every in-game setting available. If one could do a broader look at people buying discrete graphics, they're doing so just to actually GAME. It's the rarer enthusiast perspective (such as yourself and most of this forum) that are the early adopters for those few extra frames....and I think that's what this thread is all about.
.. well then it is NOT a "great gaming card", is it?so can it play on Medium-High in DX9?
Originally posted by: Tweakin
Originally posted by: Insomniator
What a waste of time. Testing quad vs dual in games on an 8800. What kind of multitasking? Firefox and itunes at the same time do not count.
Good thing it wasn't your time then, cause you already sound cranky...
it is not a problem for meOriginally posted by: Tweakin
Wow...
I was just commenting that I didn't find the newer technology to be any faster/snappier then my current setup...I didn't want to start any flame wars.
My point was that my current system, which costs about $300 minus the video card runs as fast, or appears faster (to me for what I do)...then any of the newer systems I was putting together.
To apoppin, I don't even know what Clear Sky is...like I said, I normally play COD, HL and Far Cry...and neither my 8800gtx or my 8800 GT SC (g92 DX10) card have any issues at 16x10.
I do find myself in need of another system, and after all that I have done, I'm not sure what I will put together now...but I can't afford to wait.
appopin: We agree to disagree, and we're arguing about words and definitions that have different implications to each of us. I would *enjoy* a newer rig with more high end hardware, but my PC cost me around $475 shipped AR, I get 14k in 3dMark06, play every game I want to play (with some moderate compromises) and enjoy the heck out of it while doing it, so I can't complain. Arguing over the concept of a ?great gaming card? isn?t doing anything productive for this thread so I will revise my statement to be more subjective....... A great gaming card *to me* is one that debuts at the top of the charts, hold the title for well over a year and still fits the bill today with settings adjusted appropriately.
Originally posted by: dflynchimp
If you note the games that he plays, you can see that he doesn't need any more firepower beyond 8800GTX, which is still a great card to game with. Seriously there's nothing that ticks me off more than people trying to argue that a 30% increase in performance is worth tossing out a $300-400 card and splurging on another one.
If I had the money, sure I might do the same, but I wouldn't try and justify it by saying something as absurd as the 8800GTX being "not fit" for gaming.
This isn't a flame or angry retort, btw. I just doled out for a 4870X2, so I'm in the same boat here. But I did some benchmark hunting and the 4870X2 will at least double my 8800GTS 320's performance, which will make a difference when I'm running Far Cry and Crysis at high def. The only time I justify an upgrade is when it effects the magical 30fps threshold on my games at the resolutions I play at. If I'm already doing 50fps, I stick with where I am.
Originally posted by: brencat
Originally posted by: dflynchimp
If you note the games that he plays, you can see that he doesn't need any more firepower beyond 8800GTX, which is still a great card to game with. Seriously there's nothing that ticks me off more than people trying to argue that a 30% increase in performance is worth tossing out a $300-400 card and splurging on another one.
If I had the money, sure I might do the same, but I wouldn't try and justify it by saying something as absurd as the 8800GTX being "not fit" for gaming.
This isn't a flame or angry retort, btw. I just doled out for a 4870X2, so I'm in the same boat here. But I did some benchmark hunting and the 4870X2 will at least double my 8800GTS 320's performance, which will make a difference when I'm running Far Cry and Crysis at high def. The only time I justify an upgrade is when it effects the magical 30fps threshold on my games at the resolutions I play at. If I'm already doing 50fps, I stick with where I am.
Best post of this entire thread.
Just recently got my first LCD -- a 24incher too, and was shit scared I wouldn't be able to run native at playable frames in my 2 favorite games, COD4 and L4D, with my card. I'm happy to say it handles L4D perfectly at 19x12 with 4xAA/4xAF, no stuttering or slideshow at all. COD4 is a different story with it averaging in the mid-50s on most boards and dipping into the 30s on Overgrown using 2xAA/max everything else. Runs much better at 16x10 with max everything using vid-card upscaling.
I don't play Stalker and I certainly realize Crysis, WIC, and a few others will not be playable at 19x12 with my card. But to say an 8800GTX is not a decent card for most games today is simply absurd, since most people are okay with dialing down the eye candy a bit. Most except apoppin apparently...
No, never used FRAPS. The COD4 mid-50s frame claim comes from the in game stat running in the upper right corner of the game while playing (and mid-30s on Overgrown). It's a dead steady 60 FPS at 16x10 with vsync enabled.Originally posted by: apoppin
i have the equal of your card - 2900xt - and i doubt you have ever really used FRAPS to verify your frame rates at 19x12
![]()
