Upgrading from a 7600gt to a 9600gt - Is it a good idea ?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

toyota

Lifer
Apr 15, 2001
12,957
1
0
Originally posted by: bryanW1995
schmide, you need to realize something: toyota is person who reads stuff written by PROFESSIONALS. dont' question him like this or he'll go ape shit on you, too. Shit, I'd better take down my system specs, he's about to tell me that my q6600 is bottlenecking my fanless 7300gt.

typical moronic reply from you now that we disagreed earlier. you still believe somebody got a 170% increase at 1280 in Risen going from a 9600gt to a 4850 so you are clearly a fool.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Originally posted by: Schmide
Originally posted by: toyota
typical moronic ... so you are clearly a fool.

See post above.

so? he deserves exactly what i said? what if i said a small cpu upgrade like going from a 5000 X2 to 5600 X2 gave me 170% more performance? people would be foolish to believe that nonsense. he chose to believe someone that claimed the same thing with a small gpu upgrade so yes he is a fool for thinking that guy wasnt full of crap.
 

Schmide

Diamond Member
Mar 7, 2002
5,712
978
126
Originally posted by: toyota
so? he deserves exactly what i said? what if i said a small cpu upgrade like going from a 5000 X2 to 5600 X2 gave me 170% more performance? people would be foolish to believe that nonsense. he chose to believe someone that claimed the same thing with a small gpu upgrade so yes he is a fool for thinking that guy wasnt full of crap.

I don't know how that is relevant to this thread. I can't find it in this thread. You're obviously adding insults and external information to this thread. I'm kindly asking you to curb it.
 

MarcVenice

Moderator Emeritus <br>
Apr 2, 2007
5,664
0
0
Originally posted by: toyota
Originally posted by: Schmide
Originally posted by: toyota
typical moronic ... so you are clearly a fool.

See post above.

so? he deserves exactly what i said? what if i said a small cpu upgrade like going from a 5000 X2 to 5600 X2 gave me 170% more performance? people would be foolish to believe that nonsense. he chose to believe someone that claimed the same thing with a small gpu upgrade so yes he is a fool for thinking that guy wasnt full of crap.

In all honesty, you should sit back and take a deep breath. Depending on what kind of 9600 we're talking, a HD 4850 could actually be nearly twice as fast. Also, I've seen weird things happen, like 16fps minima with 1 hd 5870, 30 fps minima with 2x HD 5870 and 60fps minima with 3x HD 5870. Riddle me that? I can't explain it, but it happened.

But we're going offtopic, the OP must be going crazy right now. I think we can all agree that the 9600gt would be a nice upgrade for sure. Depending on price it would be hard to give a negative buying advice. If the OP thinks a HD 4850 is worth some extra money *because it WILL give better fps*, he shouldn't be told NOT to buy it either.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Originally posted by: MarcVenice
Originally posted by: toyota
Originally posted by: Schmide
Originally posted by: toyota
typical moronic ... so you are clearly a fool.

See post above.

so? he deserves exactly what i said? what if i said a small cpu upgrade like going from a 5000 X2 to 5600 X2 gave me 170% more performance? people would be foolish to believe that nonsense. he chose to believe someone that claimed the same thing with a small gpu upgrade so yes he is a fool for thinking that guy wasnt full of crap.

In all honesty, you should sit back and take a deep breath. Depending on what kind of 9600 we're talking, a HD 4850 could actually be nearly twice as fast. Also, I've seen weird things happen, like 16fps minima with 1 hd 5870, 30 fps minima with 2x HD 5870 and 60fps minima with 3x HD 5870. Riddle me that? I can't explain it, but it happened.

But we're going offtopic, the OP must be going crazy right now. I think we can all agree that the 9600gt would be a nice upgrade for sure. Depending on price it would be hard to give a negative buying advice. If the OP thinks a HD 4850 is worth some extra money *because it WILL give better fps*, he shouldn't be told NOT to buy it either.
on average a 4850 is about 30-40% faster than a 9600gt. at 1280 and with a low end cpu the difference is even less. that guy was full of crap saying he almost tripled his minimum framerates and double his average in Risen under those conditions. as you can see the difference between a 9600gt an even a much faster 4870 or 4890 isnt any where near that big and thats using a overclocked i7. http://www.pcgameshardware.com...deon-HD-5850/Practice/
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Toyota, if you told me that were able to clock your x3350 at a rock solid stable 525 fsb I would tell you that you're full of shit. You know why? Because I own one and I've gone through the hassle/fun of oc'ing it. I know from my own personal experience that going over 450 fsb 24/7 stable is VERY rare on a 45nm penryn quad (I also got to do it on a q9450 a few months later). I also checked with reliable sources who were able to confirm my experiences. If I had ONLY checked with internet sources which might or might not be reliable and had little to no experience then I would be...you.

When somebody made a claim about his personal experience of going from a shitty card to a 4850 I was not completely familiar with it since I haven't played risen or used a 4850 on an x2 system, but I have owned an opteron 185 in the past and my last video card was a 4850 so I at least had some idea of where he was coming from. I admit that his claim sounded a bit high, but not outrageously so. based upon many many others (not lease marcvenice's) experiences on a 4850 I was disinclined to call him a liar. You, however, had no such qualms. why? because you went to a german website that tested that game with all ingame max settings but no AA and concluded that said website was the risen bible. That was kind of a reach, as is your current vendetta against video card upgrades on a tech-oriented video card forum.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Originally posted by: bryanW1995
Toyota, if you told me that were able to clock your x3350 at a rock solid stable 525 fsb I would tell you that you're full of shit. You know why? Because I own one and I've gone through the hassle/fun of oc'ing it. I know from my own personal experience that going over 450 fsb 24/7 stable is VERY rare on a 45nm penryn quad (I also got to do it on a q9450 a few months later). I also checked with reliable sources who were able to confirm my experiences. If I had ONLY checked with internet sources which might or might not be reliable and had little to no experience then I would be...you.

When somebody made a claim about his personal experience of going from a shitty card to a 4850 I was not completely familiar with it since I haven't played risen or used a 4850 on an x2 system, but I have owned an opteron 185 in the past and my last video card was a 4850 so I at least had some idea of where he was coming from. I admit that his claim sounded a bit high, but not outrageously so. based upon many many others (not lease marcvenice's) experiences on a 4850 I was disinclined to call him a liar. You, however, had no such qualms. why? because you went to a german website that tested that game with all ingame max settings but no AA and concluded that said website was the risen bible. That was kind of a reach, as is your current vendetta against video card upgrades on a tech-oriented video card forum.

keep living in a fantasy world if you think a 4850 will give you 170% more performance over a 9600gt at 1280 with a lower end X2 cpu. yeah that benchmark easily disproves what that guy was claiming but now you will just ignore that well known German sites benchmarks since it only backs up what i said. what if I had no benchmark for Risen at all then your excuse would be that we just dont know since there are no benchmarks. lol
 

DetConan

Member
Mar 7, 2006
39
0
0
Thanks for all the feedback.

I am running at 1680 but I do not want to invest too much in a new card : under $100. And here in Canada, even with our strong dollar, electronics is still expensive. So the 9600GT is the more expensive I would go. In fact, the 4670 is more in my price range... If going with a more powerful card would help me with the performance of photo and video editing (with Premiere Elements and Photoshop), i would not hesitate and go for it, because presently this is the main weakness of my system for my wife. So I do not want to put too much money in this system because it would only benefits me and my games (for instance, i would like to try Bioshock and Empire Total War, games that can not actually run on my rig)

In fact, i am thinking of this upgrade only because i have to change the hard drive (maybe i put too much stuff on this disk ? I only have 20gb free on the 120gb available... the new disk will be bigger (500 ?) : i still wonder if that could help photoshop performance ; maybe this program use the hard drive for cache memory, swap things or whatever...). As i do not want to do it myself (i am too afraid to play with those expensive components and ruined my system..), i was wondering if i should not take the opportunity of this "operation" to give a little boost to the rig with a new card.

 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Originally posted by: DetConan
Thanks for all the feedback.

I am running at 1680 but I do not want to invest too much in a new card : under $100. And here in Canada, even with our strong dollar, electronics is still expensive. So the 9600GT is the more expensive I would go. In fact, the 4670 is more in my price range... If going with a more powerful card would help me with the performance of photo and video editing (with Premiere Elements and Photoshop), i would not hesitate and go for it, because presently this is the main weakness of my system for my wife. So I do not want to put too much money in this system because it would only benefits me and my games (for instance, i would like to try Bioshock and Empire Total War, games that can not actually run on my rig)

In fact, i am thinking of this upgrade only because i have to change the hard drive (maybe i put too much stuff on this disk ? I only have 20gb free on the 120gb available... the new disk will be bigger (500 ?) : i still wonder if that could help photoshop performance ; maybe this program use the hard drive for cache memory, swap things or whatever...). As i do not want to do it myself (i am too afraid to play with those expensive components and ruined my system..), i was wondering if i should not take the opportunity of this "operation" to give a little boost to the rig with a new card.

well at 1680 I would certainly choose the 9600gt over the 4670 if its not too much more. as for as photoshop goes maybe someone else can answer that for you.
 

elconejito

Senior member
Dec 19, 2007
607
0
76
www.harvsworld.com
Originally posted by: DetConan
Thanks for all the feedback.

I am running at 1680 but I do not want to invest too much in a new card : under $100. And here in Canada, even with our strong dollar, electronics is still expensive. So the 9600GT is the more expensive I would go. In fact, the 4670 is more in my price range... If going with a more powerful card would help me with the performance of photo and video editing (with Premiere Elements and Photoshop), i would not hesitate and go for it, because presently this is the main weakness of my system for my wife. So I do not want to put too much money in this system because it would only benefits me and my games (for instance, i would like to try Bioshock and Empire Total War, games that can not actually run on my rig)

In fact, i am thinking of this upgrade only because i have to change the hard drive (maybe i put too much stuff on this disk ? I only have 20gb free on the 120gb available... the new disk will be bigger (500 ?) : i still wonder if that could help photoshop performance ; maybe this program use the hard drive for cache memory, swap things or whatever...). As i do not want to do it myself (i am too afraid to play with those expensive components and ruined my system..), i was wondering if i should not take the opportunity of this "operation" to give a little boost to the rig with a new card.

I can't comment too much on the 4850 vs 9600gt debate (LOL) as I am not a big gamer, but I went from a 7600GT to a 9600GT (with a Q6600 at the time) and it was a huge improvement. I was playing Gears of War at the time (1600x1200 i think. it was a 4:3 lcd) and it went from a slide-show during intense firefights to a smoothly playable game all the time with all eye-candy turned on. I don't know if it was 30fps or 120fps, it just looked smooth to me.

For photoshop, if you have CS3 it won't make any difference since the GPU isn't used for anything. You'd be getting the card for video games only, really.

If you have CS4, the GPU doesn't need any major horsepower so either card will do. This will improve performance in zooming, panning, rotating canvas, etc, BUT it will not help with any of the filters, saving and opening files.

To improve overall performance, i guess it depends when you are seeing the slowdowns. If it's only when opening or saving files, then yeah the hard drive will help [if you only have 20GB left out of 120GB drive, that is probably slowing your whole system down]. If it's just while doing processing (like a filter or panning around, etc) then CPU and maybe RAM will help the most. To see if you need more RAM, check your scratch usage (the little triangle flyout in the lower left corner of the document window next to zoom percentage). If the number on the left is bigger than the number on the right then your using the hard drive as the scratch drive, and that is a HUGE slowdown [especially since your drive is so full]. In that case, get more RAM before anything else.

 

yh125d

Diamond Member
Dec 23, 2006
6,886
0
76
Wow, another derailed thread. Blah blah blah minimum frame rates. Blah blah cpu bottleneck blah. Blah blah overkill for that res blah blah blah




OP, I hope you were able to fish out some useful information from this







blah blah Nelson Mandela blah
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
Originally posted by: toyota
I didnt even say that at all so how about you not spreading the bs?

For once, you are right. I confused a few cards, but this is what you said:

http://forums.anandtech.com/me...eadid=2337404#31861526

if you still have the 4600 X2 then you arent going to see much improvement especially at just 1280. I guarantee you your minimum framerates will be the same in almost every game no matter what card you go with over your current 9600gt. when I had a 5000 X2 and much wimpier 8600gt I had the same minimum framerates in many games no matter what other faster card I tried in that system. a 9600gt is not too bad for a 4600 X2 but anything any faster will mostly go down the drain.

I dont care what theories some people on here I have because I tried various cards including 7600gt, 8600gt, 9600gt, 8800gt, and 4670 in my 5000 X2 system and know what to expect. at 1280 the 8800gt was basically dead even with the 9600gt and never gave more than 1-2fps in any benchmark. I could go into more details but the point is with your 4600 X2 your are not going to see hardly any playable difference at all upgrading your 9600gt.

It has taken me so long to reply to this because I've been doing some testing of my own. And I'm finally to call out the bullshit in the following statement:

when I had a 5000 X2 and much wimpier 8600gt I had the same minimum framerates in many games no matter what other faster card I tried in that system.

Test System 1: Athlon X2 4000+ @ 2.68 GHz, 8600GT @ 676/1622/814 MHz (from 540/1180/700), 2GB RAM, Windows XP Home

Test System 2: Athlon 5000+ BE @ 2.6 GHz (stock), HD3850 @ stock, 3.3 GB RAM, Windows Vista Ultimate 32-bit

Crysis Demo GPU Benchmark:
1280x1024, medium settings, no AA

8600GT: 19.30 min, 23.67 avg
HD3850: 26.27 min, 44.41 avg

Far Cry 2 Ranch Small:
1280x1024, Medium Render settings w/HDR, Very High Performance settings, no AA

8600GT: 27.46 min, 34.72 avg
HD3850: 31.73 min, 41.22 avg

Half-Life 2: Lost Coast:
1280x1024, All High settings, 4xAA, 8xAF

8600GT: 75.97 avg
HD3850: 111.05 avg

Street Fighter 4 Benchmark:
1280x1024, Highest settings, no AA, 16xAF

8600GT: 54.98 avg
HD3850: 82.08 avg

Devil May Cry 4 Benchmark:
1280x1024, Highest settings DX9

8600GT: 50.15 avg
HD3850: 86.65 avg

---------------------------------------------------------------

In Crysis there was a 36% improvement in minimum framerates.
In Far Cry 2 a 16% increase in minimum framerates.
In HL2 there was a 46% performance increase.
In SF4 there was a 49% performance increase.
In DMC4 there was a 70% performance increase.

I have no idea how you can say there is barely any improvement over an 8600GT with an Athlon X2 @ 2.6 GHz. That is simply not true at all; there is a substantial improvement.

I've played with both of the test systems extensively. The HD3850 provides a much better experience in Team Fortress 2. It can handle Call of Duty 4 much much better. Battlefield 2 also plays better, but it's already pretty smooth on the 8600GT so the difference isn't quite as noticeable to the naked eye. Half Life 2: Episode 2 was also much more enjoyable and had a lot less slow downs on the HD3850 than the 8600GT, even though I used higher settings on the HD3850. Bioshock is also a hell of a lot more playable on the HD3850. Left 4 Dead is much smoother on the HD3850 even with higher settings compared to the 8600GT.

And through all of this, I also disagree with the following statements:

at 1280 the 8800gt was basically dead even with the 9600gt and never gave more than 1-2fps in any benchmark. I could go into more details but the point is with your 4600 X2 your are not going to see hardly any playable difference at all upgrading your 9600gt.

I guarantee you your minimum framerates will be the same in almost every game no matter what card you go with over your current 9600gt.

I think if he were to have experience with a 9600GT and an HD4870 (or 5770), he would notice a drastic difference in all of his framerates in the vast majority of games. Now there might be a few games which are CPU limited (like Team Fortress 2 or GTAIV), but not most. The 4870, while not showing its full potential, would still give someone substantial benefits over a 9600GT when paired with a mid-range Athlon X2 processor. If I had a card like the HD4870, I would most certainly love to test it with such a setup.
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
Originally posted by: toyota
so you really doubt he will be dipping below 30fps with a 4200 X2? lol. you might want to check some other reviews because if that is your professional opinion then you need a new profession. just kidding but really he would go well below 30fps quite often in some newer games. hell he might not even average much more than 30fps in some games.

he would certainly be in the teens for Far Cry 2 and would likely not even average 30fps. http://www.pcgameshardware.com...hmarks/Reviews/?page=2

even the faster 5000 X2 would be in low 20s for Resident Evil 5. http://www.pcgameshardware.com...ield-results/Practice/


Some more bullcrap. With an X2 4200+, he can most certainly average over 30 FPS stay out of the teens with enough GPU horsepower and/or tweaking the settings.

My Far Cry 2 Ranch Small results with 4200+ and 8600GT:

http://i36.photobucket.com/alb...elincoln/fc2bench1.jpg

Min: 23
Avg: 33

http://i36.photobucket.com/alb...elincoln/fc2bench2.jpg

Min: 22
Avg: 31

If I had more GPU power than the 8600GT can provide, I have no doubt I would be able to turn the settings and resolution up and maintain playable framerates.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Originally posted by: cusideabelincoln
<partial snip>
I have no idea how you can say there is barely any improvement over an 8600GT with an Athlon X2 @ 2.6 GHz. That is simply not true at all; there is a substantial improvement.
you are certainly right about those cards being much faster overall than an 8600gt. I was wrong to say the minimums were about the same in many games and I should have said in some games. in UT3 the minimums were the same with an 8600gt, 9600gt, or 4670. I ran through that same exact path on the same map 5 runs with each card. the max framerates and therefore the averages were much higher but minimums never budged more than 2 fps between any of those cards. with Warhead there was a specific path I took when testing real gameplay with fraps. all three of those cards dipped into the mid teens at the same exact spots. there was never more than 2-3 fps between those 3 cards at those spots no matter how many times I ran through.

you can even see for yourself that a 3850 which is twice as fast as an 8600gt only got 4 more fps for minimums in your own benchmark. in the actual game there were sometimes never more than 3 or 4 fps difference during action. with Far Cry 2 I had enough of dealing with the huge dips so I tried the 4670 in another Core 2 system and got a 30% increase at 1280. thats when I decided it was time for a new pc. the 5000 X2 can still play almost every game out there but it IS a large performance bottleneck when trying to use a fast card.

use common sense and think about that a 3850 isnt even close to being fully utilized with a 2.6 5000 X2 or especially a stock 2.2 4200 X2. so now take a 4870 which is well over twice as fast and depending on the game and settings you end up throwing 30-50% of what that card can do right down the drain. if someone has a 3850 and 4200 X2 there will be plenty of games where the minimums would barely change if at all when upgrading to a faster card. averages dont mean a whole lot when game keeps dipping down at the same spots as before. really with a 4200 X2 something like a 4670 or 9600gt makes much more sense especially when that person only has a 7600 now.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Originally posted by: cusideabelincoln
Originally posted by: toyota
so you really doubt he will be dipping below 30fps with a 4200 X2? lol. you might want to check some other reviews because if that is your professional opinion then you need a new profession. just kidding but really he would go well below 30fps quite often in some newer games. hell he might not even average much more than 30fps in some games.

he would certainly be in the teens for Far Cry 2 and would likely not even average 30fps. http://www.pcgameshardware.com...hmarks/Reviews/?page=2

even the faster 5000 X2 would be in low 20s for Resident Evil 5. http://www.pcgameshardware.com...ield-results/Practice/


Some more bullcrap. With an X2 4200+, he can most certainly average over 30 FPS stay out of the teens with enough GPU horsepower and/or tweaking the settings.

My Far Cry 2 Ranch Small results with 4200+ and 8600GT:

http://i36.photobucket.com/alb...elincoln/fc2bench1.jpg

Min: 23
Avg: 33

http://i36.photobucket.com/alb...elincoln/fc2bench2.jpg

Min: 22
Avg: 31

If I had more GPU power than the 8600GT can provide, I have no doubt I would be able to turn the settings and resolution up and maintain playable framerates.


perhaps you should start your own site since its obvious that site is just lying. well with cpus such as the 4450e X2(which is tad faster than his 4200) anandtech could only average in the mid to upper 20s even with medium settings and a gtx280. http://www.anandtech.com/bench/default.aspx?b=49 . I guess they are lying too though.

I guess here is a third lying site because with a 5000 X2 they could only average 25fps with a 19fps minimum in Far Cry 2 with a gtx280 at just 1280x1024. http://www.hardwarecanucks.com...formance-review-9.html look at some of the faster cpus because they are getting higher minimums than the 5000 X2 can even average. lol. heck I get TWICE that even with a slower gtx260 because my E8500 is so much faster. my minimums with a gtx260 and E8500 are much higher then their averages with a gtx280 and 5000 X2. maybe the cpu is more important than you thought??



getting back to the OPs cpu, I just lowered my E8500 cpu to 1.6 which would still be as fast or faster than a 2.2 4200x2 and fired up a few games for actual gameplay. if you doubt that my E8500 at 1.6 is a fair comparison to the 4200 X2 then look at old cpu reviews as the 1.8 E4300 matches or in most cases easily beats the 4200 X2. the E8xxx series cpus are at least 20% faster clock for clock than the E4xxx series so if anything my E8500 at 1.6 would be faster than a 2.2 4200 X2. anyway, all these games were ran at a gpu limited 1920x1080 with highest playable settings which even meant in 4x AA Fallout 3. they were all very much playable beforehand with my cpu at 3.16. as you can see, the FRAPS numbers indicate a couple games being ether completely unplayable or close to it once I lowered the cpu to 1.6 which again is as fast or faster than a 2.2 4200 X2. the screenshots are just to show you exactly where I was testing and they ARE NOT average framerates. but they are taken at almost the identical spots for comparison purposes though.


Fallout 3= still very playable but framerates drop from around 50 fps at 3.16 down to low 30s when lowering the cpu to 1.6.

1920x1080 ultra settings and 4x AA

E8500 @ 1.6=32fps
http://img39.imageshack.us/img...ut3200910280051323.png

E8500 at stock 3.16=50fps
http://img23.imageshack.us/img...ut3200910280207151.png


Batman AA = averages go from around 40fps at 3.16 to just upper 20s when lowering the cpu to 1.6 and becomes pretty laggy during action. the minimums drop from around 30-31fps with stock E8500 to 18-19 with the cpu at 1.6.

1920x1080 all very high settings high physx no AA

E8500 @ 1.6=27fps
http://img42.imageshack.us/img...ingpcbmgame2009102.png

E8500 at stock 3.16=39fps
http://img203.imageshack.us/im...ingpcbmgame2009102.png


Crysis Warhead
= the averages seemed to be only 4-5 fps lower but it was certainly more laggy as min framerate dropped from 23-24 fps with stock 3.16 down to 16-17 with cpu at 1.6.

1920x1080 custom DX9 config almost equal to all enthusiast no AA

E8500 @ 1.6=22fps
http://img4.imageshack.us/img4...is2009102804581867.png

E8500 at stock 3.16=29fps
http://img27.imageshack.us/img...is2009102805103118.png


Far Cry 2
= there is a massive framerate drop from lowering the cpu to 1.6 but its still playable although laggy at times with some dips into the teens.

1920x1080 all ultra settings no AA

E8500 @ 1.6=23fps
http://img248.imageshack.us/im...y22009102801132941.png

E8500 at stock 3.16=41fps
http://img194.imageshack.us/im...y22009102802273389.png


Red Faction Guerrilla
= went from playable to basically not playable at all with close to only half the framerate at times when lowering the cpu to 1.6. certainly a very cpu dependent game here and quite miserable with the E8500 at 1.6.

1920x1080 all highest settings including ambient occlusion no AA

E8500 @ 1.6=13fps
http://img5.imageshack.us/img5...fg2009102517402044.png

E8500 at stock 3.16=24fps
http://img23.imageshack.us/img...fg2009102802331780.png


Ghostbusters
= goes from playable to not very playable as the framerate is nearly cut in half at times outside with dips even into the single digits with the cpu at 1.6. overall the game does average much more than these screenshots indicate but it is very cpu dependent.

1920x1080 all highest in game settings no AA

E8500 @ 1.6=11fps
http://img5.imageshack.us/img5...w32200910252030393.png

E8500 at stock 3.16=21fps
http://img203.imageshack.us/im...w32200910280215226.png



so even at a gpu limited 1920x1080 with the highest playable settings possible for my 192sp gtx260, an E8500 would provide a VERY LARGE advantage over a cpu like a 4200 X2. so just imagine the GIGANTIC difference you would see at a lower res with an even faster card on a poky cpu like a 4200 X2. if that doesnt wake you up to the reality of how important a decent cpu is with a fast graphics card then I guess this debate will go on forever.

 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
Your results mean absolutely nothing, dude.

First, you're forgetting that different CPUs will scale differently at different clockspeeds.

Second...

you can even see for yourself that a 3850 which is twice as fast as an 8600gt only got 4 more fps for minimums in your own benchmark. in the actual game there were sometimes never more than 3 or 4 fps difference during action. with Far Cry 2 I had enough of dealing with the huge dips so I tried the 4670 in another Core 2 system and got a 30% increase at 1280. thats when I decided it was time for a new pc. the 5000 X2 can still play almost every game out there but it IS a large performance bottleneck when trying to use a fast card.

In my testing you obviously DIDN'T FUCKING READ MY SETUP very carefully. The 8600GT was overclocked and the HD3850 was at stock. In this configuration, the HD3850 is not twice as fast as the 8600GT. It would be more like 50% faster, which is exactly what my results showed.

And you also didn't read my conclusion. I am willing to bet you, in Far Cry 2, if I were to stick a faster card than the HD3850 into the 5000+ that my minimum framerates would go up when using the same settings. Performance would increase, and that is what I'm trying to show you. There's no doubt a faster card wouldn't run at it's full potential, but this is not what I'm showing.

use common sense and think about that a 3850 isnt even close to being fully utilized with a 2.6 5000 X2 or especially a stock 2.2 4200 X2. so now take a 4870 which is well over twice as fast and depending on the game and settings you end up throwing 30-50% of what that card can do right down the drain. if someone has a 3850 and 4200 X2 there will be plenty of games where the minimums would barely change if at all when upgrading to a faster card. averages dont mean a whole lot when game keeps dipping down at the same spots as before. really with a 4200 X2 something like a 4670 or 9600gt makes much more sense especially when that person only has a 7600 now.

Why the fuck are you switching contexts? My results and my initial conclusion are obviously for the 5000+, not the 4200+. With a card like the 4870, if you only look at absolute minimum framerates then sure a persons could see a 30-50% performance increase with a faster CPU. But minimums don't tell the whole story, just as maximums and averages don't by themselves either. Read this line carefully, because it is what I've been trying to say, but you somehow keep putting words into my mouth like a dumbfuck: With an Athlon 5000+, if a person were to upgrade from the HD3850 to a card like the HD4870, performance would increase in most of their games on the mins, avgs, and maxs.

perhaps you should start your own site since its obvious that site is just lying. well with cpus such as the 4450e X2(which is tad faster than his 4200) anandtech could only average in the mid to upper 20s even with medium settings and a gtx280. http://www.anandtech.com/bench/default.aspx?b=49 . I guess they are lying too though.

I guess here is a third lying site because with a 5000 X2 they could only average 25fps with a 19fps minimum in Far Cry 2 with a gtx280 at just 1280x1024. http://www.hardwarecanucks.com...formance-review-9.html look at some of the faster cpus because they are getting higher minimums than the 5000 X2 can even average. lol. heck I get TWICE that even with a slower gtx260 because my E8500 is so much faster. my minimums with a gtx260 and E8500 are much higher then their averages with a gtx280 and 5000 X2. maybe the cpu is more important than you thought??

Stop being passive aggressive, you douchebag. Those sites are obviously testing at different settings than I am. And my results are real, you dolt. So if you assume my results are real, and their results are real, then can you come up with an explanation other than that I'm lying or they're lying? I have the 5000+ X2 and I've tested it. It can achieve much higher minimum framerates than 19 fps in Far Cry 2.


Also I will reiterate this: Why the fuck are you talking about the 4200X2 when I was testing with the 5000+ X2? Are you fucking stupid? Did you not read my first post? Everything in my first post applied to the 5000+, not the 4200+. In my second post I addressed the problem with the 4200+. Also your results are pretty meaningless. You're using completely different hardware, and it is well known different games will react differently to not only the make and model of a processor, but the speed as well. I have the actual hardware and I've tested it. Maybe you should do the same.

Oh and where the fuck did you get the notion that I was saying a CPU wasn't important for high end cards? I never fucking said this, you idiot.
 

Marty502

Senior member
Aug 25, 2007
497
0
0
All I'll add to this nauseating discussion is that I once had a 3870, switched it for a 8800GTX for a while, and noticed a big jump in performance in games at 1680x1050. Crysis among others.
All with a 2.4 Ghz AMD X2. Hardly a powerful CPU, same as the OP.
I settled for a 4670 but for different reasons; it's definitely much much slower than the 8800GTX or the 3870 and not in silly benchmarks only, but in proper gaming.

So there's much better to have than a 8600GT for an old dual core, even at the stock 2 Ghz for the basic models.

Buy the 9600GT or something even faster if you can afford it. It's gonna be a HUGE performance upgrade, guaranteed.
And of course, try to overclock that CPU a bit. 2.6 ghz and above is a success for these older chips.
 

deimos3428

Senior member
Mar 6, 2009
697
0
0
Test System 1: Athlon X2 4000+ @ 2.68 GHz, 8600GT @ 676/1622/814 MHz (from 540/1180/700), 2GB RAM, Windows XP Home

Test System 2: Athlon 5000+ BE @ 2.6 GHz (stock), HD3850 @ stock, 3.3 GB RAM, Windows Vista Ultimate 32-bit
There's definitely an improvement, but your results aren't surprising. I'm no video card guru, I just brought up this page:

http://www.gpureview.com/show_...p?card1=513&card2=546#

With the 8600GT clocked nearly the same as the stock 3850, you've got similar texture fill rates, but only half the pixel fill rate and memory bandwidth. It'd be interesting to see what happens if you put the 3850 in a PCIe 1.0 slot, though...it's PCI 2.0 interface is probably giving it an "unfair" advantage...

Now there might be a few games which are CPU limited (like Team Fortress 2 or GTAIV), but not most. The 4870, while not showing its full potential, would still give someone substantial benefits over a 9600GT when paired with a mid-range Athlon X2 processor. If I had a card like the HD4870, I would most certainly love to test it with such a setup.
A 4870 for TF2 is crazy (but fun) overkill. I checked once and the minimum frame rates were well over 60. I can do some tests if you like, but I pulled mine the other day to try out the IGP and much to my surprise the game ran just fine on my 790GX! I turned on fraps for a bit and didn't see anything under 30 fps. Definitely playable at 1368x768 or lower.

More importantly, I can always use more friends in TF2, so add me if you like. Same username as here.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Originally posted by: cusideabelincoln
Your results mean absolutely nothing, dude.

First, you're forgetting that different CPUs will scale differently at different clockspeeds.

Second...

you can even see for yourself that a 3850 which is twice as fast as an 8600gt only got 4 more fps for minimums in your own benchmark. in the actual game there were sometimes never more than 3 or 4 fps difference during action. with Far Cry 2 I had enough of dealing with the huge dips so I tried the 4670 in another Core 2 system and got a 30% increase at 1280. thats when I decided it was time for a new pc. the 5000 X2 can still play almost every game out there but it IS a large performance bottleneck when trying to use a fast card.

In my testing you obviously DIDN'T FUCKING READ MY SETUP very carefully. The 8600GT was overclocked and the HD3850 was at stock. In this configuration, the HD3850 is not twice as fast as the 8600GT. It would be more like 50% faster, which is exactly what my results showed.

And you also didn't read my conclusion. I am willing to bet you, in Far Cry 2, if I were to stick a faster card than the HD3850 into the 5000+ that my minimum framerates would go up when using the same settings. Performance would increase, and that is what I'm trying to show you. There's no doubt a faster card wouldn't run at it's full potential, but this is not what I'm showing.

use common sense and think about that a 3850 isnt even close to being fully utilized with a 2.6 5000 X2 or especially a stock 2.2 4200 X2. so now take a 4870 which is well over twice as fast and depending on the game and settings you end up throwing 30-50% of what that card can do right down the drain. if someone has a 3850 and 4200 X2 there will be plenty of games where the minimums would barely change if at all when upgrading to a faster card. averages dont mean a whole lot when game keeps dipping down at the same spots as before. really with a 4200 X2 something like a 4670 or 9600gt makes much more sense especially when that person only has a 7600 now.

Why the fuck are you switching contexts? My results and my initial conclusion are obviously for the 5000+, not the 4200+. With a card like the 4870, if you only look at absolute minimum framerates then sure a persons could see a 30-50% performance increase with a faster CPU. But minimums don't tell the whole story, just as maximums and averages don't by themselves either. Read this line carefully, because it is what I've been trying to say, but you somehow keep putting words into my mouth like a dumbfuck: With an Athlon 5000+, if a person were to upgrade from the HD3850 to a card like the HD4870, performance would increase in most of their games on the mins, avgs, and maxs.

perhaps you should start your own site since its obvious that site is just lying. well with cpus such as the 4450e X2(which is tad faster than his 4200) anandtech could only average in the mid to upper 20s even with medium settings and a gtx280. http://www.anandtech.com/bench/default.aspx?b=49 . I guess they are lying too though.

I guess here is a third lying site because with a 5000 X2 they could only average 25fps with a 19fps minimum in Far Cry 2 with a gtx280 at just 1280x1024. http://www.hardwarecanucks.com...formance-review-9.html look at some of the faster cpus because they are getting higher minimums than the 5000 X2 can even average. lol. heck I get TWICE that even with a slower gtx260 because my E8500 is so much faster. my minimums with a gtx260 and E8500 are much higher then their averages with a gtx280 and 5000 X2. maybe the cpu is more important than you thought??

Stop being passive aggressive, you douchebag. Those sites are obviously testing at different settings than I am. And my results are real, you dolt. So if you assume my results are real, and their results are real, then can you come up with an explanation other than that I'm lying or they're lying? I have the 5000+ X2 and I've tested it. It can achieve much higher minimum framerates than 19 fps in Far Cry 2.


Also I will reiterate this: Why the fuck are you talking about the 4200X2 when I was testing with the 5000+ X2?
Are you fucking stupid? Did you not read my first post? Everything in my first post applied to the 5000+, not the 4200+. In my second post I addressed the problem with the 4200+. Also your results are pretty meaningless. You're using completely different hardware, and it is well known different games will react differently to not only the make and model of a processor, but the speed as well. I have the actual hardware and I've tested it. Maybe you should do the same.

Oh and where the fuck did you get the notion that I was saying a CPU wasn't important for high end cards? I never fucking said this, you idiot.
nice name calling. those 3 site links had a 5000 X2 and they showed similar min framerates to what I got in Far Cry 2. after that maybe you should learn to read because I clearly said lets get back to the OPs 4200 X2. btw I could put the E8500 st 1.8 which is the equivalent of the 2.6 5000 X2 and test all over again but how much difference would that make? anyway, again just to be clear the tests I ran were because the OP has a 4200 X2. you would be the very person to tell someone to stick a 4870 or 4890 with a 4200 X2 but as you can see it would be a waste of money for many modern games. that person would get the same playable results with a slower and cheaper card because that cpu is too slow to even keep up with a higher end card. you somehow seem to think that only your tests, opinions and theories seem to matter. those are REAL GAMEPLAY results that I spent a lot of time going through just to show people like you how important the cpu is even at 1920x1080 under gpu dependent conditions and how silly recommending a high end card to somebody with a 4200 X2 cpu is.



 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Originally posted by: Marty502
All I'll add to this nauseating discussion is that I once had a 3870, switched it for a 8800GTX for a while, and noticed a big jump in performance in games at 1680x1050. Crysis among others.
All with a 2.4 Ghz AMD X2. Hardly a powerful CPU, same as the OP.
I settled for a 4670 but for different reasons; it's definitely much much slower than the 8800GTX or the 3870 and not in silly benchmarks only, but in proper gaming.

So there's much better to have than a 8600GT for an old dual core, even at the stock 2 Ghz for the basic models.

Buy the 9600GT or something even faster if you can afford it. It's gonna be a HUGE performance upgrade, guaranteed.
And of course, try to overclock that CPU a bit. 2.6 ghz and above is a success for these older chips.

the only reason an 8600gt was ever brought up was that had said I got about the same minimums in some games with it as a I did with faster cards while using a 5000 X2. nobody is suggesting the op get a 8600gt. a 9600gt or 4670 is the lowest cards being recommended for him and they are a massive jump over his current 7600.
 

Schmide

Diamond Member
Mar 7, 2002
5,712
978
126
Originally posted by: toyota
nice name calling. those 3 site links had a 5000 X2 and they showed similar min framerates to what I got in Far Cry 2. after that maybe you should learn to read because I clearly said lets get back to the OPs 4200 X2. btw I could put the E8500 st 1.8 which is the equivalent of the 2.6 5000 X2 and test all over again but how much difference would that make? anyway, again just to be clear the tests I ran were because the OP has a 4200 X2. you would be the very person to tell someone to stick a 4870 or 4890 with a 4200 X2 but as you can see it would be a waste of money for many modern games. that person would get the same playable results with a slower and cheaper card because that cpu is too slow to even keep up with a higher end card. you somehow seem to think that only your tests, opinions and theories seem to matter. those are REAL GAMEPLAY results that I spent a lot of time going through just to show people like you how important the cpu is even at 1920x1080 under gpu dependent conditions and how silly recommending a high end card to somebody with a 4200 X2 cpu is.

I sometime wish Crysis, Far Cry 2, GTA4 were never released, well not really, but they are CPU dependent and I guess justify this line of reasoning. If they were what everyone was playing, I'd give more weight to this argument.

Again, since it seems to go right through you, there are plenty of games which see moderate improvements with a higher end card, if you bias all your data just to show your one point of view, you do no justice to your argument.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Originally posted by: Schmide
Originally posted by: toyota
nice name calling. those 3 site links had a 5000 X2 and they showed similar min framerates to what I got in Far Cry 2. after that maybe you should learn to read because I clearly said lets get back to the OPs 4200 X2. btw I could put the E8500 st 1.8 which is the equivalent of the 2.6 5000 X2 and test all over again but how much difference would that make? anyway, again just to be clear the tests I ran were because the OP has a 4200 X2. you would be the very person to tell someone to stick a 4870 or 4890 with a 4200 X2 but as you can see it would be a waste of money for many modern games. that person would get the same playable results with a slower and cheaper card because that cpu is too slow to even keep up with a higher end card. you somehow seem to think that only your tests, opinions and theories seem to matter. those are REAL GAMEPLAY results that I spent a lot of time going through just to show people like you how important the cpu is even at 1920x1080 under gpu dependent conditions and how silly recommending a high end card to somebody with a 4200 X2 cpu is.

I sometime wish Crysis, Far Cry 2, GTA4 were never released, well not really, but they are CPU dependent and I guess justify this line of reasoning. If they were what everyone was playing, I'd give more weight to this argument.

Again, since it seems to go right through you, there are plenty of games which see moderate improvements with a higher end card, if you bias all your data just to show your one point of view, you do no justice to your argument.

one point of view? are you freaking kidding me? so now Crysis is cpu limited like GTA 4? Crysis is mostly a gpu limited game unless your cpu is REALLY slow. heck I didnt even test GTA 4 but you can be sure that it can be added to the list of not being very playable with a 4200 X2. anyway, I even used Fallout 3 which will run okay even on a single core cpu so my tests are just one sided. those tests were run just to show how much performance is wasted even at 19200x1080 on highest playable settings when you have decent card but a poky cpu like the 4200 X2. those games represent modern games that people would want to play. so using your logic we will just tell someone with a piss slow cpu that they just get a very fast card anyway and just ignore many modern games? if someone is wanting to upgrade their gpu its because they are having trouble playing modern games. well those same modern games require a decent cpu also. its really best just to deal with this issue on a case by case basis.