• Guest, The rules for the P & N subforum have been updated to prohibit "ad hominem" or personal attacks against other posters. See the full details in the post "Politics and News Rules & Guidelines."

How bottlenecked will this be?

decadentia84

Junior Member
Nov 22, 2009
23
0
0
Currently have

AMD X2 5600+ 2.8ghz
8800GT
4GB DDR2 Gskill

I just ordered my Gigabyte ATI 5850 and am patiently awaiting for it to be available. I'm thinking of upgrading with boxing day sales to a AMD Phenom X4 965 or I5 750. Regardless, im just wondering how big of a bottleneck I will be placing on this video card with my current cpu?

Games run at 1920x1200
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
VERY bottlenecked. you will be wasting so much of what a 5850 can do that its ridiculous to even use one with that cpu. please upgrade to that Phenom 2 or i5 as soon as possible.
 

ScorcherDarkly

Senior member
Aug 7, 2009
450
0
0
Colossal.

That doesn't mean games won't run fine, but the CPU will be holding you back from the 5850's potential.
 

ShreddedWheat

Senior member
Apr 3, 2006
386
0
0
I've got an e5200 oced to 3.4 with 4 gigs of ram. I went from a 8800gs to a 4850 and noticed a good jump and noticed a bigger jump going from the 4850 to the 5850. I would overclock that cpu of yours. I also game at 1920x1200 also. I noticed that I ran a mix of high and very high in DX9 with the 4850 and would play crysis in the mid 20s to high 20s. With the 5850 I play everything very high and get high 20s to low 40s but mainly stay in the mid 30s. It made crysis much better. Farcry 2 jumped almost double. Haven't tried out many other games yet. I plan to eventually upgrade cpu but for now it is well worth the upgrade from the 4850. My 2 cents
 

Arkaign

Lifer
Oct 27, 2006
20,646
1,154
126
If your board is compatible, you'd see a much bigger upgrade jumping out of that current processor than upgrading the 8800GT at this point. Remember, a 2.8Ghz X2 of that gen is only about as fast as a 3-year-old E6400 Core 2 Duo @ 2.13Ghz, w/2MB L2.

If your board supports it, a Phenom II 720BE or 9xx BE would be a giant freakish improvement. Then you can move to a better video card as prices come tumbling after Xmas (my prediction anyway). 5850/5870 are not in very good supply, and they are too expensive at the moment IMO. I hope Nvidia's response turns out decent, it would be really nice to see 5850-level cards drop to around the $200 mark.

Is your order already locked-in? If so, it's not the end of the world, the 5850 is a great performer, I just don't think you'll see anywhere near the true potential of it until you get out of that old X2.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
I've got an e5200 oced to 3.4 with 4 gigs of ram. I went from a 8800gs to a 4850 and noticed a good jump and noticed a bigger jump going from the 4850 to the 5850. I would overclock that cpu of yours. I also game at 1920x1200 also. I noticed that I ran a mix of high and very high in DX9 with the 4850 and would play crysis in the mid 20s to high 20s. With the 5850 I play everything very high and get high 20s to low 40s but mainly stay in the mid 30s. It made crysis much better. Farcry 2 jumped almost double. Haven't tried out many other games yet. I plan to eventually upgrade cpu but for now it is well worth the upgrade from the 4850. My 2 cents
even with his 5600 X2 overclocked to 3.2 or so a 5850 would be massively wasted in that system.
 

Arkaign

Lifer
Oct 27, 2006
20,646
1,154
126
I've got an e5200 oced to 3.4 with 4 gigs of ram. I went from a 8800gs to a 4850 and noticed a good jump and noticed a bigger jump going from the 4850 to the 5850. I would overclock that cpu of yours. I also game at 1920x1200 also. I noticed that I ran a mix of high and very high in DX9 with the 4850 and would play crysis in the mid 20s to high 20s. With the 5850 I play everything very high and get high 20s to low 40s but mainly stay in the mid 30s. It made crysis much better. Farcry 2 jumped almost double. Haven't tried out many other games yet. I plan to eventually upgrade cpu but for now it is well worth the upgrade from the 4850. My 2 cents
Your E5200 is also a WORLD faster than the X2 5600+ @ 2.8ghz that the OP is using. Remember, the E5xxx series is a C2D with 2MB L2, just like the original E6300/6400 C2Ds, which according to the benches such as on anandtech show a massive IPC advantage over the old X2s. The old X2s ran out of steam right around the 3ghz mark anyway, outside of luck and/or fantastic cooling. For an old X2 to match your E5200 @ 3.4ghz it would have to be clocked well over 4ghz, probably nearly 5ghz. The newer AM2+/AM3 Phenom X2 is a whole different ballgame, as are the new Athlons based on PhII design.
 

happy medium

Lifer
Jun 8, 2003
14,387
475
126
e5200 @ 3.4 = x2 5600 @ 3.9
E8400 @3.4 = x2 5600 @ 4.3

I've done this scaling comparison already.

A 8800gt is by far the bottleneck @ 1900x1080 with a x2 5600 @2.8 with modern games
Ive also done this scaling ,but at 1600x1200.

A 5850 is a waste with a x2 5600. I agree.

But! It will run most if not all games over 30fps @ high detail.
And so will a gtx 260 or hd4870 1gb
 
Last edited:

toyota

Lifer
Apr 15, 2001
12,957
1
0
e5200 @ 3.4 = x2 5600 @ 3.9
E8400 @3.4 = x2 5600 @ 4.3

I've done this scaling comparison already.

A 8800gt is by far the bottleneck @ 1900x1080 with a x2 5600 @2.8 with modern games
Ive also done this scaling ,but at 1600x1200.

A 5850 is a waste with a x2 5600. I agree.

But! It will run most if not all games over 30fps @ high detail.
And so will a gtx 260 or hd4870 1gb
while we certainly agree a 5850 is silly I think you are being too generous with the 5600 X2 cpu comparisons. when I had 2.6 5000 X2, my E8500 at 1.6 basically matched or beat it in every gaming benchmark I ran. also at 1.6 it ALWAYS had a better minimum framerate than the 2.6 5000 X2. with speeds at 1.8 the E8500 was easily faster than the 2.6 X2 in every case. basically from expereince and looking at most benchmarks I would say a 5600 X2 would have to be at 5.0 or better just to match an E8500 at 3.4.
 
Last edited:

decadentia84

Junior Member
Nov 22, 2009
23
0
0
If your board is compatible, you'd see a much bigger upgrade jumping out of that current processor than upgrading the 8800GT at this point. Remember, a 2.8Ghz X2 of that gen is only about as fast as a 3-year-old E6400 Core 2 Duo @ 2.13Ghz, w/2MB L2.

If your board supports it, a Phenom II 720BE or 9xx BE would be a giant freakish improvement. Then you can move to a better video card as prices come tumbling after Xmas (my prediction anyway). 5850/5870 are not in very good supply, and they are too expensive at the moment IMO. I hope Nvidia's response turns out decent, it would be really nice to see 5850-level cards drop to around the $200 mark.

Is your order already locked-in? If so, it's not the end of the world, the 5850 is a great performer, I just don't think you'll see anywhere near the true potential of it until you get out of that old X2.
Alright, i'm quite primed to purchase a new motherboard/Phenom X4 965 (using my DDR2 ram, have a gigabyte board picked out) so that should solve that problem. I guess I never realized the difference in performance between the new processors, are the phenoms faster per clock or simply the extra cores/cache?

It's ordered, but im sure I can cancel it. Everything i've read indicates there won't be any price cuts for a long time (why would a company price cut when demand is high and stock is low?). I really didn't want to wait until March + next year to see a 50$ price drop.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Alright, i'm quite primed to purchase a new motherboard/Phenom X4 965 (using my DDR2 ram, have a gigabyte board picked out) so that should solve that problem. I guess I never realized the difference in performance between the new processors, are the phenoms faster per clock or simply the extra cores/cache?

It's ordered, but im sure I can cancel it. Everything i've read indicates there won't be any price cuts for a long time (why would a company price cut when demand is high and stock is low?). I really didn't want to wait until March + next year to see a 50$ price drop.
the Phenom 2 cpus are MUCH better clock for clock compared to the old Athlon 64 architecture.
 

decadentia84

Junior Member
Nov 22, 2009
23
0
0
the Phenom 2 cpus are MUCH better clock for clock compared to the old Athlon 64 architecture.
Alrighty, thanks for the information. A new motherboard and a Phenom will come cheaply so its no issue there. Appreciate the feedback. Now whether or not to cancel this gpu and wait to see if these price cuts come to fruition...

P.S From what I can tell, i'll be waiting at least a month before i get my hands on the video card anyhow.
 

Schmide

Diamond Member
Mar 7, 2002
5,367
283
126
Games run at 1920x1200
Regardless of everything in this thread. This is what makes it viable, you should be able to enjoy plenty of eye-candy (anti-aliasing/anisotropic filtering) regardless of any CPU limitations. You will do better with the PIIx4 though.

If you can cancel do. Upgrade your CPU first. Prices will go down.
 
Last edited:

toyota

Lifer
Apr 15, 2001
12,957
1
0
Regardless of everything in this thread. This is what makes it viable, you should be able to enjoy plenty of eye-candy (anti-aliasing/anisotropic filtering) regardless of any CPU limitations. You will do better with the PIIx4 though.
IMO it doesnt make buying a $300 card that you cant come remotely close to fully utilizing viable. a 2.8 5600 X2 is a poor cpu to be used with that level of card and 1920 doesnt change that. sure 1920 means he is throwing less overall performance down the drain than he would at a lower res but either way its just way too much card for that cpu.

yeah hopefully prices will go down but it could be a few more months before things get better. its really not a good bang for buck time for the higher end cards at all.
 
Last edited:

ShreddedWheat

Senior member
Apr 3, 2006
386
0
0
Thanks for the lesson on my end....I didn't realize that the pentium dual core was that "that" much faster than an x2. Peace
 

BFG10K

Lifer
Aug 14, 2000
21,882
971
126
I wouldn't be so quick to write off that upgrade. I've just finished some underclocking tests on my E6850 (will be published soon) and even when underlocked to 2 GHz, there was little to no impact on performance because my GTX285 was the bottleneck by 100% in most cases.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
I wouldn't be so quick to write off that upgrade. I've just finished some underclocking tests on my E6850 (will be published soon) and even when underlocked to 2 GHz, there was little to no impact on performance because my GTX285 was the bottleneck by 100% in most cases.
please not this again. lol. even at 1920 and highest playable settings, I can lower my E8500 down to 2.0 and lose 30% in some games. thats just with a 192sp gtx260. a 5600 X2 is slightly slower than my E8500 at 2.0 and a 5850 is twice as fast as my 192sp gtx260 so it would be beyond ridiculous to use a 5850 with his setup. there are some games where he would literally only get around 50% of what a 5850 is capable of so yes this should be written off.


EDIT: I do see your point if running tons of AA but really overall the OPs cpu is not a good match at all for a 5850.
 
Last edited:

toyota

Lifer
Apr 15, 2001
12,957
1
0
I knew it. I cant wait to read this.
well how about we start with Batman. lol


1920x1080 all very high settings, high physx no AA

E8500 at 2.0 GTX260 at 576/1188/1990

Frames, Time (ms), Min, Max, Avg
2092, 63045, 21, 42, 33.183


E8500 at 2.0 GTX260 at 666/1392/2200

Frames, Time (ms), Min, Max, Avg
2086, 61778, 21, 40, 33.766


thats right basically NO difference outside of margin of error for over a half dozen runs. remember his 5600 X2 would be even slower than my E8500 at 2.0. even a gtx260 is clearly bottlenecked in this game by a cpu like his even at 1920 with very high settings. overclocking this old 192sp gtx260 did nothing with a cpu like his so a 5850 would be beyond silly. sure a 5850 would allow him to turn on AA but so what because having a faster cpu would help the entire game be much more playable. a 5850 would easily have about 50% of its potential go right down the drain in this game with a 5600 X2.




now lets put the cpu back at stock 3.16 and see what happens.

E8500 at 3.16 GTX260 at 666/1392/2200

Frames, Time (ms), Min, Max, Avg
2742, 66301, 31, 53, 41.357
 
Last edited:

cbn

Lifer
Mar 27, 2009
12,968
221
106
well how about we start with Batman. lol


1920x1080 all very high settings, high physx no AA

E8500 at 2.0 GTX260 at 576/1188/1990

Frames, Time (ms), Min, Max, Avg
2092, 63045, 21, 42, 33.183


E8500 at 2.0 GTX260 at 666/1392/2200

Frames, Time (ms), Min, Max, Avg
2086, 61778, 21, 40, 33.766
I bet if AA was increased we would be seeing some differences there.

Running no AA is pretty easy on a GPU.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
please not this again. lol. even at 1920 and highest playable settings, I can lower my E8500 down to 2.0 and lose 30% in some games. thats just with a 192sp gtx260. a 5600 X2 is slightly slower than my E8500 at 2.0 and a 5850 is twice as fast as my 192sp gtx260 so it would be beyond ridiculous to use a 5850 with his setup. there are some games where he would literally only get around 50% of what a 5850 is capable of so yes this should be written off.
How much AA/AF are you running though?
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
I bet if AA was increased we would be seeing some differences there.

Running no AA is pretty easy on a GPU.
so what? do you want AA and 30 fps or the opportunity to get much more than that with a competent cpu? the cpu has already brought the game down significantly. with a better cpu he can actually then utilize a 5850 and crank up as many settings as he wants. the POINT is that with a video card twice as fast as mine and an even slower cpu then that would be even more performance wasted.



EDIT: well here you go I ran some numbers in Batman with 4x AA.

1920x1080 all very high settings, 4x AA and high physx



E8500 at 2.0 GTX260 at 576/1188/1990

Frames, Time (ms), Min, Max, Avg
1786, 63457, 18, 34, 29.035


E8500 at 2.0 GTX260 at 666/1392/2200
Frames, Time (ms), Min, Max, Avg
2086, 67780, 18, 36, 30.476


E8500 at 3.16 GTX260 at 666/1392/2200
Frames, Time (ms), Min, Max, Avg
2286, 65699, 27, 43, 34.795

this is not even a very cpu intensive game still makes more sense to have a decent cpu when you look at the minimums. again my cpu at 2.0 is faster than his 5600 X2 and a 5850 would be twice as fast as my 192sp gtx260. it is foolish to run a card like a $300 5850 with a cpu like a 5600 X2.
 
Last edited:

Schmide

Diamond Member
Mar 7, 2002
5,367
283
126
IMO it doesnt make buying a $300 card that you cant come remotely close to fully utilizing viable.
The point being it is the first step in a series of upgrades. If it's part of the final system, the GPU is the best performance for $ that you could give to any one component, especially at higher resolutions with anti-aliasing and anisotropic filtering.
 

AzN

Banned
Nov 26, 2001
4,112
0
0
even with his 5600 X2 overclocked to 3.2 or so a 5850 would be massively wasted in that system.
I wouldn't call it waste now. I've probably explained this to you before and others but we have guys who like to repeat the same question when they could just search and get the answer faster than he typed the similar posts day in day out.

A 5850 is $300. A new mobo/cpu/ram is $300 and up.

Between these options GPU is the way to go far as improving your frame rates especially at 1920x1200. Holding it back sure. You might be hitting 120fps with E8400 @ 4ghz but only 70fps with a X2 5600 but that's when CPU can tax all GPU power available usually at lower resolutions and no AA. When you increase resolution and AA that gap becomes smaller and smaller and is pretty much non existent on 8800gt at that resolution even with a much faster CPU.

Holding it back sure. The improvements far supersedes CPU and mobo upgrade which should be the poster's next upgrade to further improve his gaming needs.
 

happy medium

Lifer
Jun 8, 2003
14,387
475
126
so what? do you want AA and 30 fps or the opportunity to get much more than that with a competent cpu? the cpu has already brought the game down significantly. with a better cpu he can actually then utilize a 5850 and crank up as many settings as he wants. the POINT is that with a video card twice as fast as mine and an even slower cpu then that would be even more performance wasted.



EDIT: well here you go I ran some numbers in Batman with 4x AA.

1920x1080 all very high settings, 4x AA and high physx



E8500 at 2.0 GTX260 at 576/1188/1990

Frames, Time (ms), Min, Max, Avg
1786, 63457, 18, 34, 29.035


E8500 at 2.0 GTX260 at 666/1392/2200
Frames, Time (ms), Min, Max, Avg
2086, 67780, 18, 36, 30.476


E8500 at 3.16 GTX260 at 666/1392/2200
Frames, Time (ms), Min, Max, Avg
2286, 65699, 27, 43, 34.795

this is not even a very cpu intensive game still makes more sense to have a decent cpu when you look at the minimums. again my cpu at 2.0 is faster than his 5600 X2 and a 5850 would be twice as fast as my 192sp gtx260. it is foolish to run a card like a $300 5850 with a cpu like a 5600 X2.
The one thing this shows me is even my cpu clocked @ 3.5 I can't run Batman without dipping under 30 fps? You call that game not cpu intensive?
I can run my cpu @ 2.0 (e8500@ 1.6) and I never dip below 30 fps when I benchmark Far cry 2. How do you explain that?
 
Last edited:

ASK THE COMMUNITY