I'm asking this question again, would a dual core E7500 bottleneck a 5870?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

thebomb

Member
Feb 16, 2010
101
0
0
Why is this a debate instead of a link to this:
http://www.xbitlabs.com/articles/video/display/radeon-hd5870-cpu-scaling.html

Where they specifically tested CPU scaling on a 5870.

Answer is, yes a E7500 will limit you in at least half the games out there with a 5870 and definitely in RTS/Sandbox style games where AI and pathfinding are heavy loads.

Did you even read your own link?

Now, I’ll try to sum everything up. First of all, the single Radeon HD 5870 does not depend as much on the CPU as it is supposed to. According to my tests (the left part of the diagrams), the Radeon HD 5870 is quite satisfied with an overclocked dual-core CPU manufactured two years ago. And you can even leave the CPU at its default frequency at the high-quality settings and 1920x1200. The only exceptions are Left 4 Dead and Warhammer 40000: Dawn of War II. The former game is not a problem for modern top-end Radeons while the latter, on the contrary, calls for a quad-core CPU, preferably from Intel.

From the benchmarks in your link, one would lose at most 15-20 FPS in a strategy game going from a C2D to an i7 (@1920x1200). In all other games the FPS was exactly the same (in some cases the C2D yielded MORE FPS than an i7 which I thought was weird).
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
Why is this a debate instead of a link to this:
http://www.xbitlabs.com/articles/video/display/radeon-hd5870-cpu-scaling.html

Where they specifically tested CPU scaling on a 5870.

Answer is, yes a E7500 will limit you in at least half the games out there with a 5870 and definitely in RTS/Sandbox style games where AI and pathfinding are heavy loads.

The key is is "limit you". If you look at those benches the difference alot of the time is 70fps for the Core 2 and 88 for the i7 @ 1280x1024.

@ 1600x1050 it would be less.

It may limit you some, but no where near unplayable.

Also it make a big difference what game your playing, quad core vs dual core.
 
Last edited:

toyota

Lifer
Apr 15, 2001
12,957
1
0
Did you even read your own link?



From the benchmarks in your link, one would lose at most 15-20 FPS in a strategy game going from a C2D to an i7 (@1920x1200). In all other games the FPS was exactly the same (in some cases the C2D yielded MORE FPS than an i7 which I thought was weird).
the OP is at 1600 though and you can pick games to back either argument. notice too that there are not many cpu intensive games being tested. Red Faction Guerrilla, GTA 4, Prototype, Ghostbusters, Bad Company 2 and a couple of other would have shown a very large difference even at 1920.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
The key is is "limit you". If you look at those benches the difference alot of the time is 40fps for the Core 2 and 55 for the i7 @ 1280x1024.

It may limit you some, but no where near unplayable.

Also it make a big difference what game your playing, quad core vs dual core.
well in the case of the OP its not about if his cpu is playable. its about is it strong enough to have a playable difference at just 1680 between a 5850 and a 5870. to me the answer is only in some cases but overall not worth the 100 bucks difference at all. at 1920 sure there would be a much better reason but at 1600 I just dont see any real value in that with his cpu. remeber what little advantage the 5870 would provide in some games at 1680 with his cpu could be wiped out with a decent oc on the 5850.
 
Last edited:

happy medium

Lifer
Jun 8, 2003
14,387
480
126
well in the case of the OP its not about if his cpu is playable. its about is it strong enough to have a playable difference at just 1680 between a 5850 and a 5870. to me the answer is only in some cases but overall not worth the 100 bucks difference at all. at 1920 sure there would be a much better reason but at 1600 I just dont see any real value in that with his cpu. remeber what little advantage the 5870 would provide in some games at 1680 with his cpu could be wiped out with a decent oc on the 5850.

Not worth the extra money, you are right there, unless he's upgrading his monitor or cpu relativley soon.

If not just get the 5850.
 

tweakboy

Diamond Member
Jan 3, 2010
9,517
2
81
www.hammiestudios.com
What will kill yoru performance is max AA ? You gonna go 16x AA no video card can handle that not furbi not 5970 and maintain 60fps and not fall below that. When cryengine Crysis plays at 60fps and never drops below that then that is the video card you want.

As far as AA , your gonna lose your performance,, Doesnt matter what CPU you have.

and No it will not be a bottleneck unless its stock clocks on CPU...Your res is a low res soo youll gain performance for that. I think 5870 would be great for you. Just make shure you have good PSU and gl my friends,, thx
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
so I guess you finally realized that page had 1680 results? I see you still cant count though because that is FOUR not three. I have seen you and your buddies site and its pretty funny. he claimed no bottleneck with a Core 2 at 2.0 which was absolute BS.

anyway so again are you really going to still say that an E7500 does not give up 20-25% to an i7 while using a 5870 at 1680 in many games? so why pay 100 bucks more for a card that you will see little to benefit from.

at 1920 I would see your point much better but at 1680 I just dont think its worth it at all. also its a shame he wont oc that cpu because it has a lot more potential with a card like the 5850 or 5870 in many newer games.

i see, you have *4* (total) cherry picked benches that sort of support your claim instead of 3. My bad. :p

FIRST of ALL, he games at a higher resolution than 1680:
I plan to game at 1600 x 1200 with all settings maxed full AA and AF and so on.
Secondly, i *always* recommend O/C'ing the CPU on stock voltage as it is a free way to get safe performance; it never hurts.

i did extensive testing multiple times with 5 CPUs (from dual to X3 to Quad/ Phenom vs i7 vs Penryn) from stock to 4.0 GHz - with GTX 280 to 4870-X3 Trifire and the differences are way less because of the CPU than you imagine - -- over hundreds of benches.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
As far as AA , your gonna lose your performance,, Doesnt matter what CPU you have.

Wow this thread is like beating a dead horse and getting nowhere. Does no one care about minimum frames?

Look at the benchmarks from Xbitlabs with high AA at 1920x1200. Core 2 D 3.4ghz vs. Core i7 4.1ghz:

World in Conflict = 17 vs. 29
Crysis = 11 vs. 20
Warhammer: Dawn of War 2 = 12 vs. 31

What difference does it make if you get 70 frames average on a C2D when you are at 12 frames minimum... Of course not all games are going to behave this way but it's important to look at mins.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I haven't read every post. Sorry if someone else has made this point already.

Just because it is possible to indiscriminately crank up settings and filtering on your graphics card until it finally pukes at a particular resolution, doesn't mean that you need a faster card. It means you need to engage your brain before you adjust your settings.

You might be surprised and find out a 5770 will play anything at *1080 with very good visual quality and frame rates. Where at 16x AA, max tessellation, soft shadows, etc... Xfired 5970s might puke their guts up.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
i see, you have *4* cherry picked benches that sortof support your claim instead of 3. My bad. :p

FIRST of ALL, he games at a higher resolution than 1080:

Secondly, i *always* recommend O/C'ing the CPU on stock voltage as it is a free way to get safe performance; it never hurts.

i did testing with 5 CPUs (dual to Quad/ Phenom vs i7 vs Penryn) from stock to 4.0 GHz - with GTX 280 to 4870-X3 Trifire and the differences are way less because of the CPU than you imagine in hundreds of benches.
I assume you mean 1680 because by 1080 that would mean 1920x1080 which is higher than 1600x1200.

yes in some games it wont matter much but in many of the newer more cpu intensive games it most certainly will. again in the games where the 5870 actually does have an advantage with his current setup he could just oc the 5850 and easily eliminate that small gap. in more cpu intensive games the 5850 will give him the same playable experience. its just NOT worth 100 bucks more based on his current setup IMO.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
I assume you mean 1680 because by 1080 that would mean 1920x1080 which is higher than 1600x1200.

yes in some games it wont matter much but in many of the newer more cpu intensive games it most certainly will. again in the games where the 5870 actually does have an advantage with his current setup he could just oc the 5850 and easily eliminate that small gap. in more cpu intensive games the 5850 will give him the same playable experience. its just NOT worth 100 bucks more based on his current setup IMO.

Again, you are making broad generalizations with nothing to back it up.
-i edited my typo, you quoted an early draft of my post; 16x12 is a higher resolution than 16x10.

i can back up what i am saying about CPU scaling and video cards of 5850/5870 class and i would recommend the 5870 as well as the 5850; there would be a small positive performance increase with 5870 over 5850 with a near-3 GHz CPU.

The question is, Is it "worth it"?
- to me - yes; to you, no .. but we are not the ones making the upgrade.

You also recommend overclocking the 5850 to "make up" for its performance deficit compared to 5870 - don't forget that the 5870 also overclocks. One of mine is 925/1300 the other 975/1300.
 
Last edited:

mhouck

Senior member
Dec 31, 2007
401
0
0
Wow this thread is like beating a dead horse and getting nowhere. Does no one care about minimum frames?

Look at the benchmarks from Xbitlabs with high AA at 1920x1200. Core 2 D 3.4ghz vs. Core i7 4.1ghz:

World in Conflict = 17 vs. 29
Crysis = 11 vs. 20
Warhammer: Dawn of War 2 = 12 vs. 31

What difference does it make if you get 70 frames average on a C2D when you are at 12 frames minimum... Of course not all games are going to behave this way but it's important to look at mins.

I think this really should be the basis of judging a worthwhile upgrade.

I when I run heaven I get average fps of 69 at stock gpu w/ a min fps of 20.9 and a max of 184. I'm glad my average is above my refresh rate of 60hz but the dip to 20.9 is noticeable. The max 184 is irrelevant to how I'm going to experience game play but it alters the average fps to seem like I should not notice any glitches. This is obviously not going to be the case.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
This is obviously not going to be the case.

You got it man! My very long post on the other page highlighted the exact same example with Dirt2 and my 4890 which gets ~ 55 frames avg with 2AA or 4AA but 47 mins vs. 38 mins.

Sure you can bring any GPU to its knees by cranking AA. However, no one in their right mind is going to be cranking AA when they are dipping / jerking in the game due to mins. :) Averages won't reflect this.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
I have seen you and your buddies site and its pretty funny. he claimed no bottleneck with a Core 2 at 2.0 which was absolute BS.
It’s not BS at all. I posted objective measurements from 17 games at the actual settings I play them at.

We’ve been over this at least four times now; if you think the log files are wrong, post up evidence that demonstrates the games’ reporting mechanism is faulty. Otherwise retract your claim.

Just because you can’t understand and/or accept the results, it doesn’t make them BS. I imagine it must be very hard for you when you go around parroting the same five titles, then suddenly seeing 17 games basically not budging. Those 17 games must all be “faulty”, right? :rolleyes:

notice too that there are not many cpu intensive games being tested. Red Faction Guerrilla, GTA 4, Prototype, Ghostbusters, Bad Company 2
Prove it; let’s see the benchmarks.

I can accept GTA 4 is CPU limited and so is Arma 2, but Arma 2 runs like shit on any processor, including an i7 overclocked to 4 GHz. So I’ll give you those two games.

Bad Company 2 has been benchmarked by LegionHardware and XBit, and both sides clearly showed a massive reliance on the GPU, not the CPU. In fact, XBit was getting scaling from a 5970 over a 5870 at just 1280x1024. I’m not sure how anyone can run around and claim the game is CPU limited based on that result.

So then, show us the benchmarks for the rest of the games to back your claims.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
I upgraded from a E5200 @ 4ghz to a i3 @ 4.3ghz. Only game that benefited was GTA4 and maybe Dragon Age. It wasn't game changing in those 2 games. Dragon Age was already smooth and GTA4 jumped by 20% in frame rates but still craw in some scenes... Bad Company didn't benefit for me at all either.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
It’s not BS at all. I posted objective measurements from 17 games at the actual settings I play them at.

We’ve been over this at least four times now; if you think the log files are wrong, post up evidence that demonstrates the games’ reporting mechanism is faulty. Otherwise retract your claim.

Just because you can’t understand and/or accept the results, it doesn’t make them BS. I imagine it must be very hard for you when you go around parroting the same five titles, then suddenly seeing 17 games basically not budging. Those 17 games must all be “faulty”, right? :rolleyes:


Prove it; let’s see the benchmarks.

I can accept GTA 4 is CPU limited and so is Arma 2, but Arma 2 runs like shit on any processor, including an i7 overclocked to 4 GHz. So I’ll give you those two games.

Bad Company 2 has been benchmarked by LegionHardware and XBit, and both sides clearly showed a massive reliance on the GPU, not the CPU. In fact, XBit was getting scaling from a 5970 over a 5870 at just 1280x1024. I’m not sure how anyone can run around and claim the game is CPU limited based on that result.

So then, show us the benchmarks for the rest of the games to back your claims.
I have posted benchmarks plenty of time but all you do is come up with excuses. I showed a 40-50% difference in Far Cry 2 at 1920 with 2x AA and you told me I needed more AA. lol

all I was doing was putting it on realistic settings for a gtx260 in that game and the other ones I tested. for example in Red Faction Guerrilla the min framerate was cut in half by running my cpu at 2.0 while using all high settings at 1920. in Ghostbusters, the game was almost unplayable at times and the min framerate was basically cut in half in that game too while my cpu was 2.0 at 1920 highest settings. all you will do is tell me to put more AA on and make it more gpu limited to help mask the fact that cpu is indeed a limitation.

an i7 gets nearly twice the frame rate in Bad Company as a decent dual core at 1680 so pretty sure even at a more realistic 1920 the gap would still be pretty wide. not saying that the dual core isnt playbale but the gap is still pretty large in that game.
 
Last edited:

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
I have posted benchmarks plenty of time but all you do is come up with excuses. I showed a 40-50% difference in Far Cry 2 and at 1920 with 2x AA and you told me I needed more AA. lol

all I was doing was putting it on realistic settings for a gtx260. all you will do is tell me to put the settings more gpu limited to help mask the fact that cpu is indeed a limitation.


an i7 gets nearly twice the frame rate in Bad Company as a decent dual core at 1680 so pretty sure even at a more realistic 1920 the gap would still be pretty wide. not saying that the dual core isnt playbale but the gap is still pretty large in that game.

You mean the same 4 tired benches over and over.

The rest of the time you just broadly generalize and sadly you are wrong about your Core i7 as being much of an upgrade from C2Q or C2D :p
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
You mean the same 4 tired benches over and over.

The rest of the time you just broadly generalize and sadly you are wrong about your Core i7 as being much of an upgrade from C2Q or C2D :p
nope I mean the ones I personally test. you and your little buddy just make BS excuses every time.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
I upgraded from a E5200 @ 4ghz to a i3 @ 4.3ghz. Only game that benefited was GTA4 and maybe Dragon Age. It wasn't game changing in those 2 games. Dragon Age was already smooth and GTA4 jumped by 20% in frame rates but still craw in some scenes... Bad Company didn't benefit for me at all either.
well with a stronger gpu you would in many games. also in BC2 your cpu is way slower than a true quad like an i5 750. every game is different and in some your cpu will be very strong and give up little to nothing compared to an i5/i7 quad.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
well with a stronger gpu you would in many games. also in BC2 your cpu is way slower than a true quad like an i5 750. every game is different and in some your cpu will be very strong and give up little to nothing compared to an i5/i7 quad.

dirt2-oc.gif


Not really. An overclocked i3 can easily match an i7 at least in games.



CPU.png


BC2 seems plenty fast even with a stock dual core. Averaging 60+fps long as you have enough GPU. My overclocked CPU would easily match an i7 920 in BC2. So not really.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
nope I mean the ones I personally test. you and your little buddy just make BS excuses every time.

i haven't seen the ones you personally test. However, you have come to the wrong conclusions and are posting misinformation about needing a much stronger CPU than is necessary.

i7 is nothing special for gaming
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
http://techreport.com/r.x/cpu-roundup-2010q1/dirt2-oc.gif

Not really. An overclocked i3 can easily match an i7 at least in games.



http://www.legionhardware.com/images/review/Battlefield_Bad_Company_2_Tuning_Guide/CPU.png

BC2 seems plenty fast even with a stock dual core. Averaging 60+fps long as you have enough GPU. My overclocked CPU would easily match an i7 920 in BC2. So not really.
I said nothing about a fast dual core not providing a decent framerate in BC2. heck a fast dual core is playable in pretty much every situation in every game. this thread is about whether the op would have a playable difference in between a 5850 and 5870 at 1600 with an E7500. in BC2 there is no way he would even get one more fps when he is already 100% cpu limited at 1600 in that game. heck look at the faster quads getting TWICE the framerate. sure you could turn on much more AA and drop that gap way down but the quads would still be getting close to 50% better in this particular game.
 
Last edited:

Apocalypse23

Golden Member
Jul 14, 2003
1,467
1
0
From my own tests on my E8400@3.9ghz, the dual core bottlenecks the 5870 somewhat...But I'll confirm this upon further testing on my forthcoming i7 setup, and hopefully a pcie 1.0 vs pcie 2.0 comparison soon.

I'd suggest you just go with a quad: i5, phenom x4 955/965, or even a i7 setup. It'll leave you a lot more headroom.
 
Last edited:

AzN

Banned
Nov 26, 2001
4,112
2
0
I said nothing about a fast dual core not providing a decent framerate in BC2. heck a fast dula core is playable in pretty much every situation in every game. this thread is about whether the op would have a playable difference in between a 5850 and 5870 at 1600 with an E7500. in BC2 there is no way he would even get one more fps when he is already 100% cpu limited at 1600 in that game. heck look at the faster quads getting TWICE the framerate.

I wouldn't say a faster GPU wouldn't make a 0 fps difference. The game is very much GPU limited and loves quad cores. Even 5870 has a hard time maxing out everything with AA.

However BC2 does love quad cores.

DX_04.png
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
i haven't seen the ones you personally test. However, you have come to the wrong conclusions and are posting misinformation about needing a much stronger CPU than is necessary.

i7 is nothing special for gaming
I use the i7 example for comparisons because it is the fastest solution. if the i5/i7 offered nothing then I am sure tech site would just just stick with a 2.0 Core 2 duo since you and your buddy thinks it makes no difference. and here is a bench I post almost every time he brings up that 2.0 Core comment.

Far Cry 2 Settings: Demo(Ranch Long), 1920x1080 (60Hz), D3D10, Fixed Time Step(No), Disable Artificial Intelligence(No), Full Screen, Anti-Aliasing(2x), VSync(No), Overall Quality(Very High), Vegetation(Very High), Shading(Very High), Terrain(Very High), Geometry(Very High), Post FX(High), Texture(Very High), Shadow(Very High), Ambient(High), Hdr(Yes), Bloom(Yes), Fire(Very High), Physics(Very High), RealTrees(Very High)

E8500 @ 2.0
Total Frames: 11808, Total Time: 284.02s
Average Framerate: 41.57
Max. Framerate: 84.90 (Frame:1851, 26.47s)
Min. Framerate: 23.28 (Frame:5683, 125.34s)

E8500 @ 3.8
Total Frames: 16568, Total Time: 284.01s
Average Framerate: 58.34
Max. Framerate: 114.58 (Frame:4, 0.04s)
Min. Framerate: 36.90 (Frame:7835, 125.13s)

thats a 40% increase in average and 58% increase in minimum framerates by running my E8500 at 3.8 instead of 2.0 even at a gpu limited 1920x1080 very high settings and 2x AA. dont forget that an even faster quad would still knock out a couple more fps on top of that. with a much faster gpu the difference would have been even greater too as my gtx260 is far from high end now.:eek:

gee do you really want more of my benchmarks because then you cant complain about the 4 I linked to to earlier that weren't even mine.