Legion Hardware - 5870 Crossfire CPU Scaling @ 2560x1600

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

AzN

Banned
Nov 26, 2001
4,112
2
0
Even without AA, some of these CPUs are providing horrendous mins. Sorry not everyone wants to play with 15 mins and 8AA. I'd rather take 30 mins and 0AA.

So you are telling me you need an overclocked i7 @ 4ghz to get minimum frame rates to playable settings?

So what about those who don't overclock their i7 or don't have an overclocked i7? They are stuck with 15fps min with dual 5870? :sneaky:
 

AzN

Banned
Nov 26, 2001
4,112
2
0

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
the intel i5 i7s are just incredible in gaming. looks like the best combo nowadays is a i5 750 with something like 5970 for serious gaming. I've always known my x4 620 is weak at gaming but the benchmark seem to suggest the cache is really slowing it down. i5 750 looks quite promising as my next upgrade.
Yep. The i5 750 is such an all around awesome CPU for gaming right now, it really is "the new Q6600." Fantastic package for the price.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
You seem to be infatuated with 1 or 2 benchmark like Toyota.

No, just addressing your comments that WiC doesn't need faster CPUs.

I've heard about the myth that CPUs don't matter for gaming since Doom 3 days: http://www.xbitlabs.com/articles/cpu/display/doom3-cpu_5.html

A64 3800+ = 38 fps mins (+36%)
A64 3000+ = 28 fps mins

This is because most websites only look at average framerates which don't exhibit significant CPU differences, but at the same time do not alone represent playability.

So you are telling me you need an overclocked i7 @ 4ghz to get minimum frame rates to playable settings?

Not exactly. An overclocked Core i3 and Core i5 also outperform C2D and Phenom architectures. But yes, if you want a fast 5870 CF setup, you should pair it with the fastest CPUs to get the most out of that setup. If you are only using a single GTX480/5870, it matters less.
 
Last edited:

AzN

Banned
Nov 26, 2001
4,112
2
0
No, just addressing your comments that WiC doesn't need faster CPUs.

I've heard about the myth that CPUs don't matter for gaming since Doom 3 days: http://www.xbitlabs.com/articles/cpu/display/doom3-cpu_5.html

A64 3800+ = 38 fps mins (+36%)
A64 3000+ = 28 fps mins

This is because most websites only look at average framerates which don't exhibit significant CPU differences, but at the same time do not alone represent playability.

I implied no such thing. However I think some of those minimum frame rates don't look right.

Doom 3 engine was multi core optimized. I don't know why you even brought that up.



Not exactly. An overclocked Core i3 and Core i5 also outperform C2D and Phenom architectures. But yes, if you want a fast 5870 CF setup, you should pair it with the fastest CPUs to get the most out of that setup. If you are only using a single GTX480/5870, it matters less.

According to Legion and what you've been saying you need an i7 @ 4ghz to get playable minimum frame rates in these games you've pointed out.

It matters less with a single 5870/GTX480 now? :\

You shift the GPU bottleneck you are going to be limited with CPU. I knew that already.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
I implied no such thing. However I think some of those minimum frame rates don't look right.

Doom 3 engine was multi core optimized. I don't know why you even brought that up.





According to Legion and what you've been saying you need an i7 @ 4ghz to get playable minimum frame rates in these games you've pointed out.

It matters less with a single 5870/GTX480 now? :\

You shift the GPU bottleneck you are going to be limited with CPU. I knew that already.

well I think what it shows is that when you crank the settings in many modern games that it has an effect on the cpu too. just like many slower dual core cpus have no issues running GTA 4 on lower settings but struggle at times while using the higher settings. heck I remember telling you long ago the Crysis Warhead would dip really bad at times with my 5000 X2 compared to my E8500 and we argued about it. again a lot of these cpus could handle games on low or medium settings and not dip down as bad. nobody running a high end gpu is going to settle for lower settings though.
 
Last edited:

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
well I think what it shows is that when you crank the settings in many modern games that it has an effect on the cpu too.


This is true. Under the 'performance' tab of many games, there are frequently settings that put a strain on the CPU, rather than the GPU. They basically mix CPU & GPU options in the same area now, typically.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
well I think what it shows is that when you crank the settings in many modern games that it has an effect on the cpu too. just like many slower dual core cpus have no issues running GTA 4 on lower settings but struggle at times while using the higher settings. heck I remember telling you long ago the Crysis Warhead would dip really bad at times with my 5000 X2 compared to my E8500 and we argued about it. again a lot of these cpus could handle games on low or medium settings and not dip down as bad. nobody running a high end gpu is going to settle for lower settings though.

You were using a 8600gt with your x2 and arguing how a faster CPU would let you play high settings. ^_^

GTA 4 struggles with core 2 duo even with lower settings.
 

alexruiz

Platinum Member
Sep 21, 2001
2,836
556
126
Big bunch of you know what.....

Why did they test World in conflict 1.0.1.0 (patch 10) when patch 11 (1.0.1.1) is mandatory to play online, and it features a lot of performance improvement for AMD hardware

WiC, with patch 10, runs no better on a Radeon HD5770 than on a HD4830....

I think sometimes we lose perspective of what is really being tested, but oh well. I guess most people just care about the graphs and text "this game was tested"

Any RTS that runs better in dual core hardware than QUAD core is not a good benchmark for CPU scaling. WiC is used by most of the reviewers because is popular, looks good, requires potent hardware AND has built-in benchmarking. However, the game is so poorly coded and optimized that I am surprised it is even taken seriously as benchmark.

Many of those results are hard to believe but oh well.
 

Madcatatlas

Golden Member
Feb 22, 2010
1,155
0
0
Big bunch of you know what.....

Why did they test World in conflict 1.0.1.0 (patch 10) when patch 11 (1.0.1.1) is mandatory to play online, and it features a lot of performance improvement for AMD hardware

WiC, with patch 10, runs no better on a Radeon HD5770 than on a HD4830....

I think sometimes we lose perspective of what is really being tested, but oh well. I guess most people just care about the graphs and text "this game was tested"

Any RTS that runs better in dual core hardware than QUAD core is not a good benchmark for CPU scaling. WiC is used by most of the reviewers because is popular, looks good, requires potent hardware AND has built-in benchmarking. However, the game is so poorly coded and optimized that I am surprised it is even taken seriously as benchmark.

Many of those results are hard to believe but oh well.

+1 for reminding us of the fact that the test parameters are just as important if not more than the test itself.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
You were using a 8600gt with your x2 and arguing how a faster CPU would let you play high settings. ^_^

GTA 4 struggles with core 2 duo even with lower settings.
I was talking about after I had upgraded my gpu. you are thinking about our very first discussion and that was the original Crysis then.

and yes a Core 2 duo can struggle in GTA 4 with lower settings but it struggles WAY more with higher settings.
 
Dec 30, 2004
12,553
2
76
Big bunch of you know what.....

Why did they test World in conflict 1.0.1.0 (patch 10) when patch 11 (1.0.1.1) is mandatory to play online, and it features a lot of performance improvement for AMD hardware

WiC, with patch 10, runs no better on a Radeon HD5770 than on a HD4830....

I think sometimes we lose perspective of what is really being tested, but oh well. I guess most people just care about the graphs and text "this game was tested"

Any RTS that runs better in dual core hardware than QUAD core is not a good benchmark for CPU scaling. WiC is used by most of the reviewers because is popular, looks good, requires potent hardware AND has built-in benchmarking. However, the game is so poorly coded and optimized that I am surprised it is even taken seriously as benchmark.

Many of those results are hard to believe but oh well.
+1 for reminding us of the fact that the test parameters are just as important if not more than the test itself.

that's a good point, that looked stupidly low for AMD I knew there was no way.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
The idea is that you would run the benchmark several times to weed out those 'glitches'. I know that if i am benchmarking and I see a very odd spike result in either direction and it doesn't show up in additional runs, i'll throw that one out as an anomoly. I suppose there still could be cases where you have odd repeat results, but then it should be duly noted in the review notes.
But that’s the thing – you don’t always know what constitutes a glitch, and what’s legitimate. If out of four runs I get minimums of 8 FPS, 10 FPS, 12 FPS, and 14 FPS, which one is correct? If two games have a higher minimum for the GTX480 and another two have a higher minimum on the 5870, which games are glitching, and which are accurate?

If you’re going to start picking the results based on where you think things should be, that becomes a dangerous slippery slope. And if you start averaging the minimums, you aren’t really posting minimums anymore.

The fact is, an instantaneous minimum is totally unreliable by itself. In fact I’d argue it’s borderline deceptive and misleading to even put it up. It has the potential to totally mislead people by showing a completely irrelevant result.
However, no one would argue with getting an average of low frame-rates to see just how often it happens and for how long. That would be a great thing to have and I would welcome any review that does that.
A plot point is ideal because it puts minimums into context. That way everyone can look at the graph and decide if they’re satisfied with the frame distribution.

Failing that, the next best thing would be to put up the distribution of frames based on a significant portion of the benchmark run. For example, 10% of frames are 25 FPS or below (or whatever).
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
But BFG, in real gameplay, it's these tiny drops in constant frame rate that create jerkiness and ruin the gaming experience completely.
The only thing you can infer from a minimum is that there’s one such drop in the entire benchmark. It doesn’t tell you how long the drop happens for, and whether there are more such drops.

The example of Borderlands, it very likely an outlier for a game that hasn't been driver optimized on the new GTX 4xxx series.
I think you have the mistaken impression that aside from outlier games, a minimum is totally accurate and repeatable. That isn’t the case at all.

Borderlands is but one of a plethora of examples that can show this. In fact, some games (e.g. UT2003) can’t ever show an accurate minimum because they start measuring while the level data is still loading.

In each of these cases, where avg fps seem sufficient, you can see that minimum frame rates hamper playability.
You have no idea if that minimum hampers playability or not. You don’t know how long it happens for, or how many times it happens. Only a framerate graph will tell you that information.

If out 10,000 rendered frames, 9999 are above 60 FPS but one is at 10 FPS, is that minimum significant?

And if another card gets 20 FPS at the same spot but 40 FPS in the other 9999 frames, I guess you’d be picking the second card because it has a higher minimum?

I sure wouldn’t, because for the other 9999 frames the first card is a lot better, and the minimum is insignificant given it only happens for 1/9999 of the benchmark run.
It's one thing to say mins don't matter in cut scenes. But in actual gameplay?
I’m not saying minimums don’t matter in gameplay. I’m saying that in order for them to matter, you need to demonstrate they happen for a significant time period, enough to impact gameplay. A single minimum value doesn’t show you that because by definition it’s a single data point from an entire benchmark run.
 

konakona

Diamond Member
May 6, 2004
6,285
1
0
Good points. Don't forget to OC the uncore/CPU-NB/L3$. Would be interesting to see the performance in minimum framerates on a 965 with changes in the uncore--2Ghz vs 2.6ghz etc. Since the L3 is what helps with heavily-branched, unpredictable-branch code, speeding it up aught to significantly improve minimum framerates.

The only benchmark I have seen on L3/CPU-NB's impact on overall system performance was in the link that I posted a while back when we were disccssing PhII overclocking, which did not have any concrete benches of any actual games IIRC. While I do keep hearing about CPU-NB needing to be sufficiently fast to match your core speed, I am yet to see any link to minimum framerates. We would all at best hope that certainly is the case, but I wouldn't jump to the conclusion so quick without having seen the actual results. Mine was never stable at 3.0ghz, had to back down to 2.8ghz :)
 

biostud

Lifer
Feb 27, 2003
19,925
7,036
136
going from an x2-3800@2.5ghz -> i5-750@3.6Ghz I never see stutter in DoW II. Before when monsters appeared the sound and frames would stutter badly for ~5 secs.

1680x1050 aa and af enabled.
 
Dec 30, 2004
12,553
2
76
The only benchmark I have seen on L3/CPU-NB's impact on overall system performance was in the link that I posted a while back when we were disccssing PhII overclocking, which did not have any concrete benches of any actual games IIRC. While I do keep hearing about CPU-NB needing to be sufficiently fast to match your core speed, I am yet to see any link to minimum framerates. We would all at best hope that certainly is the case, but I wouldn't jump to the conclusion so quick without having seen the actual results. Mine was never stable at 3.0ghz, had to back down to 2.8ghz :)

minimum framerates when CPU limited often have to do with frequent misses when a). predicting branches and b). not being able to retrieve the data the CPU needs to continue computing. Stick these two together, and you've got a mispredicted branch (loss of cycles), and the _real_ address that needs to be executed isn't in the L2 cache. So instead of having to wait to get all the way to system RAM, if the L3$ has it, the CPU can get back to filling the GPU with data much faster. I remember seeing a bench showing that this was inordinately useful in Crossfire systems.
 

digitaldurandal

Golden Member
Dec 3, 2009
1,828
0
76
You have no idea if that minimum hampers playability or not. You don’t know how long it happens for, or how many times it happens. Only a framerate graph will tell you that information.


I’m not saying minimums don’t matter in gameplay. I’m saying that in order for them to matter, you need to demonstrate they happen for a significant time period, enough to impact gameplay. A single minimum value doesn’t show you that because by definition it’s a single data point from an entire benchmark run.

I agree, this is one of the reasons I prefer hardOCPs benchmarks. I also find myself tweaking settings in games to find the right mixture of playability and settings, often with FRAPS running so I can verify my thoughts. I do not mind or even notice if a single frames dips under 30 or even 15 fps, but if every time I see an explosion I dip under there - then there is an issue and I need to adjust my settings.

I am playing at 5760x1200 and I get playable framerates with a single hd5870 @ 900/1200 and an i7 d0 @ 3.2ghz (and rising.)

Here are my bfbc2 settings, all high except hbao is off. 0x/8x. 5760x1200.

Now I can run the game okay with 2x AA and my friend thinks I should have it on but I have cut it out because without it I am always around 60fps barring large events on screen, and with it I dip into 45 for extended periods which to me is noticeable, and to him is not very noticeable.

A lot of it is personal preference, I have played games getting 20 fps before, it IS playable, it is just not enjoyable.
 

busydude

Diamond Member
Feb 5, 2010
8,793
5
76
As the arguments regarding CPU vs GPU are burning up this forum, ASUS came up with a dual 5870 GPU . Have a look at the engineering sample here
 
Last edited:

konakona

Diamond Member
May 6, 2004
6,285
1
0
minimum framerates when CPU limited often have to do with frequent misses when a). predicting branches and b). not being able to retrieve the data the CPU needs to continue computing. Stick these two together, and you've got a mispredicted branch (loss of cycles), and the _real_ address that needs to be executed isn't in the L2 cache. So instead of having to wait to get all the way to system RAM, if the L3$ has it, the CPU can get back to filling the GPU with data much faster. I remember seeing a bench showing that this was inordinately useful in Crossfire systems.

I am agreeing with you on the theory bit, now if you could just remember where you saw that bench.. Call me a skeptic, but I would like to see that first hand to be sure :)
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I think you mean not being able to completely max out with 16x AF and 4x AA at a constant 60FPS.

No, that's not what I meant.

http://www.xbitlabs.com/articles/video/display/geforce-gtx-480_9.html#sect1
STALKER: CoP 2560x1600 0AA -
GTX 480 = 21 min / 32 avg
5870 = 19 min / 37 avg
That's not even 30 mins...nevermind 60 fps constant

http://www.xbitlabs.com/articles/video/display/geforce-gtx-480_11.html#sect0
Metro 2033 2560x1600 0AA
GTX 480 = 12 min / 32 avg
5870 = 9 min / 30 avg
Are you saying this is playable?

"The overclocked Radeon HD 5850 is not far behind the GeForce GTX 480 at 1920x1080 but begins to display a slideshow at 2560x1600 whereas the GeForce GTX 480 is three times as fast as its opponent then. However, the new card is still unable to deliver a playable frame rate at the highest resolution"