Legion Hardware - 5870 Crossfire CPU Scaling @ 2560x1600

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
http://www.legionhardware.com/articles_pages/radeon_hd_5870_crossfire_cpu_scaling_performance_part_2,1.html

2560x1600

Pay attention to minimum framerates. Clearly cpu speed matters and also the type of cpu architecture. It would be unwise to pair 5870 CF with Q6600 @ 2.8ghz for example or with Intel Pentium E @ 4.0ghz.

For some of these processors, it simply makes no sense to apply more AA at high resolutions, since their minimum framerates are already hampering smooth gameplay.

Conclusions:
- Overall we have certainly come across some interesting results that were very different to those of our Radeon HD 5970 article, primarily because much less stress was being placed on the GPUs.
- For the most part the Core i7 9xx and Core i5 7xx series were in a league of their own
- Core i3 5xx almost always matched the minimum frame rate of the Phenom II X4, while in a number of cases it wassignificantly faster
- When looking at the budget processors we were surprised by how poorly the Athlon II X4 performed. It was the minimum frame rate of the Athlon II X4 which was most disappointing.
- Interestingly the Core 2 Duo E8xxx series did for the most part provide better minimum frame rates when compared to the Phenom II X4 and Phenom II X2 series
- However if you compare the Core 2 Duo E8xxx to the Phenom II X4 in games that can utilize more than just two cores, such as Far Cry 2, then we start to see a very different picture. The Phenom II X4 crushed the Core 2 Duo
 
Last edited:

Blue Shift

Senior member
Feb 13, 2010
272
0
76
As far as the last 2 points go... Intel's had some feature in their recent chips that throttles back unused cores and overclocks the remaining ones, which helps in applications that don't support that many threads. The Phenom II X4s that are out today do not feature a similar solution, although Thuban is rumored to.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
A minimum by definition is a single data point which tells you nothing about the rest of the benchmark. All you need is a single instance of random benchmarking noise to completely blow it out of whack.

That makes it worthless unless it has a graph putting it into context, and that graph shows it happens for a significant period of time.

I can demonstrate many such examples of flawed minimums, like this one:

http://techreport.com/articles.x/18682/7

At 1680x1050 and 1920x1200, the GTX470 has a higher minimum than the GTX480. Also the GTX285 has the highest minimum at 2560x1600. So then, would anyone claim the GTX285 is the best card for Borderlands @ 2560x1600? I think not.

Looking at that, I’m not sure how anyone can claim a minimum is a reliable gauge of performance.

As far as the last 2 points go... Intel's had some feature in their recent chips that throttles back unused cores and overclocks the remaining ones, which helps in applications that don't support that many threads. The Phenom II X4s that are out today do not feature a similar solution, although Thuban is rumored to.
That’s Turbo Boost, which is only present on i5 and i7 CPUs.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Looking at that, I’m not sure how anyone can claim a minimum is a reliable gauge of performance.

The idea is that you would run the benchmark several times to weed out those 'glitches'. I know that if i am benchmarking and I see a very odd spike result in either direction and it doesn't show up in additional runs, i'll throw that one out as an anomoly. I suppose there still could be cases where you have odd repeat results, but then it should be duly noted in the review notes.

However, no one would argue with getting an average of low frame-rates to see just how often it happens and for how long. That would be a great thing to have and I would welcome any review that does that.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
A minimum by definition is a single data point which tells you nothing about the rest of the benchmark. All you need is a single instance of random benchmarking noise to completely blow it out of whack.

But BFG, in real gameplay, it's these tiny drops in constant frame rate that create jerkiness and ruin the gaming experience completely. In other words, you can have steady 60 fps averages but if your game drops 3x during a gunfight or a race to 20 fps for even a millisecond, you are left frustrated. You can't just look at average frame rates.

I don't sit there running canned demos all day, I play my games. Therefore, I can see that my 4890 runs a track in Dirt 2 with 4AA/16AF with 55 fps average, but it drops 2-3 times to frustratingly unbearable mins. Then, I am forced to reduce image quality to 2AA to get my game to be playable. I run the same track again and I get a similar 57 fps average but my mins have increased from 38 to 49!

The example of Borderlands, it very likely an outlier for a game that hasn't been driver optimized on the new GTX 4xxx series.

Check out xbitlabs review here:

Call of Juarez: http://www.xbitlabs.com/articles/video/display/geforce-gtx-480_8.html
2560x1600 0AA
GTX295 = 92 avg / 39 min
5870 = 91 afg / 79 min

STALKER: CoP: http://www.xbitlabs.com/articles/video/display/geforce-gtx-480_9.html
1920x1080 0AA
5970 = 66 avg / 22 min
GTX480 = 47 avg / 31 min

Dirt2: http://www.xbitlabs.com/articles/video/display/geforce-gtx-480_10.html
2560x1600 4AA/16AF
GTX 295 = 51 avg / 31 min
GTX 285 = 50 avg / 43 min
GTX 480 = 66 avg / 59 min

In each of these cases, where avg fps seem sufficient, you can see that minimum frame rates hamper playability. If your videocard gets you 200 fps max, it's skewing the average (i.e. As in 5970 which can plow through easy areas but then has 3 periods of 22-25 min frames rates, vs. GTX480 that can max at 130 fps instead of 200 but same 3 periods can run with 31-33 min, then which card is providing a better gaming experience? The GTX480).

It's one thing to say mins don't matter in cut scenes. But in actual gameplay? I'd rather take a game that has constant 60 fps avg with 45 mins than a game with 120 fps avg and 20 fps mins across the board. Think about it, 60 fps is already smooth, so anything above 60fps to 120fps has limited benefit. Going from 20 to 45 fps mins is a MAJOR difference in playability.
 
Last edited:

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
i think something is wrong with their testing; especially with Ph II X4

What kind of BS is this? Someone explain this to me:
Please note that the graphs are displaying a summation of average and minimum values. This means they are not arranged by either the minimum or average results but rather a combination of the two so please keep this in mind when going over the data.
This part of their conclusion makes zero sense:
Overall we have certainly come across some interesting results that were very different to those of our Radeon HD 5970 article, primarily because much less stress was being placed on the GPUs. In the future we will look to test even more powerful graphics card configurations with games that are better designed to tackle multi-core processors.
like duh :p
--- i don't see real value in this review

Frankly i am going to do the same thing but simplify the hell out of it
- Phenom II X4 vs Phenom II X2 vs Core i7 920 .. and i will overclock and underclock them. i already tested C2D, C2Q and Phenom II X3 and i hate useless repetition. And of course i would test with CrossFired HD 5870s at 925/1300 to make it more interesting; if i am lucky, i can compare scaling results with GTX 480 SLI. Watch for it an a month or so.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
1. The arrangement in the charts is simply done so that it shows Mins + Avg. You can ignore the scaling if you'd like. Just look at the #s for red bar for mins and #s for yellow bar for avg. While the way they graphed it is unusual, the conclusion remains the same.

2. 5870 CF is faster than 5970. This is their next point, implying that the cpu limitation is even greater with faster GPU setup. This makes sense.

Notice, their results are consistent with what we have seen from Xbitlabs, PCgameshardware and the massive 146 CPU article compiled by a French site.

Games like World in Conflict, Crysis/Warhead, Far Cry 2, are severely minimum framerate limited with slow CPUs/architectures. Far Cry 2 has already been shown to scale significantly from Core 2 Duo 2.0ghz --> 3.6ghz by both toyota and happy medium. We are talking minimum framerates increase by more than 50%.
 

mhouck

Senior member
Dec 31, 2007
401
0
0
Any idea if the 45nm core 2 quads act more like the i7/i5 or like the c2d 8xxxx?
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
1. The arrangement in the charts is simply done so that it shows Mins + Avg. You can ignore the scaling if you'd like. Just look at the #s for red bar for mins and #s for yellow bar for avg. While the way they graphed it is unusual, the conclusion remains the same.

2. 5870 CF is faster than 5970. This is their next point, implying that the cpu limitation is even greater with faster GPU setup. This makes sense.

Notice, their results are consistent with what we have seen from Xbitlabs, PCgameshardware and the massive 146 CPU article compiled by a French site.

Games like World in Conflict, Crysis/Warhead, Far Cry 2, are severely minimum framerate limited with slow CPUs/architectures. Far Cry 2 has already been shown to scale significantly from Core 2 Duo 2.0ghz --> 3.6ghz by both toyota and happy medium. We are talking minimum framerates increase by more than 50%.
More or less what I was thinking. There's some solid information there that shows the trends others have also reported. However, this is one of the few reviews to really go for the gold and use a beefy GPU setup at the highest resolution available (save Eyefinity). It definitely establishes the importance of the CPU in a gaming machine, especially for minimum framerates, although it would have been nice to see some framerate vs. time graphs.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
1. The arrangement in the charts is simply done so that it shows Mins + Avg. You can ignore the scaling if you'd like. Just look at the #s for red bar for mins and #s for yellow bar for avg. While the way they graphed it is unusual, the conclusion remains the same.

2. 5870 CF is faster than 5970. This is their next point, implying that the cpu limitation is even greater with faster GPU setup. This makes sense.

Notice, their results are consistent with what we have seen from Xbitlabs, PCgameshardware and the massive 146 CPU article compiled by a French site.

Games like World in Conflict, Crysis/Warhead, Far Cry 2, are severely minimum framerate limited with slow CPUs/architectures. Far Cry 2 has already been shown to scale significantly from Core 2 Duo 2.0ghz --> 3.6ghz by both toyota and happy medium. We are talking minimum framerates increase by more than 50%.
First of all, 5870 CF at stock speeds does not run away from (edit) HD 5970. :p

Secondly, the Phenom II X4 results are pretty suspect. Looking at the other reviews, we do not see that kind of disparity with C2Q.

Thirdly - WiC is my OWN example of CPU scaling along with FC2; these TWO games do illustrate it better than 99.99999999% of other games. i don't test Warhead, but Crysis does not exhibit this at all.
 
Last edited:

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
One thing i would like to point out; although *most* games don't illustrate this kind of scaling .. it won't be long until quad is finally required. We DO see the latest games taking more advantage of the multi-core CPU. Just as the single core was supplanted by dual, the quad will become necessary.

But not quite yet. And if you notice, both BFG10K and i DO have quads :p
- i just like 4.0 Ghz so no one even hints "bottleneck"

And what i want to see (and test) is the brand new games - JC2, Metro 2033 and Crysis 2 when it comes out. So .. guess what? .. that is what i am going to test next
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
It would be nice if reviews provided some "% time spent below 30 fps" and "% time spent above 60 fps" if they're not going to give a frame rate vs time graph.

That way the end user could make a judgement re: unacceptable minimum frame rates being an anomaly or a problem. A minimum fps of 5 with less than 1% of the time spent below 30 fps and an average of 40 wouldn't bother me. 10% of the time below 30fps with an average of 100 fps would.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
And what i want to see (and test) is the brand new games - JC2, Metro 2033 and Crysis 2 when it ***es out. So .. guess what? .. that is what i am going to test next

I can't wait for Crysis 2 and the new Medal of Honor game as well. But Crysis 2 won't be out until Q4 2010. By that time 5870 will be irrelevant for high-end GPUs. Metro loooooooves more GPU power. I think DoF and Tessellation reduce frames by 2x or so. In that case, no amount of CPU power can help :(

There is also Splinter Cell: conviction which launches in 7 days! Looks really cool with in-game cut scenes: http://forums.vgrequirements.info/showthread.php?t=3167
 
Last edited:

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Considering that from now on the games are getting more multi threading, I can see a brighter future for Quad Core users, there's no reason to go Dual Core at this time.

And the Phenom X2 results are quite weird, specially when other reviews like AllienBabeltech shows that the Phenom II X4 can go toe to toe with the best of Intel in gaming performance.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Considering that from now on the games are getting more multi threading, I can see a brighter future for Quad Core users, there's no reason to go Dual Core at this time.

And the Phenom X2 results are quite weird, specially when other reviews like AllienBabeltech shows that the Phenom II X4 can go toe to toe with the best of Intel in gaming performance.
well xbit has also showed that the Phenom 2 X4 just doesnt match the min framerate of its rivals.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
http://www.anandtech.com/show/2819/7

But it does in average which also matters, probably even more since mins can be influenced by disk trashing or huge VRAM access, small spikes may affect the min for a microsecond but will not affect enough the gameplay.
yeah but I am curious to know why the minimums are consistently slower with the X4. looking at xbit it seems the mins do get faster with increased clockspeed so its certainly something in its architecture thats causing this. it would be nice to see a complete framerate graph to see exactly how much or long this actually occurs.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
yeah but I am curious to know why the minimums are consistently slower with the X4. looking at xbit it seems the mins do get faster with increased clockspeed so its certainly something in its architecture thats causing this. it would be nice to see a complete framerate graph to see exactly how much or long this actually occurs.

You guys are finally getting me interested in frame rate graphs
- it is just a lot of work ... :p

... maybe ...

Not using 45nm C2Q in the test is pure failure
i dropped C2D/C2Q from my ownreviews. i think i already covered it enough to be able to now just use Core i7 and Phenom II architecture; my last review wasn't that long ago using C2Q, i7, and 3 varieties of Ph II including X3.

Fact: C2D/C2Q = (more-or-less) Phenom II in gaming

All the other tech sites have pretty much established that parity. i think something is wrong with their Dragon platform to get consistently lower results with Phenom II.

And does anyone really think that (stock) 5870 CF is *that much* more powerful than 5970?
 
Last edited:

toyota

Lifer
Apr 15, 2001
12,957
1
0
You guys are finally getting me interested in frame rate graphs
- it is just a lot of work ... :p

... maybe ...


i dropped C2D/C2Q from my ownreviews. i think i already covered it enough to be able to now just use Core i7 and Phenom II architecture; my last review wasn't that long ago using C2Q, i7, and 3 varieties of Ph II including X3.

Fact: C2D/C2Q = (more-or-less) Phenom II in gaming

All the other tech sites have pretty much established that parity. i think something is wrong with their Dragon platform to get consistently lower results with Phenom II.

And does anyone really think that (stock) 5870 CF is *that much* more powerful than 5970?
well a 5870 crossfires two gpus would be sporting over 17% faster clocks each. I guess if you added that together it could theoretically could be "up to" 35% faster. correct me if thats not right though.
 

Apocalypse23

Golden Member
Jul 14, 2003
1,467
1
0
Right on with the posting! Interesting to see that the benches were with no AA/AF and still the fps was mediocre. The i7s and i5s take the crown in terms of performance. But I think that once you turn up the filtering, things change dramatically, the dual cores wouldn't even be in the graphs!

It would be interesting to see Crysis Warhead, Dirt 2 and BFBC2 in those benches, I wonder why they weren't included.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Right on with the posting! Interesting to see that the benches were with no AA/AF and still the fps was mediocre. The i7s and i5s take the crown in terms of performance. But I think that once you turn up the filtering, things change dramatically, the dual cores wouldn't even be in the graphs!

It would be interesting to see Crysis Warhead, Dirt 2 and BFBC2 in those benches, I wonder why they weren't included.
no it would just mainly bring down the higher end cpu scores because the load would be shifting to the gpus.
 

Makaveli

Diamond Member
Feb 8, 2002
4,966
1,561
136
well a 5870 crossfires two gpus would be sporting over 17% faster clocks each. I guess if you added that together it could theoretically could be "up to" 35% faster. correct me if thats not right though.

I'm not sure you can just add the 17% x 2, but maybe someone else can add in on this.

Overall I think this article was well done.