Dual vs Quad - Lab Benchmarks Vs Reality?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: taltamir
my take on it... any game that doesn't need a quad core, doesn't need those last 600mhz advantage the dual core has over a quad. and if your game DOES need a quad, you would be happy you have one. I went from an E8400 @ 3.6ghz to an E6600 @3ghz. My experience with games that perform better with a dual core hasn't changed, they are still silky smooth despite having a slightly lower measured FPS, my experience with games that need a quad has vastly improved.

that is generally true for single-GPU video cards

IF you run GTX200 SLi or 4870 CF, then you *need* the fastest - preferably Quad - that you can pair with it

in some cases the differences - like in WiC - is between "playable" [q9550@3.4Ghz] and slideshow [e8600@4.25Ghz]
- and i have the benches [finally] to PROVE it
rose.gif


 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: apoppin
Originally posted by: taltamir
my take on it... any game that doesn't need a quad core, doesn't need those last 600mhz advantage the dual core has over a quad. and if your game DOES need a quad, you would be happy you have one. I went from an E8400 @ 3.6ghz to an E6600 @3ghz. My experience with games that perform better with a dual core hasn't changed, they are still silky smooth despite having a slightly lower measured FPS, my experience with games that need a quad has vastly improved.

that is generally true for single-GPU video cards

IF you run GTX200 SLi or 4870 CF, then you *need* the fastest - preferably Quad - that you can pair with it

in some cases the differences - like in WiC - is between "playable" [q9550@3.4Ghz] and slideshow [e8600@4.25Ghz]
- and i have the benches [finally] to PROVE it
rose.gif

you say I am generally correct for single GPU games, and then you completely agree with me about multi GPU? I am a bit confused by that post.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: taltamir
Originally posted by: apoppin
Originally posted by: taltamir
my take on it... any game that doesn't need a quad core, doesn't need those last 600mhz advantage the dual core has over a quad. and if your game DOES need a quad, you would be happy you have one. I went from an E8400 @ 3.6ghz to an E6600 @3ghz. My experience with games that perform better with a dual core hasn't changed, they are still silky smooth despite having a slightly lower measured FPS, my experience with games that need a quad has vastly improved.

that is generally true for single-GPU video cards

IF you run GTX200 SLi or 4870 CF, then you *need* the fastest - preferably Quad - that you can pair with it

in some cases the differences - like in WiC - is between "playable" [q9550@3.4Ghz] and slideshow [e8600@4.25Ghz]
- and i have the benches [finally] to PROVE it
rose.gif

you say I am generally correct for single GPU games, and then you completely agree with me about multi GPU? I am a bit confused by that post.

i'm sorry .. i was *expanding* on your comments

i agreed with and i particularly agreed with ... < if you get the slight difference
rose.gif


Having Multi-GPU makes the need for a Quad crucial (over dual) - in games that use more than 2 cores effectively
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
Originally posted by: apoppin
Originally posted by: taltamir
Originally posted by: apoppin
that is generally true for single-GPU video cards

IF you run GTX200 SLi or 4870 CF, then you *need* the fastest - preferably Quad - that you can pair with it

in some cases the differences - like in WiC - is between "playable" [q9550@3.4Ghz] and slideshow [e8600@4.25Ghz]
- and i have the benches [finally] to PROVE it
rose.gif

you say I am generally correct for single GPU games, and then you completely agree with me about multi GPU? I am a bit confused by that post.

i'm sorry .. i was *expanding* on your comments

i agreed with and i particularly agreed with ... < if you get the slight difference
rose.gif


Having Multi-GPU makes the need for a Quad crucial (over dual) - in games that use more than 2 cores effectively

Indeed. For those who not only need bleeding edge IQ on the latest games, but acceptable performance as well, do need a fast quad (or even a triple). Here are some games where a quad really shines:

World in Conflict, courtesy of apopp - Minimum framerates are so much better

Unreal Tournament 3 - Overall faster

Stalker CS - You'll get about 20% more performance out the quad.

GTA4 - Looks like it can make use of Core i7's better multi-threaded support for beyond quad-core. Or maybe the IMC is the bigger influence for the 920's advantage over the Q9550. Either way it looks promising (for the future).

Tom Clancy's HAWX - Triple core outshines even one of the fastest dual cores.

Left 4 Dead - The E8500 @ 3.6 can't quite match the stock QX9650.

CoD World at War - Looks like this can make good use of quad, as the Phenom 9650/8650 disparity shows.

Far Cry 2 - Somewhat odd results. FC2 obviously favors Intel hardware, and certainly favors quad. It also seems it does not care about L2 cache that much (Q8200 doing much better than Q6600). Instead, it seems to favor effeciency and possibly memory bandwidth, which would explain why Core i7 does so well. it also doesn't seem to scale much past three cores if we look at the Phenom 8750/9650 results.

Now with this demonstrated, I do still believe dual cores have a place, especially for more casual gamers. But I don't think by the end of this year or early next year I'll be recommending a dual core over a triple or quad to any type of gamer - part of the reason being more games will definitely ship multi-threaded and the other part being that quad and dual core prices are going to close even more.

Correct me if I may be wrong, but I don't think back in the day the prices of the faster single core and the slower dual core was anywhere near the value we can get today with awesome dual, triple, and quad cores for under $200.
 

richierich1212

Platinum Member
Jul 5, 2002
2,741
360
126
You can probably add TF2 to your list, as they've just added a new patch for multicore support as well
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
Yeh, but I don't know the extent of the support. Who's to say an E8400 still wouldn't provide better framerates than a slightly lower clocked quad, the the Q9400? TF2 was only single-threaded to begin with. Besides I haven't come across any actual numbers yet. I should also note that there are quite a few games where a (lower clocked) quad can match a dual, or slightly surpass it, but I only tried to post results of significance.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
add mass effect to that list, a UT3 game that adds physX to the mix (and for many it will do so on the CPU), quite an improvement going to a quad in terms of stuttering. (stuttering is not reflected in average frame rate, but in ms per frame rendered)
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Yes, it is quite true - even about TFT [not that it 'needs' quad as it runs on mid-range HW]. Last year you could get by on a fast dual core.

Clearly everything has changed in the past few months as devs jump on the quad bandwagon

i got my e8600 for sale :p
- it gets 4.25Ghz easily; i am keeping my slower-clocked q9550s

CHEAP
rose.gif


 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
Originally posted by: taltamir
add mass effect to that list, a UT3 game that adds physX to the mix (and for many it will do so on the CPU), quite an improvement going to a quad in terms of stuttering. (stuttering is not reflected in average frame rate, but in ms per frame rendered)

Do you have any CPU benchmarks?

I have found these, and they show that a quad is not making a huge difference even at low resolution. The QX6850 and E6850 perform basically the same, and the Q6600 can't overtake the E6750. These type of results are found in many, many games where a quad shows some improvment, but I'm only interested in a significant one.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
These benchmark are completely wrong. mass effect is a port of an xbox360 game running the UE3 engine (which scales to 4 cores), the xbox360 has a tri core xenon 3.2ghz.

I have my own work to show for it:
http://forums.anandtech.com/me...id=31&threadid=2205375

Someone has been asking about his unreasonably low FPS and GPU utilization... we started arguing if he is CPU bound with his X2 @ 2.7ghz OC is simply CPU bound on those games... which were Crysis, Age of Conan or Call of Duty 4.
http://forums.anandtech.com/me...id=2205023&STARTPAGE=1

Finally, he mentioned that even mass effect is slow...

I thought to myself, how could this be... So I ran a little test on my gaming machine... this rig has:

E8400 @ 3.6ghz
2x2GB DDR2-1000 ram (running it at 800mhz with lower voltage and timings).
WD640GB HDD (I use a file server for storage)
VisionTek HD4850 512MB GDDR3 card. (with fixed MSI bios further modified for higher fan speeds)

I run mass effect at 1920x1200 (native) resolution, particles are on high (3/3), textures on high (3/4, there is also Very High which is unplayable), no blur, no film grain, vsync on.

Firstly, I am definitely not hitting 60 fps, I can feel some lag.

Secondly, I was shocked to see that even on 3.6ghz and high graphic settings I am still getting 100% CPU usage.

I thought that at such an OC I am certain to be GPU limited in everything I play... especially with a "weaker" card like the 4850 (compared to the 4870 or the G200 series).

I probably need to bite the bullet and edit the cfg file for riva tuner to get it to work with the 4850 like i did with the 8800GTS 512 when it first came out (rather then wait for the next riva tuner version). That way I could see video card usage and exact FPS on mass effect. I actually will do so ASAP.

In the meanwhile, please post any other game that you know that would be CPU bound even on 1920x1200, high settings, and an OCed E8400...


EDIT:
Tests done with FRAPS:
1920x1200 window mode:
Min Max Avg
39 65 52.252

720x480 window mode:
Min Max Avg
47 96 67.797

The FPS barely increases at all going from 720x480 to 1920x1200...
that is going from 345,600 pixels to 2,304,000 pixels. or... 6.6666666666 (repeating) times the pixels!

Both of those exhibit the "jitter", something I normally do not see in games until they fall into the low 10s in FPS, but is seen in mass effect even in the 50s. I don't know why FRAPS doesn't show it, it seems to be something other then an FPS drop, as in, something else in the engine is causing that effect rather then low FPS. (that or I am really really sensitive to sudden drops).
Notice that both are on a 3.6ghz C2D E8400... CPU is a constant 100%, sometimes drops to 95%, I have seen it go as low as 80 once... The GPU is 70-100% on 1920x1200, and at 20% on 720x480... ckearly it is not being taxed at that resolution.

These are tested with frame smoothing off. Which I think improved performance across the board. (but I haven't tested with fraps yet, it was before when I was using the inaccurate bioware tool)...

An interesting thing, the game built in frame counter DID show what appeared to be 10s and 20s on the FPS... fraps never showed it going that low.

however... I think it is a flaw with how fraps calculates it...
digging through the data for the 720x480 resolution I found the following:
605 8385.076 10.98
606 8426.748 41.672
607 8441.728 14.98

This is not the only example, but it is the worst (there were many which were 40 and 39 ms)...

As you can see frame 605 and 607 both took 11 and 15ms respectively to draw... while frame 606 took a whopping 41.672 ms to draw. That is undoubtly the source of the "jitters" I am noticing both in the high and low resolutions. Also... 1000/42 = 23fps... so at that instantanous frame, I had an FPS of 23, just like I said I noticed in the built in counter (which updates EVERY FRAME!). I will try it again with frame rate smoothing turned back on, but as far as I can tell the jitters decreased with it off...
And just to make things clear, this is at 720x480 resolution!

Either way...
Conclusion: Mass Effect is CPU bound as hell on a "mere" 4850 with an E8400 C2D oced to 3.6ghz!
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
That doesn't prove (or disprove) anything. Where's the comparison to a quad core or triple core? All your results prove is that is can make just about full use of a dual core processor. By just considering your results we still don't know if the game actually scales well beyond dual cores. Keyword: Well - which is what my findings are meant to demonstrate.

However I will add Far Cry 2 to my list...
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
This is just the first post, read through the thread.

Basically with a dual core it was not enough, and had microstutter, with a quad core the microstutter was solved.
Microstutter IS emperically measureable, via fraps, as I have mentioned and measures. Microstutter is NOT limited to multi GPU systems. It can occur at any time, its just that muti gpu often have it.
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
I'm not going to read the entire thread where people (Azn) are arguing constantly. But I skimmed it and I saw no emperical evidence listed. Maybe I missed one of your posts, but why don't you just post your results (if you have them) of quad vs. dual in a nice, condensed fashion.

Besides it's not like I'm making an official and all-inclusive list. I already listed UT3, so I think someone would find it obvious that many games based off UE3 would have a good chance of supporting multi-cores.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
FRAPS dump: 720x480 resolution I found the following:
605 8385.076 10.98
606 8426.748 41.672
607 8441.728 14.98

Which are: frame number since recording started, time to render since recording started, time to render compared to previous frame in miliseconds.

As you can see frame 605 and 607 both took 11 and 15ms respectively to draw... while frame 606 took a whopping 41.672 ms to draw.

This is not the only example, there are tons of such drops where it goes from low 10s to about 40ms to render a frame.

Single card micro-stutter... the example I found was 11 to 41.6 to 15 ms to draw a frame; in FPS that is 90, 23, 66 instantaneous FPS at the 720x480 resolution, where my GPU shows a utilization in the low 20%, while the CPU is 100%
Testing with a quad core reduced CPU comsumption and eliminated those type of "Drops", and was confirmed by me and others in that thread who also have access to both CPU types.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
For the record, just because a program shows 100% cpu utilization doesn't mean it's cpu-limited. I've written a bunch of apps which show 100% utilization of a single core while being purely gpu-limited, and sometimes even "utilizing" a dual-core to 100% in the same scenario, depending on driver-level multi-threading "optimizations."

*edit: and on the opposite extreme, a single-threaded app can be entirely cpu-limited while only showing 25% utilization of a quad-core cpu.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
that, would be rare but possible munky, but the CPU utilization is just one more factor corroborating the findings, not the basis for it. There is an entire list of factors and findings and all point to the same thing. The major one however is that you could use set affinity to limit a process, like mass effect, to only 1,2, or 3 cores out of your quad. And going from the tri core to dual core you stutter seeing microstutter AND can measure its existance in fraps by doing a data dump.
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
Then publicize your extensive results, with a quad vs. dual direct comparison FFS. Otherwise we're just arguing for the sake of arguing. But somehow I don't think your findings are completely telling. For starters, I would rather see results run at respectable settings that any gamer would use, which is certainly not a resolution of 720x480. I can't add a game to my "list", which if you can read and comprehend isn't an all-inclusive list anyway, without a solid link demonstrating a quad core's advantage without any other bullshit breaking up the results.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: taltamir
that, would be rare but possible munky, but the CPU utilization is just one more factor corroborating the findings, not the basis for it. There is an entire list of factors and findings and all point to the same thing. The major one however is that you could use set affinity to limit a process, like mass effect, to only 1,2, or 3 cores out of your quad. And going from the tri core to dual core you stutter seeing microstutter AND can measure its existance in fraps by doing a data dump.

well, i tested with e8600 vs q9550s both at 4.0Ghz and also a lot of other CPU speeds but at *gamer's resolutions* of 16x10 and 19x12 with maxed out details - the way normal people play with a fast GPU or even multi-GPU. With a single GPU, it is a toss up with an edge going to quad. When you move to fast graphics, like CFX-3 or even 4870X-2, Quad kicks ass in the new games.

Now i am assuming there are other little differences when you disable a couple of the Quad's cores, like cache size .. how will that affect it?

 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: cusideabelincoln
Then publicize your extensive results, with a quad vs. dual direct comparison FFS. Otherwise we're just arguing for the sake of arguing. But somehow I don't think your findings are completely telling. For starters, I would rather see results run at respectable settings that any gamer would use, which is certainly not a resolution of 720x480. I can't add a game to my "list", which if you can read and comprehend isn't an all-inclusive list anyway, without a solid link demonstrating a quad core's advantage without any other bullshit breaking up the results.

i did publicize it, you simply lack the technical understanding to comprehend them. The result is that there is OBSERVABLE MICROSTUTTER thatis PROVEABLE based on frame dumps in fraps, that microstutter is observed at 1920x1200 resolution AND at 720x480 resolution and anything in between, that microstutter is observable using an E8400, OC to 3.6ghz, and at quads with two cores disabled. it is observable on a 4850, a gtx260 and other video cards.

And that simply going to a quad core fixes the microstutter.

I don't know what more do you want from me, unless you want me to post the entire framedumps...
The frame dumps are large and unwieldly and it is impractical to post them.
IF you are saying I am lying, no point, they could easily be faked.

So really, Nothing further to discuss here.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: taltamir
Originally posted by: cusideabelincoln
Then publicize your extensive results, with a quad vs. dual direct comparison FFS. Otherwise we're just arguing for the sake of arguing. But somehow I don't think your findings are completely telling. For starters, I would rather see results run at respectable settings that any gamer would use, which is certainly not a resolution of 720x480. I can't add a game to my "list", which if you can read and comprehend isn't an all-inclusive list anyway, without a solid link demonstrating a quad core's advantage without any other bullshit breaking up the results.

i did publicize it, you simply lack the technical understanding to comprehend them. The result is that there is OBSERVABLE MICROSTUTTER thatis PROVEABLE based on frame dumps in fraps, that microstutter is observed at 1920x1200 resolution AND at 720x480 resolution and anything in between, that microstutter is observable using an E8400, OC to 3.6ghz, and at quads with two cores disabled. it is observable on a 4850, a gtx260 and other video cards.

And that simply going to a quad core fixes the microstutter.

I don't know what more do you want from me, unless you want me to post the entire framedumps...
The frame dumps are large and unwieldly and it is impractical to post them.
IF you are saying I am lying, no point, they could easily be faked.

So really, Nothing further to discuss here.

is there ANY *possibility* that disabling the two *extra* cores in your benchmarking CAUSES the microstutter - in gaming?
- you are messing with rather large shared cache

i do not see Micro stutter on my e8600 when i compare it to my quad and when it is stressing - except for the frame rates to drop - and that is other issues, not microstutter
rose.gif
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: apoppin
is there ANY *possibility* that disabling the two *extra* cores in your benchmarking CAUSES the microstutter - in gaming?
- you are messing with rather large shared cache

i do not see Micro stutter on my e8600 when i compare it to my quad and when it is stressing
rose.gif

if it would have been that, then using an E8400 who never had those two extra cores to disable wouldn't have had microstutter..

I compared an idnetical system with an E8400 and a Q6600.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: taltamir
Originally posted by: apoppin
is there ANY *possibility* that disabling the two *extra* cores in your benchmarking CAUSES the microstutter - in gaming?
- you are messing with rather large shared cache

i do not see Micro stutter on my e8600 when i compare it to my quad and when it is stressing
rose.gif

if it would have been that, then using an E8400 who never had those two extra cores to disable wouldn't have had microstutter..

I compared an idnetical system with an E8400 and a Q6600.

try that again .. i am not following

OK, now when i compared e8600 against q9550 [look in my sig] i did not see "microstutter" with e8600 .. it just crapped out when it was stressed - the same way q9550 does when it cannot deliver - which is just a rarer event

i am also not so sure that disabling 2 cores of a quad does not cause any strangeness in gaming either
- do we have testing on it?
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
interesting... did you play the entire game on an E8600? the amount of CPU work varies by zone.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
not all 13 games :p

i have to depend on repeatable benchmarks to give consistent results. But i noticed the same thing when i play on e8600 and with q9550s .. just play Clear Sky at 19x12 and they exhibit similar stuttering at the low end

what i am wondering about is what happens when you disable 2 cores of a quad ,, is it the *same* - now, in gaming - as a Dual; is the cache treated the same with the now disabled 2 cores ?