CPU Bottlenecking at Digit-Life, yes we need faster CPUs as much as we need faster GPUs

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Digit-Life Comparison Tool Link

I wanted to preface this by saying this is the single best review/benchmark presentation/tool I've ever seen. It links to different results dynamically depending what radio button you toggle and shows the results below in both FPS/% and FPS/Bar Graph.

The resolutions they used are somewhat low, but the CPUs they used are also pretty slow. The video cards used are low to high-end single cards also, although they've promised results with faster cards/multi-GPU in the future.

The results I was referring to were between the HD4870 and 4850, which are quite dramatic.

Step 1: Go to Configuration and toggle PC1 to 4850 and PC2 to 4870
Step 2: Toggle both CPU to X2 4800+. Notice almost no difference between the cards
Step 3: Toggle PC1 CPU to 6000+. Notice the "slower" 4850 is now outperforming the 4870!
Step 4: Toggle PC2 to 6000+. And once again notice the 4870 is faster in more games than with the 4800+, but is still CPU bound in the more demanding titles.

You can do other comparisons like setting both GPU the same and changing CPU speed to show how much performance is gained. While these are lower resolutions and CPU bound, these are also only single-card solutions. Once you start going to faster single-GPU like the GTX 280 and multi-GPU solutions, the need for a faster CPU are even more important. There's plenty of reviews that show this, many of which I've linked to but I thought this review had an excellent tool in place to show it very clearly. :)

PS: If you have a problem with my use of CPU bottleneck, move along, I'll call a CPU plumber next week but I really think the problem might be a CPU bottlecap, or maybe CPU bottleneck stopper or even possibly CPU bottleneck cork.
 

alcoholbob

Diamond Member
May 24, 2005
6,386
463
126
Well if we are talking about combat heavy scenes, technically every game with some sort of action is CPU bound.

GPU will affect your average framerate for exploration but combat is largely dependent on CPU power. You can compare a wide variety of GPUs from mid end to high end and when the combat gets rough the framerate differs maybe by 2-3 at most.
 

toadeater

Senior member
Jul 16, 2007
488
0
0
"At the same time, Athlon X2 4800+ is apparently insufficient to reveal full potential of the HD 4850, the 6000+ processor will be most likely the best choice. That's probably the best news for those who want a powerful computer for games for a limited budget."

All this test proved is that you shouldn't buy an X2. How does the 6000+ compared to Core 2s? $120 gets you an E7200 which can be OC'd to 3.8-4.0GHz with cheap RAM for a system that can play any game available on max settings. That's the budget king.
 

DaveSimmons

Elite Member
Aug 12, 2001
40,730
670
126
X2 4800+ is a slow CPU in 2008 so of course it bottlenecks. You already needed an X2 4200+ for the nv 7900 back in April 2006 when I build my previous gaming system.

How does the 6000+ compared to Core 2s
In the AnandTech Phenom review the X2 6000+ is only about as fast as a 2.4 GHz Core 2. So it's at least 25% slower than a $160 E8400.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Originally posted by: DaveSimmons

X2 4800+ is a slow CPU in 2008 so of course it bottlenecks.
Yeah, and so is the A64 6000+. That and the top resolution 1680x1050 is low and they didn't even use AA.

So yeah, pair a slow CPU with low detail settings and it'll be CPU limited. I don?t need a chart to tell me that as it?s common sense.

Now, I went from an E6600 to a E6850 and basically got no gain except in older titles that are known to be CPU limited like HL2 and Painkiller.

http://episteme.arstechnica.co...007078831#492007078831

Unless you run very low detail levels by far the biggest bottleneck will be with the GPU in today's games.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
How would introducing AA, a GPU limitation, prove or disprove CPU bottlenecks? If anything it skews results as you may see no difference in FPS with AA or no AA in a heavily CPU bottlenecked situation.

I agree higher resolutions and faster CPUs would've been ideal, but its also important to remember these aren't even the fastest single GPU cards available. Other reviews have shown that fast single cards and multi-GPU solutions show CPU bottlenecking even up to 1920x1200. With the 4870X2 and 4870X2 CrossFireX that even extends to 2560 with slower CPUs as you can get "free 8xAA" even at 1920.

The review does show very clearly however that faster GPUs require faster CPUs to the point you may see bigger gains from upgrading your CPU over your GPU.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: jaredpace
Originally posted by: Astrallite
...when the combat gets rough the framerate differs maybe by 2-3 at most.

aka minimum framerate points. (imo - the most useless bench to compare gpus) I'd much rather see the run in a graph like so:
http://www.xtremesystems.org/f...314&stc=1&d=1218302495


I don't agree with this. When I tested my 8800GTS 640MB, my 3800+ X2 @ 2.66Ghz it performed just as well as my Q6600 @ 3.6Ghz. The minimum framerates were not increased one single bit on most titles I tested, a few titles did see a 3-5% or so increase. This was back when everyone would try and say a X2 4400+ was a bottleneck to a 8800GTS. FYI - The settings I ran at during this time was 1680x1050 @ 4XAA, TSAA.

I am in total agreement with BFG10K1 that running low resolution + low details will obviously allow the CPU to become a limition in many scenarios... However, the minute you turn up the AA and MSAA or TSAA, things really start to shift back onto the GPU. It makes more sense, dollar wise, to upgrade your GPU more often than the CPU for gaming. I think most people understand this.

Also, my testing debunked the floating theory that a faster CPU raised minimum framerates. Perhaps in multiplayer scenarios, but not single player. Not with my own personal testing. The GPU is still the bottleneck in today's games that use any type of high detail settings with AA/AF/MSAA.

And... What is the point of having a super powerful GPU rendering low resolution, low details and no AA/AF/MSAA? That isn't even logical in my mind.


 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
I've already gone over this many times, there's CPU bottlenecking even at higher resolutions (1920) with faster CPUs. I only linked this review because it showed very clearly that current GPUs need faster CPUs.

4870 CF @ 3GHz
Notice there is virtually no difference between 4870 CF and 4850 CF even up to 2560. Also make note how close a single 4870 comes to 4870 CF.

4870 CF @ 4GHz
Notice how much performance increases with 4870CF vs 4850CF, with 4870CF 20% faster 2560. Just so happens 4870 is clocked 20% higher than 4850....

Also notice a *single* 4870 @4GHz is faster than 4870 CF @ 3GHz at resolutions up to 1920. Absolutely mindbottling!!!!

Like I said in the OP, Xbit has promised higher resolutions with faster cards and hopefully faster CPUs in the future, but its quite obvious that historically GPU bottlenecked resolutions are increasingly becoming CPU bottlenecked with the faster solutions available today.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
I wish they would have used two CPU's that both have the same amount of L2 cache. But, I'm really not seeing that big of a difference, at least not a 1680x1050. A few FPS at most. My guess is most people here (who game at 1680x1050 at a min, many game much higher) can get by very well with a fairly middle of the road CPU. Faster certainly doesn't hurt, but I just don't see a need for a 4GHz CPU, not that I wouldn't want one, but I don't see a 'need' for gaming.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Originally posted by: chizow

How would introducing AA, a GPU limitation, prove or disprove CPU bottlenecks?.
Because by loading the GPU you're shifting the bottleneck away from the CPU, thereby proving the only reason you were getting a difference in CPUs is because your settings were too low.

I mean let?s take it a step further and follow your reasoning. Is it okay for me to test Bioshock at 320x240 with no AA or AF and conclude there?s no difference between a GTX280 and a 4850 when the scores come out the same?

If anything it skews results as you may see no difference in FPS with AA or no AA in a heavily CPU bottlenecked situation.
How does it skew the results? I'm not sure you understand what a CPU limitation is or how to accurately test or interpret it.

Look at my results again:

http://episteme.arstechnica.co...007078831#492007078831

Notice how the right column (% gain) most of the time has a gain of less than 3%? In fact a 25% gain in CPU clock speed only caused an average performance gain in the 15 titles of 1.8%.

That's a text-book GPU bottleneck and clearly going from an E6600 to E6850 shows almost no difference at the settings I play as they?re more than high enough to bottleneck a 8800 Ultra.

Heck, 6 out of 15 titles showed literally no speed increase (0%).

Now drop to 1600x1200 with 4xAA and move across to the left ?% gain? column. Even after dropping to a middling resolution with a middling AA level the gain is usually still less than 5%, an overall average of 4.7% for the 15 titles after a 25% gain in CPU clock speed, thereby proving the GPU was largely the bottleneck there as well.

Other reviews have shown that fast single cards and multi-GPU solutions show CPU bottlenecking even up to 1920x1200.
That may be so, but Digit-Life's charts are useless to make that kind of inference. You can?t find any old chart with slow CPUs and low detail levels that shows a CPU limitation and then make sweeping generalizations about high-end configs. It doesn?t work like that.

Again look at my results and you'll see it crystal clear.

Even a 2.4 GHz E6600 was fast enough to almost completely saturate an 8800 Ultra at middling settings of 1600x1200 with 4xAA, even in four year old games like Doom 3 and Far Cry.
 

Marty502

Senior member
Aug 25, 2007
497
0
0
How crippled do you guys think a 2.7-2.8 Ghz AMD X2 would be at 1680x1050 4xAA with a Radeon 4850 in most modern games? Grid, Assassin's Creed, Crysis, you know the drill. The intensive ones.

With my old 3870, Crysis definitely taxed my video card at 1680x1050 even without AA; I had to drop settings like Objects detail. GRID sometimes dipped into the 20s but mostly stayed cool with 4xAA and everything in Ultra, and Assassin's Creed had pretty good framerates overall with some dips here and there. I wonder if a 4850 would be any better?
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Originally posted by: chizow

4870 CF @ 3GHz
Notice there is virtually no difference between 4870 CF and 4850 CF even up to 2560. Also make note how close a single 4870 comes to 4870 CF.

4870 CF @ 4GHz
Notice how much performance increases with 4870CF vs 4850CF, with 4870CF 20% faster 2560. Just so happens 4870 is clocked 20% higher than 4850...
Notice how none of those results use AA or AF?

How about looking at a proper chart of UT3?

http://www.bootdaily.com/index...6&limit=1&limitstart=7

At 1920x1200 with 8xAA we have no trouble distinguishing between a single 4850, single 4870 and crossfire 4850 & 4870 configs, which in itself isn?t a very high resolution.

Heck, even at 1680x1050 we can still see large differences and keep in mind that their CPU was only a AMD Phenom @ 2.2GHz, a far cry from the 3 GHz/4 GHz config used in your links.

And notice how as we drop down the page and lower the resolution the differences between GPUs start to decrease? That?s because we?re shifting the bottleneck away from the GPU and the settings become too low to test GPUs properly, which is the exact problem Tweaktown?s results suffer from.

It?s just a shame they didn?t do 2560x1600 with 8xAA as it would have proven my point even more.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: BFG10K
Originally posted by: chizow

4870 CF @ 3GHz
Notice there is virtually no difference between 4870 CF and 4850 CF even up to 2560. Also make note how close a single 4870 comes to 4870 CF.

4870 CF @ 4GHz
Notice how much performance increases with 4870CF vs 4850CF, with 4870CF 20% faster 2560. Just so happens 4870 is clocked 20% higher than 4850...
Notice how none of those results use AA or AF?

How about looking at a proper chart of UT3?

http://www.bootdaily.com/index...6&limit=1&limitstart=7

At 1920x1200 with 8xAA we have no trouble distinguishing between a single 4850, single 4870 and crossfire 4850 & 4870 configs, which in itself isn?t a very high resolution.

Heck, even at 1680x1050 we can still see large differences and keep in mind that their CPU was only a AMD Phenom @ 2.2GHz, a far cry from the 3 GHz/4 GHz config used in your links.

And notice how as we drop down the page and lower the resolution the differences between GPUs start to decrease? That?s because we?re shifting the bottleneck away from the GPU and the settings become too low to test GPUs properly, which is the exact problem Tweaktown?s results suffer from.

It?s just a shame they didn?t do 2560x1600 with 8xAA as it would have proven my point even more.

Agreed. Also If memory serves me right, the 2.2Ghz Phenom isn't much faster clock for clock, core for core than the older X2 series. IIRC we are talking 10% faster than a X2, which basically would roughly equate to a X2 clocked around 2.4Ghz

There may in fact be some isolated examples of CPU bottlenecking out there, but they are far from the norm, occur possibly in a very small number of games that the point becomes quite moot, in my opinion.

Generally speaking, though, a platform upgrade costs more money than a GPU upgrade. Although this is not always true. It is also true that a faster CPU and platform will help in more than just games not to mention game load times, etc... But if the entire reason to upgrade is just to gain some frames per second in some games, just be sure you really check it out thoroughly before plunking down that hard earned cash for a CPU upgrade.
 

praesto

Member
Jan 29, 2007
83
0
0
I only have one point to make in this matter: If you can afford to buy a kickass gpu, then you can damn well also afford an E8xxx, which will be more than sufficient. So in my opinion we do not need faster cpu's. Intel has already delivered ''gods gift'' to us, we need graphics manufacturers to do the same. With that said, I certainly can't complain about the current gpu prices. They are more than lovely :)
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Originally posted by: ArchAngel777

Agreed. Also If memory serves me right, the 2.2Ghz Phenom isn't much faster clock for clock, core for core than the older X2 series. IIRC we are talking 10% faster than a X2, which basically would roughly equate to a X2 clocked around 2.4Ghz
Yep, the Phenom used in the tests is mid-range at best yet it was still producing large performance rifts when moving to Crossfire in UT3.

Not to mention that UT3 is more CPU limited than most modern titles, akin to HL2 and Painkiller being used alongside Far Cry and Doom 3 in the olden days.

There may in fact be some isolated examples of CPU bottlenecking out there, but they are far from the norm, occur possibly in a very small number of games that the point becomes quite moot, in my opinion.
I agree completely. Most modern games are massively GPU bottleneck and most CPU bottlenecking is caused by inept reviewers running settings they shouldn't be running at.

The fact is if you want a higher framerate in modern games the biggest gain by far comes from upgrading the GPU, not the CPU.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: ArchAngel777
Originally posted by: jaredpace
Originally posted by: Astrallite
...when the combat gets rough the framerate differs maybe by 2-3 at most.

aka minimum framerate points. (imo - the most useless bench to compare gpus) I'd much rather see the run in a graph like so:
http://www.xtremesystems.org/f...314&stc=1&d=1218302495


I don't agree with this. When I tested my 8800GTS 640MB, my 3800+ X2 @ 2.66Ghz it performed just as well as my Q6600 @ 3.6Ghz. The minimum framerates were not increased one single bit on most titles I tested, a few titles did see a 3-5% or so increase. This was back when everyone would try and say a X2 4400+ was a bottleneck to a 8800GTS. FYI - The settings I ran at during this time was 1680x1050 @ 4XAA, TSAA.

I am in total agreement with BFG10K1 that running low resolution + low details will obviously allow the CPU to become a limition in many scenarios... However, the minute you turn up the AA and MSAA or TSAA, things really start to shift back onto the GPU. It makes more sense, dollar wise, to upgrade your GPU more often than the CPU for gaming. I think most people understand this.

Also, my testing debunked the floating theory that a faster CPU raised minimum framerates. Perhaps in multiplayer scenarios, but not single player. Not with my own personal testing. The GPU is still the bottleneck in today's games that use any type of high detail settings with AA/AF/MSAA.

And... What is the point of having a super powerful GPU rendering low resolution, low details and no AA/AF/MSAA? That isn't even logical in my mind.

Can't argue with you there. This has been tested and proven again and again in multiple articles. As you go in higher resolutions and adding AA it becomes GPU limitation more than CPU. This depends on the game of course but we are still GPU limited for the most part by adding more effects and filters into the mix.

 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: BFG10K
Originally posted by: chizow

How would introducing AA, a GPU limitation, prove or disprove CPU bottlenecks?.
Because by loading the GPU you're shifting the bottleneck away from the CPU, thereby proving the only reason you were getting a difference in CPUs is because your settings were too low.

I mean let?s take it a step further and follow your reasoning. Is it okay for me to test Bioshock at 320x240 with no AA or AF and conclude there?s no difference between a GTX280 and a 4850 when the scores come out the same?

If anything it skews results as you may see no difference in FPS with AA or no AA in a heavily CPU bottlenecked situation.
How does it skew the results? I'm not sure you understand what a CPU limitation is or how to accurately test or interpret it.

Look at my results again:

http://episteme.arstechnica.co...007078831#492007078831

Notice how the right column (% gain) most of the time has a gain of less than 3%? In fact a 25% gain in CPU clock speed only caused an average performance gain in the 15 titles of 1.8%.

That's a text-book GPU bottleneck and clearly going from an E6600 to E6850 shows almost no difference at the settings I play as they?re more than high enough to bottleneck a 8800 Ultra.

Heck, 6 out of 15 titles showed literally no speed increase (0%).

Now drop to 1600x1200 with 4xAA and move across to the left ?% gain? column. Even after dropping to a middling resolution with a middling AA level the gain is usually still less than 5%, an overall average of 4.7% for the 15 titles after a 25% gain in CPU clock speed, thereby proving the GPU was largely the bottleneck there as well.

Other reviews have shown that fast single cards and multi-GPU solutions show CPU bottlenecking even up to 1920x1200.
That may be so, but Digit-Life's charts are useless to make that kind of inference. You can?t find any old chart with slow CPUs and low detail levels that shows a CPU limitation and then make sweeping generalizations about high-end configs. It doesn?t work like that.

Again look at my results and you'll see it crystal clear.

Even a 2.4 GHz E6600 was fast enough to almost completely saturate an 8800 Ultra at middling settings of 1600x1200 with 4xAA, even in four year old games like Doom 3 and Far Cry.

Well said.
 

biostud

Lifer
Feb 27, 2003
19,870
6,974
136
The test definitely needs a 4xAA + 16xAF comparison as well. No-one will run a 4850 or 4870 without those on.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: BFG10K
Because by loading the GPU you're shifting the bottleneck away from the CPU, thereby proving the only reason you were getting a difference in CPUs is because your settings were too low.

I mean let?s take it a step further and follow your reasoning. Is it okay for me to test Bioshock at 320x240 with no AA or AF and conclude there?s no difference between a GTX280 and a 4850 when the scores come out the same?
If you're loading the GPU to shift the bottleneck away from the CPU then you agree that the bottleneck is the CPU.

That's ironic that you bring up the 320x240 example because I asked you the very same thing when AT's results showed similar CPU bottlenecking up to 1920, but you of course ducked the question. Running at 320x240 would most likely serve no purpose as you'd still be CPU bottlenecked at much higher resolutions with modern GPUs. The point of finding where the paradigm shifts between GPU/CPU bottleneck is so you can make informed decisions about hardware. With current solutions you will be CPU bottlenecked at historically GPU bound resolutions.



How does it skew the results? I'm not sure you understand what a CPU limitation is or how to accurately test or interpret it.

Look at my results again:

http://episteme.arstechnica.co...007078831#492007078831

Notice how the right column (% gain) most of the time has a gain of less than 3%? In fact a 25% gain in CPU clock speed only caused an average performance gain in the 15 titles of 1.8%.

That's a text-book GPU bottleneck and clearly going from an E6600 to E6850 shows almost no difference at the settings I play as they?re more than high enough to bottleneck a 8800 Ultra.
Uh, no shit? Does that mean there's no gain from an E6600 to E6850 if you turn off AA and see a 10-15% boost? Of course not, its because you chose to run a GPU bottlenecked resolution and setting on an older part. There's much faster solutions out now that call for faster CPUs.

But lets use the flipside of your 320x240 anecdote, if I ran a P4 3.2GHz and my GTX 280 and went to a Q6600 @ 3.6 but ran 16xTRSSAA at 2560x1600 and averaged 12 FPS on both, would I be able to conclude there is no benefit of a faster CPU? Of course not. When you load the GPU you skew your results by hiding any benefit of a faster CPU.

Heck, 6 out of 15 titles showed literally no speed increase (0%).

Now drop to 1600x1200 with 4xAA and move across to the left ?% gain? column. Even after dropping to a middling resolution with a middling AA level the gain is usually still less than 5%, an overall average of 4.7% for the 15 titles after a 25% gain in CPU clock speed, thereby proving the GPU was largely the bottleneck there as well.
Run those again with SLI or a GTX 280, heck even a 4870 and see how it turns out at that once-high resolution.

That may be so, but Digit-Life's charts are useless to make that kind of inference. You can?t find any old chart with slow CPUs and low detail levels that shows a CPU limitation and then make sweeping generalizations about high-end configs. It doesn?t work like that.

Again look at my results and you'll see it crystal clear.

Even a 2.4 GHz E6600 was fast enough to almost completely saturate an 8800 Ultra at middling settings of 1600x1200 with 4xAA, even in four year old games like Doom 3 and Far Cry.
Just as you can't point to a 16 month old part based on 2 year old tech that runs 1/2 as fast or less than current solutions, crank up GPU limited settings and claim there is no benefit of faster CPUs......

Originally posted by: BFG10K
Notice how none of those results use AA or AF?

How about looking at a proper chart of UT3?

http://www.bootdaily.com/index...6&limit=1&limitstart=7

At 1920x1200 with 8xAA we have no trouble distinguishing between a single 4850, single 4870 and crossfire 4850 & 4870 configs, which in itself isn?t a very high resolution.

Heck, even at 1680x1050 we can still see large differences and keep in mind that their CPU was only a AMD Phenom @ 2.2GHz, a far cry from the 3 GHz/4 GHz config used in your links.
Yes, that proves that 1920 with 8xAA is GPU limited, how is that a surprise when you couldn't even find a review with 8xAA at 1920 2 years ago with last-gen parts? Of course the paradigm shifts as you introduce new parts, thats the point of constantly using the fastest available to see where CPU/GPU bottlenecking converges. If you instantly go to GPU bottlenecked settings you will not be able to clearly see where the shift occurs because the GPU intensive settings skew the results.

But even in that example, you can see the faster solutions in the comparison are still CPU limited up to 1680x1050 with 8xAA as there is very little increase between resolutions with the GTX 280 and 4870CF. If you inject a faster CPU you most likely see an increase in the lower resolutions but still see GPU bottlenecking at 1680 or 1920 with such high AA levels.

Even those slower parts are probably getting "free AA" up to 1680 due to such a slow CPU bottlenecking maximum framerates, which is fine as the paradigm shifts again to higher resolutions due to CPU bottlenecking and faster GPUs.
And notice how as we drop down the page and lower the resolution the differences between GPUs start to decrease? That?s because we?re shifting the bottleneck away from the GPU and the settings become too low to test GPUs properly, which is the exact problem Tweaktown?s results suffer from.
Yes of course lowering resolution/settings will increase performance, but it also shows how the faster solutions are CPU bottlenecked. There is less difference, but can you come to the conclusion the parts perform the same based on a CPU bottlenecked resolution? Without AA, CPU bottlenecking extends even to previously GPU bottlenecked resolutions like 1920.

It?s just a shame they didn?t do 2560x1600 with 8xAA as it would have proven my point even more.
It would've proven 2560x1600 with 8xAA only became a reality with this last generation of parts....how is this a surprise?

 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Originally posted by: chizow

If you're loading the GPU to shift the bottleneck away from the CPU then you agree that the bottleneck is the CPU.
Right?but that was never under contention.

The contention was with you running around quoting AA-less benchmarks and Digit's chart with slow CPUs and low settings and trying to draw inferences across the board. The fact is people run AA with high-end boards (that?s the whole point of them) so your ramblings are largely meaningless to them.

That's ironic that you bring up the 320x240 example because I asked you the very same thing when AT's results showed similar CPU bottlenecking up to 1920, but you of course ducked the question.
Uh, no. I repeatedly pointed out the examples with 8xAA, especially Firingsquard?s 2560x1600 8xAA tests, but you then you started harping on about fictional game caps and ignored the results so we never got anywhere.

With current solutions you will be CPU bottlenecked at historically GPU bound resolutions.
Not if you run AA you won't, something that is the whole point of high-end configs, and something many do on such configs.

Uh, no shit? Does that mean there's no gain from an E6600 to E6850 if you turn off AA and see a 10-15% boost?
Why would I turn off AA? For that matter why should anyone that runs with AA pay attention to any of the AA-less benchmarks you provide?

If you chose to run without AA on your GTX280 that?s your problem, not mine. But don?t run around claiming people are CPU bottlenecked just because you chose to run pre-shader settings on your 2008 card.

There's much faster solutions out now that call for faster CPUs. Run those again with SLI or a GTX 280, heck even a 4870 and see how it turns out at that once-high resolution.
Likewise there are also much more demanding games than Far Cry and Doom 3.

A GTX280 is probably 50% faster than a 8800 Utra and I'd bet Far Cry and Doom 3 can generate 50% more framerate at similar settings than the likes of Call of Juarez, so the net effect would be similar.

After I get a GTX280 I?ll probably move to an Core i7 and provide more benchmarks thereby proving you wrong yet again. But I?m sure you?ll be telling me how my games are ?capped?. :roll:

Yes, that proves that 1920 with 8xAA is GPU limited, how is that a surprise when you couldn't even find a review with 8xAA at 1920 2 years ago with last-gen parts?
I?m not sure what that answer is even supposed to mean.

Are you going to retract Tweaktown?s AA-less benchmarks using a 3 GHz/4 GHz CPU given it?s been demonstrated a 2.2 GHz Phenom is enough to saturate even 4870 Crossfire when running 8xAA?

Remember, one of the main attractions of the 4870 is to run 8xAA; running one without AA is just dumb. The same applies to nVidia high-end configs.

If you instantly go to GPU bottlenecked settings you will not be able to clearly see where the shift occurs because the GPU intensive settings skew the results.
You mean like the useless Digit-Life results you provided which instantly go to CPU limitation because of archaic processor speeds and Voodoo era image settings?

It would've proven 2560x1600 with 8xAA only became a reality with this last generation of parts....how is this a surprise?
Uh, no. Plenty of older games would be playable at those settings even on older generation parts (e.g. HL2).
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
I'm not going to waste a lot of time replying to your nonsense on a point by point basis, about 15 reviews just popped up proving me to be 100% correct. Here's a few highlights:

Tech Report Call of Duty 4 Results

Well, this is a great game, but it's pretty much CPU limited with its quality settings maxed out along with 16X aniso and 4X AA. At 2560x1600, the multi-GPU configs do separate a little bit, and the 4870 X2 comes out ahead of three GTX 260s in SLI, let alone two.

Tech Report Quake Wars: ET

Its clearly obvious even up to 2560 with 4xAA that the 3.0GHz C2 CPU is not capable of pushing more FPS as the faster solutions are capping out at 86-89FPS in COD4 and 110-115 in QW: ET at low resolutions all the way up to 2560.

Its even more obvious when you add 2nd, 3rd or even 4th cards to the mix and they yield absolutely no improvement, and even reduced performance in many cases due to additional CPU overhead.

So again, based on those results, do you think the 9800GTX XXX SLI or 9800GX2 are as fast as Radeon 4870X2 CrossFire (4 GPU) at 1920x1200 with 4xAA in COD4??? The graph shows 83, 84 and 89FPS respectively, are you going to conclude they're comparably fast solutions? Or are you going to now measure performance in +free AA? Or are you going to admit to the obvious, that current CPUs are simply not fast enough to keep up with current GPU solutions?

The results are very different from Hardware Canucks, one of the few sites that figured it would make sense to test these exponentially faster solutions with a faster-than-stock CPU.

COD4 @ Hardware Canucks

From their conclusion:

Notice I said that this card needs the right system? I mean it. The HD4870 X2 does not deserve to be put in a non-overclocked system with an under-24? monitor. This means that if you are playing at a slightly lower resolution, you have the very real possibility of this card being severely CPU limited if you don?t overclock the balls off your processor. Just remember this when you are searching through the internet for more information about this card; in many cases, a stock processor can and will hamstring ATI?s new wunderkind in many cases. Heck, even our overclocked QX9770 at nearly 4Ghz showed it wasn?t up to the task of running with this card in a game or three.

Pretty much every review mentions CPU bottlenecking in general or on a game-by-game basis. There's no point in linking every example, I'd be here all day and its plainly obvious when you can literally lay a ruler flat across the max FPS and its the same across different resolutions and settings.

I'm sure you'll reply by pointing out specific examples with 8xAA or more at 2560 that show the CPU has no impact on GPU bottlenecked settings and resolutions, but the fact remains every example above and in nearly every review that came out today with the 4870X2 shows heavy CPU bottlenecking, even with AA up to 1920 and sometimes even 2560 with current CPUs. The reason we still use Voodoo-era resolutions and settings without AA is because most people still game at 1920 or lower without AA in modern games and use reviews to make informed decisions.