Tom's Hardware: CPU/GPU Bottlenecks

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
@RussianSensation

Well, I suppose we could apply 4x AA in some games at those resolutions and be ok for gameplay, but it’s a CPU review and I dint run the benches with filters. ;)
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
But if you play games like Civ5, SC2, Dragon Age: Origins, Far Cry 2, World in Conflict, Arma2, RE5 and get a Q6600 + GTX480 then a Core i7 860/920 + GTX470 system will destroy it.
They had Starcraft 2 and it was showing clear GPU bottlenecking, and they didn’t even use AA. If they’d dropped a GTX480 into that system they would’ve gotten a large performance gain, especially if they’d been using AA.

Yes, there are some cherry picked games that show bigger performance gains from the CPU, but for the vast majority of games running at their highest playable settings, they’ll be influenced by the graphics card the most, often to the point of being completely bottlenecked by it.

They tested AvP at 1920x1080 8AF and Anno1404 at 1920x1080 8AA! on a GTX460 768mb and then measured CPU limitation on a Core i5 4.0ghz lol! Are they nuts? a 768mb GTX460 can't play AvP with tessellation smoothly regardless what CPU it's paired with. That would be the same as using 8800GT at 1920x1080 Entusiast setting in Crysis and then arguying that changing CPU speed from 4.0ghz to 2.0ghz made no difference - we already know the conclusion before testing!
The point was that a GTX460 is a bottleneck even with a dual-core CPU, and that something like a GTX480 would show a much higher framerate.

I tested AvP3 on a GTX470 using 1920x1200 with 16xAF and 2xTrMS and there was absolutely no performance drop by disabling two cores on my i5 750 (43.40 FPS average in both cases). I also played through the game from start to finish at those settings, and most of the time the framerate was over 40 FPS, which is fine for a non-twitch shooter like that.

Next I dropped a GTX480 into my system and at the same settings it caused a 27.43% performance gain (57.60 FPS).

Next I re-tested my GTX480 with an i7 870 (14% clock advantage over an i5 750) and I got zero performance gain.

I’ve confirmed this many times in dozens of games using the actual settings I play them at. The GPU is vastly more important for dictating overall performance than the CPU is.

We often see people on the forums with much slower cards (e.g. 4850) with decent dual-cores being absolutely petrified to upgrade their GPU because they think they’re CPU limited. Or even better, we see people overclocking hex-core processors while running crappy 1680x1050 displays.

Plus no minimum framerates, which makes any CPU limitation article worthless.
A minimum framerate means squat if it doesn’t have a benchmark graph putting it into perspective. Here’s an example from your own sources:

CODMW2_01.png


The average is ~15% lower, but the minimum is the same, so I'd guess you’d consider the single core equal?

Ok now, let's say I add a GTX480 into the 2 systems. I can increase AA, resolution, and still get the same 38 avg if I wanted to on the C2Q @ 3.6ghz, since I'll be transferring the load to the GPU. My CPU can still support 38 fps avg. Therefore, I'll be able to increase image quality and still maintain decent playability, or I will get faster framerates than 38 fps on a faster CPU at the same image quality (if I don't have a CPU limitation). Now if I add a GTX480 into the E6850 rig, it's still choppy and unplayable. I am already at < 30 fps without AA, with minimums at 21!
It doesn’t always work like that as Tom’s article showed. If there’s a significant load on the GPU you can still see higher performance even if it isn’t utilized 95% or more. The same applies to the CPU, except it’s the graphics card that does this far more often.


Here’s a quick example from my own Far Cry 2 testing at 2560x1600, 16xAF, 2xTrMS.
  1. i5 750 (2 cores) + GTX470 = 45.04 FPS.
  2. i5 750 (4 cores) + GTX470 = 47.92 FPS (6.39% gain over #1).
  3. i5 750 (4 cores) + GTX480 = 58.09 FPS (21.2% gain over #2).
  4. i7 870 (4 cores) + GTX480 = 59.72 FPS (2.81% gain over #3).
So, dropping a GTX480 into my system netted over three times the performance gain of adding an extra two cores. Also a 14% clock increase from the i7 870 provided less than a 3% performance gain on a GTX480.

These aren’t BS theoretical settings either; I’m actually playing through Far Cry 2 right now at those settings, and the GTX480 has provided a far larger performance gain than all of my CPU upgrades combined.

I’d much rather have an E6850 + GTX480 for Far Cry 2 than an i7 870 + GTX470 because the former will run the game much better than the latter, at very realistic and playable settings too.

Quake 1-4
Doom 3
Far Cry 1
Serious Sam 2
Fear 1
Quake Wars
HL2
Unreal Tournament 99, 2004
Call of Duty 1, 2
Return to Castle Wolfenstein
Medal of Honor Pacific Assault
I’m glad you’re such an astute student of my benchmarking methods Russian, LOL! :p

But if you look closely at my CPU scaling articles (e.g. the last one which tests the i5 750 with two vs four cores), you’ll see I stick to more modern games. Due to the forum rules I’m not allowed to link to my articles, but you can if you want to discuss specifics.

The upcoming i7 870 vs i5 750 will do the same, except I’ll throw in some older games just to prove to people that any game can be GPU bottlenecked by running high enough settings, settings which are both realistic and meaningful for gaming situations.

For example, Quake 3 (an 11 year old game) is bottlenecked by my GTX480 at my settings of 2560x1600 with 16xAF and 16xS. ~200 FPS average is plenty for that twitch shooter, and I actually use those settings in that game.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Next I dropped a GTX480 into my system and at the same settings it caused a 27.43% performance gain (57.60 FPS).

Next I re-tested my GTX480 with an i7 870 (14% clock advantage over an i5 750) and I got zero performance gain.

In AvP benchmark with tessellation enable, CPU speed doesn’t mater, the GPU plays the most important role. I believe this to be true in all FPS DX-11 Games (Tessellation enable) like AVP and Lost Planet 2. CIV-5 needs both CPU and GPU.

So at the end, its all have to do with the game and the resolution/settings we play each game.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
and the GTX480 has provided a far larger performance gain than all of my CPU upgrades combined.

I remember I upgraded from Athlon XP 1600+ to P4 2.6ghz C @ 3.2ghz and my frames with Radeon 8500 doubled in Unreal Tournament. Many years ago I was playing BF: 1942 in college using my XP1600+ & Radeon 8500 64mb. At the time, my friend had a Pentium III 733mhz + PNY GeForce 4200, a card probably 2x faster than the 8500. He was never able to have the same smooth gameplay as me and always had to lower his in-game setting image quality compared to my system despite a much faster videocard. Since we played over LAN, I was always there in the room bugging him to upgade his CPU. hehe

I have experienced CPU bottlenecking so many times even with my Q6600 @ 3.4ghz in Resident Evil 5 that I will always buy a system with a CPU that will let me max out 2-3 generations of GPUs. I am perfectly fine being GPU bottlenecked because I can swap out a new GPU every 12 months. However, my trusty i7 I got a Microcenter more than a year ago is easily going to last another 12 months - not so sure about the 470! Even with SB, it's not looking like I will be able to increase my CPU performance by more than 50&#37;, not at least until Socket 2011 ships. Plus, because the CPU helps in other aspects of PC use, not just games, I'd rather spend $200+ on a CPU upfront and not worry about it for 2+ years. Again, I can understand if a quad-core cost 2x more than a dual core today. However, with Core i3s at $140 and i5 at $200, it's not very smart to not get the quad.

Again, remember you are using far off extreme examples of 2560x1600 gaming. Don't forget your resolution has 98% more pixels than mine. Therefore, your videocards have 2x the workload from the beginning. This is why you rarely face CPU bottlenecks (plus your undeniable appreciation for FPS genre, which tends to be a lot less CPU limited in the first place). :p
 
Last edited:

Ben90

Platinum Member
Jun 14, 2009
2,866
3
0
I'm sorry if this has already been mentioned, but I'm not reading this thread. I saw they were testing a 768mb card at 1920x1200@8xAA and stopped reading. This isn't testing CPU/GPU bottlenecks, they are testing CPU/VRAM bottlenecks. Most useless article ever.
 

MangoX

Senior member
Feb 13, 2001
623
165
116
Many moons ago I picked up a 8800GTS 640mb when they launched and paid very dearly so for as an upgrade to my X800XL only to find that the performance improvement was not as reviews showed. Being younger and less wise I chalked up the differences between the test systems. I was still using an old Athlon64 X2 3800+ and reviewers were testing on Core2. I didn't know about CPU limitations.

Later down the road I started to get into video encoding and I needed something faster; faster than what I had. I ended up getting a 680i board and a G0 q6600... a cpu known for it's amazing overclocking abilities. Right out of the box I easily hit 3.3ghz stable and wow... are my games ever flying.

The guys recommending a i7 + GTX470 have reason to, as they've most likely experienced cpu limitations as I have in some kind of way. Though an i7 is not necessary at all for gaming. The sweet spot appears to be the i5 750/760, and it's all we'll ever need for the next few years.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I'm sorry if this has already been mentioned, but I'm not reading this thread. I saw they were testing a 768mb card at 1920x1200@8xAA and stopped reading. This isn't testing CPU/GPU bottlenecks, they are testing CPU/VRAM bottlenecks. Most useless article ever.

Mentioned by Termie in Post #4 on the first page. You missed the part where they used AA+Tessellation at 1920x1200 in Metro 2033, a setting not even a GTX480 SLI setup can run smoothly. They basically applied the maximum AA levels in modern games that brought 768mb card to its knees, 8AA that no one with a GTX460 would use at 1920x1200. But eh, it sure supports the CPU limited conclusion well. ;)

There are basically 2 schools of thought here. The first Group wants 8-16AA/2-4x TrSS at 60 fps in games. Since these types of settings are impossible in any modern 2009-2010 games which will chug at 16-35 fps, it automatically defaults Group #1 to play 5-10 year old games. As a result, any modern game is GPU limited and is off limits until GeForce GTX780+ arrives.

Group #2 doesn't play older games and have long moved on to modern games. However, they are also not interested in playing Metro 2033 in 2015. They instead want 60-85 fps with 40-50 fps minimums, even if it means 0-4AA/no tessellation in modern games. This group believes that Metro 2033, STALKER:CoP, Crysis Warhead all look better with 0AA than all COD, all Unreal Tournament, all Quake/Doom, Far Cry games combined with 16xAA+. Group #2 has also purchased their latest hardware to play modern games smoothly (think 50-60+ fps), not necessarily to max them out with an endless array of filters.

If you associate yourself with Group #1, then almost every modern game you will play will be GPU limited. Since Group #1 will not play any modern game with 0AA with maxed in-game quality settings and smooth framerates, to them CPU speed is irrelevant. By the time they do play those modern games, they would have upgraded their CPU many times over at which point no "modern game today" will be CPU limited in 4-5 years from now. As a result, no CPU bottleneck ever exists to Group #1.
 
Last edited:

bryanW1995

Lifer
May 22, 2007
11,144
32
91
They aren't testing with the wrong graphics card at the resolution they are using.
Actually their test is very sensible. Take a graphics card positioned at a certain resolution, and test it at that resolution.
http://www.techpowerup.com/img/10-09-06/27b.jpg

They paired it with a sensibly priced CPU, the sort which you might see in GTX460 system, which is the easiest way to see which component you should think of spending more on (~$200 GPU vs ~$200 CPU).

:$: my i7 920 is now clamoring for a hotter girlfriend. thanks a lot.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
The article doesn't really help anyone who plans on upgrading imo. Most people who are looking to upgrade the CPU are already using dual cores (C2D 1.86 - 3.0ghz variety - E6300 - E8400) or low end quads at stock speeds (such as Q6600/6700). These people are wondering if upgrading the GPU is worth it from their 4850/4870/GTX260, or if they are going to be bottlenecked by their slower CPUs. In other words, it would be completely wasteful to get a GTX480 for a stock E6600/Q6600 as such systems would produce almost identical framerates with a slower GTX460/5850 videocard, making a GTX480 wasteful.

From that perspective, the article did little to help these users decide what to upgrade. It would have been far better to see various systems such as C2D 1.86, C2D 3.0ghz, Core i3/i5 @ 4.0ghz, Athlon X4s + 4850 compared to the same CPUs with GTX460/480/5870 and then tested SLI/CF setups too. Then we would have seen which GPUs are wasteful for which CPUs and what's the minimum modern CPU clock speed/core count for modern games (not just FPS variety either).

Plus, you can't compare an i5 dual core processor to a dual core Phenom or C2D processor due to the differences in performance per clock and the effects of shared 8mb cache. And like I said, they didn't include minimums in most of their graphs - CPU plays a large role for minimum framerates.

This article would have been great if Xbitlabs, LegionHardware, PCgameshardware and Techspot already didn't produce far superior CPU/GPU articles. However, since the results of these websites constantly contradict the predominant view on our forum that CPU speed is not important, I only see Toyota, myself and a handful of others linking to them (with BFG on many occassions ignoring results from all 4 of those websites because they show both CPU frequency and core count dependence in a large variety of games; and they focus on minimum framerates - a metric BFG largely dismisses as 'inaccurate').

So we have 4 independent sources which continue to show that CPU speed is important and 1 source that shows that it isn't (on top of that using a $130 videocard paired with a $200 CPU to prove their point). It's almost the same as Wreckage trying to find 1-2 outlier benchmarks where a stock GTX460 beat an HD5870 and then claiming that GTX460 is as fast as an HD5870. Bottom line is, every game is different. The games one plays should be tested separately in order for us to answer if CPU or GPU is more important for a particular game.

Again, what is so surprising about a GTX460 768mb being the bottleneck of a Core i5 3.0ghz system in games tested at 1920x1080 4/8AA? They mysteriously omitted testing GTX480/5970 to show that the importance of the CPU.

you have a lot of good points. I have personally seen on many occasions the importance of higher cpu speed/oc'ing in DAO and civ5. In fact, I always close down all other apps and boinc when playing either of those games on my i7 920 @ 4.0 b/c otherwise I can get some hiccuping here and there. that experience would certainly be worse on a dual core at the same speed. When I was at 3.0 on my i7 my gameplay was definitely more "choppy" in those 2 games, esp with anything open in the background. That was both with the gtx 260 and the gtx 460 - 768.

I think you went a bit overboard comparing bfg to wreckage. BFG has his opinion. for once I actually disagree with him, but that was like comparing keys to rollo b/c they're both focus group members.


edit, specifically in civ 5, I had to run for a while at stock before getting a new mobo and going back up to 4.0 +. It was literally a night and day proposition. went from avg/low settings on a normal map to everything high + 4xAA on the same size map, and I still get much better performance with no/nearly no lag even in the later game stages now. I wish I could get 4.5 out of this rig!
 
Last edited:

bryanW1995

Lifer
May 22, 2007
11,144
32
91
The game was coded POS basically filled with memory leaks.

It still choked every 5 seconds since you have to change camera angles and perspectives all the time in environment and combat.

do you have anything running in the background? I'm only at 1680x1050, but I run 8xAA just fine on my i7 920 @ 4.0 with a gtx 460 - 768 @ 900/4300. Haven't played it in a few months, but I think that my last playthrough was nearly perfect actually. I have a few final saves before the archdemon, I'll go check out that battle with my current rig but iirc even that sequence was smooth once I got over 4ghz on my cpu.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
I am not so concerned with the data as I am with the the dangerous advice that can come out of it. I just don't want to see someone looking to build a system to last 2-3 years go out and buy a Core i3-560 3.33ghz ($150) + GTX460 1GB ($200) over a Core i5 750/1055T ($200) + GTX460 768mb ($150) because of that article. This is exactly the type of advice I fear because it will result in the person upgrading both the GPU and the CPU 2 years from now. The i3 system will be basically worthless while a 3.8ghz+ 750/1055T will be usable 2 years from now. You know what I mean? We have seen first hand single core A64 owners suffer this fate, and soon we will see all dual-core owners suffer the same fate. .

I can't even count how many times I have seen people who initially bought E6600/E6850 upgrade to Q9550s to ride this generation out with new GPU cards. On the AMD side, at least you can swap a new CPU since you can get a X4 940 for $100, for example. However, with price parity between a decent dual core and Quad-Core being about $50-70 now, it is too much of a gamble to get a dual-core now when already a lot of games benefit from quads (Starcraft 2, DA:O, Supreme Commander 2, World in Conflict, Civ5, GTAiv, Resident Evil 5, Mass Effect 2). Just my 2 cents on the matter.

they recommended that the optimum amt of cores was 2.75, so anybody building a rig today would reasonably get 3+cores, right? The only thing that they should have clarified is that HT makes little to no difference in gaming so don't think that 2+2 ~ 3 cores.



edit: Here's a link to one of the articles that BFG 10K and RussianSensation are talking about:


http://alienbabeltech.com/main/?p=20501
 
Last edited:

bryanW1995

Lifer
May 22, 2007
11,144
32
91
I'm sorry if this has already been mentioned, but I'm not reading this thread. I saw they were testing a 768mb card at 1920x1200@8xAA and stopped reading. This isn't testing CPU/GPU bottlenecks, they are testing CPU/VRAM bottlenecks. Most useless article ever.

not true, I use 4x/8xAA in every single game I play at 1680x1050, which is ~ 85% as many pixels as they used. gtx 460 - 768 is far and away the best current value at 1680x1050 or less but it is still competitive with faster cards at 1920x1080. It's just not likely to have much staying power in future games at the higher resolutions.
 

crucibelle

Senior member
Feb 21, 2005
308
0
0
www.facebook.com
they recommended that the optimum amt of cores was 2.75, so anybody building a rig today would reasonably get 3+cores, right? The only thing that they should have clarified is that HT makes little to no difference in gaming so don't think that 2+2 ~ 3 cores.
Yep, this is what I saw. I didn't see in the conclusion of the article where a dual core processor was being recommended over a 3 or 4 core. So, I'm not sure what all the fuss is about.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
I've been finding myself CPU limited lately to the point that I can no longer run several current games. I've got an Opteron 165 overclocked to 2.6ghz.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
I've been finding myself CPU limited lately to the point that I can no longer run several current games. I've got an Opteron 165 overclocked to 2.6ghz.
yeah I moved away from that level of cpu 2 years ago this month. it certainly wont get the job done when running decent settings in many modern games. even in games where it is perfectly playable, it will still noticeably limit any modern mid range or better gpu.
 
Last edited:

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
I've been finding myself CPU limited lately to the point that I can no longer run several current games. I've got an Opteron 165 overclocked to 2.6ghz.

Was that a dual core? IIRC that is has a 25% IPC deficit to the original Core 2 Duo which puts you between a E6300 and E6400 Conroe. Yep, I'd say you need to upgrade your CPU. But you were able to get quite a bit of life out of it... That ah heck is darn near 4? years old.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Was that a dual core? IIRC that is has a 25&#37; IPC deficit to the original Core 2 Duo which puts you between a E6300 and E6400 Conroe. Yep, I'd say you need to upgrade your CPU. But you were able to get quite a bit of life out of it... That ah heck is darn near 4? years old.
yeah the Opty at 2.6 it would be about like an E6400.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I think you went a bit overboard comparing bfg to wreckage. BFG has his opinion. for once I actually disagree with him, but that was like comparing keys to rollo b/c they're both focus group members.

Yeah, that is totally not cool.

Nah, I guess I didn't make myself clear. I apologize if it came off as me "attacking BFG". I wasn't comparing BFG to Wreckage. I was comparing Tom's Reviewer by using extreme AA settings in his game testing to predictably "prove" that most games are GPU limited akin to Wreckage's extreme examples of GTX460 beating an HD5870 by throwing unrealistic situations no one would use the HD5870 in (like Extreme tessellation in Heaven). Tom's Review automatically assumed that people will slap high AA+Tessellation whenever possible (like Metro 2033, AvP, Just Cause 2), even if it means sacrificing all playability in those games (50-60 fps).

Tom's never contemplated that some gamers may want 60 fps+ at reduced or no AA. Under those gaming situations, CPU speed does matter because a slower clocked CPU often can't get you 60 fps regardless if you disable all filters. Also, I am pretty sure the $130-150 GTX460 768mb card is not the target market for 1920x1200 8AA.
 
Last edited:

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Nah, I guess I didn't make myself clear. I wasn't comparing BFG to Wreckage. I was comparing Tom's Reviewer by using extreme AA settings in his game testing to predictably "prove" that most games are GPU limited akin to Wreckage's extreme examples of GTX460 beating an HD5870. Tom's Review automatically assumed that people will slap high AA+Tessellation whenever possible (like Metro 2033, AvP, Just Cause 2), even if it means sacrificing all playability in those games (50-60 fps).

Tom's never contemplated that some gamers may want 60 fps+ at reduced or no AA. Under those gaming situations, CPU speed does matter because a slower clocked CPU often can't get you 60 fps regardless if you disable all filters. Also, I am pretty sure the $130-150 GTX460 768mb card is not the target market for 1920x1200 8AA.

Fair enough... From my perspective I see two debates going on and this is very typical of most debates within any subject... You have the far left and the far right and neither tends to budge towards the middle, as if moving towards the middle somehow is a sign of weakness.

Both are important but my opinion on this issue is the same as BFG's and always has been thus far. Still, I don't think BFG's or my view is an extreme one. We do agree that some games do require some great CPU horsepower, but I think we are clear that in 8 out of 10 games, across the board, that the CPU takes second place in the role of gaming.

I also don't agree on a statement that you made about this topic having 'dire consequences' on the less informed masses. I have found that if someone doesn't research and cross analyze data before making a purchase, then they likely wouldn't notice the difference anyway. They, are in a blissful state, because ignorance is bliss. Sure, a buddy comes over and tells him that he made a bad purchase, but that is what rationalization is for. He will just make up excuse as to why his buddy is wrong and he will have deceived himself into belieiving it and all will be well with the world. Somewhat joking there, but I do find it true. I don't think Tom's Hardware is 'damaging' to people's purchase decisions. What I do find damaging to the consumer is lieing cheats and unfair business practices designed to extort, but that has nothing to do with this thread.
 
Last edited:

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Was that a dual core? IIRC that is has a 25% IPC deficit to the original Core 2 Duo which puts you between a E6300 and E6400 Conroe. Yep, I'd say you need to upgrade your CPU. But you were able to get quite a bit of life out of it... That ah heck is darn near 4? years old.
Yeah, the problem is I would have to buy new memory and everything, so I'm holding out.

My GPU is outdated as well. I'm waiting for a bit and them I'm going to get an entirely new computer.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Yeah, the problem is I would have to buy new memory and everything, so I'm holding out.

My GPU is outdated as well. I'm waiting for a bit and them I'm going to get an entirely new computer.

Gaming gets old though... For the first time in my life, I don't really care about upgrading my computer. I have no plans to upgrade my GTX 280. Heck, I don't even have plans to setup my desktop after the move. This laptop I use with a SU7300 handles what I need too, even King's Bounty: Crossworlds for when I decide that gaming is not so boring, only to find out that it is boring again.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Gaming gets old though... For the first time in my life, I don't really care about upgrading my computer. I have no plans to upgrade my GTX 280. Heck, I don't even have plans to setup my desktop after the move. This laptop I use with a SU7300 handles what I need too, even King's Bounty: Crossworlds for when I decide that gaming is not so boring, only to find out that it is boring again.
This is true.

IMO it's going to take the next-gen consoles to push gaming to the next level anyways. In a sense I'm thankful that most PC games are console ports; it means that my computer can run anything that an Xbox 360 or PS3 can.

Before I get a whole new computer I'm going to get an SSD and a 2TB hard drive. After that I'll probably get a GPU and see how my system copes with it.