How much will a CPU bottleneck a GPU?

TheInternal

Senior member
Jul 7, 2006
447
0
76
Greetings all,

After being trolled on about a video card upgrade, my confidence has been a bit shaken. Would a CPU technology released in 2004 significantly bottleneck a GPU released in 2006?

For example; would an AMD 64 4400+ significantly bottleneck an nVidia 9800 GT card? What about two nVidia 9800 GT cards in SLI? how much of a performance reduction would occur due to CPU limiting? Would going from two 7950 GT cards in SLI to two 9800 GT cards in SLI still net a performance increase of 80% or more in most games? How much, percentage-wise, of the GPU's maximum output be gimped by the processor?
 

TemjinGold

Diamond Member
Dec 16, 2006
3,050
65
91
You weren't trolled on. The guy was being honest with you.

You seem not to understand something: It isn't about your cpu holding back a modern gpu (though it will do a lot of that.) This is about your cpu not being strong enough for the cpu part of the equation on the games you want to play at the res you specced.
 

LoneNinja

Senior member
Jan 5, 2009
825
0
0
It doesn't matter what video card you have, the now aged Athlon X2 processors will struggle with some of todays games regardless because they cannot handle the game. It has nothing to do with a gpu bottleneck.
 

faxon

Platinum Member
May 23, 2008
2,109
1
81
It works two ways. In way one, there are some peices of code which require both a fast enough CPU and gpu to run in games, and in another way, because the CPU may be doing some other type of calculation related to framerate, like advanced physics calculations, where it would directly throttle your FPS lower because it can't figure put where objects on screen are going to be located fast enough. In GPGPU computing there is also a direct relationship between gpu and CPU performance because the calculations are being done on both units simultaneously and a slow CPU can definitely slow down a fast gpu if the gpu is sitting waiting on data to be processed by the cpu
 

TheInternal

Senior member
Jul 7, 2006
447
0
76
so, how much of a real world reduction in performance will this present? No one seems to even present an estimate, just the generalized theory which I know and understand.

Yes, I know a slower CPU will detract from graphics card performance, but by how much?

As for being trolled on, if someone told you you had a "terrible" component, you'd not at all take that as a bit troll like? Some parts may be old, but even the original K6 isn't a "terrible" chip. It's just old.
 

TemjinGold

Diamond Member
Dec 16, 2006
3,050
65
91
The problem is that you won't get an answer unless someone happens to have a chip that old to test with. The closest you'll probably get is the guy in the other thread with a faster X2 than yours. He saw 50% greater performance by swapping out his cpu to something more modern.

Yes, I know you know that a slower cpu detracts from graphics performance. The thing is, we're not trying to tell you that. We're trying to tell you that your cpu doesn't meet the minimum cpu requirements of the games you want to play at high settings.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
...and 2-3x with a good bang/buck Phenom II, on top of such a CPU having more cores than most games can use.
 

ShawnD1

Lifer
May 24, 2003
15,987
2
81
so, how much of a real world reduction in performance will this present? No one seems to even present an estimate, just the generalized theory which I know and understand.
With an Athlon 4400? It's bad to the point where upgrading the video card will see almost no frame rate increase in any game released in 2009 or 2010.

I can even give a real world example of this. A GeForce 7950GT and GeForce 8800GTX both get the same frame rate in World of Warcraft. Why? It's because the CPU in my computer was an E6600 from 2006. Your CPU is even slower than mine, so it would be bottlenecked way worse.
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
You weren't trolled on. The guy was being honest with you.

You seem not to understand something: It isn't about your cpu holding back a modern gpu (though it will do a lot of that.) This is about your cpu not being strong enough for the cpu part of the equation on the games you want to play at the res you specced.

What does resolution have to do with CPU? That is GPU-dependent. Unless there is some caveat I'm not aware of.
 

faxon

Platinum Member
May 23, 2008
2,109
1
81
With an Athlon 4400? It's bad to the point where upgrading the video card will see almost no frame rate increase in any game released in 2009 or 2010.

I can even give a real world example of this. A GeForce 7950GT and GeForce 8800GTX both get the same frame rate in World of Warcraft. Why? It's because the CPU in my computer was an E6600 from 2006. Your CPU is even slower than mine, so it would be bottlenecked way worse.
yea if you want a good example of cpu limitation, i have 2 games for you. WoW and EQ2. WoW can utilize some multithreading, EQ2 will eat a core and a half more or less until about 1.9ghz at which point the second core starts getting a higher % load, but if you push it higher, like the 4.2 i got my cpu to, i still dont run into a GPU limit in EQ2. never played WoW but its similar, though not quite as bad. there's other games as well if you look around. most source games for example its not to hard, the engine tends to be pretty cpu limited, but the games still top out at several times playable FPS. still, its easy enough to test for especially if you have an older CPU
 

ShawnD1

Lifer
May 24, 2003
15,987
2
81
cpu limited, but the games still top out at several times playable FPS. still, its easy enough to test for especially if you have an older CPU
We should have a sticky to explain how this test works.


Testing for CPU limitations is very easy. All you need to do is plot some data of frame rate vs resolution. Do not change any other setting. If the frame rate is strongly affected by resolution, then it's a video card bottleneck. If the frame rate is not affected by resolution, then it's a CPU bottleneck. In World of Warcraft, it's very typical to see something like this kind of a trend:
800x600 - 45 fps
1024x768 - 44 fps
1280x1024 - 44 fps
1440x900 - 42 fps
1680x1050 - 40 fps
1920x1080 - 35 fps

Here we can see that the frame rate doesn't really go down a lot even though the resolution is cranked way up. This is exactly what a CPU limited game looks like.

Video card bottlenecked games have a much stronger correlation between the number of pixels and how fast it draws them. That would be something like Gears of War on ultra high quality settings.
800x600 - 200 fps
1024x768 - 180 fps
1280x1024 - 150 fps
1440x900 - 120 fps
1680x1050 - 80 fps
1920x1080 - 60 fps

1920x1080 has more than 4x as many pixels as 800x600. In the above example, we see that trying to draw 4x as many pixels makes the game run 3x slower. The video card is definitely holding this back.
 
Last edited:

LoneNinja

Senior member
Jan 5, 2009
825
0
0
Some may not believe this, but my Athlon X2 5000 was a bottlenect for my radeon 4670 playing L4D and a few other games of mine @ 1680x1050. It wasn't significant, but I did see a frame rate increase, especially in the min frame rate category when I got a Phenom 9850.

Also since I know the OP has mentioned GTA IV, I've played that on PC with a 4670 mostly low settings @ 1280x720. With an Athlon X2 7750@ 3.0Ghz I pulled low 20's in frame rate, rarely saw 30fps even with next to nothing happening on screen. My Phenom II 940 downclocked to 2.4Ghz rarely dips below 30, and I frequently see around 45fps with those same video settings.

It's only a few games that I own, but point is even a 4670 can be limited by an Athlon X2 in some games that are cpu hungry for AI/Physics/whatever else the cpu must do.
 

MJinZ

Diamond Member
Nov 4, 2009
8,192
0
0
Unless you are running like 5600x2560 Eyefinity or something, your CPU is what holds you back.

My GTX 480 runs whatever game at whatever FPS at 1920x1200 with all settings on max.

A quad core i7 would scale linearly by my estimate until way past 5 or 6 ghz.

So yes, if you are playing at insane resolutions, the video card becomes a bottleneck. This is however irrelevant because your CPU will be too slow for these resolutions anyway, so even the GPU limits your FPS more so than your CPU, the frames are going to be too low anyway.

General rule - CPU bottlenecks every single high end card. 5870, GTX 480 etc. Like crazy. My experience with a 4.2ghz Core i7 running at various speeds, and various GPUs (GTX 260, 5870, 5870 crossfire, GTX 480).
 

Scali

Banned
Dec 3, 2004
2,495
1
0
so, how much of a real world reduction in performance will this present? No one seems to even present an estimate, just the generalized theory which I know and understand.

Yes, I know a slower CPU will detract from graphics card performance, but by how much?

It's very specific, it can vary greatly from game to game (or different settings in the same game).
For example, Doom 3 is a game that is extremely CPU-limited, because it does a lot of processing on the CPU.
3DMark03's "Battle of Proxycon" test is a very Doom3-like test, but it offloads all shadow calculations to the GPU. This allows it to scale to much higher framerates with high-end GPUs coupled with low-end CPUs.

With Crysis, setting physics to Very High is going to make it extremely CPU-limited (but not when no physics-related stuff is happening at the time, ofcourse).
Set physics to low, and the CPU load is much lighter, and you can reach pretty decent framerates even on a relatively slow CPU.

So the answer to "how much?" is: "it depends".

PS: CrossFire/SLI tend to increase the CPU load/bottleneck even further, as the driver now has to synchronize two videocards instead of one.
 

TemjinGold

Diamond Member
Dec 16, 2006
3,050
65
91
Fellas, I think by this point with this thread and the other one over at the Video Card forum, it's safe to say the OP is hell bent on not believing anything that doesn't involved "getting a budget video card will allow you to play GTAIV at 1920 on high settings." Anything else we tell him is interpreted as trolling him.

OP: To make it plain and simple, whatever dollars you have for an upgrade will give you far more mileage if spent on new cpu/mobo/ram than on video card. I know that's not what you want to hear and I know you don't believe me. So here's what you should do: Find a local store with a good return policy and buy the best video card you can afford. Go home and plug it in. When you see how abysmal the resulting performance is, go return the card for your money back.
 

mhouck

Senior member
Dec 31, 2007
401
0
0
If this is trolling, its the most objective and imformative trolling I have ever seen!
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
Some may not believe this, but my Athlon X2 5000 was a bottlenect for my radeon 4670 playing L4D and a few other games of mine @ 1680x1050. It wasn't significant, but I did see a frame rate increase, especially in the min frame rate category when I got a Phenom 9850.

Also since I know the OP has mentioned GTA IV, I've played that on PC with a 4670 mostly low settings @ 1280x720. With an Athlon X2 7750@ 3.0Ghz I pulled low 20's in frame rate, rarely saw 30fps even with next to nothing happening on screen. My Phenom II 940 downclocked to 2.4Ghz rarely dips below 30, and I frequently see around 45fps with those same video settings.

It's only a few games that I own, but point is even a 4670 can be limited by an Athlon X2 in some games that are cpu hungry for AI/Physics/whatever else the cpu must do.

Concerning L4D, he is playing at a higher resolution and he only has a 7950GT SLI. The video card setup is much weaker than a single 9800GT. At 1080p I believe he would see a good boost in performance in L4D2 with one 9800GT. A second 9800GT probably isn't worth it for the games he's playing.

Also I'm not sure what to make of your GTA4 performance. It's not what I see when I play GTA4 at the lowest settings. For me the game, on average, pulls low 30s. As more things happen it will go into the high 20s. And when there is a crap ton of stuff it will dip down into the low 20s. But when there's nothing happening, my Athlon X2 and HD3850 get above out 30 fps.

Fellas, I think by this point with this thread and the other one over at the Video Card forum, it's safe to say the OP is hell bent on not believing anything that doesn't involved "getting a budget video card will allow you to play GTAIV at 1920 on high settings." Anything else we tell him is interpreted as trolling him.

I think he was more disturbed with the manner in which toyota approached him. Because let's face it: Toyota is abrasive and his word is the law. And that's exactly how he came off.

But you are right, he will not be using high settings in GTA4 with just a video card upgrade. He would only see a small boost. In L4D I believe he would see a nice boost.
 

ShawnD1

Lifer
May 24, 2003
15,987
2
81
But you are right, he will not be using high settings in GTA4 with just a video card upgrade. He would only see a small boost. In L4D I believe he would see a nice boost.

Indeed. GTA4 is terribly coded. My old E6600 overclocked by 30% needed to have most things on low just to run it.

Even with my newer Phenom X6 1055, it pushes the needles. It will run 4 cores as hard as they'll go.
 

peonyu

Platinum Member
Mar 12, 2003
2,038
23
81
I would ditch the Athlon system and get a Core 2 duo atleast, you can overclock them to 4ghz pretty easily. Also same deal with the video card, get a newer one...The 5850's and the Nvidia mid range cards are both good for the value and will last you awhile. But trying to stick with a old cpu that you have now is pointless...And buying a video card from 2006 is just as pointless. You will be dissapointed in the framerate gains from a card from 2006 and will regret spending money on it [I was in a similar situation before and regretted it].

If you upgrade to to atleast a C2D and a decent video card, you can sell your old AMD hardware and your older video card...Believe it or not, those will sell for a decent ammount of money and that will go aways to making up the cost of buying new hardware.
 

peonyu

Platinum Member
Mar 12, 2003
2,038
23
81
For your CPU in particular though, if your cpu bottlenecks you you are screwed on trying to get more fps out of it...So just upgrade it. Having your video card be the bottleneck isnt as bad since you can dial down AA/AF or just lower the resolution.
 

ShawnD1

Lifer
May 24, 2003
15,987
2
81
I would ditch the Athlon system and get a Core 2 duo atleast


pedo-bear-too-old.jpg
 

iufan4lifeul

Member
May 21, 2010
58
0
0
Just throwing it out there I believe my CPU bottlenecks crysis at 1080, ultra high, max AA. Runs roughly 30fps or so and so I figured I would adjust the video card and squeeze some more out with overclocking... No matter how much I overclocked I never saw 1 fps better :(