Bottleneck Here, Bottleneck There, Bottleneck Bottleneck Everywhere

Pederv

Golden Member
May 13, 2000
1,903
0
0
So is bottleneck the new buzz word when building or upgrading a system?

I see so many people concerned if upgrading their video card will result in the CPU being bottlenecked. Who cares? If the game you play has gone from 20fps to 50fps because of your video card choice, isn't that what you're after? If the game looks better and is more enjoyable to play with a newer card, was it worth it?

There is always going to be a bottleneck between the CPU and GPU. Even in a balanced system the application will determine which is causing the bottleneck.

I can understand being concerned about reaching a point of diminishing returns, this holds true for all components of a PC.

Maybe hardware sites need to graph video card performance over a range of CPU speeds, at a given resolution, to show the point of diminishing returns for the card.

 

Flipped Gazelle

Diamond Member
Sep 5, 2004
6,666
3
81
YAY!!! :thumbsup:

This topic has been irking me for some time now, people screaming "bottleneck!" like they just saw a ghost. People find a word and latch onto it. Next year it will be something else.

Who cares if the CPU is not capable of maxing out the video card? You buy the card to improve the gaming experience, not to see if the CPU can push it to it's limits.

Thank you for the common-sense post. :cookie:
 

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
Originally posted by: Flipped Gazelle
YAY!!! :thumbsup:

This topic has been irking me for some time now, people screaming "bottleneck!" like they just saw a ghost. People find a word and latch onto it. Next year it will be something else.

Who cares if the CPU is not capable of maxing out the video card? You buy the card to improve the gaming experience, not to see if the CPU can push it to it's limits.

Thank you for the common-sense post. :cookie:

ditto, now if only we can get people to understand most of the time that they are having issues PEBKAC

 

Arkaign

Lifer
Oct 27, 2006
20,736
1,379
126
The problem is that for many people, with CPUs like X2-3800s, Stock 2140/2160, etc, the difference in many games will be pretty much nil between a 9600GT and say a GTX280, or a 4670 and a 4870 :) Particularly if they have larger screens.

I really agree with the last part of the post, it would be handy when video cards are reviewed, if they also put in numbers of various slower than top-end configurations, to show what the results would be on slower systems.
 

Leyawiin

Diamond Member
Nov 11, 2008
3,204
52
91
Thank you. About time someone pointed that out. I have an X2 6400+ paired with a factory OC'd GTX 260. Yes, the GPU would run significantly faster on a top-drawer Intel CPU. But its still faster than the 8800 GTX it replaced (which I suppose was bottlenecked a little too). All I know is I can play my games quite smoothly except in NPC intensive areas. Oh well.

Like you said, you wouldn't recommend an HD 4870 X2 to someone with an Athlon X2 3800+, but a little overpowered GPU (which is hard to avoid these days in relation to most CPU speeds) is fine.
 

IlllI

Diamond Member
Feb 12, 2002
4,927
11
81
i always thought it to be more like this:

card a cost $100
card b costs $200

both will give the same fps due to the various other system components/bottleneck

wouldnt you like to know this information instead of wasting an extra $100 to get negligible difference?

 

Leyawiin

Diamond Member
Nov 11, 2008
3,204
52
91
Originally posted by: clandren
i always thought it to be more like this:

card a cost $100
card b costs $200

both will give the same fps due to the various other system components/bottleneck

wouldnt you like to know this information instead of wasting an extra $100 to get negligible difference?

Its a fine line. I know my set up is bottlenecked a little, but I'm still getting 10 to 20 FPS more in "non cpu intensive" parts of games, which are most areas not crowded with NPCs (thats a generalization, but usually NPC A.I., dynamic shadows around them, etc... are what hits a CPU in the RPGs and MMOs I play most). Educate yourself and decide if the cost is worth it.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
The term CPU bottleneck came from when pairing a old cpu with a new GPU and getting same performance across many resolutions. I think people use this word way too often. I think the correct term would be CPU limiting but then all components are limited by something.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: Pederv
So is bottleneck the new buzz word when building or upgrading a system?

I see so many people concerned if upgrading their video card will result in the CPU being bottlenecked. Who cares? If the game you play has gone from 20fps to 50fps because of your video card choice, isn't that what you're after? If the game looks better and is more enjoyable to play with a newer card, was it worth it?

There is always going to be a bottleneck between the CPU and GPU. Even in a balanced system the application will determine which is causing the bottleneck.

I can understand being concerned about reaching a point of diminishing returns, this holds true for all components of a PC.

Maybe hardware sites need to graph video card performance over a range of CPU speeds, at a given resolution, to show the point of diminishing returns for the card.

No, bottleneck is not the new buzz word, its been used for as long as people have discussed computer hardware and much longer before that when referring to logistics, manufacturing, war strategy, etc etc. Only recently have people started taking offense to the term instead using more tame terms such as "system limited" or "cpu limited".

Regardless of semantics, CPU bottlenecking has become a greater issue as of late given how quickly GPUs have progressed. Not only are they following Moore's Law when it comes to single GPUs, they exceed it once you consider multi-GPU configurations. CPU progression meanwhile has become relatively stagnant where there hasn't been any major increase in clockspeeds or IPC with much worst scaling and utilization of multiple processors than with GPUs.

Fact is older CPUs are simply not enough to drive newer GPUs to the point you may not notice any benefit from a newer GPU other than "free AA" and may very well benefit more in terms of pure FPS by upgrading to a new CPU. You'll see this holds true even at resolutions that are historically GPU bottlenecked. On the very high end, you'll see the same with multi-GPU configurations where older CPUs show poor scaling but newer and faster CPUs show scaling as expected.

Here's some recent reviews that clearly show CPU bottlenecking with a single fast GPU:

GTA4 - 13 CPU round-up

COD4 + GRiD - Intel CPU Clock for Clock Comparison @ 2GHz

COD5 - 12 Intel and AMD CPUs

Far Cry 2 - various speeds

Left 4 Dead - various speeds

There's nothing wrong with upgrading the GPU first, but if you see very little difference in performance or you're not seeing as much gain as you expected, now you know why. I fully expect the CPU to continue to be a significant bottleneck until games get much more GPU intensive, make better use of multi-cores and HT or start unloading physics calculations to the GPU.
 

Flipped Gazelle

Diamond Member
Sep 5, 2004
6,666
3
81
"Bottleneck" is indeed a new buzzword, because instead on being confined (in relation to PC's) to the builder/enthusiast community, it has entered the modern lexicon when discussing computer performance. It's amazing how often I've seen the term "bottleneck" in posts by noobs over this last year.

As far as the Card A/Card B - $100/$200 argument, often the $200 card will exhibit a notable performance boost, even on a low-endish CPU, such as an X2 3800+.

For example, earlier this year my brother had an X2 4200 @ 2.7 Ghz. He noticed a substantial improvement in gaming performance when he went from 1 8800gt to 2 8800gt's. He games at 1920 res.
 

AVP

Senior member
Jan 19, 2005
885
0
76
Maybe I am missing something but all that I see in a quick glance from those various benchmarks is that indeed a faster processor greatly enhances your fps. What I don't see is the difference between various video cards on 2.0-2.5ghz cpus. While some may talk of the card's "potential" to be what it will score with a 4ghz quad, in reality people aren't being shorted because they can't those numbers with a lesser processor, to a degree. Where that drop of happens is not demonstrated by those charts.

Sure you may get 30fps with a 2.0ghz pentium and a gtx 280 but with a 9600gt how much are you getting? 15? 28? Big difference... and I have not seen the answer yet.
 

Phew

Senior member
May 19, 2004
477
0
0
I wish that Legion Hardware test went up to like 4.4 GHz for the Core 2 Duos, since that is a more typical air overclock of an E0 E8x00. 3.6 GHz may be about the typical quad air overclock, but you can get that an on E8x00 without even increasing the voltage from stock.

I have a anecdote on the 'bottleneck' issue to relate though. My friend had an older Gateway desktop with a 3.2 GHz Pentium 4, and an equivalent graphics card from that era (6600 or so, I don't remember). We both started playing Warhammer Online, but he was having awful slowdowns in scenarios, despite only playing at 1280x1024. He decided to upgrade his graphics card to a Radeon 4670, but it really had no impact on his minimum fps in scenarios. So he spent $80 on a new video card that didn't make his game run any faster, because Warhammer is totally CPU-bound in large battles.

So 'bottlenecking' isn't some meaningless buzzword; not understanding it can cause people to waste money on upgrades that don't help their fps.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: AVP
Maybe I am missing something but all that I see in a quick glance from those various benchmarks is that indeed a faster processor greatly enhances your fps. What I don't see is the difference between various video cards on 2.0-2.5ghz cpus. While some may talk of the card's "potential" to be what it will score with a 4ghz quad, in reality people aren't being shorted because they can't those numbers with a lesser processor, to a degree. Where that drop of happens is not demonstrated by those charts.

Sure you may get 30fps with a 2.0ghz pentium and a gtx 280 but with a 9600gt how much are you getting? 15? 28? Big difference... and I have not seen the answer yet.

Digit-Life CPU/GPU Benchmark Tool

That's one of the best comparison tools you'll find and it does exactly what you ask. It shows how CPU speed impacts performance relative to various GPU configurations. Unfortunately its limited to slower CPUs and low to mid-range GPU solutions but it very clearly shows the impact of CPU bottlenecking in current games. And you don't need to be a "noob" to understand it! :)

But you can clearly see with a few of the following settings:

PC1: 4800+/HD3870CF = 100%
PC2: 6000+/3870 single = 102%
Slower CPU with faster GPUs trades blows with faster CPU + single GPU.

PC1: 4800+/HD3870 = 100%
PC2: 6000+/HD3870 = 111%

PC1: 4800+/HD3870CF = 100%
PC2: 6000+/HD3870CF = 121%
Both results scale once you increase clockspeeds.

PC1: 4800+/HD4870 = 100%
PC2: 6000+/4850 = 114%
The "slower" GPU is actually outperforming the faster GPU when paired with the faster CPU.

Its really an excellent comparison tool and something I wish other sites employed. It shows very clearly that CPU bottlenecks are significant with current games and hardware. Cuular's link also shows how CPU bottlenecks impact the high-end.



 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: Phew
because Warhammer is totally CPU-bound in large battles.

So 'bottlenecking' isn't some meaningless buzzword; not understanding it can cause people to waste money on upgrades that don't help their fps.
Yep exactly. WAR is insanely CPU intensive for sure, I'm hoping they someday implement GPU PhysX to help relieve the load, as it already uses GameBryo.
 

qbfx

Senior member
Dec 26, 2007
240
0
0
Originally posted by: Flipped Gazelle
"Bottleneck" is indeed a new buzzword, because instead on being confined (in relation to PC's) to the builder/enthusiast community, it has entered the modern lexicon when discussing computer performance. It's amazing how often I've seen the term "bottleneck" in posts by noobs over this last year.

As far as the Card A/Card B - $100/$200 argument, often the $200 card will exhibit a notable performance boost, even on a low-endish CPU, such as an X2 3800+.

For example, earlier this year my brother had an X2 4200 @ 2.7 Ghz. He noticed a substantial improvement in gaming performance when he went from 1 8800gt to 2 8800gt's. He games at 1920 res.

I don't see what's the problem with that. The concept/meaning of that word is simple enough so even non-"builders/enthusiasts" can understand it and I'm pretty sure even a "noob" can imagine that a GTX280 will be bottlenecked to death when coupled with an Athlon 64.

And yeah, sure you buy a new card to make your games run faster and all that, and even with a slower cpu you do see a substantial performance increase in most cases but hey it wont hurt if you get 60fps more instead of 40 right? It's like having bought a GTX280 for the price of a GTX260, wouldn't that be great? We're happy when the prices drop with even 5%, aren't we?

Unless you have a top tier cpu you will be severely bottlenecked with a fast gpu. People need to stop thinking that only the gpu matters in games even so with future titles.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: qbfx


Unless you have a top tier cpu you will be severely bottlenecked with a fast gpu. People need to stop thinking that only the gpu matters in games even so with future titles.

GPU matters much more. Majority of the games if not all games will benefit much more from upgrading GPU than a CPU upgrade.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: Azn
Originally posted by: qbfx


Unless you have a top tier cpu you will be severely bottlenecked with a fast gpu. People need to stop thinking that only the gpu matters in games even so with future titles.

GPU matters much more. Majority of the games if not all games will benefit much more from upgrading GPU than a CPU upgrade.

It depends on the engine as well. A game like crysis for example, going from a 3.2Ghz C2D and 8800gt to a 4Ghz Q9550 and GTX280 at 1920x1200 takes you from "unplayable" to "not fun". Even with a massave increase in both CPU and GPU performance. The Engine isn't well optimized at all so nothing will help for the time being. It also depends on your settings. Setting very high or enthusiast is playable but not really great.
 

qbfx

Senior member
Dec 26, 2007
240
0
0
Originally posted by: Azn
Originally posted by: qbfx


Unless you have a top tier cpu you will be severely bottlenecked with a fast gpu. People need to stop thinking that only the gpu matters in games even so with future titles.

GPU matters much more. Majority of the games if not all games will benefit much more from upgrading GPU than a CPU upgrade.

I totally agree with you and never claimed cpu matters more than gpu.

I just mean that imho the term bottleneck can be annoying but then again has its place in this forum and, as you can see in the links chizow provided, a slow cpu could cripple your card quite a bit.
 

Concillian

Diamond Member
May 26, 2004
3,751
8
81
Originally posted by: Pederv
So is bottleneck the new buzz word when building or upgrading a system?


How is it a new buzzword? People have been talking about it since I joined, and probably since you joined. You're just paying more attention now.

Anyone can make either CPU or GPU "a bottleneck"... you want to make CPU a bottleneck? Play your games at 800x600 with no AA or AF.

You want to make video card your bottleneck? Play at 2560x with 8xAA and 16xAF.

In reality, people are showing acceptable results with old S939 dual cores and GTX cards. Even if your CPU is a "bottleneck" you can get better graphics quality options with a killer video card, so I've always been one to pretty much disregard these CPU "bottleneck" claims.

You have to get to the point where you have a pretty old CPU plus a pretty new graphics card before you're actually "bottlenecked", this was the same back when we were using 9700 pros and Athlon XPs, and I bet it will be the same 3 years from now when we're measuring processor performance primarily by whether you have 8, 12, or 16 cores (or however many they're up to by then).

Designers always have more scaling capability on the graphics end than the CPU end, because in actuality the CPU end is really not all that different from top of the line (e8600) to bottom (AMD 4850e). Maybe like 100% difference. Compare that with the difference between a HD4670 and an 4870x2. Memory bandwidth is close to 10x and shaders * core speed is like 5x, so you see something likee a 6-8x difference from top to bottom on video cards, where CPU only sees a ~2x difference (until games start taking advantage of quad cores anyway).
 

Phew

Senior member
May 19, 2004
477
0
0
Originally posted by: chizow
WAR is insanely CPU intensive for sure, I'm hoping they someday implement GPU PhysX to help relieve the load, as it already uses GameBryo.

Yesterday I saw 7 FPS in a scenario with only 20 people, at a time when a bunch of particle effects were flying around. I would have needed like a 10 GHz quad core to keep the framerate playable in that situation. It's not like the effects look amazing in that game either.
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
any reviews of warhammer comparing core i7 in the cpu benchmarks? i7 probably works great for warhammer, like it does in GTA4.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: Phew
Originally posted by: chizow
WAR is insanely CPU intensive for sure, I'm hoping they someday implement GPU PhysX to help relieve the load, as it already uses GameBryo.

Yesterday I saw 7 FPS in a scenario with only 20 people, at a time when a bunch of particle effects were flying around. I would have needed like a 10 GHz quad core to keep the framerate playable in that situation. It's not like the effects look amazing in that game either.

Wow, that is horribly unacceptable for an MMO on a rig with your specs.

E0 E8400@4.34Ghz (1.34v) under FZ120 | ASUS P5Q | 8GB G.Skill DDR2-1066 | VisionTek HD4870@840/1100 | WD Velociraptor 300GB | ThermalTake 700W | Lian-Li PC-68 | Vista x64 | Planar PX2611W

Your cpu is clocked way above anything available in retail, 8GB ensures that you have plenty of RAM, and your video card is no slouch either.

Back to the topic at hand, IMO, it just takes common sense... If you have a high end or multi-gpu setup, you probably should pair that with a higher end or overclocked cpu. For example, in this link earlier provided http://www.pcgameshardware.com..._CPUs_reviewed/?page=4

These guys pair a $350 GTX 280 video card with a stock less than $65 Celeron E1400 Allendale cpu. This would be a pretty dumb match up for someone to make.

I would think that generally DIYers aren't going to have too much difficulty with this because budget will dictate the build, and a smart DIYer will try to build a balanced rig.

I could see bottle necking being a more serious issue for someone who buys a Dell, HP, etc., doesn't really know how to upgrade, and expects miracles by adding a new high end video card to a 2-3 year old PC. In that situation, surely some significant bottle necking would occur that would diminish the overall value of the upgrade considerably.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
CPUs are very important, but all to frequently they are made out to be more than they are. Keep in mind that a CPU can ONLY raise your frame-rate, nothing else... Extra GPU power doesn't go to waste because you can always turn up the eye candy, enable AF, AF and TSAA, etc... So having a CPU bottleneck still results in a better experience than a GPU bottleneck, IMO.

I'd take a Q6600 STOCK with a GTX280 over a Core i7 @ 4.0Ghz with an 8800GTX. Well, actually I wouldn't but not because of the performance, but because popping in a new video card is easier than upgrading a platform. No, seriously, I would take the first system over the second, provided my only goal was to play games NOW - not plan for the future.
 

Sk8rdd00

Member
Jan 19, 2003
28
0
0
I'm glad the OP brought this up as this is something thats been irking me for a while and I never see anyone discussing it.

The cold hard fact is CPU's have become less and less relevant (for gaming) with every generation. Especially since the mhz/ghz race is more or less over. Multiple cores will only run as well as a game can take advantage of them. And the term bottlenecking is also thrown around way to liberally I think. "CPU bottlenecking" refers to maintaining the same framerate regardless of graphical settings or resolution.

As far as games themselves go, some are very CPU demanding, but the vast majority simply aren't. The only game I can think of off the top of my head that you really need a quad or dual to enjoy is Supreme Commander. Crysis is very very GPU dependent and hardly uses the extra 2 cores on a quad while using about half of the 2nd core.

Warhammer online was mentioned above and I can tell you first hand there are many performance-crippling bugs in the game that effect a broad variety of systems ranging from single cores to quad cores. All you have to do is check the warhammer-alliance forums because theres dozens if not hundreds of threads with thousands of posts on the issue and I ended just quitting the game due it not being fixed...