Once and for all, CPU is NOT A BOTTLENECK!!!!

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

avi85

Senior member
Apr 24, 2006
988
0
0
Originally posted by: akshayt
Not so for Carbon, it gives 54FPS on 1900XTX with C2D E6400 while single core A64 @ 2.4GHz just gives 40FPS, both are playable, but the C2D will be smoother and faster, this is at 12x10 4x AA, without AA the AMD gives 40FPS again while C2D give 65.

That's cause your xtx sucks...:laugh:
 

sbuckler

Senior member
Aug 11, 2004
224
0
0
I play ut2004, favourate game of all time and that has been cpu bottlenecked since the nvidia 6800's came out.

I used to have a p4 @ 3.5, on a large 32 player server in a big battle that could drop into the mid 20's for fps. Nothing to do with my graphics (6800GT), I could set the settings to whatever I liked and I'd still get the same fps. Even a top end athlon FX would drop under 40 fps at the same point. UT is a game where fps matters - you really want at least 60.

I now have an E6600 @ 3.2 and that is the first processor that can play the game at close to 60 fps at all times.

So there are popular games that are cpu and not gpu bottle necked.
 

Scarpozzi

Lifer
Jun 13, 2000
26,391
1,780
126
Dell is coming out with an Intel-based dual-processor, quad-core server. If you can find a form-factor with room for a PCI-Express slot, it would make one hella good gaming box. :D
 

Aikouka

Lifer
Nov 27, 2001
30,383
912
126
Alraito! I agree with that one guy.. after looking back up.. anthrax.

There's a curve relating to performance / processor "power". Power being the relative ideaology of a processor being faster than another. At a certain point (processor), yes, while just playing the game the game's performance curve will begin to wane and flatline. This is completely true as the game cannot saturate the processor anymore.

Now, let's stop with this perfect world hogwash. I run a metric ass-ton of extra programs in the background... I think around 60 processes at once on my dual-core machine. Just this alone would show that switching from my older Athlon 64 2800+ or my older Athlon 64 3200+ to my Athlon 64 X2 4400+ or my Core 2 Duo E6600 would show a huge performance increase since the processor would be able to handle more than one task at once (and further reduce the need to perform the register switches and stuff (I forgot the proper term)). I buy a faster processor so I can do more than game at once. PC's are multi-tasking machines and we eventually become used to such.

If someone asks whether or not their single-core processor (a relatively newer.. say K8) would be a bottleneck compared to a newer dual-core... I'm gunna tell them yes. The ability to off-load excess mandatory yet mundane tasks to another core is by far more beneficial than not and could easily help show improvement in games.
 

GundamSonicZeroX

Platinum Member
Oct 6, 2005
2,100
0
0
People like to have their parts in the same 'end' or 'league.' I know it'd bother me if I have a 7900GTX and only a 3200 AMD64.
Higher ended CPUs do give better performance. Plain and simple. And some engines like the Source engine are CPU-bound.

(rant will continue after I get home from class)
 

TheRyuu

Diamond Member
Dec 3, 2005
5,479
14
81
I think we need to revise the statement.

"Any MODERN (by modern I mean Celeron 326 and greater) cpu is not the bottle neck as long as the resolution is high enough, and IQ settings and AA/AF are turned up enough on a modern graphics card."

Hows that one?

And when I'm talking about bottleneck, I'm not talking about going from 100fps to 150fps with a faster CPU. That doesn't really matter.
 

Aikouka

Lifer
Nov 27, 2001
30,383
912
126
Originally posted by: wizboy11
And when I'm talking about bottleneck, I'm not talking about going from 100fps to 150fps with a faster CPU. That doesn't really matter.

Then your definition of a bottleneck is off. It doesn't matter if you don't think it matters, a bottleneck is defined as what constricts the flow (hence the term bottleneck, as a neck of a bottle is smaller than the bottle and constricts flow) of data. If going from an Athlon 64 2800+ to an Athlon 64 3500+ shows a 15FPS improvement, then the game is somewhat CPU bound (increasing graphics may also improve the framerate at this point as well).
 

TheRyuu

Diamond Member
Dec 3, 2005
5,479
14
81
Originally posted by: Aikouka
Originally posted by: wizboy11
And when I'm talking about bottleneck, I'm not talking about going from 100fps to 150fps with a faster CPU. That doesn't really matter.

Then your definition of a bottleneck is off. It doesn't matter if you don't think it matters, a bottleneck is defined as what constricts the flow (hence the term bottleneck, as a neck of a bottle is smaller than the bottle and constricts flow) of data. If going from an Athlon 64 2800+ to an Athlon 64 3500+ shows a 15FPS improvement, then the game is somewhat CPU bound (increasing graphics may also improve the framerate at this point as well).

Well then, everything in my computer is bottlenecking everything else?
 

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
Originally posted by: wizboy11
Originally posted by: Aikouka
Originally posted by: wizboy11
And when I'm talking about bottleneck, I'm not talking about going from 100fps to 150fps with a faster CPU. That doesn't really matter.

Then your definition of a bottleneck is off. It doesn't matter if you don't think it matters, a bottleneck is defined as what constricts the flow (hence the term bottleneck, as a neck of a bottle is smaller than the bottle and constricts flow) of data. If going from an Athlon 64 2800+ to an Athlon 64 3500+ shows a 15FPS improvement, then the game is somewhat CPU bound (increasing graphics may also improve the framerate at this point as well).

Well then, everything in my computer is bottlenecking everything else?

Yup, your keyboard is bottlenecking it all.. ultimately. Esp if your on PS/2!! OMGZORZ!
;)


As far as this thread: Games are largely GPU bound at the high resolutions that people play at. /end
 

dreddfunk

Senior member
Jun 30, 2005
358
0
0
You know, I agree with the general wisdom of the OP's premise (GPU>CPU) in most applications. We might note AT's own review of CPU performance in Oblivion, however, specifically with regard to CPU/GPU scaling tests done in town:

http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=2747&p=4

Outdoors, where the GPU demands are higher, only crossfire x1900xt's required the use of anything more than the lowest-end Athlon 64 (remember the time frame in which the article came out, pre-Conroe). But in towns, with the much higher concentration of NPCs, AT found that even an X1800xl paired with a 2.6Ghz Athlon64 would soundly beat crossfire x1900xt's paired with a 1.8Ghz model.

BUT, the difference between losing the ~9fps from the low 40s to the low 30s in the 2.6Ghz/x1800xl vs. 1.8Ghz/x1900xt "town" comparison, isn't nearly as important, of course, as the the difference in the "outdoor" benchmark: ~12fps from the upper 20's to the upper teens.

Thus the GPU>CPU wisdom generally holds true in even CPU-intensive titles.

Still, I would *not* want to play Oblivion on a Celeron processor. Sure, you can get semi-acceptable framerates if you pair it with ultra-high end graphics cards, but even then it will be seriously sluggish (19.9 fps in the Oblivion Gate bench, and 21.5 in the town bench with crossfire x1900xts). Considering just how much you can stretch the performance of even a single x1900 with better processors, a Celeron owner would have been better served ditching the second x1900 and upgrading their CPU to at least an Athlon64 3000+. That setup would have beaten the performance of x1900crossfire by 10fps in every benchmark and would probably have been cheaper, even taking new motherboard & RAM into consideration.

Situation, situation, situation. There is no single, magic bullet for upgrading your GPU/CPU. Generally, your GPU is likely to be more of a bottleneck in most games. If you can find the information, however, it just may be that your favorite game needs a slightly better balance of CPU & GPU.
 

Aikouka

Lifer
Nov 27, 2001
30,383
912
126
Originally posted by: wizboy11
Well then, everything in my computer is bottlenecking everything else?

Yes, I'm sure those USB headers on the bottom of your motherboard are causing that lag that makes you die in CS. Tell that to someone next time you die.

Don't pull out illogical crap. Jeez, what kind of argumentative skills are you youngens learnin' these days? Oh that's right, Bill Gates said public schools suck, you're excused ^_^.
 

beggerking

Golden Member
Jan 15, 2006
1,703
0
0
Originally posted by: wizboy11

Well then, everything in my computer is bottlenecking everything else?

Correct! the biggest / slowest bottleneck on a computer system is the user (especially dumb ones).
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: wizboy11
I think we need to revise the statement.

"Any MODERN (by modern I mean Celeron 326 and greater) cpu is not the bottle neck as long as the resolution is high enough, and IQ settings and AA/AF are turned up enough on a modern graphics card."

Hows that one?

And when I'm talking about bottleneck, I'm not talking about going from 100fps to 150fps with a faster CPU. That doesn't really matter.
Please look at what you just said and please ponder what games you think a piddly little Celeron would give you 100fps at whatever 'high enough' resolution and cranked settings you are eluding to.
 

Avalon

Diamond Member
Jul 16, 2001
7,571
178
106
Originally posted by: TheSnowman
Originally posted by: wizboy11
I think we need to revise the statement.

"Any MODERN (by modern I mean Celeron 326 and greater) cpu is not the bottle neck as long as the resolution is high enough, and IQ settings and AA/AF are turned up enough on a modern graphics card."

Hows that one?

And when I'm talking about bottleneck, I'm not talking about going from 100fps to 150fps with a faster CPU. That doesn't really matter.
Please look at what you just said and please ponder what games you think a piddly little Celeron would give you 100fps with at whatever 'high enough' resolution and cranked you are eluding to.

Quake II :p
 

niggles

Senior member
Jan 10, 2002
797
0
0
I upgraded my 3500+ up to a 4400+ both with my BFG 7800 GTOC. and saw a pretty pronounced improvement on BF2. I get that it's not a bottleneck to the GPU in most cases, but I have to say that it most certainly improved my performance.
 

Conky

Lifer
May 9, 2001
10,709
0
0
Originally posted by: niggles
I upgraded my 3500+ up to a 4400+ both with my BFG 7800 GTOC. and saw a pretty pronounced improvement on BF2. I get that it's not a bottleneck to the GPU in most cases, but I have to say that it most certainly improved my performance.
BF2 is a RAM hog. You could go back to your 3500 with 2 gigs of RAM and surpass your current performance... at least when loading levels... this is the reason I can't stand this game... gotta wait a couple minutes for every lame level to load. :roll:

And to the guy who thought he needed an E6600 to play UT2004, your videocard is still punking you because an X850xt with an E6400 will give you a couple hundred frames per second. I don't know where your bottleneck is but I am guessing it's that videocard. My recent 7900GS has UT2004 up to something like 300 frames per second or something ridiculous like that. Might as well be talking about Q3 fps's! :laugh:

 

sbuckler

Senior member
Aug 11, 2004
224
0
0
Originally posted by: Beachboy

And to the guy who thought he needed an E6600 to play UT2004, your videocard is still punking you because an X850xt with an E6400 will give you a couple hundred frames per second. I don't know where your bottleneck is but I am guessing it's that videocard. My recent 7900GS has UT2004 up to something like 300 frames per second or something ridiculous like that. Might as well be talking about Q3 fps's! :laugh:

You obviously just read the silly little tests review sites use for ut2004 (16 bots, deathmatch, average fps, whatever). Try playing the game for real on a 32 player ons server (e.g. titan 32 player one) and watch your frame rates dive. Min fps is as I said in my previous mail (and min fps happens in the middle of a fire fight when you need those fps the most). I am an experienced ut2004 player, I know lots of other experienced ut2004 players, I didn't make this up, it simply is a fact that the game gets cpu bound on big servers if you have a modern (6600GT or better) graphics card.
 

TanisHalfElven

Diamond Member
Jun 29, 2001
3,512
0
76
Originally posted by: avi85
Originally posted by: akshayt
Not so for Carbon, it gives 54FPS on 1900XTX with C2D E6400 while single core A64 @ 2.4GHz just gives 40FPS, both are playable, but the C2D will be smoother and faster, this is at 12x10 4x AA, without AA the AMD gives 40FPS again while C2D give 65.

That's cause your xtx sucks...:laugh:

LOL