CPU effects on games, I disagree.

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

dug777

Lifer
Oct 13, 2004
24,778
4
0
Originally posted by: xFlankerx
Very nice post SlitheryDee, I agree ofcourse.

Did you actually have a point to this thread?

My point is, basically, that "You don't need a fast CPU to get the most out of your GPU." If anything, it should be the other way around, since the GPU is what can't keep up with the CPU. I think that the conclusion should be more like, "You need a fast GPU to handle the graphics of the game, but you will suffer a drop if FPS if your CPU cannot keep up with the AI in the game in crowded areas."

what do you actually mean? As far as i can see you totally contradict yourself :p

First you say that you don't need a fast CPU to get the most out of your CPU, then you say that you will suffer a drop in FPS if the CPU can't keep up :laugh:

By that very statement, you are admitting that CPU limtations exist, and that you WON"T BE GETTING THE MOST OUT OF YOUR GPU if you are cpu limited ;)

Of course for most people this won't be happening unless they have something slower than a A64 3000+ and some serious gfx hardware, but that is obvious, and not what you say in the OP :p
 

Barkotron

Member
Mar 30, 2006
66
0
0
Originally posted by: xFlankerx
Thank you Bobthelost, you've basically been saying what I thought.

This paragraph completely contradicts itself.

You say, "you will suffer a drop in FPS if your CPU cannot keep up with the AI in the game in crowded areas". Please explain exactly how you square this with "you don't need a fast CPU to get the most out of your GPU"? If the CPU is limiting the possible FPS, then you're not getting the most out of your GPU, are you?

I'm saying that the increase in FPS that you see in the Town benchmark IS NOT FROM THE X1900XT STRETCHING IT'S LEGS. Its from the CPU. Thus, you are getting no more out of your GPU than you would otherwise, but your CPU is handling things better and is providing better performance.

You seem confused. You're describing a CPU limitation, using words which make it very clear that you're talking about a CPU limitation, but you just refuse to call it a CPU limitation. Is this some kind of Zen thing?
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: xFlankerx
Originally posted by: munky
Look at it from this point: except for the oblivion gate benchmark, the x1900 CF was getting the same fps as a single x1900xtx in the town and dundeon benches. That's a severe cpu limitation already. And even in oblivion gate the single x1900xtx got a slight increase with a faster cpu. What that means basically is that for any game that's less gpu-intensive than the oblivion gate scene (which goes for about 99% of current games), the 1.8ghz A64 will be a limitation to a single x1900xtx. And a mediocre P4 will be even more of a limitation.

The X1800XL was getting the same FPS as the X1900XT CF setup, lol, even when paired with a FX-60. Are you going to tell me that a FX-60 is not enough to take on these videocards? "A slight increase" isn't worthy of note, and is well within the margin of error.

And I've already told you that even though the Town benchmark wasn't as demanding as the Oblivion Gate benchmark on the GPU, it was the most pressure ANY GAME ON THE MARKET could have put on those processors. Your statement should be restated to say, "What that means basically is that for any game thats less CPU-intensive than the Oblivion Town scrne (which goes for about 99% of current games), the 1.8Ghz A64 will be just fine with a X1900XTX." And a 3.4Ghz P4, while not quite up to par with it's AMD counterparts, is hardly a "mediocre" CPU.

The x1800xl was getting the same fps as the x1900CF with a 1.8ghz cpu in the town benchmark. That means both are cpu limited at this point. With a fx60, the x1800xl is no longer getting the same fps as the x1900's. Which means that the cpu is no longer the limiting factor, and the video card cant keep up with its bigger siblings.

Looking at the town benchmark for the x1900's shows a linear increase with cpu speed, with no signs of slowing down. It's not a question if the fx60 is "enough", because that's a subjective matter. But there's no doubt that if they had results for a faster cpu than the fx60, then the x1900's would get even higher fps in the town benchmark.

In the oblivion gate benchmark, the same situation goes for the x1900CF. It scales linearly with cpu clocks, so it's at least partially still limited by the cpu, even an fx60.
 

SniperWulf

Golden Member
Dec 11, 1999
1,563
6
81
Personally, I feel that both are right. I really just depends on the game in question. So the OPs statement is neither right or wrong, I just clearly doesn't apply to all situations. To demonstrate what I mean, we need more benchmarks of games using the same methodology.
 

Barkotron

Member
Mar 30, 2006
66
0
0
Originally posted by: SniperWulf
Personally, I feel that both are right. I really just depends on the game in question. So the OPs statement is neither right or wrong, I just clearly doesn't apply to all situations. To demonstrate what I mean, we need more benchmarks of games using the same methodology.

Except that a) the point he is trying to make is dead wrong, and b) he contradicts his own point when he tries to explain it.

People can be wrong, you know. There's no desperate need to try to see both sides of a discussion when one of the sides is factually incorrect and logically inconsistent ;).
 

xFlankerx

Member
Aug 30, 2005
29
0
0
Indeed, and you, Barkotron, are so pathetically closeminded as to not see what I am talking about.

You would like to see more benchmarks?

http://www.xbitlabs.com/articles/cpu/display/cpu-games2_3.html

They're using a GeForce 7800GT. Old, but it proves the point.

In every single one of the benchmarks, the Socket 939 Athlon 64 3200+ performs as well as anything else. The 5FPS or so difference is ridiculous when you consider the .8GHz clockspeed difference and the $600-$800 difference in prices.
 

dug777

Lifer
Oct 13, 2004
24,778
4
0
Originally posted by: xFlankerx
Indeed, and you, Barkotron, are so pathetically closeminded as to not see what I am talking about.

You would like to see more benchmarks?

http://www.xbitlabs.com/articles/cpu/display/cpu-games2_3.html

They're using a GeForce 7800GT. Old, but it proves the point.

In every single one of the benchmarks, the Socket 939 Athlon 64 3200+ performs as well as anything else. The 5FPS or so difference is ridiculous when you consider the .8GHz clockspeed difference and the $600-$800 difference in prices.

i still don't see you point? You've admitted that CPU limitations do exist, then said they don't (and indeed the latter what your title says) :laugh:

Rather than insulting people why not make it clear what you are trying to say?

If you are trying to say that most people with a low end A64 and mid-range gfx hardware are not CPU bound in most games, but rather are GPU bound, then just say that. It's clearly true and hardly worthy of posting a thread about it :p

It's just as clear that people with high end gfx hardware (x1900XTs CF or 7900GTX SLI) would be cpu limited at lower resolutions, even in the most modern games like oblivion (and you admitted this in your OP) if they had a 3200+/3000+...
 

Madwand1

Diamond Member
Jan 23, 2006
3,309
0
76
Originally posted by: n19htmare
He plays at 1600x1200. I dunno but his frame rate did indeed double. Can't say what and how but it did... he got a new HD too but I don't know how a new HDD will affect the FPS.

This is WoW. WoW is known to be very CPU hungry in addition to GPU & RAM. Before I quit WoW, this is one of the tests I did -- checking CPU & GPU bottlecks with various hardware on hand. It was clearly CPU bound with an Athlon 2600+, and an Athlon 64 2.0 GHz cured that (note that both these processors have a nominal frequency of around 2 GHz; so architecture matters a lot here). AT has a good article on WoW performance where they show some CPU boundedness. However, WoW is very different from most games, so this sort of conclusion is not valid as a generalization for other games.
 

xFlankerx

Member
Aug 30, 2005
29
0
0
Originally posted by: dug777
If you are trying to say that most people with a low end A64 and mid-range gfx hardware are not CPU bound in most games, but rather are GPU bound, then just say that.

It's just as clear that people with high end gfx hardware (x1900XTs CF or 7900GTX SLI) would be cpu limited at lower resolutions, even in the most modern games like oblivion (and you admitted this in your OP) if they had a 3200+/3000+...

Ah, but a X1900XT CF or 7900GTX SLI setup is literally a one in a thousand system, most likely less.

And so I'll say this; Most people with a low end A64 and high-end GFX hardware (X1900XT, 7900GT) are not CPU bound in games, but rather are GPU bound. Thus, one should not fret about having a CPU/GPU balance as a A64 3200+ and 7900GT are hardly balanced, but still work. And since a faster CPU will not help you, it would seem that the extra money spent on a Dual Core processor in a gaming system would also be pointless.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
I recently went from a 2.4c running at 3.0 to a 3700+. With an x800xtpe the difference in gaming is shocking. I know this is a major cpu upgrade, but still I did not expect this much difference, at least with fear and hl2. Now if I can get a decent vid card......
 

TheRyuu

Diamond Member
Dec 3, 2005
5,479
14
81
A CPU can be a limiting factor. Often the graphics driver will offload some calculations to the CPU (if it's dual core) and also the CPU has to handel all the physics, AI, and other processing done by the program. If all those things are complex enough, the game will become very dependant on the CPU.

With a 7900GT SLI setup, the CPU is also important (for me) because of the SLI overhead that running the GPU's in SLI creates. Without a fast CPU the benifits of SLI go down the drain (or at least marginally fast CPU).
 

JM Aggie08

Diamond Member
Jan 3, 2006
8,418
1,009
136
CPU's that can handle a larger work load do a wonder on physics, such as in HL2 for example. You'd be suprised.
 

Bobthelost

Diamond Member
Dec 1, 2005
4,360
0
0
1) The oblivion gate benchmark was the one with the lowest fps
2) That a X1900XT is only just CPU limmited in the oblivion gate benchmark
3) That you set your game settings to run at playable frames for the worst case scenario
4) Higher frame rates are nice, but it is the minimum frame rates that make games unplayable/playable
5) Following on from 4) a game with a lower average fps but a higher minimum frame rate may well be more enjoyable to play
6) That the rather notable increase in fps in the oblivion gate with the X1900XT with 1.8Ghz CPU over the X1800XT with 2.6Ghz (28.8/25) is more significant than the increase in town (33.4/46.5)

Numbered so it's easy for you to shoot holes in my logic ;)

So for a computer built for oblivion alone and nothing else, you should neglect the CPU untill you have got to a X1900XT or XTX and only then increase the CPU budget. While it will help in many/most situations (ie running around in town) it is not going to make as much difference where it matters, the worst case scenario where the game turns into a slide show and you start tearing your hair out and diving for the menu.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: munky
Look at it from this point: except for the oblivion gate benchmark, the x1900 CF was getting the same fps as a single x1900xtx in the town and dundeon benches. That's a severe cpu limitation already. And even in oblivion gate the single x1900xtx got a slight increase with a faster cpu. What that means basically is that for any game that's less gpu-intensive than the oblivion gate scene (which goes for about 99% of current games), the 1.8ghz A64 will be a limitation to a single x1900xtx. And a mediocre P4 will be even more of a limitation.

Although what you are saying is entirely true, it is never a question what is a limitation in a game a cpu or a gpu -- Q: why? A: Because every game is always gpu and cpu limited. How so? If you benchmarked a game in 2001 and then ran same game with FX-60 and same graphics card you'll get better framerates and same with a graphics card swap since you'll be able to turn on more shaders and features and higher resolution for "free". The point is what is more of a bottleneck, a cpu or gpu?

A cpu bottleneck would mean that if you have P4 3.0ghz and 7800GTX 256mb you'll get faster framerates by substituting P4 4.0ghz / 4000+ (33% faster) than an X1900XTX (say 33% faster). The point is at any point in time a game is always either cpu and/or gpu limited. However, Oblivion benefits much more from a faster gpu than a cpu in that if you had a choice between getting a faster gpu vs a faster cpu you'd want a faster gpu, and this is certainly true for 99% of all games. So in reality it is the graphics card that is a bottleneck for most games, although yes, cpu does decrease minimum/average frames, but not as much as swapping in 6600GT vs. 7800GT. Of course you'd optimally want to have fastest cpu and fastest gpu. But for all intents and purposes with faster gpu you'll be able to turn on distant lands, and other feature, whether as going from A64 3000+ to 4000+ wont let you do any of that (it'll increase framerates by say 20fps with distant lands off but if your graphics card is weak it'll drop framerates into single digits as the videocard is overloaded immediately if it cannot handle the load).

If I didnt explain myself well this example will illustrate exactly what I am saying:

CPU Performance Tests
At 1600x1200 A64 2.0ghz is getting 33 fps, vs. 40fps for 2.6ghz A64. 1.2ghz A64 is getting only 20fps (unplayable) - ran on 7900GTX SLI
GPU performance tests
At 1600x1200 (same settings) 7800GT is getting 19 fps vs. 33 fps for X1900XT.

In other words, going from X1900XT to 7900GT you lose roughly 70% performance, while going from FX60 2.6ghz to 2.0ghz A64 you lose 20% in performance with 7900GTX SLI. So clearly you'd rather have SLI system and A64 3200+ than A64 4000+ and 7800GT. Even though Oblivion is one of a few games where we see substantial performance increases at 1600x1200 with max setting due to faster cpu (ie. 20%), they are nothing compared to the differences you'll encounter once you start swapping graphics cards.
And certainly pairing A64 6000+ and 7600Gt isn't going to give you better performance than A64 3000+ and 7900GTX.

Same with COD2, BF2, Far Cry and FEAR. They might be cpu limited, but are gpu bottlenecked. Xbitlabs has shown that only after 3000+ rating or so that cpu limitation starts to become as severe as a graphics card one if not more. I agree with you that A64 1.8ghz will limit X1900XTX, but it WILL NOT bottleneck gaming experience. For most users with modern cpus, upgrading the cpu might bring 20-30% increase, but their graphics card limitation might be 70-100%. So it's a question of: Do I want to spend $300 to get 30% increase or $300 to get 70%? (and getting 70% still doesn't remove the cpu limitation, but it's a better investment). Of course getting a faster cpu and faster gpu will thus allow us to get 100% gaming experience which is most optimal condition. Optimal marginal utility per $1 rests on the graphics card, even in Oblivion, assuming the user has 3000+ or faster rated P4 or A64
 

dug777

Lifer
Oct 13, 2004
24,778
4
0
Who said anything about marginal utility? It's indisputable that anyone with a 3000+ or faster is better off going with a faster GPU than CPU, and i don't see ANYONE arguing with that.

HoweverCPU limitations blatantly do exist. I'm horribly CPU limited in Farcry at 1024/all settings maxed/water at ultra/4xAA/4xAF with my 6600GT and tbred-b at 2.1Ghz...similarly in UT and Hl2 when running bots.

http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=2747&p=3

In oblivion running x1900 CF, are you rly telling me that those cards aren't being bottlenecked by a 3000+ cpu? :laugh:

there's a MASSIVE difference between the fx-60 and the 3000+...

AT has a quote that illustrates the situation nicely, and this isn;t even concerning a 3000+, but rather a 3500+...

The usefulness of the Radeon X1900 XT CrossFire setup diminishes significantly as you begin to move down the list of contenders; you'll actually lose over 20% of your real world frame rate if you've got an Athlon 64 3500+ vs. if you had an Athlon 64 FX-60. These aren't low resolution tests designed to isolate the impact of your CPU, these are tests at reasonable display settings for the GPU setup and these are real world results.

I'd call that a CPU limitation affecting gaming performance, wouldn't you? ;)
 

Bobthelost

Diamond Member
Dec 1, 2005
4,360
0
0
A $200 difference in the X1800XT and X1900XT is around the same as the price difference between a 3700+ and a 4200X2. You would be much, much better off with a X1900XT and a 3700 than you would with a 4200X2 and a X1800XT.

So for oblivion at least buying dual core is a very bad idea, untill you are talking about a crossfire system which is even less popular than the idea that elvis is infact alive and well, guiding the course of the US of A from a motherships somewhere above Utah.
 

dug777

Lifer
Oct 13, 2004
24,778
4
0
This thread is utterly pointless because it was started without a point, and continued in the same vein. It's no suprise or indeed news to anyone that for anyone with a 3000+ or faster a GPU upgrade is far more rewarding than a CPU upgrade. It's also indisputable that CPU limitations exist, are real, and that if you own a massively powerful gfx setup (ie CF or SLI at the top end) and even a lower end A64, you would see 'CPU limitations that would affect gaming performance' playing Oblivion at least.

I think the thread can be put to be on that note.
 

Looney

Lifer
Jun 13, 2000
21,938
5
0
Originally posted by: n19htmare
You do need a faster cpu if the CPU is what's bottlenecking the performace. Take this for example.

My brother is a a die hard WoW player. His rig was my old p4 2.8 @3.2 with 6600GT vid card. he netted about 28-30 Fps... Then I bought the Pentium M...Overclocked it to 2.5 THAT was all i changed and his FPS jumped to 60 INSTANTLY.
Faster CPU sure did him wonders.

I find that a little hard to believe. 3.2ghz with a 6600GT should give you better than 30fps to begin with in WoW. And i don't why it would jump so significantly with a CPU upgrade, ESPECIALLY in an MMO.
 

Bobthelost

Diamond Member
Dec 1, 2005
4,360
0
0
Without a point? That the anandtech reviewers got the summary wrong is a good enough point for me.

It's also important to look at the multi-core optimizations that Oblivion provides. The benefit of a dual core processor is definitely visible in Oblivion, and we welcome more games where there's a tangible real world performance improvement to multi-core processors. The difference isn't quite as large as what we've seen with Quake 4, but we're heading in the right direction.

A ~$200 increase going from a 3700 to a 4200X2 resulting in 1.3/0.1/2.2 fps increase is utterly unjustified. This review should have shown that there is no point at all in getting a dual core untill you're staring at a crossfire system.

Those lucky enough to have a high end CrossFire setup, for example with two X1900 XTs, will definitely want to invest in a high end Athlon 64 X2.

Example
A64 4000+ vs 4800X2
$337 vs $632
39.6 vs 44.1
44.3 vs 48.5
67.4 vs 74.0

Going for a dual core CPU that costs $300 more results in at most a 11% improvement in fps. Dual core is not worth the money.

I think that if this topic is to be put to bed it should be with this line : Anandtech got it wrong.
 

TheRyuu

Diamond Member
Dec 3, 2005
5,479
14
81
Originally posted by: Bobthelost
Without a point? That the anandtech reviewers got the summary wrong is a good enough point for me.

It's also important to look at the multi-core optimizations that Oblivion provides. The benefit of a dual core processor is definitely visible in Oblivion, and we welcome more games where there's a tangible real world performance improvement to multi-core processors. The difference isn't quite as large as what we've seen with Quake 4, but we're heading in the right direction.

A ~$200 increase going from a 3700 to a 4200X2 resulting in 1.3/0.1/2.2 fps increase is utterly unjustified. This review should have shown that there is no point at all in getting a dual core untill you're staring at a crossfire system.

Those lucky enough to have a high end CrossFire setup, for example with two X1900 XTs, will definitely want to invest in a high end Athlon 64 X2.

Example
A64 4000+ vs 4800X2
$337 vs $632
39.6 vs 44.1
44.3 vs 48.5
67.4 vs 74.0

Going for a dual core CPU that costs $300 more results in at most a 11% improvement in fps. Dual core is not worth the money.

I think that if this topic is to be put to bed it should be with this line : Anandtech got it wrong.

Your example is sh!t.
I don't think anyone here would even think about getting a 4800+.

They would most likely get a X2 3800+ or a Opty 165 and overclock it. So then since the price is almost equal then, IT DOES make sense to go dual core.
 

Bobthelost

Diamond Member
Dec 1, 2005
4,360
0
0
Originally posted by: wizboy11
Originally posted by: Bobthelost
Without a point? That the anandtech reviewers got the summary wrong is a good enough point for me.

It's also important to look at the multi-core optimizations that Oblivion provides. The benefit of a dual core processor is definitely visible in Oblivion, and we welcome more games where there's a tangible real world performance improvement to multi-core processors. The difference isn't quite as large as what we've seen with Quake 4, but we're heading in the right direction.

A ~$200 increase going from a 3700 to a 4200X2 resulting in 1.3/0.1/2.2 fps increase is utterly unjustified. This review should have shown that there is no point at all in getting a dual core untill you're staring at a crossfire system.

Those lucky enough to have a high end CrossFire setup, for example with two X1900 XTs, will definitely want to invest in a high end Athlon 64 X2.

Example
A64 4000+ vs 4800X2
$337 vs $632
39.6 vs 44.1
44.3 vs 48.5
67.4 vs 74.0

Going for a dual core CPU that costs $300 more results in at most a 11% improvement in fps. Dual core is not worth the money.

I think that if this topic is to be put to bed it should be with this line : Anandtech got it wrong.

Your example is sh!t.
I don't think anyone here would even think about getting a 4800+.

They would most likely get a X2 3800+ or a Opty 165 and overclock it. So then since the price is almost equal then, IT DOES make sense to go dual core.

Manners seem to have disapeared here :rolleyes;

I dont' know why on earth you think you can't overclock a 144. $170, that'll OC to around the same speed as a Fx 57 so you're paying around $130 for a small performance increase. The price is not equal there by any means, and yet it's enough to get you from an X1800XT to within spitting distance of a X1900XT.

Of course one here would be stupid enough to buy a 4800X2 when they can OC a 165 for a fraction of the cost!
Oh, no wait a sec. Some people around here have and do including some people who have single card systems and FX-60s, if they had gotten a 4000+ instead then that's more than enough to get a Crossfire motherboard and crossfire X1900 card.

They made poor choices in terms of performance and cost, but conclusions like the ones in the review only help perpetuate a rather unjustified conclusion that dual core is better value for money than SLI or crossfire, when it isn't.
 

xFlankerx

Member
Aug 30, 2005
29
0
0
Bob's been saying everything perfectly as far as I'm concerned :)

Originally posted by: Bobthelost
That the anandtech reviewers got the summary wrong is a good enough point for me.

I think that if this topic is to be put to bed it should be with this line : Anandtech got it wrong.

Indeed, thats the shocking part. I respect Anandtech and they people that work here a lot, but to miss something like the unjustified 1.3/0.1/2.2 fps increase that you mentioned is appalling.
 

dug777

Lifer
Oct 13, 2004
24,778
4
0
You people either can't read, or have a learning disability :p

Those lucky enough to have a high end CrossFire setup, for example with two X1900 XTs, will definitely want to invest in a high end Athlon 64 X2. Oblivion is quite possibly the first game we've tested where we can actually justify (and this is a stretch) an FX-60 and a pair of X1900 XTs, as they enable you to have much more than you get out of them in most games. As we stated in the beginning, you can also try hacking your configuration files and downloading some mods, improving performance in other ways. If you just want to set the detail sliders on Maximum and play the game at high resolutions, though, X1900 XT CF and a fast dual-core CPU will get the job done nicely. (Good luck convincing yourself or your significant other of that "need", though!)

I can't see the AT reviewer at ANY stage claiming that ' dual core is better value for money than SLI or crossfire', which is what Bobby and xflanker appear to be claiming. They are MERELY POINTING OUT THAT CPU LIMITATIONS DO EXIST, THEY HAVE AN IMPACT ON A TOP END GFX SETUP IN OBLIVION, AND IF YOU HAVE THE MONEY AND WANT TOP PERFORMANCE YOU WILL WANT A FX-60 CURRENTLY. THEY MAKE NO COMMENTS AS TO VALUE FOR MONEY.

I hope what i've said here actually sinks into your heads this time. There's nothing appalling, unjustified, or wrong about what they say, IF YOU ARE ACTUALLY CAPABLE OF READING THE WHOLE ARTICLE.

Nuff said :p