Originally posted by: xFlankerx
Very nice post SlitheryDee, I agree ofcourse.
Originally posted by: xFlankerx
Thank you Bobthelost, you've basically been saying what I thought.
This paragraph completely contradicts itself.
You say, "you will suffer a drop in FPS if your CPU cannot keep up with the AI in the game in crowded areas". Please explain exactly how you square this with "you don't need a fast CPU to get the most out of your GPU"? If the CPU is limiting the possible FPS, then you're not getting the most out of your GPU, are you?
I'm saying that the increase in FPS that you see in the Town benchmark IS NOT FROM THE X1900XT STRETCHING IT'S LEGS. Its from the CPU. Thus, you are getting no more out of your GPU than you would otherwise, but your CPU is handling things better and is providing better performance.
Originally posted by: Bobthelost
Originally posted by: PingSpike
I'd like to see the test done with single cards myself. I still don't consider multicard solutions mainstream in any shape or form.
You mean like this? Only one of the lines is for a crossfire system.
http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=2747&p=4
Originally posted by: xFlankerx
Originally posted by: munky
Look at it from this point: except for the oblivion gate benchmark, the x1900 CF was getting the same fps as a single x1900xtx in the town and dundeon benches. That's a severe cpu limitation already. And even in oblivion gate the single x1900xtx got a slight increase with a faster cpu. What that means basically is that for any game that's less gpu-intensive than the oblivion gate scene (which goes for about 99% of current games), the 1.8ghz A64 will be a limitation to a single x1900xtx. And a mediocre P4 will be even more of a limitation.
The X1800XL was getting the same FPS as the X1900XT CF setup, lol, even when paired with a FX-60. Are you going to tell me that a FX-60 is not enough to take on these videocards? "A slight increase" isn't worthy of note, and is well within the margin of error.
And I've already told you that even though the Town benchmark wasn't as demanding as the Oblivion Gate benchmark on the GPU, it was the most pressure ANY GAME ON THE MARKET could have put on those processors. Your statement should be restated to say, "What that means basically is that for any game thats less CPU-intensive than the Oblivion Town scrne (which goes for about 99% of current games), the 1.8Ghz A64 will be just fine with a X1900XTX." And a 3.4Ghz P4, while not quite up to par with it's AMD counterparts, is hardly a "mediocre" CPU.
Originally posted by: SniperWulf
Personally, I feel that both are right. I really just depends on the game in question. So the OPs statement is neither right or wrong, I just clearly doesn't apply to all situations. To demonstrate what I mean, we need more benchmarks of games using the same methodology.
Originally posted by: xFlankerx
Indeed, and you, Barkotron, are so pathetically closeminded as to not see what I am talking about.
You would like to see more benchmarks?
http://www.xbitlabs.com/articles/cpu/display/cpu-games2_3.html
They're using a GeForce 7800GT. Old, but it proves the point.
In every single one of the benchmarks, the Socket 939 Athlon 64 3200+ performs as well as anything else. The 5FPS or so difference is ridiculous when you consider the .8GHz clockspeed difference and the $600-$800 difference in prices.
Originally posted by: n19htmare
He plays at 1600x1200. I dunno but his frame rate did indeed double. Can't say what and how but it did... he got a new HD too but I don't know how a new HDD will affect the FPS.
Originally posted by: dug777
If you are trying to say that most people with a low end A64 and mid-range gfx hardware are not CPU bound in most games, but rather are GPU bound, then just say that.
It's just as clear that people with high end gfx hardware (x1900XTs CF or 7900GTX SLI) would be cpu limited at lower resolutions, even in the most modern games like oblivion (and you admitted this in your OP) if they had a 3200+/3000+...
Originally posted by: munky
Look at it from this point: except for the oblivion gate benchmark, the x1900 CF was getting the same fps as a single x1900xtx in the town and dundeon benches. That's a severe cpu limitation already. And even in oblivion gate the single x1900xtx got a slight increase with a faster cpu. What that means basically is that for any game that's less gpu-intensive than the oblivion gate scene (which goes for about 99% of current games), the 1.8ghz A64 will be a limitation to a single x1900xtx. And a mediocre P4 will be even more of a limitation.
Originally posted by: n19htmare
You do need a faster cpu if the CPU is what's bottlenecking the performace. Take this for example.
My brother is a a die hard WoW player. His rig was my old p4 2.8 @3.2 with 6600GT vid card. he netted about 28-30 Fps... Then I bought the Pentium M...Overclocked it to 2.5 THAT was all i changed and his FPS jumped to 60 INSTANTLY.
Faster CPU sure did him wonders.
It's also important to look at the multi-core optimizations that Oblivion provides. The benefit of a dual core processor is definitely visible in Oblivion, and we welcome more games where there's a tangible real world performance improvement to multi-core processors. The difference isn't quite as large as what we've seen with Quake 4, but we're heading in the right direction.
Those lucky enough to have a high end CrossFire setup, for example with two X1900 XTs, will definitely want to invest in a high end Athlon 64 X2.
Originally posted by: Bobthelost
Without a point? That the anandtech reviewers got the summary wrong is a good enough point for me.
It's also important to look at the multi-core optimizations that Oblivion provides. The benefit of a dual core processor is definitely visible in Oblivion, and we welcome more games where there's a tangible real world performance improvement to multi-core processors. The difference isn't quite as large as what we've seen with Quake 4, but we're heading in the right direction.
A ~$200 increase going from a 3700 to a 4200X2 resulting in 1.3/0.1/2.2 fps increase is utterly unjustified. This review should have shown that there is no point at all in getting a dual core untill you're staring at a crossfire system.
Those lucky enough to have a high end CrossFire setup, for example with two X1900 XTs, will definitely want to invest in a high end Athlon 64 X2.
Example
A64 4000+ vs 4800X2
$337 vs $632
39.6 vs 44.1
44.3 vs 48.5
67.4 vs 74.0
Going for a dual core CPU that costs $300 more results in at most a 11% improvement in fps. Dual core is not worth the money.
I think that if this topic is to be put to bed it should be with this line : Anandtech got it wrong.
Originally posted by: wizboy11
Originally posted by: Bobthelost
Without a point? That the anandtech reviewers got the summary wrong is a good enough point for me.
It's also important to look at the multi-core optimizations that Oblivion provides. The benefit of a dual core processor is definitely visible in Oblivion, and we welcome more games where there's a tangible real world performance improvement to multi-core processors. The difference isn't quite as large as what we've seen with Quake 4, but we're heading in the right direction.
A ~$200 increase going from a 3700 to a 4200X2 resulting in 1.3/0.1/2.2 fps increase is utterly unjustified. This review should have shown that there is no point at all in getting a dual core untill you're staring at a crossfire system.
Those lucky enough to have a high end CrossFire setup, for example with two X1900 XTs, will definitely want to invest in a high end Athlon 64 X2.
Example
A64 4000+ vs 4800X2
$337 vs $632
39.6 vs 44.1
44.3 vs 48.5
67.4 vs 74.0
Going for a dual core CPU that costs $300 more results in at most a 11% improvement in fps. Dual core is not worth the money.
I think that if this topic is to be put to bed it should be with this line : Anandtech got it wrong.
Your example is sh!t.
I don't think anyone here would even think about getting a 4800+.
They would most likely get a X2 3800+ or a Opty 165 and overclock it. So then since the price is almost equal then, IT DOES make sense to go dual core.
Originally posted by: Bobthelost
That the anandtech reviewers got the summary wrong is a good enough point for me.
I think that if this topic is to be put to bed it should be with this line : Anandtech got it wrong.
