Originally posted by: HardWarrior
you seem to want to challenge nearly everything, as opposed to learning about it.
I don't feel I need to learn how games are coded or how GPUs and CPUs are built. All I need to know is how to read. Benchmarks prove that at high resolutions, graphicallly-intensive games are completely unaffected by dual-core optimizations, and even processor speed in general (a 2.6ghz X2 benching the same as a 1.8ghz A64). All my links are above and you are free to dispute their findings as their testing methods are easy to replicate.
You've taken this too personally. My comment about wanting you to justify a dual-core cpu for high-end gaming wasn't about you purchasing one. What we purchase individually doesn't matter to this discussion at all. I am talking about whether or not next-gen games (such as Prey, UT2k7, Crysis, etc) will have any performance impact from a dual-core CPU over a single-core counterpart at 1600x1200 with AA/AF/HDR/etc.
All I've done is provide links to benches that show current games not even coming close to doing that (99% of them don't even show an increase from DC at 1280x1024... only Q4 does). Because these games, such as Oblivion, are so GPU-hungry, I don't feel that developers will make these next-gen games-- which I'm sure are going to be just as GPU-hungry as current games like FEAR, Oblivion, and COD2-- in such a way that they will do any better with dual-core at those high resolutions. I get it that you feel differently. Okay. No big deal. I'm fine agreeing to disagree.
Just so you know, I'm immune to feeling bad or stupid over being able to afford some of the things I want.
I hate responding to this because that's just going to keep this subtopic going... but I have to repeat what I said earlier-- you're making this personal and this quote shows that. I have no problem with what people buy, nor am I basing my statements on what I have bought (I often advocate best bang-for-buck, yet I adopted SLI way back in Jan '05).
I am simply making a comment that dual-core cpus don't provide any better performance right now for high-res gaming. And, it is so far off (virtually no increases at lower res either), that I don't see it helping in high-res in the near future either. It's just an opinion. And a self-admitted, somewhat ignorant one since I know nothing about the hardware or software and the way it works.
All I know is my opinion is backed by benchmarks of real-world tests. Since I have no inside knowledge of the industry, my opinion is likely to change as soon as those real-world benchmark tests do. If you have any links you'd like to share that either disprove the ones I posted or discuss what next-gen game coding will do for dual-core at high-resolutions, I'd be happy to see them and let them change my mind. Until then, we're both just stuck with our opinions. But there's really no need to make this all so personal regarding your system or your purchase. This isn't about that.
If we can actually discuss this technology and what the future holds, let's by all means continue. But if it's only going to degrade into a personal jab-fest, let's just drop it or take it to PMs. I know, I know... nobody wants to back down from an argument on an internet forum because they want to be seen as "right". But the fact of the matter is that neither one of us is right. After all, we're guessing about the performance of future games on future chips with future chipsets on a future OS. No matter what, we're both gonna be wrong.