Dual cores for gamers?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

HardWarrior

Diamond Member
Jan 26, 2004
4,400
23
81
Something people need to realize about DC CPU's. Windows recognizes them automatically, which makes it a lot more snappy than before. So, while current games don't take advantage of DC(Q4 is the only game I know of that does, and it runs great at 1600x1200 on my box with maxxed settings) future games will. Most notably for me Prey will be DC aware.
 

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
Battlefield 2 has never crashed or been disconnected from a server on my machine as a result of it being dual core.
 

deadseasquirrel

Golden Member
Nov 20, 2001
1,736
0
0
Originally posted by: HardWarrior
So, while current games don't take advantage of DC(Q4 is the only game I know of that does, and it runs great at 1600x1200 on my box with maxxed settings) future games will. Most notably for me Prey will be DC aware.

Yes, but DC-aware does not necessarily translate to actual benefits. From what I understand, Oblivion was the first game out with native DC support (i.e. no patch needed). Yet, the results show no benefit, even at a low 1280x1024 noAA/AF.

And, I'm gonna wager a guess that at your settings-- 16x12 AA/AF-- that you won't see any DC performance increase anytime soon. Those settings are just too GPU intensive for the CPU to make any difference (from 2ghz to 2.8ghz). And it's not just Oblivion. Quake 4. FEAR at 16x12 AA/AF getting 30fps no matter what CPU used-- 3000+, FX57, and X2 4800+.

Granted, I can't predict the future... and I know next-to-nothing about game design or graphics cards in general. But I don't see settings at 1600x1200 with AA/AF, for popular games such as FPSs and now the FPS/RPG genre being affected by dual-core CPUs anytime soon, including Prey. I don't know what needs to be done to have the CPU affect those settings more-- offload some physics to another card or CPU core, program more advanced AI calculations, etc. But, until something revolutionary is done, I can't see DC having an impact at 1600x1200 with AA/AF.

There are a lot of other great uses for a dual-core chip right now. High-end gaming just isn't one of them. It seems this thread was more about how DC chips hurt gaming rather than if they help, and, as it has been stated, most of the issues for existing games have been resolved. If your building budget can handle the extra $150+ for an X2 3800+ (or Opty 165) over the 3200+, go for it. It certainly won't hurt your gaming performance. But I'd only get it for a primarily-gaming system if I've already budgeted in a great display, 2GB of ram, and a very fast video solution.
 

HardWarrior

Diamond Member
Jan 26, 2004
4,400
23
81
Originally posted by: deadseasquirrel
Yes, but DC-aware does not necessarily translate to actual benefits.

Semantics, DS. The Prey devs say that the game will take advantage of DC boxes. How that plays out we'll just have to see.

And, I'm gonna wager a guess that at your settings-- 16x12 AA/AF-- that you won't see any DC performance increase anytime soon.

There are already benefits, at least that's what nV says. Their driver suites already use DC's.

Granted, I can't predict the future... and I know next-to-nothing about game design or graphics cards in general. But I don't see settings at 1600x1200 with AA/AF, for popular games such as FPSs and now the FPS/RPG genre being affected by dual-core CPUs anytime soon, including Prey.

Then you aren't using your imagination, ds. There's a lot more to be done by CPU's in a modern game than just feeding the rendering pipeline. Game programmers have proven themselves quite adept at utilizing available resources over the years, so much so that we have to regularly buy new HW just to keep up. There's no way that some industrious game coder won't find a way to sense a second core, and then use it in some way to enhance gameplay. Again you bring up Prey. The developers have SAID that Prey will take advantage of DC's. Is it prudent, or logical to ignore that, without having even SEEN the game?

I don't know what needs to be done to have the CPU affect those settings more-- offload some physics to another card or CPU core, program more advanced AI calculations, etc.

What a second, but you say in this very same post that such things aren't likley to happen! ;)

But, until something revolutionary is done, I can't see DC having an impact at 1600x1200 with AA/AF.

Why does the inclusion of DC's into gaming have to be heralded by something revolutionary, regardless of resolution? Aren't incremental changes just as good, though not as sexy?

High-end gaming just isn't one of them. -snip-

Forgive this, but you're just pontificating now ds, and you don't need my participation for that.

 

deadseasquirrel

Golden Member
Nov 20, 2001
1,736
0
0
I really didn't intend for my post to be combative, HardWarrior. I'm sorry if it had that impression. My intention was to show, via benchmarks, how the latest games have been performing with dual-core CPUs at higher resolutions. I freely admit that I know nothing of about game coding or GPU hardware. Developers have said that their game is dual-core aware and that it will provide improvements before (Oblivion), and, in other cases, they've released a patch to enable dual-core (COD2), and, like you said nV says there are benefits in their drivers. However, in all cases, no performance increase was to be found, especially at higher resolutions... benchmarks prove this.

So, my guess (and, again, I fully admit I know nothing and it is only a guess) is that the next batch of games won't benefit from a dual-core chip at 1600x1200 either. They're likely to benefit at 1280x1024 like Q4 does. If I am wrong, I will admit it. No big deal. I am basing everything on current releases with current CPUs. AM2, Conroe, DX10, and Vista are all wildcards that could prove me wrong. I hope they do because I am ALL about better performance.

But, because of these current games, current CPUs, and current benchmarks out there, if you're building a current system for only gaming, you're better off doing as I said and putting that extra money into a faster video card and nicer display (that extra $180 for the X2 would be better spent on something that actually affects gaming performance). I don't mind debating and pontificating with ya here... that's what this place has been about for as long as I've been here. So, yeah, I can pontificate on my own, or we can actually discuss the topic.
 

HardWarrior

Diamond Member
Jan 26, 2004
4,400
23
81
Originally posted by: deadseasquirrel
I really didn't intend for my post to be combative, HardWarrior.

I'm resonably sure that this wasn't your intention. You did, however, come off as if you were the established expert on the issue. Far too often, when it comes to disscusions about newer technology, some people try to offer their reasons, whatever they may be, for not being early adopters as all-encompassing. Intentional or not, you post was rife with this attitude.

My intention was to show, via benchmarks, how the latest games have been performing with dual-core CPUs at higher resolutions.

Then you wasted your time. Current game performance is near-irrelevant. Development cycles for games being as so long as it is, advancing hardware features often lag behind in terms of implementation.

 

deadseasquirrel

Golden Member
Nov 20, 2001
1,736
0
0
Originally posted by: HardWarrior
Originally posted by: deadseasquirrel
I really didn't intend for my post to be combative, HardWarrior.

I'm resonably sure that this wasn't your intention. You did, however, come off as if you were the established expert on the issue. Far too often, when it comes to disscusions about newer technology, some people try to offer their reasons, whatever they may be, for not being early adopters as all-encompassing. Intentional or not, you post was rife with this attitude.

I don't want to attack and defend with you. I really just want to discuss this topic, but this above claim is false. If you reread my original post you will see all over the place:

From what I understand
I'm gonna wager a guess
I can't predict the future
I know next-to-nothing about game design or graphics cards in general
I don't know what needs to be done

That doesn't sound like someone professing to be "the established expert".

Current game performance is near-irrelevant. Development cycles for games being as so long as it is, advancing hardware features often lag behind in terms of implementation.
Then I must ask you-- if you feel judging dual-core performance by current games is irrelevant, then how do you justify buying a dual-core CPU for gaming today? Because in the future games will take advantage of it? Why not just buy one in the future when they do then? For that matter, why not buy 4GB of ram right now since in the future, software developers will likely find a use for it?
 

betasub

Platinum Member
Mar 22, 2006
2,677
0
0
Originally posted by: deadseasquirrel
Originally posted by: HardWarrior
Current game performance is near-irrelevant. Development cycles for games being as so long as it is, advancing hardware features often lag behind in terms of implementation.
Then I must ask you-- if you feel judging dual-core performance by current games is irrelevant, then how do you justify buying a dual-core CPU for gaming today? Because in the future games will take advantage of it? Why not just buy one in the future when they do then? For that matter, why not buy 4GB of ram right now since in the future, software developers will likely find a use for it?

QFT
 

HardWarrior

Diamond Member
Jan 26, 2004
4,400
23
81
Originally posted by: deadseasquirrel
I don't want to attack and defend with you.

Perhaps not, but I find in interesting that while you seem to take great pride in and admit to knowing next to nothing about the subject matter, you seem to want to challenge nearly everything, as opposed to learning about it.

Current game performance is near-irrelevant. Development cycles for games being as so long as it is, advancing hardware features often lag behind in terms of implementation.

Then I must ask you-- if you feel judging dual-core performance by current games is irrelevant, then how do you justify buying a dual-core CPU for gaming today?

I'm under not obligation to justify anything to you. I will, however, restate something I already said. XPro is dual-core aware, which translates to tangible benefits right now. After all, I do a lot more with my box than play games, and I want the quickest performance I can afford. Of the 3 games I'm waiting for, which will probably be released in the next 6-months, all are DC compatible.

Why not just buy one in the future when they do then?

I didn't have to wait. I can both afford and enjoy the benefits of having a DC CPU now, and later.

For that matter, why not buy 4GB of ram right now since in the future, software developers will likely find a use for it?

:) 4gig at this point will force all the RAM in my box to 2T. This generation of AMD CPU's doesn't work as well with 2T. Moreover, current high-end gaming has just quietly transitioned to 2gig for optimal performance in some cases. The industry simply isn't going to push its luck and take a chance on alienating PC gamers by all of a sudden jumping to a 4gig footprint. Remember, they want to sell games to as many folks as possible. As it is, a fair amount of 4gig in my current rig would just sit there idle. That isn't cost effective when I'm very pleased with my rig's performance right now.

Now, do yourself a favor. Be honest. Not once have I asked you to "justify" your choice not to buy a DC CPU, yet you've given me the 5th degree, trying to make points all the way. Nor have I taken it upon myself to challenge the way you spec your rig. If you don't see this sort of behavior the way I explained it then you're deluding yourself. Somewhere in your head you want to feel good about what you have, and for some people that means berating those who've made other choices.

Just so you know, I'm immune to feeling bad or stupid over being able to afford some of the things I want.



 

deadseasquirrel

Golden Member
Nov 20, 2001
1,736
0
0
Originally posted by: HardWarrior
you seem to want to challenge nearly everything, as opposed to learning about it.

I don't feel I need to learn how games are coded or how GPUs and CPUs are built. All I need to know is how to read. Benchmarks prove that at high resolutions, graphicallly-intensive games are completely unaffected by dual-core optimizations, and even processor speed in general (a 2.6ghz X2 benching the same as a 1.8ghz A64). All my links are above and you are free to dispute their findings as their testing methods are easy to replicate.

You've taken this too personally. My comment about wanting you to justify a dual-core cpu for high-end gaming wasn't about you purchasing one. What we purchase individually doesn't matter to this discussion at all. I am talking about whether or not next-gen games (such as Prey, UT2k7, Crysis, etc) will have any performance impact from a dual-core CPU over a single-core counterpart at 1600x1200 with AA/AF/HDR/etc.

All I've done is provide links to benches that show current games not even coming close to doing that (99% of them don't even show an increase from DC at 1280x1024... only Q4 does). Because these games, such as Oblivion, are so GPU-hungry, I don't feel that developers will make these next-gen games-- which I'm sure are going to be just as GPU-hungry as current games like FEAR, Oblivion, and COD2-- in such a way that they will do any better with dual-core at those high resolutions. I get it that you feel differently. Okay. No big deal. I'm fine agreeing to disagree.

Just so you know, I'm immune to feeling bad or stupid over being able to afford some of the things I want.

I hate responding to this because that's just going to keep this subtopic going... but I have to repeat what I said earlier-- you're making this personal and this quote shows that. I have no problem with what people buy, nor am I basing my statements on what I have bought (I often advocate best bang-for-buck, yet I adopted SLI way back in Jan '05).

I am simply making a comment that dual-core cpus don't provide any better performance right now for high-res gaming. And, it is so far off (virtually no increases at lower res either), that I don't see it helping in high-res in the near future either. It's just an opinion. And a self-admitted, somewhat ignorant one since I know nothing about the hardware or software and the way it works.

All I know is my opinion is backed by benchmarks of real-world tests. Since I have no inside knowledge of the industry, my opinion is likely to change as soon as those real-world benchmark tests do. If you have any links you'd like to share that either disprove the ones I posted or discuss what next-gen game coding will do for dual-core at high-resolutions, I'd be happy to see them and let them change my mind. Until then, we're both just stuck with our opinions. But there's really no need to make this all so personal regarding your system or your purchase. This isn't about that.

If we can actually discuss this technology and what the future holds, let's by all means continue. But if it's only going to degrade into a personal jab-fest, let's just drop it or take it to PMs. I know, I know... nobody wants to back down from an argument on an internet forum because they want to be seen as "right". But the fact of the matter is that neither one of us is right. After all, we're guessing about the performance of future games on future chips with future chipsets on a future OS. No matter what, we're both gonna be wrong.
 

HardWarrior

Diamond Member
Jan 26, 2004
4,400
23
81
You really need to examine your motives for all these high-temperature gyrations, ds. I don't think anyone cares that you go out of your way to post benchmarks, which you have no part in producing, as a way to minimize a technology that you don't own or seem to understand. I'll give you one thing, I've never seen a person more able to completely discredit themselves so quickly, yet continue to dig the hole even deeper all the while ignoring all other concerns. Your attitude (condescending\inane\thick-headed\unimaginative\opinionated\oblivious\whiney) is yet another sad testimonial for internet forums.
 

deadseasquirrel

Golden Member
Nov 20, 2001
1,736
0
0
That certainly was not the reply I expected. I really don't see how you could come to such a conclusion from reading these posts. In fact, I'm now wondering if you're reading them at all. You say *I* discredit myself. Hmmm. Okay. Let's see--

I say that dual-core chips provide NO benefit over single-core chips in games at high resolutions, and back that up with facts and benchmarks. Then, I make an *opinion* (which cannot be proven either right or wrong) that, because of these facts and benchmarks, I don't think that even next-gen games will allow for any benefit at those high resolutions with dual core either.

Your response-- attack me. Nice. You keep thinking I'm "minimizing technology I don't own" when I do in fact own an X2 3800+. Just because I own it doesn't mean I can't say that it gives no benefit at high-res gaming. Just because I owned SLI doesn't mean I can't say it had the occassional driver bug or that there was shimmering at times. Jeez man, this isn't about you and me and what we own/buy/use. This is about the technology. You can't seem to grasp that. So, please respond as to why you feel that dual-core CPUs provide a benefit in today's games at high resolutions, or why you feel that they will in games coming out later this year. Can you answer that question? Or will your response be just another personal jab again? Because if so, then you win, I'll quit and admit defeat. I have no interest in engaging in that kind of behavior here. Do a search on my posts over the past 5 years. I don't do that.

Newad, it seems I've taken your original thread and brought it way off-course. I apologize. According to most of the testing out there, most (if not all) games have no conflicts with a dual-core chip. Some needed to apply the patch, others didn't. Some needed to set affinity, others didn't. But, by and large, pretty much everything works without issues now.