How long will the X2s be good for gaming?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Mr Fox

Senior member
Sep 24, 2006
876
0
76
Originally posted by: Capt Caveman
Originally posted by: akshayt
If you rule out exceptions like Alan Wake and if you oc the X2 to 2.7-3.0GHz then you should be good for another 1-2yrs, hopefully 1.5-2yrs. After that there is no way you can continue though. Best would be change in atmost a yr or yr and half.

If you want to run Alan Wake in full gore, then C2Ds and X2s are obsolete already, you need 3.5GHz+ C2Qs.

OP - Ignore anything this person posts. I would go with X2 and oc it. You'll be fine for 2 years at a minimum.



Ditto !! Hit that Nail in the Head !!
 

MBrown

Diamond Member
Jul 5, 2001
5,724
35
91
great thread. I have been wondering the same thing lately as I will be picking myself up a 4400 89W within the week or so. People seem to be getting great oc's with this 89W chip. The only problem I have are these people having problems getting there computer to run smoothly with dual core systems. I guess I will just have to see how it goes it I get this chip.
 

formulav8

Diamond Member
Sep 18, 2000
7,004
522
126
Any X2 will be perfectly fine at gaming. Even years from now. Especially since games are expected to be alittle more multi-threaded as time goes on. So, don't worry about a X2 not being able to keep up with the video card. IT will pretty much limit the the high end cards today, especially if you run at high resolutions or if you oc to 2.6ghz or so. And in the future, even if it doesn't push the video card to its limit, it will still give you plenty of frames. So no problems with the X2's and gaming.


Jason
 

Viditor

Diamond Member
Oct 25, 1999
3,290
0
0
Originally posted by: formulav8
Any X2 will be perfectly fine at gaming. Even years from now. Especially since games are expected to be alittle more multi-threaded as time goes on. So, don't worry about a X2 not being able to keep up with the video card. IT will pretty much limit the the high end cards today, especially if you run at high resolutions or if you oc to 2.6ghz or so. And in the future, even if it doesn't push the video card to its limit, it will still give you plenty of frames. So no problems with the X2's and gaming.


Jason

I agree completely...the more difficult choice will be a video card. As DX10 is coming up soon, current video cards may have a shorter lifespan than 1 year...
 

Mr Fox

Senior member
Sep 24, 2006
876
0
76
Originally posted by: Viditor
Originally posted by: formulav8
Any X2 will be perfectly fine at gaming. Even years from now. Especially since games are expected to be alittle more multi-threaded as time goes on. So, don't worry about a X2 not being able to keep up with the video card. IT will pretty much limit the the high end cards today, especially if you run at high resolutions or if you oc to 2.6ghz or so. And in the future, even if it doesn't push the video card to its limit, it will still give you plenty of frames. So no problems with the X2's and gaming.


Jason

I agree completely...the more difficult choice will be a video card. As DX10 is coming up soon, current video cards may have a shorter lifespan than 1 year...



You really need to understand this..... The Vid cards won't turn into Pumpkins.... you will still be able to use, you just don't get to see some of the eyecandy...


Too many people have fallen for the hype on this subject. Marketing 101 = " Make Them Belive that They Need It ".

The Dis-information buzz that surounds this is the Clueless being Led by the Moronic...

It will take 2-3 years before DX10 becomes mainstream, and the Games will be supported by both technologies...

Game developers are going to go where the Money is and that is Mainstream.....

Sure MicroShaft will come out with a supporting game or two... they can offset the loss...

Mainstream developers seem to Target as a Minimum... 2-4 year old Hardware......

 

harpoon84

Golden Member
Jul 16, 2006
1,084
0
0
Originally posted by: Mr Fox
Originally posted by: Viditor
Originally posted by: formulav8
Any X2 will be perfectly fine at gaming. Even years from now. Especially since games are expected to be alittle more multi-threaded as time goes on. So, don't worry about a X2 not being able to keep up with the video card. IT will pretty much limit the the high end cards today, especially if you run at high resolutions or if you oc to 2.6ghz or so. And in the future, even if it doesn't push the video card to its limit, it will still give you plenty of frames. So no problems with the X2's and gaming.


Jason

I agree completely...the more difficult choice will be a video card. As DX10 is coming up soon, current video cards may have a shorter lifespan than 1 year...



You really need to understand this..... The Vid cards won't turn into Pumpkins.... you will still be able to use, you just don't get to see some of the eyecandy...


Too many people have fallen for the hype on this subject. Marketing 101 = " Make Them Belive that They Need It ".

The Dis-information buzz that surounds this is the Clueless being Led by the Moronic...

It will take 2-3 years before DX10 becomes mainstream, and the Games will be supported by both technologies...

Game developers are going to go where the Money is and that is Mainstream.....

Sure MicroShaft will come out with a supporting game or two... they can offset the loss...

Mainstream developers seem to Target as a Minimum... 2-4 year old Hardware......

Most developers will code for multiple DX versions. Look at HL2 - there is a DX9, DX8 and DX7 path. In HL2:Episode One there is also DX9 HDR.

Just because a game can take advantage of DX10 features doesn't mean it can't run in DX9 mode - things just won't look as good.

Since Anandtech is mainly targeted an 'enthusiasts' then I suppose (most) of us are interested in running at the highest level of eyecandy available, unlike 'mainstream' gamers. ;)
 

Regs

Lifer
Aug 9, 2002
16,665
21
81
I don't think next gen video cards have anything to do with DX10. However, they are introducing a lot of new hardware capabilities into the fold that will help a lot taking the load off the CPU.
 

Viditor

Diamond Member
Oct 25, 1999
3,290
0
0
Originally posted by: Mr Fox

You really need to understand this..... The Vid cards won't turn into Pumpkins.... you will still be able to use, you just don't get to see some of the eyecandy...


Too many people have fallen for the hype on this subject. Marketing 101 = " Make Them Belive that They Need It ".

The Dis-information buzz that surounds this is the Clueless being Led by the Moronic...

It will take 2-3 years before DX10 becomes mainstream, and the Games will be supported by both technologies...

Game developers are going to go where the Money is and that is Mainstream.....

Sure MicroShaft will come out with a supporting game or two... they can offset the loss...

Mainstream developers seem to Target as a Minimum... 2-4 year old Hardware......

I agree that the DX9 cards won't become irrelevant, but IIRC it only took ~1 year for games to be released on DX9 in the mainstream. I'm assuming that DX10 will follow the same pattern, though of course this could be absolutely wrong.
In addition to the cards themselves, it's looking like power will be a MAJOR issue for high-end cards. DX10 cards are rumoured to require upwards of 360w each!! Imagine the power requirements of a quad-sli scenario with them...
 

harpoon84

Golden Member
Jul 16, 2006
1,084
0
0
Originally posted by: Viditor
I agree that the DX9 cards won't become irrelevant, but IIRC it only took ~1 year for games to be released on DX9 in the mainstream. I'm assuming that DX10 will follow the same pattern, though of course this could be absolutely wrong.
In addition to the cards themselves, it's looking like power will be a MAJOR issue for high-end cards. DX10 cards are rumoured to require upwards of 360w each!! Imagine the power requirements of a quad-sli scenario with them...

It thought it was around 200W? 360W is totally ridiculous if true. :thumbsdown:

In regards to the DX9 -> DX10 thing, totally agree. I remember back in 03 when I bought my (then leading edge) 9800 Pro hoping there would be DX9 games on the horizon, and whattdaya know, they came in swarms some 6 - 12 months later. :)
 

DrMrLordX

Lifer
Apr 27, 2000
21,991
11,542
136
Originally posted by: Viditor

In addition to the cards themselves, it's looking like power will be a MAJOR issue for high-end cards. DX10 cards are rumoured to require upwards of 360w each!! Imagine the power requirements of a quad-sli scenario with them...

This can't be repeated often enough. Just when we thought both CPU and GPU manufacturers had become more sensitive to power consumption, both Nvidia and ATI have hit us with this crap? I understand that there are limits to what they can do, but cmon, 300W for a single card is absurd. I don't care how fast it is. I was impressed by both Nvidia and ATI for bringing power consumption down while bumping performance up with their latest product refreshes. Heck, the 79xx-series GeForce cards were an improvement over the 78xx cards across the board. Why the sudden backslide?
 

Viditor

Diamond Member
Oct 25, 1999
3,290
0
0
Originally posted by: DrMrLordX
Originally posted by: Viditor

In addition to the cards themselves, it's looking like power will be a MAJOR issue for high-end cards. DX10 cards are rumoured to require upwards of 360w each!! Imagine the power requirements of a quad-sli scenario with them...

This can't be repeated often enough. Just when we thought both CPU and GPU manufacturers had become more sensitive to power consumption, both Nvidia and ATI have hit us with this crap? I understand that there are limits to what they can do, but cmon, 300W for a single card is absurd. I don't care how fast it is. I was impressed by both Nvidia and ATI for bringing power consumption down while bumping performance up with their latest product refreshes. Heck, the 79xx-series GeForce cards were an improvement over the 78xx cards across the board. Why the sudden backslide?

This could be why ATI is pushing to 45nm ASAP...
 

sandorski

No Lifer
Oct 10, 1999
70,213
5,794
126
Originally posted by: Viditor
Originally posted by: DrMrLordX
Originally posted by: Viditor

In addition to the cards themselves, it's looking like power will be a MAJOR issue for high-end cards. DX10 cards are rumoured to require upwards of 360w each!! Imagine the power requirements of a quad-sli scenario with them...

This can't be repeated often enough. Just when we thought both CPU and GPU manufacturers had become more sensitive to power consumption, both Nvidia and ATI have hit us with this crap? I understand that there are limits to what they can do, but cmon, 300W for a single card is absurd. I don't care how fast it is. I was impressed by both Nvidia and ATI for bringing power consumption down while bumping performance up with their latest product refreshes. Heck, the 79xx-series GeForce cards were an improvement over the 78xx cards across the board. Why the sudden backslide?

This could be why ATI is pushing to 45nm ASAP...

Likely also why GPU/CPU is going to blurr into 1 in the near future. Eventually GPU's will become near impossible to cool on a card.
 

harpoon84

Golden Member
Jul 16, 2006
1,084
0
0
Originally posted by: sandorski
Originally posted by: Viditor
Originally posted by: DrMrLordX
Originally posted by: Viditor

In addition to the cards themselves, it's looking like power will be a MAJOR issue for high-end cards. DX10 cards are rumoured to require upwards of 360w each!! Imagine the power requirements of a quad-sli scenario with them...

This can't be repeated often enough. Just when we thought both CPU and GPU manufacturers had become more sensitive to power consumption, both Nvidia and ATI have hit us with this crap? I understand that there are limits to what they can do, but cmon, 300W for a single card is absurd. I don't care how fast it is. I was impressed by both Nvidia and ATI for bringing power consumption down while bumping performance up with their latest product refreshes. Heck, the 79xx-series GeForce cards were an improvement over the 78xx cards across the board. Why the sudden backslide?

This could be why ATI is pushing to 45nm ASAP...

Likely also why GPU/CPU is going to blurr into 1 in the near future. Eventually GPU's will become near impossible to cool on a card.

Yet having it next to a 100W CPU will make things better? ;)

 

Sunrise089

Senior member
Aug 30, 2005
882
0
71
OP - you and I are in the same boat, more or less. See my system in my sig - I would like to upgrade to a Core2Duo and OC the heck out of it...I could probably afford it as well. But why ditch a perfectly good MB and memory when DDR2 prices are REALLY high and quad-core is the Next Big Thing (TM)? I will be selling my 144 (for pennies, I'm sure) and buying a dual-core Opteron, adding a gig of memory, and selling my 7900GT and buying the next ~$350 video card that arrives with the next generation, which should easily outperform a X1900XTX. My total cost will be like $350, and I will not be outperformed by a Core2Duo system in games by anyone using the same video card to any real degree. For those users who want SLI and Crossfire and therefore really high levels of performance, well the burden does shift towards the CPU, but for the vast majority of user's (and their power supplies) that use one GPU, I don't think there is a need to move from s939 to Core at the moment, much as I might like to.
 

Tanclearas

Senior member
May 10, 2002
345
0
71
I pretty much make a major upgrade every year during the holidays (between Christmas and New Years). I'm glad that I've got the time to wait because there is so much that is unknown right now, or that is just not quite right, or just simply not necessary (for me).

There is no question that C2D is faster than X2. However, if you're focusing on games, unless you're running at low resolutions, or disabling a lot of the eye-candy, then you won't notice any difference between a C2D and X2 system. I know that I could upgrade to C2D, but it would absolutely not improve my framerates in any noticeable way based upon my preferred video settings. I like to run with in-game features set pretty high, and with 4xAA, and with 8x (or higher) AF. I use a single 1900XT 512MB card. As it is, I have been eyeing up a bigger monitor (which means higher res), and a faster card or SLI/CF configuration would be what I need more than a faster processor.

Regardless, I think the platforms available for C2D leave a lot to be desired, especially CF/SLI boards. 975X boards are fairly pricey. 965 boards don't even offer 2 x X8 PCIe slots (1 x X16 and 1 x X4). Nvidia is about to release a new chipset, so I'd be hesitant to jump on 5xx-series boards right now.

Memory prices are quite high at the moment, and it would cost a fair chunk to buy CPU, motherboard, and memory.

Finally, there is always the big K8L question mark. Everyone has their own opinion on it, but that's just it. They are opinions. Who would have believed that C2D could have seen the huge increases over CD? K8L might be a slight improvement (read "huge disappointment"), or it might be to K8 what C2D is to CD. Although K8L is still quite a ways away from shipping, remember that Intel demoed C2D several months before it launched. It's possible AMD may pull the same stunt and be demonstrating K8L sooner than people think.

If you really want to go dual core, take the easiest path for now. X2 is a great choice, and won't set you back too much. It will definitely buy you some time, and let you see how things shake out over the next 6 to 8 months.
 

VooDooAddict

Golden Member
Jun 4, 2004
1,057
0
0
It depends primarily on your expectations.

With a top end video card (you are already talking about the R600) and plenty of RAM I see not problems running extremely playable, imersive, and compedative for 1-2 years on any X2 (esp with a mild overclock).

If you are looking to MAX out all game and Image Quality settings with whatever game comes out in 1-2 years... the only way to do that is to keep upgrading every 9 months or so.

The reality is that an X2 with the R600 and lots of RAM will still be at the upper end of the mainstream perforamance target for Devs over the next 1-2 years. In 18 months games that come out will probably have "7+ Series GeForce" or "X1+ Series Readon" as the Recomended specs. Minimums will likely be the 6100 or X200 onboard video. Publishers want people to be able to buy thier games :) Especially subscription MMOGs.

Remember some peole are right now hapily gaming with onboard 6100s. It's all in your expectations.

Personally, I think an X2 is the perfect way to extend the life of your system. If you aren't overclocking, the X2 4600+ is a great value for upgrading. Upgrading a friend's SFF tomorrow from a 3000+, 1Gig, and 6800NU to a 4600+, 2Gigs, 7900GS tomorrow.
 

Keblerelf04

Senior member
Jul 31, 2006
827
3
81
I don't know if anyone touched on this already as I just skimmed through really quick. Dual Core isn't even at its peak age yet, not all games, and in fact most games are not optimized for more than 1 thread at the moment, once dual cores are optimized for gaming, the x2 you are considering will be even better, so yes it will definetly last you at least a year, but if you can hold out another 6-12months you can just skip to quad core, and be ahead of the curve.
 

lopri

Elite Member
Jul 27, 2002
13,212
597
126
Originally posted by: Keblerelf04
I don't know if anyone touched on this already as I just skimmed through really quick. Dual Core isn't even at its peak age yet, not all games, and in fact most games are not optimized for more than 1 thread at the moment, once dual cores are optimized for gaming, the x2 you are considering will be even better, so yes it will definetly last you at least a year, but if you can hold out another 6-12months you can just skip to quad core, and be ahead of the curve.
This is a good point. Games don't even make use of X2 to a reasonable extent yet, and until then there is no need to worry about its extinction. I mean, it has not even begin to be useful yet. (!) This proves again that the true, underlying concern is the single-threaded performance of X2 in gaming. (or one's desire to upgrade hardware ;))
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: lopri
Originally posted by: Keblerelf04
I don't know if anyone touched on this already as I just skimmed through really quick. Dual Core isn't even at its peak age yet, not all games, and in fact most games are not optimized for more than 1 thread at the moment, once dual cores are optimized for gaming, the x2 you are considering will be even better, so yes it will definetly last you at least a year, but if you can hold out another 6-12months you can just skip to quad core, and be ahead of the curve.
This is a good point. Games don't even make use of X2 to a reasonable extent yet, and until then there is no need to worry about its extinction. I mean, it has not even begin to be useful yet. (!) This proves again that the true, underlying concern is the single-threaded performance of X2 in gaming. (or one's desire to upgrade hardware ;))

Agreed... buy a better video card before you buy a new CPU. Any X2 even a 3800+ not even overclocked is pretty damn good. And you can still overclock that to at least 2.4Ghz.
 

VooDooAddict

Golden Member
Jun 4, 2004
1,057
0
0
Originally posted by: cmdrdredd
Originally posted by: lopri
Originally posted by: Keblerelf04
I don't know if anyone touched on this already as I just skimmed through really quick. Dual Core isn't even at its peak age yet, not all games, and in fact most games are not optimized for more than 1 thread at the moment, once dual cores are optimized for gaming, the x2 you are considering will be even better, so yes it will definetly last you at least a year, but if you can hold out another 6-12months you can just skip to quad core, and be ahead of the curve.
This is a good point. Games don't even make use of X2 to a reasonable extent yet, and until then there is no need to worry about its extinction. I mean, it has not even begin to be useful yet. (!) This proves again that the true, underlying concern is the single-threaded performance of X2 in gaming. (or one's desire to upgrade hardware ;))

Agreed... buy a better video card before you buy a new CPU. Any X2 even a 3800+ not even overclocked is pretty damn good. And you can still overclock that to at least 2.4Ghz.

He's already planning to get a R600 or similar. From what I gather/guess he's upgrading to X2 now to extend the life of the rest of his system. He doesn't want to wait till dual core is needed and then findout that 939 X2s are dried up and he needs to pay a premium, buy used, or have to upgraded the Motherbaord and RAM too.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: VooDooAddict
Originally posted by: cmdrdredd
Originally posted by: lopri
Originally posted by: Keblerelf04
I don't know if anyone touched on this already as I just skimmed through really quick. Dual Core isn't even at its peak age yet, not all games, and in fact most games are not optimized for more than 1 thread at the moment, once dual cores are optimized for gaming, the x2 you are considering will be even better, so yes it will definetly last you at least a year, but if you can hold out another 6-12months you can just skip to quad core, and be ahead of the curve.
This is a good point. Games don't even make use of X2 to a reasonable extent yet, and until then there is no need to worry about its extinction. I mean, it has not even begin to be useful yet. (!) This proves again that the true, underlying concern is the single-threaded performance of X2 in gaming. (or one's desire to upgrade hardware ;))

Agreed... buy a better video card before you buy a new CPU. Any X2 even a 3800+ not even overclocked is pretty damn good. And you can still overclock that to at least 2.4Ghz.

He's already planning to get a R600 or similar. From what I gather/guess he's upgrading to X2 now to extend the life of the rest of his system. He doesn't want to wait till dual core is needed and then findout that 939 X2s are dried up and he needs to pay a premium, buy used, or have to upgraded the Motherbaord and RAM too.


Then buy the friggen CPU now...god...don't read into it so much and make it rocket science.