span of a gaming computer's life

Feb 26, 2013
177
1
81
I was wondering if you built a top of the line computer today, how long before it becomes a mid tier computer? And then how long before that computer is no longer viable for gaming? This of course, would be the stipulation is that the rate of development remains the same speed and not much faster or slower.
 

Sleepingforest

Platinum Member
Nov 18, 2012
2,375
0
76
Basically, top-of-the-line falls to be merely "good" in about 2 years, which will fall to "nearly unusuable" by 4-5, depending on the rate of development.

As a general idea, the GTS 8800, the absolute best single graphics card from 6 or 7 years ago, is now less powerful than the integrated graphics in Intel i5/i7 CPUs. It's far more economical to get a "competent" ($200ish) graphics card every other year or so, and sell the old one to help pay for the new one.
 
Feb 26, 2013
177
1
81
Basically, top-of-the-line falls to be merely "good" in about 2 years, which will fall to "nearly unusuable" by 4-5, depending on the rate of development.

As a general idea, the GTS 8800, the absolute best single graphics card from 6 or 7 years ago, is now less powerful than the integrated graphics in Intel i5/i7 CPUs. It's far more economical to get a "competent" ($200ish) graphics card every other year or so, and sell the old one to help pay for the new one.

what about the other components? I know you have to pay quite a bit for a computer that can play crysis series which seems to be the benchmark game for the last decade. So on average how much does a competent gpu depreciate in value per month? I won't pick something like the titan. More like a GTX 560 or HD 6870 which seem to be recomended for quite a few games atm.
 

Sleepingforest

Platinum Member
Nov 18, 2012
2,375
0
76
Okay, the remaining parts don't matter nearly so much. The CPU needs to be replaced, at most, every four years or so (which implies a motherboard upgrade would also be needed, to support the new CPU). If a new RAM standard comes out, you'll need to get that for the new mobo as well. The hard drive pretty much would only need replacement if it fails--load times will be longer, sure, but it doesn't make an actual difference in performance.

My advice is that you actually buy from the most recent generation. Its got a better resale value plus lower energy needs and heat. Get a GTX 660 or 7850 (roughly $180-210). They both can overclock (the 7850 moreso) to provide more power if necessary a year down the line.
 

nsafreak

Diamond Member
Oct 16, 2001
7,093
3
81
I wouldn't start with a GTX 560 or 6870, both of those are already 1 generation behind and later this year they'll be 2 generations behind. For the most part the rest of the parts of a standard build typically can stay around longer nowadays. There are still folks that are using i7 920 CPUs paired with newer GPUs and are doing just fine. The previous generation SanyBridge and IvyBridge CPUs look like they have a minimum of 3-5 years left on their clocks if not a bit more since developers still aren't making heavy use of the multithreading power that they offer.
 
Feb 26, 2013
177
1
81
I was just using that as an example because thats the min recommended for planetside 2. I'm hearing the new console generation might last about 10 years and wondering which drives the hardware development more. From what it looks like currently for a cost / performance level consoles can't really go much farther. I think playstation is shooting for 720p at 60fps for their ps4. So how will that correlate to PC graphics? Also what would happen if the console starts to become obsolete?
 

Sleepingforest

Platinum Member
Nov 18, 2012
2,375
0
76
Okay, consoles won't "go obselete". What'll happen is developers will get better and better at using the hardware--and then eventually, they'll have extracted everything, and graphics and so on will stagnate for a bit (that's what's been happening the last year or two). This is because gaming PCs are a minority compared to consoles and most developers want to cater to the lowest denominator.
 

Shephard

Senior member
Nov 3, 2012
765
0
0
I disagree with Sleepingforest. 2 years? Definitely not at this stage.

I used my AMD X2 6000+ for 6 years. I used a 9800 gt and then a GTS 250. GTS 250 only slightly better.

GTS 250 could play Crysis max graphics, but not AA. Usually 30-35 fps pretty good.

Metro 2033 was where computer showed it age. Max graphics, but not AA. 30-35 fps, some crazy firefights drop to 15.

It depends on your monitor too. I run 1680x1050. 1920x1080 10% drop.

With Ivy Bridge 2500k and 3570k really these CPUs at stock are no problem for any game out there today.

GPU wise still not many companies are going max graphics. Look at some new games... Bioshock crap graphics, Sim City average, Starcraft 2 good art style but average in graphics, CSGO, Borderlands 2, etc.

Games hailed with good graphics...

Crysis 3 good graphics. I still think Crysis 1 beats it with the immersion of jungle. Looks better in my opinion.

Hitman. Looks good, but very poor optimization so gpu is not to blame here.

Far Cry 3 looks great, but physics worse than Far Cry 2. Still not amazing for 2012 game.

Batman AA. Good graphics, great lighting.

You really don't need to spend more than $280 gpu if you want to max everything. Unless you go multimonitor of course.

I think Ivy Bridge and Sandy Bridge cpus will still be good for at least 4 years.

You can see gtx 4 series are still very strong today.
 

Sleepingforest

Platinum Member
Nov 18, 2012
2,375
0
76
Okay, my standards are different from yours (I like to hold 60FPS, though I'll take lower quality settings to get it). But also, the last two years have been pretty stagnant for gaming, so 4 years of "good" is more like 2 plus some gravy because of weak, old consoles.

Since a new generation is coming, at least one of which is x86 and 8 cored, I expect there to be a timid first year as devs learn the ropes and figure out some basic multithreading tricks, and then a boom of innovation/improvement in game engines and graphics. That's going to require extra power.
 
Last edited:

DominionSeraph

Diamond Member
Jul 22, 2009
8,386
32
91
Basically, top-of-the-line falls to be merely "good" in about 2 years, which will fall to "nearly unusuable" by 4-5, depending on the rate of development.

As a general idea, the GTS 8800, the absolute best single graphics card from 6 or 7 years ago, is now less powerful than the integrated graphics in Intel i5/i7 CPUs. It's far more economical to get a "competent" ($200ish) graphics card every other year or so, and sell the old one to help pay for the new one.

No, AMD's best APU is at 5570 level, while the 8800GT is at 5670 level. (And the top 8800 GTX was ~20% faster than the 8800GT IIRC). Intel is even further back.

2 years ago was Sandy Bridge. I wouldn't call the i5 2500k or i7 2600k "merely good." For top of the line, GTX 590 and 6990 were two years ago as well. "Merely good" compared to a 7970 GHz edition?
http://www.anandtech.com/bench/Product/618?vs=516

Nehalem dropped 4 and a half years ago. I don't think an i7 920 is going to be "nearly unsuable." 4870x2 was 4.5 years ago as well, and that's at the level of 5870/6950/7850. 4890 CF would fare even better.
A 920 and single 4890 would be faster than what I have, and my X4 945 and 6770 are perfectly usable.
 
Last edited:

Blain

Lifer
Oct 9, 1999
23,643
3
81
I was wondering if you built a top of the line computer today, how long before it becomes a mid tier computer? And then how long before that computer is no longer viable for gaming? This of course, would be the stipulation is that the rate of development remains the same speed and not much faster or slower.
The PC in question would degenerate to "mid tier" in 16-19 months.
The PC in question would degenerate to a "no longer viable" gaming PC in 26-31 months.
 

DominionSeraph

Diamond Member
Jul 22, 2009
8,386
32
91
The PC in question would degenerate to "mid tier" in 16-19 months.

I have a feeling that 16 months from now we aren't even going to be on the 800 series. I think Titan quad-SLI will still be above mid tier.

The PC in question would degenerate to a "no longer viable" gaming PC in 26-31 months.

So an i7 2600k or i7 980X with GTX 580 SLI isn't viable?
 
Last edited:

DominionSeraph

Diamond Member
Jul 22, 2009
8,386
32
91
The original post was addressing "top of the line computer today", not days gone by.

And why would the next 26 months, which will likely see hardware advance at an even slower rate, be so much different from the previous 26 months?
 

jaqie

Platinum Member
Apr 6, 2008
2,471
1
0
I have a 386 and it's still very viable for gaming. I think you just misworded what you meant, but how you worded it seems to show you thinking it won't be able to play the low requirement stuff.

Also, viable for gaming new games means something different to everyone, some people are ok with low settings, others cannot stand low framerates, et cetra.

Me, personally, I like to be able to play games without ever dipping under 20FPS on medium settings at least, when I can't, I want more powerful equipment... till then, I am actually fine with what I have.
 

showb1z

Senior member
Dec 30, 2010
462
53
91
And why would the next 26 months, which will likely see hardware advance at an even slower rate, be so much different from the previous 26 months?

New consoles probably. That'll be a big jump in average hardware requirements. Pretty much all games will finally move over to DX11 then too.
That said, my current rig is exactly 26 months old and still runs everything perfectly, and since the next gpu-generation is not expected in 2013, I'll probably add at least another 12 months to that. Crazy.
 
Last edited:

Sleepingforest

Platinum Member
Nov 18, 2012
2,375
0
76
No, AMD's best APU is at 5570 level, while the 8800GT is at 5670 level. (And the top 8800 GTX was ~20% faster than the 8800GT IIRC). Intel is even further back.

2 years ago was Sandy Bridge. I wouldn't call the i5 2500k or i7 2600k "merely good." For top of the line, GTX 590 and 6990 were two years ago as well. "Merely good" compared to a 7970 GHz edition?
http://www.anandtech.com/bench/Product/618?vs=516

Nehalem dropped 4 and a half years ago. I don't think an i7 920 is going to be "nearly unsuable." 4870x2 was 4.5 years ago as well, and that's at the level of 5870/6950/7850. 4890 CF would fare even better.
A 920 and single 4890 would be faster than what I have, and my X4 945 and 6770 are perfectly usable.

Okay, my post is worded strongly, and some parts are untrue. I was going off the top of my head after I looked at Passmark benchmarks. I apologize for those factual inaccuracies. But the point is: by the time 8 years passes by, your video card will be less powerful than the latest $100 video card, which is able to drive basically medium at 1080p and 60Hz. So while it is not unusable, it is no longer at the quality level it began at: ultra/high with AA and effects. It is still "usable," but it's performance leaves a considerable amount of room for improvement. If you want top level eye candy, you pretty much must upgrade every 2 years.

Crossfired cards are not top-of-the-line in the sense that I meant because they are dual cards even then, we see the 7970 GHz edition trading blows with a 6990 (crossfired 6970s). You are looking from a different perspective. You see a purchase beyond the flagship model keeping up with the next generation's best single card--I see a $450-500 card keeping up with the $1000 card of the past. Furthermore, yes, I believe a 7970 GHz to be merely good. There is a reason $400 graphics cards are put into "mid-range" builds--because they are merely good. A truely powerful gaming rig would be dual/triple/quad flagship cards. 6770 would have been considered "budget" when it came out and it is even lower than that now.

Furthermore, consoles have created stagnation across the last several years because while computers have marched onward, console hardware has remained the same for 7 years. If you look at trends in the past, you see CPUs and GPUs improving very rapidly, and game engines competing and improving quickly as well. These days, there is no more low hanging fruit on the hardware side and game engines hardly take advantage of the improvements that do exist. The introduction of the new consoles will spark innovation and improvement again in programming--just look at Crysis 3, which takes better advantage of paralleled processing than the vast majority of games before it. The flagship of years past, the i7-930, gets 10% fewer frames in Crysis 3 than a stock i7-2600K, which in turn is weaker than a stock i7-3770K. Will it work--yes. Could you get better by getting a newer CPU in an appreciable way? Yes.
 
Last edited:

BeauCharles

Member
Dec 31, 2012
131
3
46
Basically, top-of-the-line falls to be merely "good" in about 2 years, which will fall to "nearly unusuable" by 4-5, depending on the rate of development.

As a general idea, the GTS 8800, the absolute best single graphics card from 6 or 7 years ago, is now less powerful than the integrated graphics in Intel i5/i7 CPUs. It's far more economical to get a "competent" ($200ish) graphics card every other year or so, and sell the old one to help pay for the new one.

I bought an 8800 GTX in late 2006 (which was top of the line then) and didn't retire it until 4 1/2 years later. It was $550 and that works out to two mid tier cards in your time scale. Nice thing was it was on top or near the top for the first couple years instead of being a middle class card from the start.
 

Sleepingforest

Platinum Member
Nov 18, 2012
2,375
0
76
I bought an 8800 GTX in late 2006 (which was top of the line then) and didn't retire it until 4 1/2 years later. It was $550 and that works out to two mid tier cards in your time scale. Nice thing was it was on top or near the top for the first couple years instead of being a middle class card from the start.

It starts at $550. But a $200 card will sell for $100-150 after the new generation comes. So you can spend $600 across 5 years and get okay performance all 5, versus falling behind even extremely low budget GPUs by the end(the GTS 8800 fell behind the 2010 era Radeon 5670, and the top end GTX 8800 is only 20% better than the GTS--it falls behind a 5770 or so.)
 
Aug 11, 2008
10,451
642
126
I was wondering if you built a top of the line computer today, how long before it becomes a mid tier computer? And then how long before that computer is no longer viable for gaming? This of course, would be the stipulation is that the rate of development remains the same speed and not much faster or slower.

My previous computer lasted 6 years with only graphics card upgrades, and it was not really top of the line when I got it. That said, I dont play the most demanding games, and was using a 900p monitor.

As for now, I think any high end cpu will last almost indefinitely. I dont really see a major upgrade in CPU power anytime in the near future, unless intel decides to bring out a hex core mainstream one (yea, we wish).

Even if AMD makes the advances they are claiming with Streamroller, which I am somewhat skeptical of, that is only 15 to 20 percent improvement, and who knows what will come after that. Not really enough to upgrade from an i5 or 8350. And intel's 5-10 percent improvements per generation dont really hold out much motivation to upgrade from anything newer than Sandy Bridge.

Graphics cards, at least for high resolutions or image quality, will perhaps offer more motivation to upgrade.