• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

span of a gaming computer's life

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
no.

it also depends on what settings you want.

Do you want max graphics?

What resolution do you use it?

Do you care about AA?

I guess if you're willing to live with crap. IMHO a gaming PC is different than a PC that happens to be capable to running games at minimal settings.
 
DominionSeraph has nailed it tbh, these time scales a lot of you guys are giving are BS. Some are still on Q6600's from 2007... which are very viable for gaming.

A gaming computers life is as long as you want it to be, it entirely depends on the user.
 
Okay, but factually, your graphics card loses about $50-100 in value each year, which roughly corresponds to how good it is compared to the latest generation. So a $400 card in 8 years will become basically the bare minimum to play games, at 30 frames per second on low/medium.
 
Okay, but factually, your graphics card loses about $50-100 in value each year, which roughly corresponds to how good it is compared to the latest generation. So a $400 card in 8 years will become basically the bare minimum to play games, at 30 frames per second on low/medium.
but games are not changing as fast anymore graphical wise...

Look at Crysis from 2007. That game with no mods is still one of the best looking games in the world. Mod it and it's probably top 3.

The only other games that really have pushed graphics are Witcher 2 which has a nice artstyle, Skyrim modded, Battlefield 3, Batman AA, Metro 2033, Crysis 2 modded, and Shogun 2 large scale RTS great animations.

There are many other games released that have some good graphics, like Hitman, but they are so poorly optimized.

Everything else released is a console port. Graphics are not impressive, only sharper on PC.

You don't need to spend $400 on graphics card unless you want a multi monitor setup.

Game graphics really started to change around 2004 with Doom 3 and HL2. Now the past few years they haven't changed so much. Just a few specific games. Lighting is something that has gotten better.
 
I bought an 8800 GTX in late 2006 (which was top of the line then) and didn't retire it until 4 1/2 years later. It was $550 and that works out to two mid tier cards in your time scale. Nice thing was it was on top or near the top for the first couple years instead of being a middle class card from the start.

And it came out at the same time the Xbox 360 did, so we have a benchmark for what a new console generation does.
 
but games are not changing as fast anymore graphical wise...

Look at Crysis from 2007. That game with no mods is still one of the best looking games in the world. Mod it and it's probably top 3.

The only other games that really have pushed graphics are Witcher 2 which has a nice artstyle, Skyrim modded, Battlefield 3, Batman AA, Metro 2033, Crysis 2 modded, and Shogun 2 large scale RTS great animations...

Game graphics really started to change around 2004 with Doom 3 and HL2. Now the past few years they haven't changed so much. Just a few specific games. Lighting is something that has gotten better.
Yes, and what happened 1-2 years prior to 2004? Pentium 4 and Athlon 64 were fierely competing. The PS2 and Xbox came out. The hardware race and new consoles allowed for a new push in innovation to take advantage it.

You're thinking a little backwards. Software paces itself with the hardware (with rare exceptions, like Crysis, which was ahead of it's time), not the other way. As the new Playstation and Xbox come out, we'll see new stuff to take advantage of it. For example: TressFX just came out on PC. That's an absolute framerate destroyer, but the hair animation looks incredibly good in comparison to before. We've had a plateau for a while, but we're starting to see a climb again--new console hardware should allow more innovation like TressFX.
 
I totally disagree that crysis was a huge jump. The huge jump was HL2 especially ep2. Crysis didn't wow me in the least little bit, HL2 ep2 sure did.
 
The jump from stuff like the original Doom to HL2 was much bigger than HL2 to Crysis unmodded. So not necessarily an "uh what no".

Back on topic: I think that buying a top end card (like a 7970 GHz or 680) now will allow you to game for 6 years, if you accept much lower framerates and lower quality by the last 2 years. The problem is, it's hard to accept lower detail and no AA after having it for a couple years. So you can replace it whenever you think it is no longer adequate and have the money.
 
The jump from stuff like the original Doom to HL2 was much bigger than HL2 to Crysis unmodded. So not necessarily an "uh what no".

Back on topic: I think that buying a top end card (like a 7970 GHz or 680) now will allow you to game for 6 years, if you accept much lower framerates and lower quality by the last 2 years. The problem is, it's hard to accept lower detail and no AA after having it for a couple years. So you can replace it whenever you think it is no longer adequate and have the money.
Doom is from 1993. If you want to talk like that then HL1 was big jump.
 
As for now, I think any high end cpu will last almost indefinitely.

People in this thread have very low expectations in terms of graphical fidelity, framerates, and even more importantly, simulation fidelity.

My i5 3570K can't even individually simulate all 187,000 Sims in my new city, so the game as to cut that down by a factor of 100 and do really simple, stupid AI. I can't do all Sims at full fidelity with realistic AI, thus 2013 CPUs are not sufficient for 2013 games, much less 2015 games.
 
Last edited:
People in this thread have very low expectations, in terms of graphical fidelity, framerates, and even more importantly, simulation fidelity.

Exactly. Which is surprising on an enthusiast website. You'd figure we'd be demanding at least high detail at native resolution with some form of AA.
 
My gaming PC's must:

- Run 1920x1200 with at least 50FPS with settings maxed or tweaked to High (I'm looking at you Crysis 3) in any game.

- AA/AF entirely optional, 4x if I can swing it.

Was it worth buying a 680 for $550 (not US price)? I'd say not really. I always bought mid range before, but decided to splurge for the top end (at the time, excluding the 690). Very few games can take advantage of that 680, and those that do you need to adjust a setting or 2 anyway for 50+ FPS gameplay anyway.

BUT, when I do upgrade I'll still stick with the high end as I can't stand 30-40FPS anymore but that's the only real reason. And at the end of the day, an additional $300 for a top end GPU is only 2-3 days extra work.

Lifespan for a gaming PC is 2yrs max. That only affects the GPU if you have Sandy Bridge or above. The i5's and i7's are good for at least another 3-4yrs.
 
CPU can last for a long time since games are mainly platform bound to lowest common denominator (consoles) but GPUs can become unable to "max" games at common res (1080p) within 2 years on average I would say.

A CPU from 5 years ago can still keep up, mainly because annual performance gains has only been about 5-8%. (upgraded last week from i7 920 in 2008 to 3770k in 2013...I don't think gaming performance increased by more than 2-3 fps in GPU limited situations).
 
Back
Top