How long do you think a GPU should run well for?

chrismr

Member
Feb 8, 2007
176
0
0
Was just looking up the time between release dates for consoles. there seems to be a 5 to 6 year gap between one generation and the next. That really does seem to make Consoles a good choice for gaming as you are gauranteed to be playing games on that system for some time.

However, i just could not imagine a GPU for my PC offering decent performance in games for such a period of time. I don't think I have owned a GPU for more than a year before I found myself having to drop resolution's etc.

Such as when I had my x800xl, for what I am sure was only about 9 months and running all games wonderfully, then I got Quake 4 and I had to run on 1024 with medium settings again. And to be honest in the last 6 months I have gone through many GPU's trying to find one I am happy with. A 7900gt 256MB, X1900XT 256 MB, 2 x 7900GS in SLI, and finally my 8800GTX (which, yes, I am happy with). Ok, I was happy with my SLI setup too, but one of the cards died.

But I just wonder, how long is this card going to be good enough to run games well? Without having to drop my res below 1680 x 1050, and keeping settings relatively high with only mild AA (4x)...

While I understand that the advances in hardware are important, I can't help but feel there needs to more of an emphasis on the game programming side to better utilise existing hardware.

Is DX10 going to address this? Because I am beginning to understand why people use consoles. I don't want to upgrade this card for some time, but get the feeling that perhaps beginning of next year, its no longer going to be cutting it for my desired settings.

I just prefer gaming on a PC to a console, but one day it might just be a worthwhile switch to make.
 

tigersty1e

Golden Member
Dec 13, 2004
1,963
0
76
I know what you're saying.

They make console games that look better year after year with the same console (hardware), but computer games can come out with really bad programming.

I might have to give up computer gaming altogether and get an Xbox 360.
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
don't want to upgrade this card for some time, but get the feeling that perhaps beginning of next year, its no longer going to be cutting it for my desired settings.

You don't want to upgrade, but in the same sentence you say you want to.

Perhaps you need to determine what your "desire" is worth to you.

Yes, DX10 should help to some degree, but expect the devs to use up whatever is gained pretty quickly.
 

chrismr

Member
Feb 8, 2007
176
0
0
Sorry, but no I do not contradict myself in that sentence.... what I am saying is that one year from now, I am probably going to find myself lowering my res to play a game decently. And we all know that using anything other than native res on a LCD monitor looks damn ugly.

Lets put it this way: I have the current top of the range GPU. If a console will run all the games made for it for 5 or 6 years to come, with some pretty decent improvements, shouldn't I expect at least 3 to 4 years from a top of the range GPU for the PC at high settings?
 

Sniper82

Lifer
Feb 6, 2000
16,517
0
76
If everyone in the world that games on a PC owned a 8800GTX(2gb,C2D,ect) then it would last you 5-6 years. But they don't or even have anything coming close to the performance of the 8800GTX. So games made for the PC have to be made around older slower hardware as well and they can't focus on one setup like they can with the 360. This is the reason games take longer to make on the PC, are more buggy on release, usually poorly optimized/programmed, ect making you have you upgrade so often.
 

nullpointerus

Golden Member
Apr 17, 2003
1,326
0
0
(I really hope this doesn't degenerate into another PC vs. console flamewar.)

Anyway, the answer to your question depends on your definition of "run well."

The difference between PCs and consoles is that consoles require more work to utilize the same hardware to its maximum potential. PCs, on the other hand, have a common API for a huge performance gamut, so there's not much a developer can do to optimize a given game engine and game design for the targetted hardware specs. Much of the problem rests on the shoulders of the content creators, the people who actually flesh out the levels, objects, rules, storyline, etc. PC game developers are mostly code monkeys.

What you are not mentioning is that console games come out with very poor quality--sometimes being only on par with the previous generation of the same manufacturer--and only realize their full potential when the game developers get their butts in gear after the initial launch craze where the push is to get something out "before the other person." When you buy a console, you're buying something that you know will only be fully tapped near the end of its life. Even then, consoles have typically trailed PC games in visual quality.

By contrast, PC game developers have to aim at a moving feature set, and PC gamers expect new games to use "the latest and greatest" features in their $500+ SLI/Crossfire video setups. Sure, a PC game development studio can create several versions of a particular level or effect, but that dramatically increases developing and testing time on an already strained budget and crazy, marketing-driven deadlines. Those low/medium/high/uber settings you see did not come "free."

To a certain extent, this forced march toward obsolescense is a result of the 3/6 month product refresh/generation update that nVidia started years ago.

My advice: Turn the settings down and focus on playing the game.

If you want new eye candy every time a new game is released, you either pay through the nose for high end cards every 3-6 months or buy a game console. Or talk lots of PC game junkies into not whining every time a new game comes out that has graphics on par with existing games. Take your pick.
 

chrismr

Member
Feb 8, 2007
176
0
0
I have to agree with the product refresh and time between new cards does contribute a lot to this problem. Also that the GPU manufacturers release so many versions of each family.

It would be nice to see something like 2 years between families of GPU's. And that the focus was not only on getting the new hardware out ASAP, but getting the hardware out with the drivers being optimised properly prior to release.

Ok, AMD/ATI is going to be getting close to 2 years since their last card carrying on the way they are, but when the R600 comes out, it is more than likely still going to be plagued by driver problems anyway.

Personally, I think it would do the PC gaming industry the world of good to slow down a little and get some good use out of whats out now before intoducing something new.

Or perhaps move to a system similar to consoles, with the product cycle being only every few years (maybe 4) with games being developed specifically for each generation of GPU. So the new games will no run on the older hardware, but at least you got some good use out of your previous card. Same as you can't run your playstation 3 games on your playstation 2.
 

A554SS1N

Senior member
May 17, 2005
804
0
0
Originally posted by: chrismr
And we all know that using anything other than native res on a LCD monitor looks damn ugly.

Yeah it does, and it doesn't - my 17" has native at 1280x1024, but Unreal Tournament doesn't have any option for that resolution, so I'm stuck with 1024x768, and I'm surprised it's not as blurry as I was expecting - 640x480 is very blurry and 800x600 doesn't look to nice either, but I reckon on res below the native isn't too bad - you could get away with that. :) Well, I can anyway :p