Why is current hardware just not so "current"?

m21s

Senior member
Dec 6, 2004
775
0
71
First things first.
I game only on a PC and I am not a huge fan of consoles.

Why is it that even todays "current" video cards still get brought to its knees in current gen games?

I game at 1600x1200 I own a 2001FP and this is pretty standard (it is 2006) im all set with 1024x768.

Who is to blame? The hardware side of it (Nvidia,Ati)? Or are the game developers just releasing games they know wont be played in ALL there glory for a full year or 2?

Oblivion is a prime example, I absolutly love this game!
But even on todays best systems running it at 1600x1200 with FULL shadows and HDR and everything else this game is a slideshow once you get into battle or there is alot of movement.

I dont know about you but when a game is released I want to play it how its supposed to be played, I do not want to wait a couple years for the hardware to catch up? :confused:

I don't know...its just one of those days today
 

kleinwl

Senior member
May 3, 2005
260
0
0
There is an easy solution to your problem... throw money at it.

Honestly, if games didn't push hardware to the limits I wouldn't be happy. There would be no reason for hardware vendors to constantly throw R&D money at hardware to improve it. The ever increasing capability of equipment would slow down and eventually we would all be left with PCs that acted like consoles, never improving.

So... don't cry that your gaming experience, with the exact same game you bought 2 years ago, improves over time... be happy that the programers have given you the ability to pull out an old game and experience it like new another time, taking full advantage of the quantum leap in computing that has occured in the mean time.
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: m21s
Why is it that even todays "current" video cards still get brought to its knees in current gen games?

First, this is hardly a new phenomenon. It's very easy to write software (gaming or otherwise) that will push hardware way beyond its limits. It's unusual for brand-new games to not push the current-generation hardware very hard at maximum settings/resolution.

Who is to blame? The hardware side of it (Nvidia,Ati)? Or are the game developers just releasing games they know wont be played in ALL there glory for a full year or 2?

Would you rather have them release games that look like ones released in 2002-3 at super-ultra-mega-high detail, or ones that look like 2002 games at 'medium' settings and actually push the hardware at maximum settings with all the eye candy on?

I dont know about you but when a game is released I want to play it how its supposed to be played, I do not want to wait a couple years for the hardware to catch up? :confused:

You can't have it both ways.

It's very, very hard to match the 'maximum' settings for a game to what the 'best' hardware will be when it comes out, especially if you're planning a year or more in advance! Not to mention that the 'best' hardware today is something like a 7900GTX SLI or X1900XTX Crossfire setup, so if they aimed for that level of performance for the max settings, no solution under $500 is going to have a prayer of running them.

If they aim too high, but you can scale down the detail or effects, people with high-end systems will be happy, and people with lower-end systems can still play it at lower resolution or detail settings.

But if they aim too low, people won't be 'wowed' by the graphics, and the high-end system owners will whine: "I spent $600 on my awesome video card setup; why isn't there anything that actually uses it???!!!"
 

m21s

Senior member
Dec 6, 2004
775
0
71
You both make great points and I do agreee.

Just in todays age with CPU's being so "fast" I feel the video side is still in the dark ages.

I don't know maybe it's just me :)

Maybe i am just "uneducated" on the video side of things :)