- Dec 28, 2001
- 11,391
- 3
- 0
Well?
As I'm a (casual) PC-tech-nerd, I like to read up on what kind of tech the newfangled GPUs are capable of nowadays - but the focus of the review appears to be transitioning from the ol' "can it display the game well (lighting, AA, shadows)" to more of the "how can it display the game (3D, multi-display)?"
Think about it - from the hardware side, as much as we gripe (and it's a legitimate point, IMO) about monitor resolutions becoming more and more standardized to a lower (HDTV-level) resolution, it means less work has to be performed by each GPU.
From the software side, well; I think we've hit parity there too: I mean, look at games made by the Unreal 3 Engine and the Cryengine 3 - both very pretty to look at, and I'm fairly excited to see what idTech 5 has to offer in terms of Rage and all that gaming goodness . . . but then again, people are still playing Modern Warfare 2, Team Fortress 2, Left 4 Dead, etc. and they're games using (or derived from) tech almost a decade old!
Same thing from the CPU side - Intel is routinely bringing out new refreshes every 6 months, and AMD is following suit as well - actually their release schedules appear staggered so that they're coming out with something new every 3 months!
And bear with me here, I'm typing this on a laptop which is a couple cycles old (a little over a year) but it boggles the mind - technology is advancing at a pace that, when I read about it, the software end is lagging behind - and I can vividly remember when the software demanded the hardware to be upgraded (I remember reading about discrete GPUs becoming the-item to get all thanks to quake).
And so, dear PC gamers, is the question: when there really isn't software that can use the newest and greatest hardware out there, what function does the hardware serve?
As I'm a (casual) PC-tech-nerd, I like to read up on what kind of tech the newfangled GPUs are capable of nowadays - but the focus of the review appears to be transitioning from the ol' "can it display the game well (lighting, AA, shadows)" to more of the "how can it display the game (3D, multi-display)?"
Think about it - from the hardware side, as much as we gripe (and it's a legitimate point, IMO) about monitor resolutions becoming more and more standardized to a lower (HDTV-level) resolution, it means less work has to be performed by each GPU.
From the software side, well; I think we've hit parity there too: I mean, look at games made by the Unreal 3 Engine and the Cryengine 3 - both very pretty to look at, and I'm fairly excited to see what idTech 5 has to offer in terms of Rage and all that gaming goodness . . . but then again, people are still playing Modern Warfare 2, Team Fortress 2, Left 4 Dead, etc. and they're games using (or derived from) tech almost a decade old!
Same thing from the CPU side - Intel is routinely bringing out new refreshes every 6 months, and AMD is following suit as well - actually their release schedules appear staggered so that they're coming out with something new every 3 months!
And bear with me here, I'm typing this on a laptop which is a couple cycles old (a little over a year) but it boggles the mind - technology is advancing at a pace that, when I read about it, the software end is lagging behind - and I can vividly remember when the software demanded the hardware to be upgraded (I remember reading about discrete GPUs becoming the-item to get all thanks to quake).
And so, dear PC gamers, is the question: when there really isn't software that can use the newest and greatest hardware out there, what function does the hardware serve?