Reading the Max Payne 3 thread got me thinking.
I'm not too into the world of direct game engine programming or graphics, i've programmed a "bit" on the serverside of mmo's and understand the need to design systems with throughput\simplicity in mind(Don't use no vectors if an array will do!).
But how do we, as geeky gamers find out and determine wether an awesome game like Crysis 1 - has a coded engine for using all the throughput of a cpu\gpu without bottlenecking on thread codepaths, for some subsystem.
Obviously there's different pragmatic tests.
There's the good ol eye too.
...what i'm getting at - how do we as gamers define a beast requiring game from a sloppy shitty console port?
Take the highest IQ game, compare new game and performance\IQ to it?
Crysis 1 was amazing for it's time.
Still sort of is.
But what if it just had the high IQ and would be a choppy game on
Would you still say Crysis 1 is an awesome engine job - if id still had problems maxing on a gtx 480 ? or gtx 280 for that matter.
Discuss!
I'm not too into the world of direct game engine programming or graphics, i've programmed a "bit" on the serverside of mmo's and understand the need to design systems with throughput\simplicity in mind(Don't use no vectors if an array will do!).
But how do we, as geeky gamers find out and determine wether an awesome game like Crysis 1 - has a coded engine for using all the throughput of a cpu\gpu without bottlenecking on thread codepaths, for some subsystem.
Obviously there's different pragmatic tests.
There's the good ol eye too.
...what i'm getting at - how do we as gamers define a beast requiring game from a sloppy shitty console port?
Take the highest IQ game, compare new game and performance\IQ to it?
Crysis 1 was amazing for it's time.
Still sort of is.
But what if it just had the high IQ and would be a choppy game on
Would you still say Crysis 1 is an awesome engine job - if id still had problems maxing on a gtx 480 ? or gtx 280 for that matter.
Discuss!
Last edited: