Originally posted by: jiffylube1024
If resolution is all that matters, why bother with AA/AF at all?
That's nothing more than a ridiculous strawman. You're the one that demanded "best possible IQ" but then started arguing the resolution doesn't count because it doesn't fit into your delusional idea of image quality.
1280x1024 is
not "best possible IQ".
4xTrAA/4xAAA is
not "best possible IQ".
This entire thread was constructed around the basis of the reasoning level of that of a six year old, and your arguments since have done nothing but follow that trend.
Where is this argument going, aside from proving that 5150Joker is wrong from a semantic point of view?
I'd say Joker is on the right path with his "highest IQ possible," even if the way he wrote it doesn't pass your strict standards for nitpicking, I for one found his argument pretty clear
😉 .
When he is saying "highest IQ possible," he obviously means "best IQ possible at playable framerates," or even more precisely, at frames he considers playable for himself. Sort of like what HardOCP does, only using his own personal criteria of playability, like we all do when we judge what is playable to us.
I'd agree with this methodology too: 2048X1536 is all fine and good as an estimate of future game performance, but at the end of the day, I'm primarily looking at 1600X1200 tests (with and without AA, and with AF all the time), because that's what I use and I want to know what IQ settings I can run and maintain playable framerates with.
These kinds of normative discussions seem to go nowhere; setting a new standard with 1280X960 (or 1280X1024; yes many people do run LCD's at this resolution) as a low resolution or a medium resolution... Other than to make the person feel like crap because their shiny new LCD is now considered a "low" resolution, what purpose do arguments like this serve?