• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

High Res Home Monitors (Intel Timeframe) ?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
While it makes sense to look at it that way, this is not how the term high resolution is used when referring to displays.

Dude, you think I'm an idiot?

Yes, that's not how it's used to refer to displays, but the real meaning of resolution has nothing to do with how marketing jackasses have shaped the word to fool dumbasses into thinking they're getting something they're not.

My point is that as it's used when referring to displays is wrong. More pixels at the same resolution is not high resolution... it's a larger viewing area.
 
Dude, you think I'm an idiot?
I don't see how you concluded this from my post, I wasn't being disparaging.

No, this is not a gimmick from marketing jackasses, nor is it wrong to refer to 27" 2560x1440 display as high resolution. It is standard technical terminology, and it's wrong to redefine it because you feel it's not an appropriate term and unilaterally declaring your definition as the "real meaning". There's a separate term for what you want: pixel density.

There are many examples of such misnomers. E.g. all non-parametric models have parameters, and typically many more than parametric. Correcting someone who says that "neural networks are non-parametric models" because they have parameters is either showing misunderstanding of what the term means in statistics or attempting to quibble over definitions.
 
This is a silly reasoning. You don't even need crossfire/sli nor the latest and greatest to get 95% IQ without absolutely murdering your frame rates, just disable the single effect that's only there to sell GPU's. Why would you use MSAA instead of FXAA in BF3 when it doesn't even work on most of the geometry? Is DOF in Metro really worth halving your fps for? Do you enjoy invisible sub-pixel tessellation in Crysis 2 (to be fair, I don't think you can limit tessellation on Nvidia's drivers)? I know I'd rather skip AA and go for a higher resolution.



Also, this.

These settings do make a noticeable difference, and when you're dealing with incredibly expensive high resolution monitors you should be striving to get the best looking visuals you can. My high resolution is pointless if I have everything set on super low settings. 4K monitors as they stand now will MURDER frame rates even on crappy settings with current games.
 
Am I the only one who wants to see more (cheaper) high quality IPS/IPS like displays first before high res? Not to mention OS support. Not sure on the OS X front, but Win7 isn't exactly grand at scaling.

I totally agree. Last thing I want is trying to push games at a crazy resolution. I'd rather see high quality 120hz IPS panels at standard resolutions. "Retina" is good for something that you hold right up to your face for the high DPI, but I sit 2+ ft from my screen so it is much less needed.
 
Dude, you think I'm an idiot?

Yes, that's not how it's used to refer to displays, but the real meaning of resolution has nothing to do with how marketing jackasses have shaped the word to fool dumbasses into thinking they're getting something they're not.

My point is that as it's used when referring to displays is wrong. More pixels at the same resolution is not high resolution... it's a larger viewing area.


You're just making up what you *think* it should mean, but it never has. It just isn't the case.
 
Back
Top