High Res Home Monitors (Intel Timeframe) ?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Concillian

Diamond Member
May 26, 2004
3,751
8
81
While it makes sense to look at it that way, this is not how the term high resolution is used when referring to displays.

Dude, you think I'm an idiot?

Yes, that's not how it's used to refer to displays, but the real meaning of resolution has nothing to do with how marketing jackasses have shaped the word to fool dumbasses into thinking they're getting something they're not.

My point is that as it's used when referring to displays is wrong. More pixels at the same resolution is not high resolution... it's a larger viewing area.
 

iCyborg

Golden Member
Aug 8, 2008
1,344
61
91
Dude, you think I'm an idiot?
I don't see how you concluded this from my post, I wasn't being disparaging.

No, this is not a gimmick from marketing jackasses, nor is it wrong to refer to 27" 2560x1440 display as high resolution. It is standard technical terminology, and it's wrong to redefine it because you feel it's not an appropriate term and unilaterally declaring your definition as the "real meaning". There's a separate term for what you want: pixel density.

There are many examples of such misnomers. E.g. all non-parametric models have parameters, and typically many more than parametric. Correcting someone who says that "neural networks are non-parametric models" because they have parameters is either showing misunderstanding of what the term means in statistics or attempting to quibble over definitions.
 
Nov 23, 2011
69
0
0
This is a silly reasoning. You don't even need crossfire/sli nor the latest and greatest to get 95% IQ without absolutely murdering your frame rates, just disable the single effect that's only there to sell GPU's. Why would you use MSAA instead of FXAA in BF3 when it doesn't even work on most of the geometry? Is DOF in Metro really worth halving your fps for? Do you enjoy invisible sub-pixel tessellation in Crysis 2 (to be fair, I don't think you can limit tessellation on Nvidia's drivers)? I know I'd rather skip AA and go for a higher resolution.



Also, this.

These settings do make a noticeable difference, and when you're dealing with incredibly expensive high resolution monitors you should be striving to get the best looking visuals you can. My high resolution is pointless if I have everything set on super low settings. 4K monitors as they stand now will MURDER frame rates even on crappy settings with current games.
 

Subyman

Moderator <br> VC&G Forum
Mar 18, 2005
7,876
32
86
Am I the only one who wants to see more (cheaper) high quality IPS/IPS like displays first before high res? Not to mention OS support. Not sure on the OS X front, but Win7 isn't exactly grand at scaling.

I totally agree. Last thing I want is trying to push games at a crazy resolution. I'd rather see high quality 120hz IPS panels at standard resolutions. "Retina" is good for something that you hold right up to your face for the high DPI, but I sit 2+ ft from my screen so it is much less needed.
 

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
Dude, you think I'm an idiot?

Yes, that's not how it's used to refer to displays, but the real meaning of resolution has nothing to do with how marketing jackasses have shaped the word to fool dumbasses into thinking they're getting something they're not.

My point is that as it's used when referring to displays is wrong. More pixels at the same resolution is not high resolution... it's a larger viewing area.


You're just making up what you *think* it should mean, but it never has. It just isn't the case.
 

Binky

Diamond Member
Oct 9, 1999
4,046
4
81
Last edited by essential; Yesterday at 09:44 PM. Reason: corrected the title for Binky
Lol, no offense intended. I'm overly sensitive to Apple Disciples and their use of terms or technology "invented" by Apple.