- Mar 18, 2007
- 11,959
- 156
- 106
I found topics for it but they are to old to bump.
So which resolution should I go for in a new monitor if I am a gamer ?
So which resolution should I go for in a new monitor if I am a gamer ?
Depends on what GPU you have I suppose? I am using a 1200p one, came from a 1080p one, and even after calculating that 1200p is ~31% extra resolution I still went for the upgrade and I loathe my decision. The first reason is I am now having 30% less frames per second, secondly while recording via Dxtory I have to downscale the game to 1080p anyway to get rid of the zooming effect, and thirdly this monitor is garbage and the person who says there are no good 1200p gaming monitors available is absolutely right.
Right now I will happily switch to a 1080p monitor, call it a downgrade or whatever.
No brainer, 1200p has 120 more p's, and since I'm getting older I p a lot more often so I like that..
Yeah, I definitely prefer 1200p over 1080p.
.
Small answer go for 1080p screen simply because it's much better supported by the games and softwares. Even movies and other content have 1080p native resolution. A 1200p resolution would mean that you would need to upscale or resize most of the things. And any content whether it's movies, gaming, pictures and videos etc all look best on there native resolution only. Resizing or upscaling losses the clarity and original perspective on which the developer or movie maker wanted you to view the content.
No. You should choose 1920x1080 because:
1) The range of choice is massively better than if you restrict yourself to 1920x1200, and
2) AFAIK, no 1920x1200 monitor supports gamer-friendly features like 120-144Hz, G-Sync, ULMB/LightBoost, etc.
3) 1920x1200 is being effectively abandoned
It had always been my experience that departing from the recommended resolution on a monitor had certain undesirable results. For instance, I used to use the Win 7 "Gadget" clock. Using other than the recommended resolution, the clock would be ellipsoid rather than round. Would that sort of thing happen when using 1920x1200? If a monitor is 1920x1200 "capable," is that the "recommended" resolution?
Some minority of us think it necessary or desirable to use as KVM switch -- deploying a single monitor across 2 or more PCs. I've noticed that some of the contemporary DVI/USB 4-port switches (IOGEAR, StarTech) tout 1920x1200 capability in their specs.
Just a thought on this, without being an irritant.
I'm no expert on flat-panel/LCD-LED displays. I used a cheap Hanns-G (on and off) since around 2010, and it died on me. So I had to choose whether to move up to 1440p or 4K now, or simply replace the Hanns with something less expensive (but better) now.
It had always been my experience that departing from the recommended resolution on a monitor had certain undesirable results. For instance, I used to use the Win 7 "Gadget" clock. Using other than the recommended resolution, the clock would be ellipsoid rather than round. Would that sort of thing happen when using 1920x1200? If a monitor is 1920x1200 "capable," is that the "recommended" resolution?
Choosing between a 4K monitor and a 1080p monitor is an important and fundamental choice. It would raise not only the factors of price, but also whether you're "ready" for it. But choosing between a 1920x1200 and 1920x1080 just doesn't seem to be a hilla-beans difference.
Some minority of us think it necessary or desirable to use as KVM switch -- deploying a single monitor across 2 or more PCs. I've noticed that some of the contemporary DVI/USB 4-port switches (IOGEAR, StarTech) tout 1920x1200 capability in their specs. That doesn't mean anything to me as long as they're 1920x1080 capable. And I will hold off buying a new KVM until they are either 4K capable, or the 4K capable KVMs don't cost between $400 and $500.
Frankly, in that latter respect, I won't move up to a 4K monitor until they become more prevalent among the offerings and the price declines somewhat.
When I do, I will have planned for the transition, and the planning . . . begins now . . . .
4K, lets open up another can of worms shall we?
For uneducated masses that means that horribad 16:9 3840x2160 that every TV and their mother will have. Then to make things perfectly clear there is this 4K DCI 4096x2160 19:10 resolution for movie industry. And ofcourse FOR COMPUTERS there is this lovely 16:10 aspect resolution of 3840x2400 so you can actually do some stuff without turning your head around, guess you don't want to hear my opinion of that ultra wide resolution abomination (21:9, seriously people, want neck injury?).
Dear god/FSM why o why does there have to be a gimped resolution for masses when professionals won't use it. Could it have anything to do with how panels are manufactured hmm?
the 1920x1200 monitors are pricey enough to where it might just bump your price range up to 2560x1440 anyway.
I mean there was that HP 32" 2560x1440 monitor on sale for $399 come on now...
guess you don't want to hear my opinion of that ultra wide resolution abomination (21:9, seriously people, want neck injury?).
Personally, I'd rather have multiple monitors as opposed to one ultrawide, because you don't get nearly as many snap-to angles. You can have 6 half-wide windows open in Windows, and you don't even have to touch your mouse to make it happen, no manual re-sizing needed. Just hit Windows Key + Left, Right, or Up/Down. Left/Right will cycle through half-wide and maximized windows on all attached monitors, and Up will maximize a window on the current display while Down will first windowize an application and then minimize with a second press.