• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Is 1400p gaming worth it?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Magically, every individual will see 77% more pixels with the higher resolution display!

You're trying to make the argument that resolution doesn't matter. It's such an absurd stance that there is little point taking you seriously.

You're free to go game on a 640x480 display since pixel count is not important.

I am NOT trying to say that resolution doesn't matter. I am saying that the difference between 1080 and 1440 is up in the extreme end of things and beyond a certain point the average eye won't recognize any 'Appreciable' difference.

But please feel free to miss-read my (and others) posts any way that makes you feel that little bit superior. apparently some people need that.
 
I suggest a visit to the eye doctor if you can't tell the difference.

Also, realize that quantifiable relates to things that can be directly measured (e.g. 77% more pixels) and is not a term you want to use when suggesting that there is no subjective difference as the difference is extremely quantifiable.
 
Last edited:
I suggest a visit to the eye doctor if you can't tell the difference.

Also, realize that quantifiable relates to things that can be directly measured (e.g. 77% more pixels) and is not a term you want to use when suggesting that there is no subjective difference as the difference is extremely quantifiable.

Interesting because benchmarks are quantifiable. Customer approval rates are quantifiable. Optical acuity is also quantifiable. Standard Deviations are quantifiable.

I guess you were talking about the OTHER Quantifiable.

And I would love to send you a dictionary so that you can look up the term 'Appreciable'.
 
Hooray, you've listed things that are actually quantifiable. Now, step back and think *really* hard and you might just figure out the difference between that list, and what you originally stated.
 
Yeah, I am thinking one of us isn't really understanding. And since this is WAY off topic. Good luck to you.
 
going from 1050 at a 22" to 1440 on 27" inch is definitely giving a better gaming experience. Personally I can run with lower AA levels than before because of the better DPI, also the colors are much better. But I guess it depnds on the games you play. If it's fast paced fps, you probably don't have the time to look at the gorgeous details, while playing Skyrim or similar the extra resolution is going to look better.
 
The framerate hit is too great IMO. I used to use a U2711, and had an i7 920@4Ghz, 12GB RAM and 570SLI to back it up, and the hit was just too great at mostly max settings in BF3. Went for a 120Hz 1080p monitor instead, and things felt much better. I use the U2711 for everything else, but gaming is better at 1080p. I tried running at 1/2 and 3/4 resolution, but it just didnt look as good.

Yeah, I am thinking one of us isn't really understanding. And since this is WAY off topic. Good luck to you.

No just me eh? No wonder you didnt want to play with me anymore! lol j/k :biggrin:
 
You're not including your Dual 670's required to run it maxed out, and most people are not comfortable spending $350 on a Korean monitor with no warranty. (my opinion, I owned a Catleap 🙂 ... and a U2711 at different points. Nothing wrong with it, I'm just saying the premiums for what you get are not worth it in my books, to each there own)

I got the second 670 before I upgraded the monitor anyway. It was not a case of "oh a single card isn't fast enough." I did indeed try a single card on this monitor and it was very playable but I wasn't happy in the end because I never turn any game below Max (not counting AA).
 
For higher resolution monitors and multiple monitors you'll want a graphics card with more than the standard 1gb vram. At least 2gb and with the way games are increasing their textures 4gb on a single video card is not unreasonable.

Some people prefer better graphics and others prefer faster and/or larger monitors. If you want to do serious photography better graphics are important. Otherwise I'd say go to the store and compare the difference for yourself because no two people have the same taste.

The video card memory doesn't matter as far as resolution is concerned. The framebuffer is only a few megabytes.
 
For higher resolution monitors and multiple monitors you'll want a graphics card with more than the standard 1gb vram. At least 2gb and with the way games are increasing their textures 4gb on a single video card is not unreasonable.

Some people prefer better graphics and others prefer faster and/or larger monitors. If you want to do serious photography better graphics are important. Otherwise I'd say go to the store and compare the difference for yourself because no two people have the same taste.

The video card memory doesn't matter as far as resolution is concerned. The framebuffer is only a few megabytes.
 
A simple way of looking at it is at 1200, 1440 or 1600 resolution, you get more stuff on the screen. If you are a Civ 5 map you get to see a BIGGER part of the map at 1600 than you do at 1080. I am not a fan of 1080 big screen displays, I had a Dell 2405 @1920x1200 and then purchased an LG 27" 1920x1080 and hated it. I took back the LG, I missed the 120 lines of resolution. To me 16:9 is most suitable for video and 16:10 is for computer work, I currently run a Soyo 24" 1920x1200, a Samsung 26" 1920x1200 and a Dell 30" 2560X1600. I think 1200, 1440 or 1600 is entirely worth it!!
 
Yes it is a big difference going from 1080p to 1440p. It is actually more noticeable on things that don't include movement such as text and images then things with movement such as games.
 
I would say skip it for now. 1400p will never be a standard. Just stick with 1920x1080/1200 (getting harder to find 1200). The 4k standard is closer to reality than anything else, and that will cause a trickle down in all video displays. The consumer electronics industry needs their next big "thing" to keep people buying/replacing their TV's and video equipment (and keep their profits up since competing on existing stuff which hundreds of companies make has very little profit margin if any depending on labor and exchange rates).

I don't know soon this change-over will occur, but with the initial displays being released for the second half of this year TV upgrades, it will not be more than 2-3 years for all the panel manufacturers to have a 4k production lines and have prices drop. Computer monitors in the 27-32" range will most likely see this within a year. Panasonic was demoing a prototype 20" IPS panel earlier this year. I can see Apple's early 2013 32" displays possibly having this given their "retina" display updates on most things this year (which actually have a higher pixel per inch count than a 32" 4k display would have).
 
Back
Top