Higher resolution 1440p / 1600p & gaming

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Downsampling is great in the fact that you're (artificially) increasing the PPI of your monitor. Most monitors are generally between 70-80 PPI, and HDTV's are even less than that.

I'm am interested in the idea, but don't be fooled, there is no increase in PPI. This is basically an AA method, most similar to SSAA.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
What's the consensus on 8bit vs. 10bit? (2713H vs. HM)

I suppose I can search for threads I'd imagine that's been discussed plenty.

The only applications that benefit from wide gamut color (10 bit per channel) are professional apps such as Premier, 3DS max, and adobe suite. You will not benefit whatsoever from 10 bit color, there is nothing that will use it. Consumer cards can only display 10bpc in D3D, which is worthless (zero applications), the only applications which do benefit are again - professional in nature, very costly, and also requires a workstation card such as quadro , firepro, or tesla.

Unfortunately, some people automatically think that 10 bit wide gamut is better. You cant' even use it - the other thing is, unless a monitor can switch between RGB and aRBG, the 10 bit color screen will be completely oversaturated and will look worse.

So long story short, you dont need a 10bpc screen. If you want one, get one that can switch from adobe RGB (because aRGB will look worse) and understand that you get no benefit from 10 bit color.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Downsampling is not even remotely close to increasing PPI or resolution. The analogy of downsampling being similar is ludicrous. Let's make no mistake, if you're comparing images from a 1080p monitor to a WQHD monitor, the detail of the WQHD screen will always be better by virtue of having nearly double the pixel count, assuming image quality settings are near identical. Heck, even without AA WQHD will look better. This is also ignoring the fact that most 1080p screens are TN which will always display an inferior image to a high quality IPS or even VA panel, and TN panels are nigh unusable on screens larger than 23 inches. Having used a TN 1080p 27 inch, the pixellation was disgusting. I wouldn't ever use it.

This also ignores that PPI is somewhat meaningless on big screens - it is an important metric on small displays, because font sizes are super small. But once you're involved with big screens, it matters less because volume economics dictates that prices go up exponentially for high PPI (4k resolution and beyond) on 27 inch and higher screens (the cost is unbelievable, really) and the fact that viewing distance also ties into the equation. You do not view a big screen 2 inches away from your face as you would a tablet or smartphone. Viewing distance is an important consideration, and anyone judging a big screen or desktop display on the same metric as a handheld is out of their mind. I'm only saying this because of the over-emphasis that some place on PPI with big screen displays - it's not an apples to apples comparison when viewing a smartphone/tablet vs a big screen desktop display. In fact, using the "retina" metric that Apple uses - a WQHD 27 inch monitor is by definition retina when viewed from 3 feet or further away. Viewing distance adjusts the equation for big screen displays.

I had a 27 inch 1080p display, 24 inch 1200p, and a 1080p 23 inch prior to switching to WQHD and 2560x1600. Comparing gaming stills there is absolutely no comparison - WQHD looks far better. It just has that much more detail because of the pixel count. I understand, however, that some look for excuses to hang on to their (excuse the lack of eloquence here, but it's true) cheap garbage 2006 resolution. ;) Whatever suits the buyer, I suppose.

edit: i'm sure someone will jump in with the 120hz argument. 120hz has merits, although I can't bring myself to use a TN for it. Lightboost definitely has a niche, I can't say otherwise - it's very nice and smooth. It's just unfortunate that it's relegated to TN.
 
Last edited:

Granseth

Senior member
May 6, 2009
258
0
71
The only applications that benefit from wide gamut color (10 bit per channel) are professional apps such as Premier, 3DS max, and adobe suite. You will not benefit whatsoever from 10 bit color, there is nothing that will use it. Consumer cards can only display 10bpc in D3D, which is worthless (zero applications), the only applications which do benefit are again - professional in nature, very costly, and also requires a workstation card such as quadro , firepro, or tesla.

Unfortunately, some people automatically think that 10 bit wide gamut is better. You cant' even use it - the other thing is, unless a monitor can switch between RGB and aRBG, the 10 bit color screen will be completely oversaturated and will look worse.

So long story short, you dont need a 10bpc screen. If you want one, get one that can switch from adobe RGB (because aRGB will look worse) and understand that you get no benefit from 10 bit color.

I hadn't thought about the need of a workstation card as a requirement for 10-bit color. But based on that, there is no use for 30-bit color space for average joe. Lets hope it changes in the future as the extra color information is actually visibly when you can use it.
 

Fx1

Golden Member
Aug 22, 2012
1,215
5
81
1920x1200 is the best balance of monitor size. A single GPU can max out most games on 2xAA.

1440p is going to need 2x7950 or even 2x7970 to max out. I depends how much you gain by moving up because GPU upgrades will be pricey and you will need them more often also as games get more demanding
 

biostud

Lifer
Feb 27, 2003
19,790
6,880
136
1920x1200 is the best balance of monitor size. A single GPU can max out most games on 2xAA.

1440p is going to need 2x7950 or even 2x7970 to max out. I depends how much you gain by moving up because GPU upgrades will be pricey and you will need them more often also as games get more demanding

Personally I don't need AA on my 1440p monitor, I use it if my GTX 670 can handle it otherwise I just leave it off. With games supporting FXAA you hardly take any performance hit using it. So having a a single GTX 670 or 7950 with a 1440p monitor is perfectly fine IMO.
 

This Guy

Junior Member
Apr 12, 2011
6
0
0
I have a 1440p, IPS catleap running at 109Hz off a 5870. Looks great. I push settings as high as I can get without dropping below 30fps. I notice next to no latency (monitor has no OSD and is running at a sub 10ms refresh period). I have found 4x AA very important on some titles and a waste of time on others.

I have heard the Qinx QX2710's do 80-135 Hz with Samsung PLS (similar to IPS). They can be had for ~$300-380 on ebay. I've heard the matte version (actually semi matte) looks great.


As for PPI, I find resolution very important for a 27" monitor. I sit two to three feet away from my 27", 1440p panel and I can still see pixels on stationary images. A 1080p, 27" panel may serve you if you sit further away, primarily watch videos/ play games or have less than great eye sight.
 

Lavans

Member
Sep 21, 2010
139
0
0
Kind of an odd way to look at it, as there is nothing at all done to increase PPI, but I guess you look at AA as artificial PPI.

Yes, I know nothing is done to increase the physical PPI. Hence, again, "artificial". The blending of pixels simulates a higher PPI, even though it's still approximately 95PPI on a 23.3" screen. And no, it's not the same as AA.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Yes, I know nothing is done to increase the physical PPI. Hence, again, "artificial". The blending of pixels simulates a higher PPI, even though it's still approximately 95PPI on a 23.3" screen. And no, it's not the same as AA.

That is odd, because the link you gave on how to do it, describes it as an AA method. He said it was basically 2x SSAA. In fact, that is what SSAA does. It renders at higher resolutions and down samples.
http://forums.guru3d.com/showthread.php?t=346325

http://en.wikipedia.org/wiki/Supersampling
This is achieved by rendering the image at a much higher resolution than the one being displayed, then shrinking it to the desired size, using the extra pixels for calculation.
 

Lavans

Member
Sep 21, 2010
139
0
0
It is best described as an AA method, but in reality it is not. It's basically the same thing as increasing the render scale in Arma, Champions Online, Star Trek Online, Neverwinter, and Planetside 2. It improves image quality, but doesn't necessarily anti-alias it.
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
It's a form of supersampling, therefore it can be used as AA solution.

Image is internally rendered at say 3840x2160, and then scaled down to 1920x1080 display giving you similar image as 2x2 OGSSAA.

Similarly as OGSSAA it doesn't do wonders for horizontal/vertical lines, but coupled with FXAA/SMAA it's usually much better than pure MSAA.
Main advantage being that it's an AA solution thats always accessible, and doesn't need (MSAA) game/driver support.
 
Last edited:

Lavans

Member
Sep 21, 2010
139
0
0
It's a form of supersampling

In technicality, yes, but not literally.

There's a distinct difference between OGSSAA and downsampling, primarily when it comes to post processing effects. 2x2OGSSAA + FXAA/MLAA will never be as effective as downsampling + FXAA/MLAA, and other post processing effects such as bloom and ambient occlusion will never render as accurately with 2x2OGSSAA compared to a 4k downsample.

Edit: In addition to that, downsampling can sometimes offer better performance compared to 2x2OGSSAA if AA compatibility has to be set in certain games, making it a much more viable option to work with.
 
Last edited:

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
Oh...? I never though of DS being superior to SSAA.
Is that because these effects are added after resolving OGSSAA
 

Lavans

Member
Sep 21, 2010
139
0
0
Oh...? I never though of DS being superior to SSAA.
Is that because these effects are added after resolving OGSSAA

Exactly. Hardware AA is done during the first pass of the rendering path of 3d scenes, which involves rendering the meshes, textures, normal maps, etc. The lighting algorithms in deferred engines, as well as post process effects in general, are done after the first pass, making AA useless on it. When downsampling, you're literally giving it more room to calculate the post process effects, resulting in a more accurate image.

Though I wouldn't call it superior. They both have their advantages and disadvantages. It really depends on the game you play and the image quality you're going for.
 
Last edited:

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Then we agree to disagree.

As you said yourself, the only difference is when the down sampling happens. It still does the same thing either way. There are other forms of AA other than SSAA, that happens after post processing, it doesn't make them something other than an AA method (FXAA and MLAA for examples).
 

Lavans

Member
Sep 21, 2010
139
0
0
As you said yourself, the only difference is when the down sampling happens. It still does the same thing either way. There are other forms of AA other than SSAA, that happens after post processing, it doesn't make them something other than an AA method (FXAA and MLAA for examples).

Just because it can (inadvertently) achieve a similar effect, that doesn't mean it's the same thing. FXAA and MLAA is not AA, it's merely an edge blur that simulates anti-aliasing.

I can simulate anti-aliasing in screen shots by applying deinterlacing filters, but that doesn't make it anti-aliasing.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Just because it can (inadvertently) achieve a similar effect, that doesn't mean it's the same thing. FXAA and MLAA is not AA, it's merely an edge blur that simulates anti-aliasing.

I can simulate anti-aliasing in screen shots by applying deinterlacing filters, but that doesn't make it anti-aliasing.

So now you are calling things that are written down as antialiasing techniques, and calling them not AA. Morphalogical Antialiasing (MLAA), and Fast Aproximation Antialiasing (FXAA) are AA methods. It's right in their name. Do you even know what AA means? All AA methods use blurring to achieve their results.

Ok, there is no point in arguing. You just plain have come up with your own definition of AA.

Anti-aliasing is any method that reduces jagged edges, or aliasing, in an image. It does not refer to a specific method.
 
Last edited:

Lavans

Member
Sep 21, 2010
139
0
0
Anti-aliasing is any method that reduces jagged edges, or aliasing, in an image. It does not refer to a specific method.

Wrong. Anti-aliasing is the technical term that is used to describe the process of eliminating high signal frequencies that causes graphical artifacts, known as aliasing. Post-AA does no such thing. The artifacts are still there, just blurred.
 

Mark R

Diamond Member
Oct 9, 1999
8,513
16
81
What's the consensus on 8bit vs. 10bit? (2713H vs. HM)

I suppose I can search for threads I'd imagine that's been discussed plenty.

10 bit has virtually no software support, apart from a very few OpenGL apps, so simply being 10 bit does almost nothing. Windows is 8 bit all the way, any 10 bit rendering must be done using the graphics card driver, bypassing any of the windows graphics functions (so needs specifically written software).

However, 10 bit has a minor advantage, if you are recalibrating. Most graphics cards can store a look-up-table which allows colors/brightness to be modified by the card after rendering, but before the signal is transmitted to the monitor. With a 10 bit monitor, you get the advantage of a 10-bit lookup table. However, if you are calibrating, then this is very much a poor-man's option, even with 10-bit.

As it is, the H supports internal 14-bit calibration, and any decent calibration tool can upload a new calibration into the H's internal signal processor.

However, this is not the only difference between the H and the HM.

The H is wide gamut, so can display most of the Adobe RGB color-space, whereas the HM is just about sRGB capable.

The other difference is that the H supports internal re-calibration, whereas the HM cannot be re-calibrated. The HM is supplied from the factory calibrated to sRGB. However, if you need a different calibration (e.g. for mac compatibility, or for medical use) then the HM cannot be recalibrated without seriously degrading image quality.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Wrong. Anti-aliasing is the technical term that is used to describe the process of eliminating high signal frequencies that causes graphical artifacts, known as aliasing. Post-AA does no such thing. The artifacts are still there, just blurred.
No, that is your definition.

To call FXAA and MLAA not AA is ludicrous.

Look up the meaning sometime.

http://dictionary.reference.com/browse/antialiasing?s=t
a technique for smoothing out jagged lines in graphical computer output.
http://www.thefreedictionary.com/antialiasing
In computer graphics, the process of removing or reducing the jagged distortions in curves and diagonal lines so that the lines appear smooth or smoother.
http://www.dahlsys.com/misc/antialias/
This one is interesting, as it lists all the forms of AA, including post process AA methods such as: FXAA, MLAA, TXAA, SMAA
AA Anti-aliasing
Anti-aliasing is any technology that is designed to minimize aliasing and temporal aliasing artifcats.
I can go on, but I should have made my point by now. You are referring to a very specific type of AA, and calling anything else not.
 
Last edited: