Is it easier for a graphics card to do antialiasing or push more pixels?

fuzzybabybunny

Moderator<br>Digital & Video Cameras
Moderator
Jan 2, 2006
10,455
35
91
This is in regards to choosing a new laptop.

I'm debating between a 1920 x 1080 px IPS panel or a 3200 x 1800 px IPS panel or a 2560 x 1440 px IPS panel.

I could have 1920 x 1080 and turn on AA in games (not sure how much would be required to get close to the smoothness of 3200 x 1800 or 2560 x 1440?) or 3200 x 1800 / 2560 x 1440 with no AA.

3200 x 1800 is pushing through 2.78 times more pixels than 1920 x 1080.
2560 x 1440 is so my question is 1.78 times more than the 1920 x 1080.

Is it easier for a graphics card to just do 3200 x 1800 / 2560 x 1440 with no AA or 1920 x 1080 with some amount of AA to get around the same smoothness as the formers?

I'm not sure how higher resolution textures or shadows or whatever other visual additions would affect this as well. For example, would 1800 with no AA and medium details / shadows be harder for a card to run than 1080 with AA and medium details / shadows?

Another option for me would be to game at 1600 x 900 on the 3200 x 1800 since that would still be sharp with the pixels matching up evenly. 1280 x 720 on the 2560 x 1440 is getting to be too low despite the crispness...

I just can't stand the blurriness of interpolation of non-native resolutions.
 
Last edited:

thesmokingman

Platinum Member
May 6, 2010
2,307
231
106
I think in general it works out the same however the larger resolution is more pleasurable to the senses. W/o changing gpu power, I'd rather have the larger resolution at lower IQ settings. You can test this out by using BF4's supersampling on your gpu, albeit to test the load on your gpu.
 

ThatBuzzkiller

Golden Member
Nov 14, 2014
1,120
260
136
It's almost certainly easier to do anti aliasing and that also applies to deferred renderers with MSAA than to super sample everything ...
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
It all depends on the type of AA used. SSAA x4 would be pretty similar to going from 1080p to 4k, but MSAA x4 would be far less. But even MSAA x4 in some games is a severe hit, while not so much in others.

The most accurate test to see what performance you'd get, you can use downsampling, DSR or VSR. Downsampling performance will be pretty much identical to the resolution you choose. DSR and VSR may have a little bit of difference, but still reasonably close.
 

Sabrewings

Golden Member
Jun 27, 2015
1,942
35
51
As mentioned, it depends. Antialiasing can be a huge memory bandwidth hog. So can high resolutions, but not to the same degree. High resolutions will typically require more memory, so if you have it that option can be easier.

My personal preference would be for higher quality at an acceptable resolution. If you're happy with how 1080p looks, adding on the extra bling (AA, higher res textures, various shadow effects) will be a more stunning experience than a higher res screen with lower res textures and/or lower effects.

Bystander's idea of using DSR/VSR to try it out is very sound. I render at 4k and have it downsampled to 1080p with DSR. The resultant image is absolutely stunning; better than any level of AA I've tried. It's such a brute force method, though, that your card has to be able to push 4k. There are other factors you can use and not render all the way up to 4k and still get most of the benefits. To me, I paid for a 980 Ti, so I'm going to use it.
 

Zodiark1593

Platinum Member
Oct 21, 2012
2,230
4
81
I'm running a 768P 32" hdtv as my main monitor here. I run BF3 at close to 1080P resolution via DSR with deferred AA at 2x. Runs quite beautifully even with Shadowplay going. Post AA on a low resolution looks absolutely horrific though so I leave it off. Running a 960 here.
 
Last edited:

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
Get the best monitor you can afford. The monitor will be around a long time.

It's so easy and more frequent to swap out your video card, so you'll probably use the monitor for far longer and get a more powerful video card in a year to push as many pixels as you need.

Also, if you really don't like the blurriness of interpolation of non-native, then you may also dislike the interpolation used for anti-aliasing, so just go for pure pixel count increase.
 

CakeMonster

Golden Member
Nov 22, 2012
1,384
480
136
Getting the best monitor is generally a good idea but these days it can be argued that you'll want to wait for a newer version of DP as well as 4k 120hz support and for the FS/GS debate to be settled. You can spend a ton and still not be sure that you'll be able to make it use the full potential of graphics cards and standards in only 2-3 years.
 

fuzzybabybunny

Moderator<br>Digital & Video Cameras
Moderator
Jan 2, 2006
10,455
35
91
Hmmmm... My post was originally intended to be about gaming on a laptop. I'm debating between these resolutions for a new laptop that'll hopefully last me for a good long time.
 

poofyhairguy

Lifer
Nov 20, 2005
14,612
318
126
Are you only using this laptop for gaming? Because higher res screens will get you nicer fonts elsewhere, but some programs play very badly with Window's 10 DPI scaling so if you have a lot of legacy apps I would only consider 1080p.
 

fuzzybabybunny

Moderator<br>Digital & Video Cameras
Moderator
Jan 2, 2006
10,455
35
91
Are you only using this laptop for gaming? Because higher res screens will get you nicer fonts elsewhere, but some programs play very badly with Window's 10 DPI scaling so if you have a lot of legacy apps I would only consider 1080p.
Yeah... That's another barrel of worms at this point. I'm looking at something 14" or smaller for the portability.

1800p on a 14" would make everything buttery smooth and hardly distinguishable to the eye since it's around the 250 dpi mark. As far a future-proofing in terms of LCD detail and sharpness, it doesn't get much better than being at the level of what the human eye can physically resolve.

No legacy programs, but Windows 10 doesn't seem to be ready for high resolution concerning basic things like text and icons. I'll also be coding in Ubuntu Linux in addition to gaming on Windows. No idea how Ubuntu plays with 1800p...
 

tential

Diamond Member
May 13, 2008
7,355
642
121
I think in general it works out the same however the larger resolution is more pleasurable to the senses. W/o changing gpu power, I'd rather have the larger resolution at lower IQ settings. You can test this out by using BF4's supersampling on your gpu, albeit to test the load on your gpu.
Agree with this and if you can downsample it's better than aa. I never apply aa actually. I'll apply max settings at 1080p. Then bump resolution up for down sampling from there.
 

fuzzybabybunny

Moderator<br>Digital & Video Cameras
Moderator
Jan 2, 2006
10,455
35
91
Agree with this and if you can downsample it's better than aa. I never apply aa actually. I'll apply max settings at 1080p. Then bump resolution up for down sampling from there.

Soooo... I'm totally confused on how this downsampling actually works or even what it is. What is the native res of the monitor you are using?
 

dogen1

Senior member
Oct 14, 2014
739
40
91
What games do you play, or are planning on playing? And what kind of hardware are you looking at, gpu wise?

Soooo... I'm totally confused on how this downsampling actually works or even what it is. What is the native res of the monitor you are using?

Essentially you're rendering the game at a higher res, and the graphics driver(or a program like GeDoSaTo) handles converting it back down to your monitors native resolution. It produces better quality because it can look at multiple pixels from the higher res shot for each final pixel, so you get full screen SSAA-like antialiasing. Of course, the quality will depend on high you go.
 

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
Oh for a laptop, hmm, well then switch what I said to get the most powerful GPU you can afford, because it will be locked-in and you can't upgrade to a beefier GPU later.

Do any of your options have a discrete GPU accelerator included? If any are using the built-in VGA of the CPU then maybe pass on those and look for one with a discrete GPU.

This comes in handy if you ever connect the laptop to an external display, or game on it.
 

poofyhairguy

Lifer
Nov 20, 2005
14,612
318
126
Yeah... That's another barrel of worms at this point. I'm looking at something 14" or smaller for the portability.

1800p on a 14" would make everything buttery smooth and hardly distinguishable to the eye since it's around the 250 dpi mark. As far a future-proofing in terms of LCD detail and sharpness, it doesn't get much better than being at the level of what the human eye can physically resolve.

No legacy programs, but Windows 10 doesn't seem to be ready for high resolution concerning basic things like text and icons. I'll also be coding in Ubuntu Linux in addition to gaming on Windows. No idea how Ubuntu plays with 1800p...

14 inches and Ubuntu? To me that screams 1080p plus a Nvidia (or even Skylake Intel) GPU.

What exact laptop are you looking at?
 
Last edited:

MongGrel

Lifer
Dec 3, 2013
38,751
3,068
121
Hmmmm... My post was originally intended to be about gaming on a laptop. I'm debating between these resolutions for a new laptop that'll hopefully last me for a good long time.

Why are you even messing with a laptop, some guy posted earlier about how phones are going to take over the gaming world.

:colbert:

:biggrin:
 

tential

Diamond Member
May 13, 2008
7,355
642
121
Soooo... I'm totally confused on how this downsampling actually works or even what it is. What is the native res of the monitor you are using?
Mine is 1080p. It's a projector because I like big screen gaming!
Others have explained it but I render and display the image at 1080p.
1080p on an 80 inch screen is nice, but I wanted more quality. Since I can't get higher resolution projectors (without breaking the bank) I use vsr(you could use dsr too but something with dsr image iq is off to me) and it now renders the game at 1800p. It then scales that image back to 1080p which gives it a better image quality.

It's a huge difference to me visually. I can't pick out the individual spots where iq is better, but it's obvious when I'm down sampling or not. My roommate who is not a gamer at all instantly could tell with bioshock 2.

But I mean a lot of gamers here on this forum don't use it as much or are playing new titles and don't have the gpu horsepower.

Think of it like this though, no matter your resolution size on your monitor, you always want to game at the highest resolution possible.

For dolphin emulator, I game at 9 to 11 times the normal games internal resolution. It cleans up the game nicely. Super Mario Galaxy 2 is gorgeous.
 

fuzzybabybunny

Moderator<br>Digital & Video Cameras
Moderator
Jan 2, 2006
10,455
35
91
14 inches and Ubuntu? To me that screams 1080p plus a Nvidia (or even Skylake Intel) GPU.

What exact laptop are you looking at?

I'm looking at the newer MSI GS40 Phantom: http://www.msi.com/product/notebook/GS40-6QE-Phantom.html

Right now it's 1080p but a higher 3K or 4K panel will be available eventually, which would probably add ~$600 more to the cost. I won't be gaming on external monitors, so I guess getting the higher res screen from the get-go will be good?

It'll be smoother looking for daily tasks and I can just wait for Windows 10 and Ubuntu to catch up to Apple in terms of utilizing high-DPI displays sensibly.

In the future, when games surpass the ability of the card to push 4K, I can still run games at 1600 x 900 with details high and AA on and have a crisp experience.

Kind of the best of both worlds?

The only bummer right now is that I'm still considered an early adopter of 3K and 4K screens, hence the stupid $600 price premium to upgrade...