What is the benefit of playing game at 1920X1080?

cool.dx.rip

Senior member
Mar 11, 2013
226
0
71
Guys,i have a RES. about 1368X768.a frnd of me is doing fun on me cause he is saying that very soon i will have a powerful gpu but don't have a decent RES.monitor so i can't utilize it well.can u tell me what is the advantage of it?What i find out that it's have a bigger screen.This RES. have become standard of testing games.Thanksssssssss
 

Red Storm

Lifer
Oct 2, 2005
14,233
234
106
Higher res means better and sharper picture, and in many games it allows you to see more of the game world.
 

Kalmah

Diamond Member
Oct 2, 2003
3,692
1
76
If I'm not mistaken, your computer's ability to perform well on a high resolution monitor is more dependent on your cpu than your gpu.

Higher resolution smooths out the jagged edges more, lowering the need for anti-aliasing.

Also, in some games it gives you more screen space which is very useful in games such as Eve Online.
 

ericlp

Diamond Member
Dec 24, 2000
6,137
225
106
1920X1080

You mean HD? 1080P? Well, it means you can go down to the store and buy a 46" TV for 450 bucks at walmart and start PLAYING games! :) FarCry 3 looks pretty good on a 100 dollar video card ATI 7770. So that's my recommendation! ;)
 

RPD

Diamond Member
Jul 22, 2009
5,100
584
126
On your desktop, change your resolution to 800x600 look a the screen the icons and note their locations.

Now change it back to whatever much higher resolution you normally have.

The difference? Same in gaming.
 

PrincessFrosty

Platinum Member
Feb 13, 2008
2,300
68
91
www.frostyhacks.blogspot.com
Higher screen resolution gives you a clearer picture and allows you to see more detail the scene, but you need a monitor capable of displaying the resolution.

Higher screen resolutions take more GPU power to run smoothly, if you're splashing out on a fast GPU then you might want to consider a higher resolution monitor or that rendering power might just be going to waste.
 

Bonesdad

Platinum Member
Nov 18, 2002
2,213
0
76
It doesn't matter at all if the game you are playing sucks. If the game is good, it can add an element of immersion. Ultimately, the enjoyment of a game is in the writing/acting/production value. Same as movies.
 

shortylickens

No Lifer
Jul 15, 2003
80,287
17,080
136
On your desktop, change your resolution to 800x600 look a the screen the icons and note their locations.

Now change it back to whatever much higher resolution you normally have.

The difference? Same in gaming.

Umm, no not really. Very few games will see that kind of drastic change in quality.

Go play UT2004 with maxed AA at 800x600 and then again at 1600x1200.

The difference really isnt that noticeable. I know cuz I've done it, a LOT.
 

Wardawg1001

Senior member
Sep 4, 2008
653
1
81
In general, higher resolution means better looking graphics in games, though the amount of improvement may be minimal in some games. In some games you will also be able to see a larger portion of the playing area.

Also, if you upgrade your monitor and get a larger screen, you may notice that some older games actually look worse than before, especially if they don't support the max resolution of your monitor - this is mostly just because the flaws are easier to see because they are physically larger than they used to be.

Since you just threw down a bunch of money for a powerful GPU, I assume you are looking to play at least some new games, in which case you will almost definitely want a monitor that can do 1920x1080.
 

HeXen

Diamond Member
Dec 13, 2009
7,832
37
91
I never really thought it was all that over 1280x1024, i use it cause it's there, but i find it fairly negligible. I'd rather just see artistically detailed games and better AI within a more interactive world than i care about resolution or aliasing.
 

Craig234

Lifer
May 1, 2006
38,548
350
126
I think it's a nice improvement in sharper graphics, but you should compare for yourself.
 

KaOTiK

Lifer
Feb 5, 2001
10,877
8
81
I never really thought it was all that over 1280x1024, i use it cause it's there, but i find it fairly negligible. I'd rather just see artistically detailed games and better AI within a more interactive world than i care about resolution or aliasing.

You lose tons of detail with lower resolution.

The Dark Souls PC port showed it better then anyone could imagine. Before the mod to get it to render at higher res internally it was doing 720. Screenshots taken of before and after the mod, there was so much detail missing it was incredible, but no one knew the texture artists did that much work because it was never seen on the console version and wouldn't have been seen on the PC version with the mod.
 

Bateluer

Lifer
Jun 23, 2001
27,730
8
0
Umm, no not really. Very few games will see that kind of drastic change in quality.

Go play UT2004 with maxed AA at 800x600 and then again at 1600x1200.

The difference really isnt that noticeable. I know cuz I've done it, a LOT.

I've done that with a few games as well, and the difference is still night and day. AA looks progressively better at higher resolutions as well.

Eh, it's the other way around.

^ This is correct. Low resolutions are more CPU bound, higher resolutions are GPU bound. Low resolutions are usually used as CPU benchmarks from reputable sites.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,000
126
Higher res means better and sharper picture, and in many games it allows you to see more of the game world.
That won't happen unless the FOV also changes, which won't happen unless the aspect ratio changes.

Umm, no not really. Very few games will see that kind of drastic change in quality.

Go play UT2004 with maxed AA at 800x600 and then again at 1600x1200.

The difference really isnt that noticeable. I know cuz I've done it, a LOT.
You must be joking. Someone would have to have a serious vision disorder not to notice the slap-in-the-face difference.
 

Markbnj

Elite Member <br>Moderator Emeritus
Moderator
Sep 16, 2005
15,682
14
81
www.markbetz.net
I've done that with a few games as well, and the difference is still night and day. AA looks progressively better at higher resolutions as well.

Yeah, it makes a huge difference. You're trying to simulate the continuous color spectrum of a natural scene, so the more discrete pixels you have to apply the anti-aliasing algorithms to, the smoother the result will be.
 

Zenoth

Diamond Member
Jan 29, 2005
5,198
205
106
Well I've bought a LCD recently although I still have my beloved 19" LG Flatron F900B CRT (flat screen, but CRT nonetheless). I use the CRT on my Windows XP rig (not the one in my sig) mostly to play older games. And it's usually turned off anyway to save some electricity and help a bit with the bill too, that CRT is a beast (and itself consumes probably 2x the energy that the rest of the rig does). Anyway, point is I went from my CRT's native resolution of 1280x1024 to my LCD's (Samsung S23A700D) 1920x1080. When I switched the rig around along with the LCD I also moved most of my games to my "current gaming" rig (the one in my sig, the other one has my older GTX285 and my older Intel E8400, with 4GB DDR). I can say that ultimately "just" the higher resolution itself, by itself, won't do much if the game's own textures were not worked on that much to start with by the developers.

In other words, if you have washed out, blend and blurry 512x512 textures at the game's highest settings and play it at 1024x768, 1280x1024 or 1920x1080 the only thing that it's going to do is it'll stretch out the lack of details further for you to shake your head at when you stare at it. Also, on my LCD (not sure if it applies to all LCDs, I'm kinda new myself to the whole LCD technology since only two months in fact) if I game at any resolution that's not equivalent to my native resolution's ratio then the overall picture (display in general, gaming or not) looks either stretched out horizontal or "compressed" vertically (not sure how to describe it other than it doesn't look "right"). So sometimes I have to "force" myself playing at my native resolution on my LCD with a game that although does support the HD resolution(s) won't exactly do itself justice (I.E. the game's own "best" textures settings aren't HD-friendly to start with).

I mean... playing UT99 at 1920x1080 does not give me "more details" on the textures compared to how said textures look like at 1024x768 (I know, I've tried). But, on my LCD due to the now standardized shape (rectangular) of HD displays I pretty much "have to" play comparably "old" games at my native resolution anyway not "because it gives better details" but simply because otherwise the display looks weird, whereas on my CRT I could be content with a ratio along the lines of 4:3 or 5:3 (say like 1024x768), rather than 16:9 (like HD res). With this said, however, don't get me wrong! There are plenty of recent games (not just released "right now", but recent I.E. from a couple of years ago as well) which certainly benefit from higher resolutions exactly because the devs worked on the textures enough so that when displayed at HD resolutions will "fully reveal" their details as "meant to be" by the developers, probably because first of all they know what they're doing, they're good artists and they probably worked on 2048x2048 textures or something like that.

I mean the better the texture is by itself the more the HD res will display that. But don't expect HD res to "add" detail out of thin air to a texture that doesn't have any details on it to start with.
 
Last edited:

KaOTiK

Lifer
Feb 5, 2001
10,877
8
81
Well I've bought a LCD recently although I still have my beloved 19" LG Flatron F900B CRT (flat screen, but CRT nonetheless). I use the CRT on my Windows XP rig (not the one in my sig) mostly to play older games. And it's usually turned off anyway to save some electricity and help a bit with the bill too, that CRT is a beast (and itself consumes probably 2x the energy that the rest of the rig does). Anyway, point is I went from my CRT's native resolution of 1280x1024 to my LCD's (Samsung S23A700D) 1920x1080. When I switched the rig around along with the LCD I also moved most of my games to my "current gaming" rig (the one in my sig, the other one has my older GTX285 and my older Intel E8400, with 4GB DDR). I can say that ultimately "just" the higher resolution itself, by itself, won't do much if the game's own textures were not worked on that much to start with by the developers.

In other words, if you have washed out, blend and blurry 512x512 textures at the game's highest settings and play it at 1024x768, 1280x1024 or 1920x1080 the only thing that it's going to do is it'll stretch out the lack of details further for you to shake your head at when you stare at it. Also, on my LCD (not sure if it applies to all LCDs, I'm kinda new myself to the whole LCD technology since only two months in fact) if I game at any resolution that's not equivalent to my native resolution's ratio then the overall picture (display in general, gaming or not) looks either stretched out horizontal or "compressed" vertically (not sure how to describe it other than it doesn't look "right"). So sometimes I have to "force" myself playing at my native resolution on my LCD with a game that although does support the HD resolution(s) won't exactly do itself justice (I.E. the game's own "best" textures settings aren't HD-friendly to start with).

I mean... playing UT99 at 1920x1080 does not give me "more details" on the textures compared to how said textures look like at 1024x768 (I know, I've tried). But, on my LCD due to the now standardized shape (rectangular) of HD displays I pretty much "have to" play comparably "old" games at my native resolution anyway not "because it gives better details" but simply because otherwise the display looks weird, whereas on my CRT I could be content with a ratio along the lines of 4:3 or 5:3 (say like 1024x768), rather than 16:9 (like HD res). With this said, however, don't get me wrong! There are plenty of recent games (not just released "right now", but recent I.E. from a couple of years ago as well) which certainly benefit from higher resolutions exactly because the devs worked on the textures enough so that when displayed at HD resolutions will "fully reveal" their details as "meant to be" by the developers, probably because first of all they know what they're doing, they're good artists and they probably worked on 2048x2048 textures or something like that.

I mean the better the texture is by itself the more the HD res will display that. But don't expect HD res to "add" detail out of thin air to a texture that doesn't have any details on it to start with.

LCD look like ass if you run them in anything other then their native resolution. Even the ones with decent scalers still look like ass, just not as bad. CRT is still the best for pure image quality but not by that much anymore, but for being able to use whatever res your CRT can do it will knock a LCD out of the park if that LCD isn't running in native res.
 

greenhawk

Platinum Member
Feb 23, 2011
2,007
0
71
LCD look like ass if you run them in anything other then their native resolution. Even the ones with decent scalers still look like ass, just not as bad. CRT is still the best for pure image quality but not by that much anymore, but for being able to use whatever res your CRT can do it will knock a LCD out of the park if that LCD isn't running in native res.

+1

OP, Running at your screen's native Res is far more important with LCD's than running at 1920x1080.

A larger screen might be what the program/game you are running might have been designed for, but any good program/game will work on any Res and give a similar enough picture.

Running your screen at 1368X768 is fine if that is all your screen can handle.
 

quakeworld

Senior member
Aug 5, 2009
222
0
76
Well I've bought a LCD recently although I still have my beloved 19" LG Flatron F900B CRT (flat screen, but CRT nonetheless). I use the CRT on my Windows XP rig (not the one in my sig) mostly to play older games. And it's usually turned off anyway to save some electricity and help a bit with the bill too, that CRT is a beast (and itself consumes probably 2x the energy that the rest of the rig does). Anyway, point is I went from my CRT's native resolution of 1280x1024 to my LCD's (Samsung S23A700D) 1920x1080. When I switched the rig around along with the LCD I also moved most of my games to my "current gaming" rig (the one in my sig, the other one has my older GTX285 and my older Intel E8400, with 4GB DDR). I can say that ultimately "just" the higher resolution itself, by itself, won't do much if the game's own textures were not worked on that much to start with by the developers.

In other words, if you have washed out, blend and blurry 512x512 textures at the game's highest settings and play it at 1024x768, 1280x1024 or 1920x1080 the only thing that it's going to do is it'll stretch out the lack of details further for you to shake your head at when you stare at it. Also, on my LCD (not sure if it applies to all LCDs, I'm kinda new myself to the whole LCD technology since only two months in fact) if I game at any resolution that's not equivalent to my native resolution's ratio then the overall picture (display in general, gaming or not) looks either stretched out horizontal or "compressed" vertically (not sure how to describe it other than it doesn't look "right"). So sometimes I have to "force" myself playing at my native resolution on my LCD with a game that although does support the HD resolution(s) won't exactly do itself justice (I.E. the game's own "best" textures settings aren't HD-friendly to start with).

I mean... playing UT99 at 1920x1080 does not give me "more details" on the textures compared to how said textures look like at 1024x768 (I know, I've tried). But, on my LCD due to the now standardized shape (rectangular) of HD displays I pretty much "have to" play comparably "old" games at my native resolution anyway not "because it gives better details" but simply because otherwise the display looks weird, whereas on my CRT I could be content with a ratio along the lines of 4:3 or 5:3 (say like 1024x768), rather than 16:9 (like HD res). With this said, however, don't get me wrong! There are plenty of recent games (not just released "right now", but recent I.E. from a couple of years ago as well) which certainly benefit from higher resolutions exactly because the devs worked on the textures enough so that when displayed at HD resolutions will "fully reveal" their details as "meant to be" by the developers, probably because first of all they know what they're doing, they're good artists and they probably worked on 2048x2048 textures or something like that.

I mean the better the texture is by itself the more the HD res will display that. But don't expect HD res to "add" detail out of thin air to a texture that doesn't have any details on it to start with.

why do you use "quotes" on certain words? you do that a lot in your posts.
 

Zenoth

Diamond Member
Jan 29, 2005
5,198
205
106
why do you use "quotes" on certain words? you do that a lot in your posts.

Ah, yes "Reapers"...

Hmmm, not sure, it's a reflex, and I don't follow actual grammatical "rules" I guess (I quoted "rules" intentionally indeed). Also, quotes can elevate certain words' "aesthetics" in my book... «-- see?

Also, are you "stalking" me? How do you know I do that often in my posts? :p

But, yeah seriously... it's not grammatically needed (and am aware of it) I just like doing it I guess.
 

PrincessFrosty

Platinum Member
Feb 13, 2008
2,300
68
91
www.frostyhacks.blogspot.com
Just some notes on misinformation in a few of these posts.

Increased screen resolution can increase viewing area if you have fixed pixel artwork, almost exclusively 2D games. FOV tends to change with aspect ratio and not with resolution. The aspect ratio of 1920x1080 and 1368x768 is basically the same (16:9) so it wouldn't make appreciable FOV changes in games with dynamic FOV calculation.

AA can only partially mitigate low resolutions but cannot ever replace a true increase in screen resolution. You can also apply AA to any screen resolution meaning 1368x768 + AA is still clearly beaten by 1920x1080 + AA.

Viewable aliasing is not directly correlated with screen resolution, it's actually correlated by pixel density which is the screen resolution divided by the (apparant) screen area. If you run out to get a brand new 1080p monitor the amount of visible aliasing will depend on the physical size of your monitor and also the distance you sit away from it. For example if you buy a 42" TV which is 1080p and sit right next to it you'll have more visible aliasing than on your old 1368x768 monitor
 

Markbnj

Elite Member <br>Moderator Emeritus
Moderator
Sep 16, 2005
15,682
14
81
www.markbetz.net
Viewable aliasing is not directly correlated with screen resolution, it's actually correlated by pixel density which is the screen resolution divided by the (apparant) screen area. If you run out to get a brand new 1080p monitor the amount of visible aliasing will depend on the physical size of your monitor and also the distance you sit away from it. For example if you buy a 42" TV which is 1080p and sit right next to it you'll have more visible aliasing than on your old 1368x768 monitor

Very good point.