Go Back   AnandTech Forums > Hardware and Technology > Video Cards and Graphics

Forums
· Hardware and Technology
· CPUs and Overclocking
· Motherboards
· Video Cards and Graphics
· Memory and Storage
· Power Supplies
· Cases & Cooling
· SFF, Notebooks, Pre-Built/Barebones PCs
· Networking
· Peripherals
· General Hardware
· Highly Technical
· Computer Help
· Home Theater PCs
· Consumer Electronics
· Digital and Video Cameras
· Mobile Devices & Gadgets
· Audio/Video & Home Theater
· Software
· Software for Windows
· All Things Apple
· *nix Software
· Operating Systems
· Programming
· PC Gaming
· Console Gaming
· Distributed Computing
· Security
· Social
· Off Topic
· Politics and News
· Discussion Club
· Love and Relationships
· The Garage
· Health and Fitness
· Merchandise and Shopping
· For Sale/Trade
· Hot Deals with Free Stuff/Contests
· Black Friday 2014
· Forum Issues
· Technical Forum Issues
· Personal Forum Issues
· Suggestion Box
· Moderator Resources
· Moderator Discussions
   

Reply
 
Thread Tools
Old 12-12-2012, 06:43 PM   #1
Insomniator
Diamond Member
 
Join Date: Oct 2002
Location: jersey
Posts: 5,138
Default 1920x1200 performance hit vs 1900x1080?

At home I have a 25.5" Samsung 2693HM 1920x1200. Used strictly for gaming... occasional VPN use. At work I have a 24" Dell 1900x1080... used strictly for work.


Does it make sense to switch these two monitors considering 1920x1200 is better for work and 1080p will save me some pixels while gaming? Is the performance difference worth it (if any)? I have a 560TI 448 which I feel like is on the border of needing an upgrade... 10% more performance would be nice for Far Cry 3 I'm sure.
Insomniator is online now   Reply With Quote
Old 12-12-2012, 06:45 PM   #2
cmdrdredd
Lifer
 
cmdrdredd's Avatar
 
Join Date: Dec 2001
Posts: 20,117
Default

1080p would only serve to be more compatible with in game resolutions.

Having used both the performance difference was so small to be unmentionable.
__________________
Asus Maximus V Gene | 3570k @ 4.5 | 8GB Samsung 30nm @ 2133 | EVGA GTX 670 FTW SLI @ 1230/6765
2x 128GB Crucial M4 SSD | WD 300GB Velociraptor | Lite-On HBS212 BD-R | Corsair HX1000 PSU
Cooler Master HAF 932 | Yamakasi Catleap Q270 | Windows 8.1 Pro
cmdrdredd is online now   Reply With Quote
Old 12-12-2012, 07:04 PM   #3
KingFatty
Platinum Member
 
KingFatty's Avatar
 
Join Date: Dec 2010
Posts: 2,625
Default

Would you say that it was 10%, the small difference that is unmentionable?

Maybe bumping from 55 frames per second to 60? Or 30 frames per second to 33?
KingFatty is offline   Reply With Quote
Old 12-12-2012, 07:08 PM   #4
cmdrdredd
Lifer
 
cmdrdredd's Avatar
 
Join Date: Dec 2001
Posts: 20,117
Default

Quote:
Originally Posted by KingFatty View Post
Would you say that it was 10%, the small difference that is unmentionable?

Maybe bumping from 55 frames per second to 60? Or 30 frames per second to 33?
I don't even think it was that much.

Actually let me do a quick test in Heaven with SLI off and report back. 1920x1080 and 1920x1200. Even though I have a 2560x1440 monitor I can run non native resolutions and get an idea for the performance.

edit: Heaven Benchmark 8xAA, 16xAF, Extreme Tessellation. 1x GTX670 @ 1230/6765

1920x1080: Min - 30.2fps Avg - 49.1fps Max - 124.6fps
1920x1200: Min - 24.9fps Avg - 46.4fps Max - 116.6fps
__________________
Asus Maximus V Gene | 3570k @ 4.5 | 8GB Samsung 30nm @ 2133 | EVGA GTX 670 FTW SLI @ 1230/6765
2x 128GB Crucial M4 SSD | WD 300GB Velociraptor | Lite-On HBS212 BD-R | Corsair HX1000 PSU
Cooler Master HAF 932 | Yamakasi Catleap Q270 | Windows 8.1 Pro

Last edited by cmdrdredd; 12-12-2012 at 07:32 PM.
cmdrdredd is online now   Reply With Quote
Old 12-12-2012, 07:49 PM   #5
Pia
Golden Member
 
Join Date: Feb 2008
Posts: 1,551
Default

You get the same performance increase if you run 1080p resolution on the 1920x1200 display. I'd definitely swap the displays anyway, both to get rid of the black bars while gaming, and to have more screen space for work.
Pia is offline   Reply With Quote
Old 12-12-2012, 08:08 PM   #6
KingFatty
Platinum Member
 
KingFatty's Avatar
 
Join Date: Dec 2010
Posts: 2,625
Default

Wow surprising jump on the MIN framerates, more than 10% difference there. That might be where you really notice it, possibly reducing stutter?
KingFatty is offline   Reply With Quote
Old 12-12-2012, 08:17 PM   #7
postmortemIA
Diamond Member
 
postmortemIA's Avatar
 
Join Date: Jul 2006
Location: Midwest USA
Posts: 6,500
Default

yeah, if your game sucks on 1200p, chances are that it doesn't work to well on 1080p. difference is 11% afterall
__________________
D1. Win7 x64 i7-3770 on Z77, HD7850, 2707WFP, 840, X-Fi D2. Win7 x64 E8400 on P35
L1. OSX 10.9 rMBP 13 L2. Vista x86 E1505
M. Galaxy S4

postmortemIA is online now   Reply With Quote
Old 12-12-2012, 08:27 PM   #8
cmdrdredd
Lifer
 
cmdrdredd's Avatar
 
Join Date: Dec 2001
Posts: 20,117
Default

Quote:
Originally Posted by KingFatty View Post
Wow surprising jump on the MIN framerates, more than 10% difference there. That might be where you really notice it, possibly reducing stutter?
Could be, but even still 1080p is more widely supported. If I remember some games required me to mess around in the .ini files to get 1920x1200 to work.
__________________
Asus Maximus V Gene | 3570k @ 4.5 | 8GB Samsung 30nm @ 2133 | EVGA GTX 670 FTW SLI @ 1230/6765
2x 128GB Crucial M4 SSD | WD 300GB Velociraptor | Lite-On HBS212 BD-R | Corsair HX1000 PSU
Cooler Master HAF 932 | Yamakasi Catleap Q270 | Windows 8.1 Pro
cmdrdredd is online now   Reply With Quote
Old 12-12-2012, 09:21 PM   #9
Insomniator
Diamond Member
 
Join Date: Oct 2002
Location: jersey
Posts: 5,138
Default

Quote:
Originally Posted by postmortemIA View Post
yeah, if your game sucks on 1200p, chances are that it doesn't work to well on 1080p. difference is 11% afterall
True, but I'm not expecting miracles. Better compatibility and any kind of better FPS increase is good if It'll help hold me over until Crysis 3!
Insomniator is online now   Reply With Quote
Old 12-13-2012, 07:48 AM   #10
Gryz
Senior Member
 
Gryz's Avatar
 
Join Date: Aug 2010
Posts: 427
Default

Three months ago I replaced my 24" 1920x1200 Samsung with a 27" 1920x1080 Asus screen. I am still not decided if it was worth the money.

My main reason was to get lower responsetimes. My older Samsung 2443BW 24" has 5ms response time. I always found the screen blurry when I moved the camera. So I wanted a monitor with only 2ms. I chose the Asus VE278Q 27". It's impossible to see different monitors in shops these days. And reviews of monitors always seem very subjective. So I had no other choice but to just try.

Motion in games is indeed a little less blurry. But not by much. Having 16x9 in stead of 16x10 is slightly less enjoyable on the desktop. Maybe also in MMOs, where you have a lot of UI stuff at the bottom. But in other games, 16x9 is just as good as 16x10. There are 2 other benefits to my new screen (besides slightly less blurry). One is the size. 27" is nicer than 24". But I also realize, I don't need anything bigger than 27". I don't think I want a 30" monitor.

The second benefit is the higher fps. 11% less pixels indeed means 11% more performance ! At the time I was playing mostly Guild Wars 2 and Skyrim. All settings to Ultra. Enabled SSAO on the control panel. Skyrim visuals were heavily modded. Both games ran 60 fps in most places, but some spots had only 50 fps or less. (gtx680 + 3570k@4Ghz). I wrote down numbers in some locations in both games. And after switching monitors (and resolutions), to my surprise I saw a ~10% fps increase in almost all locations ! In theory 10% less pixels means 10% higher framerates. But that doesn't mean you will see that result in practice too.

For you, the higher resolution also has the larger screen. That is another factor to consider. Do you want a bigger desktop at work, or bigger screen while gaming ? But I would certainly look at response-times. Your Samsung has 5ms response-time. Not sure about the Dell (I see 2 24" 1080p Dells, with 5ms and 6ms delay). But if both have 5ms delay, and you find 10% higher fps useful, I'd certainly swap the monitors.
Gryz is offline   Reply With Quote
Old 12-13-2012, 09:06 AM   #11
PrincessFrosty
Golden Member
 
Join Date: Feb 2008
Location: UK
Posts: 1,422
Default

Quote:
Originally Posted by cmdrdredd View Post
1080p would only serve to be more compatible with in game resolutions.

Having used both the performance difference was so small to be unmentionable.
Games are generally speaking no more compatible with 16:9 than 16:10, in fact 16:10 was around for a LOT longer in the PC space than 16:9, that has only been adopted from the television market in more recent years, almost all modern games read the list compatible screen resolutions from your video card drivers and so both are equally as supported.

Performance is pretty linear with number of pixels

1920x1200 is 2,304,000 pixels
1920x1080 is 2,073,600 pixels

The difference is small, maybe unnoticeable in some circumstances, but it does exist.
__________________
Intel 2600k @ 4.9Ghz || ThermalRight TRUE Spirit 140
16Gb PC3-12800 || 2x MSI GTX580 Twin Frozr II in SLI
Dell 3007 WFP-HC 30" || BenQ XL2420T 24" 120hz
PrincessFrosty is offline   Reply With Quote
Old 12-13-2012, 09:17 AM   #12
BrightCandle
Diamond Member
 
BrightCandle's Avatar
 
Join Date: Mar 2007
Posts: 4,763
Default

Quote:
Originally Posted by Gryz View Post
Three months ago I replaced my 24" 1920x1200 Samsung with a 27" 1920x1080 Asus screen. I am still not decided if it was worth the money.

My main reason was to get lower responsetimes. My older Samsung 2443BW 24" has 5ms response time. I always found the screen blurry when I moved the camera. So I wanted a monitor with only 2ms. I chose the Asus VE278Q 27". It's impossible to see different monitors in shops these days. And reviews of monitors always seem very subjective. So I had no other choice but to just try.

...
Its too late now but for future reference tftcentral.co.uk will fullfill your desire to know exactly how much blur a monitor has, along with a whole host of other important factors to do with the images quality. If the monitor isn't on their list its almost certainly complete rubbish or nothing new.
__________________
I no longer frequent these forums.
BrightCandle is offline   Reply With Quote
Old 12-13-2012, 12:28 PM   #13
exar333
Diamond Member
 
exar333's Avatar
 
Join Date: Feb 2004
Location: Charlotte, NC
Posts: 6,641
Default

Quote:
Originally Posted by cmdrdredd View Post
1080p would only serve to be more compatible with in game resolutions.

Having used both the performance difference was so small to be unmentionable.
It is pretty unusual to find game that would only work on 1080p vs. 1920x1200.

That said, I agree that the difference in performance is negligable. If it runs on 1080P , it will run fine on 1920x1200, and vice versa. The only case I can think of where the difference might be more substantial is where 1080P performance is right at minimum acceptable framerate levels, 30-35fps, and in this case losing 10% performance means much more than going from 75fps to 68fps, for example.
__________________
My Cars:
-2011 DGM WRX Limited
My Rig:
4670K @ 4.0 Ghz w/ CM 212
EVGA GTX 970 SC @ 1501/7780
3440x1440 LG 34UM95 QHD Display
exar333 is offline   Reply With Quote
Old 12-13-2012, 01:04 PM   #14
KingFatty
Platinum Member
 
KingFatty's Avatar
 
Join Date: Dec 2010
Posts: 2,625
Default

Also, you can always use a 1200p monitor for either 1080p or 1200p, so it's more options if you want them. The 1080p is always stuck at 1080p, which can be limiting for games that are Vert+ (I think that's the term to describe games that show you more of a scene the taller your resolution).
KingFatty is offline   Reply With Quote
Old 12-13-2012, 01:10 PM   #15
Pia
Golden Member
 
Join Date: Feb 2008
Posts: 1,551
Default

Quote:
Originally Posted by PrincessFrosty View Post
Games are generally speaking no more compatible with 16:9 than 16:10, in fact 16:10 was around for a LOT longer in the PC space than 16:9, that has only been adopted from the television market in more recent years, almost all modern games read the list compatible screen resolutions from your video card drivers and so both are equally as supported.
16:10 is "supported", usually, but a good number of games are optimized for 16:9 (example: Starcraft II) or as wide ratio as possible (example: most first person shooters). So you will generally see less if your screen is 16:10 as opposed to 16:9.
Pia is offline   Reply With Quote
Old 12-13-2012, 01:16 PM   #16
BrightCandle
Diamond Member
 
BrightCandle's Avatar
 
Join Date: Mar 2007
Posts: 4,763
Default

About 6 months ago I went through all the resolutions on the Steam survey and worked out their aspect ratios. I found at that moment that while 16:10 was in decline it was actually the #1 ratio, followed closely by 16:9 and distantly by 4:3. But there are a host of other ratios about now which show up in the data and I counted some that were close to 16:10 as 16:10 and those close to 16:9 as 16:9.

16:9 is where the industry is clearly headed, 16:10 is IMO better for work and games. I lack vertical visibility as it is, I certainly don't want to loose anymore.
__________________
I no longer frequent these forums.
BrightCandle is offline   Reply With Quote
Old 12-13-2012, 02:13 PM   #17
skipsneeky2
Diamond Member
 
Join Date: May 2011
Posts: 3,649
Default

Have done a little 1080p vs 1200p testing myself with my u2412m and a 7850.

Found my 1050/1450 7850 in BF3 on the map Operation Metro using all high settings enabled minus msaa would average nearly 8fps higher and saw bare minimums jump from 57fps to about 63fps just switching resolutions.

Maybe a issue resolution wise but my 7850 seems to be a little jittery when i hit the lower 60s especially outside,dropping it all to medium smooths it out and my minimums never seen under 65fps,nor under 60 on basically the majority of maps.

Can't wait to pick up a 7970....
skipsneeky2 is offline   Reply With Quote
Old 12-13-2012, 04:09 PM   #18
Pia
Golden Member
 
Join Date: Feb 2008
Posts: 1,551
Default

Quote:
Originally Posted by BrightCandle View Post
About 6 months ago I went through all the resolutions on the Steam survey and worked out their aspect ratios. I found at that moment that while 16:10 was in decline it was actually the #1 ratio, followed closely by 16:9 and distantly by 4:3. But there are a host of other ratios about now which show up in the data and I counted some that were close to 16:10 as 16:10 and those close to 16:9 as 16:9.

16:9 is where the industry is clearly headed, 16:10 is IMO better for work and games. I lack vertical visibility as it is, I certainly don't want to loose anymore.
Based on your sig it looks like you have a 48:10 screen setup? If you only had a single display, 16:9 would be better for games.
Pia is offline   Reply With Quote
Old 12-13-2012, 05:55 PM   #19
Gryz
Senior Member
 
Gryz's Avatar
 
Join Date: Aug 2010
Posts: 427
Default

Quote:
Originally Posted by BrightCandle View Post
Its too late now but for future reference tftcentral.co.uk will fullfill your desire to know exactly how much blur a monitor has, along with a whole host of other important factors to do with the images quality.
I actually did look at that website to find information. Neither of the two monitors I was dealing with are mentioned there. There are many many different monitors, and only a few (relatively) are there. Also, I feel that tftcentral is mostly concerned with image quality. And less with gaming. The Asus VE278Q was actually recommened by members of these anandtech forums. I think the VE27Q is an good choice for a gaming monitor. But the Samsung was good too, I think.

My next monitor will probably be an IPS screen.
But only if there will ever be a monitor that has:
- 1920x1200 (or 1920x1080)
- 27"
- IPS
- 120Hz
- 2 ms response-time
- not ridiculously expensive.
Even if I drop the price-requirement of my wishlist, there are no monitors that fullfil even 3 out of the remaining 5 requirements. And I think it will be a long long time before we see a 120Hz IPS monitor with 2ms responsetime.

Quote:
If the monitor isn't on their list its almost certainly complete rubbish or nothing new.
Just like with videocards (see the "what would you pick" thread), it all depends on your requirements. I think tftcentral on average has different requirements than I have. I can't believe omission from the website means that a product is crap.

I'm curious what tftcentral would consider the best monitor for gaming. Their recommendation tool does not give *any* suggestions when my input is gaming+IPS.
Gryz is offline   Reply With Quote
Old 12-13-2012, 06:06 PM   #20
BrightCandle
Diamond Member
 
BrightCandle's Avatar
 
Join Date: Mar 2007
Posts: 4,763
Default

Quote:
Originally Posted by Gryz View Post
My next monitor will probably be an IPS screen.
But only if there will ever be a monitor that has:
- 1920x1200 (or 1920x1080)
- 27"
- IPS
- 120Hz
- 2 ms response-time
- not ridiculously expensive.
I am literally waiting with fist fulls of money for my IPS 120Hz monitors. I would likely replace all three of my 60 hz ones when they finally become available. As long as they don't ghost badly.

Quote:
Originally Posted by Gryz View Post
Just like with videocards (see the "what would you pick" thread), it all depends on your requirements. I think tftcentral on average has different requirements than I have. I can't believe omission from the website means that a product is crap.

I'm curious what tftcentral would consider the best monitor for gaming. Their recommendation tool does not give *any* suggestions when my input is gaming+IPS.
Tftcentral tests just about everything. For gaming in particular you care a great deal about the ghosting and the input latency, both of which they test in detail. They also test the colour quality and contrast etc. If you look at the ghosting comparison they always show it against other monitors in the class as well as the best gaming monitor they know of (a 23" Samsung 120hz TN screen). So if you were looking for the best monitor in that aspect then the samsung would be what they have, but obviously its colour quality is kind of rubbish. Tftcentral don't tell you to take one monitor over the other, what they do is give you all the data and leave it to you to work out if it fits what you want to do with it.
__________________
I no longer frequent these forums.
BrightCandle is offline   Reply With Quote
Old 12-14-2012, 08:52 AM   #21
pauldun170
Platinum Member
 
pauldun170's Avatar
 
Join Date: Sep 2011
Posts: 2,097
Default

I have a Samsung Monitor with a native 1920x1200 resolution.

I've been playing Far Cry 3 with a GTX570 and on ultra settings the FPS hasn't been "optimal". The game is playable but its obvious the game is dipping into the 20's at times.

After reading this thread I thought I'd give 1080p a try so I changed the settings in game.
Game is very smooth now and a VERY noticeable improvement. I can play now without even thinking about the FPS.
pauldun170 is offline   Reply With Quote
Old 12-14-2012, 10:30 AM   #22
Aikouka
Lifer
 
Aikouka's Avatar
 
Join Date: Nov 2001
Posts: 24,612
Default

Quote:
Originally Posted by Gryz View Post
Motion in games is indeed a little less blurry. But not by much. Having 16x9 in stead of 16x10 is slightly less enjoyable on the desktop. Maybe also in MMOs, where you have a lot of UI stuff at the bottom. But in other games, 16x9 is just as good as 16x10. There are 2 other benefits to my new screen (besides slightly less blurry). One is the size. 27" is nicer than 24". But I also realize, I don't need anything bigger than 27". I don't think I want a 30" monitor.
One thing that I've always found annoying about MMOs and 1920x1200 is how some video authoring software refuses to support anything above 1080p. I've owned a copy of Sony Vegas for quite awhile, and it was always a bit annoying that it refused to render my videos at 1920x1200. I ended up scaling my videos down to 1,728x1080 just so they would retain the same 16x10 frame.

It is definitely nice to have more room for UI elements though.
Aikouka is offline   Reply With Quote
Old 12-15-2012, 01:50 AM   #23
PrincessFrosty
Golden Member
 
Join Date: Feb 2008
Location: UK
Posts: 1,422
Default

Quote:
Originally Posted by Pia View Post
16:10 is "supported", usually, but a good number of games are optimized for 16:9 (example: Starcraft II) or as wide ratio as possible (example: most first person shooters). So you will generally see less if your screen is 16:10 as opposed to 16:9.
You're talking about vertical and horizontal FOV.

You get more in one direction depending on both your aspect ratio and the behaviour of the game, some games are vert- and some are horz+

With Horz+ you see a wider horizontal FOV with a 16:9 monitor, but if a game is vert- then a 16:9 monitor has more of the vertical field of view chopped off.

Some (next to none) games are anamorphic and add black bars for anything greater than 16:9 like assassins creed for example, they constitute a tiny fraction of the number of total games, and are generally considered to be poorly designed for widescreen.
__________________
Intel 2600k @ 4.9Ghz || ThermalRight TRUE Spirit 140
16Gb PC3-12800 || 2x MSI GTX580 Twin Frozr II in SLI
Dell 3007 WFP-HC 30" || BenQ XL2420T 24" 120hz

Last edited by PrincessFrosty; 12-15-2012 at 02:41 AM.
PrincessFrosty is offline   Reply With Quote
Old 12-15-2012, 06:09 AM   #24
Pia
Golden Member
 
Join Date: Feb 2008
Posts: 1,551
Default

Quote:
Originally Posted by PrincessFrosty View Post
You're talking about vertical and horizontal FOV.

You get more in one direction depending on both your aspect ratio and the behaviour of the game, some games are vert- and some are horz+

With Horz+ you see a wider horizontal FOV with a 16:9 monitor, but if a game is vert- then a 16:9 monitor has more of the vertical field of view chopped off.

Some (next to none) games are anamorphic and add black bars for anything greater than 16:9 like assassins creed for example, they constitute a tiny fraction of the number of total games, and are generally considered to be poorly designed for widescreen.
Yes, I just didn't feel a need to pull the technical terms into it. Practically no new games are vert-, and for the rest, 16:9 is better than 16:10. Ergo, these days 16:9 is by far the best choice for gaming.
Pia is offline   Reply With Quote
Old 12-15-2012, 06:19 AM   #25
PrincessFrosty
Golden Member
 
Join Date: Feb 2008
Location: UK
Posts: 1,422
Default

Quote:
Originally Posted by Pia View Post
Yes, I just didn't feel a need to pull the technical terms into it. Practically no new games are vert-, and for the rest, 16:9 is better than 16:10. Ergo, these days 16:9 is by far the best choice for gaming.
That's not really true, there's still a lot of games that have issues with being vert- you only need to go over to somewhere like the WSGF and check compatibility and there's a lots of problems.

1920x1200 can of course fit 1920x1080 inside by simply setting the scaling in drivers to 1:1 to get a perfect picture with black bars, so since 1920x1200 can do at least everything that 1920x1200 can do, but more, it's a better resolution.
__________________
Intel 2600k @ 4.9Ghz || ThermalRight TRUE Spirit 140
16Gb PC3-12800 || 2x MSI GTX580 Twin Frozr II in SLI
Dell 3007 WFP-HC 30" || BenQ XL2420T 24" 120hz
PrincessFrosty is offline   Reply With Quote
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -5. The time now is 11:48 PM.


Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2014, vBulletin Solutions, Inc.