Ultrawide vs Probably best gaming performance monitor?

KillerBunnyB

Junior Member
Mar 26, 2017
10
0
1
Hello!

So I'm in a slight dilemma. I recently built a new rig. Following are the brief specifications:

i7 6700K
GTX 1080
16GB Gskill Rip Jaws V
Asus Maximus VIII Hero

Currently the monitor I'm using is an older one: The Dell S2240L. Its a 21.5inch 1080p IPS Panel. Basically it was good 5 years ago.

I'm stuck in dilemma between getting an Ultra wide monitor or a monitor with both Gsync and a higher refresh rate.

On one side I have shortlisted the Acer Predator XB2721HU. Fantastic tech and most likely the most ideal for gaming. Gsync, 1440p, 165hz, IPS.

On the other hand I have shortlisted the LG 34UC98 34" ultra wide monitor. (I unfortunately can't afford the Acer X34 or the Asus equivalent in ultrawide - too expensive). Amazing curved design.A completely unique way of gaming, total immersion, something different. But - no gysync, 60hz refresh rate, but good resolution.

I will primarily game on my rig, along with watch the occasional movie and some regular TV shows and web-browsing daily.

Please give me your opinions on which monitor is the smarter buy. Ideally for gaming, the Acer monitor is a no-brainer, BUT, exactly how different and better is Gsync with the higher refresh rate? Will it blow me away? Sometimes tech like this (for RPG gamers like me) doesn't seem to make a HUGE difference. Or will Ultra wide, with its extreme immersion, be a better experience (including that amazing aesthetic value), but just a 60hz refresh rate!

Please give your detailed opinions on this guys. I've been fretting on this a lot, and once I buy a monitor it will ideally be with me for the next 5 years.

Looking forward to your responses!
 

lehtv

Elite Member
Dec 8, 2010
11,897
74
91
Ultrawide is worth it for gaming only if you don't have to give up high refresh rate and GSync for it. It might be a smart compromise for single player games if you could still get the GSync. But with just 60Hz and no Sync... no way. I'd suggest chugging along with the Dell until you can afford the X34. Best of both worlds. But if you can't wait, buy the XB271HU.

Ideally for gaming, the Acer monitor is a no-brainer, BUT, exactly how different and better is Gsync with the higher refresh rate? Will it blow me away? Sometimes tech like this (for RPG gamers like me) doesn't seem to make a HUGE difference. Or will Ultra wide, with its extreme immersion, be a better experience (including that amazing aesthetic value), but just a 60hz refresh rate!

There are lots of different RPG games, some benefit from high refresh rates and smooth motion clarity while in others you can have 30 fps and not notice anything wrong. If it's action RPG's you play, high refresh rate makes a bit of difference but usually these games are easy to run locked to 60 fps with VSync as well. On the other hand, games like Witcher 3 and other immersive and graphics-heavy first/third person games will definitely benefit from GSync, especially on a high resolution like 3440x1440. GTX 1080 is just enough to run demanding games at high settings on that resolution, but only with GSync.
 
Last edited:

KillerBunnyB

Junior Member
Mar 26, 2017
10
0
1
Ultrawide is worth it for gaming only if you don't have to give up high refresh rate and GSync for it. It might be a smart compromise for single player games if you could still get the GSync. But with just 60Hz and no Sync... no way. I'd suggest chugging along with the Dell until you can afford the X34. Best of both worlds. But if you can't wait, buy the XB271HU.



There are lots of different RPG games, some benefit from high refresh rates and smooth motion clarity while in others you can have 30 fps and not notice anything wrong. If it's action RPG's you play, high refresh rate makes a bit of difference but usually these games are easy to run locked to 60 fps with VSync as well. On the other hand, games like Witcher 3 and other immersive and graphics-heavy first/third person games will definitely benefit from GSync, especially on a high resolution like 3440x1440. GTX 1080 is just enough to run demanding games at high settings on that resolution, but only with GSync.

Hmm, I see. So you suggest at least gysnc with an Ultrawide.

I was looking around a little more and found the Samsung CF791 - http://www.samsung.com/us/computing/monitors/curved/34--cf791-wqhd-monitor-lc34f791wqnxza/

This is on native 100hz and ultrawide 34". Any opinion on this?

Also on a sidenote - will i be able to hit ultra setting with 60fps on lets say, witcher 3 on these ultrawides?
 

lehtv

Elite Member
Dec 8, 2010
11,897
74
91
@KillerBunnyB Oh, excellent find. The high contrast VA panel makes it brilliant for atmosphere and darker scenes. It has FreeSync which GTX 1080 won't be able to take advantage of, but the 100Hz refresh rate does help with avoiding most of the screen tearing - the tearing is still there but it's less visible because of the faster refresh rate.

If you want to enable FreeSync, you can sidegrade to AMD Vega. Depending on Vega's pricing, the switch won't necessarily cost you much after you sell the GTX 1080.

I'd say that monitor is at least worth a try.

Also on a sidenote - will i be able to hit ultra setting with 60fps on lets say, witcher 3 on these ultrawides?

Yes, with HairWorks off. See GamersNexus (scroll down to halfway). GTX 980 Ti reference gets 48.7 fps average and 37 fps 1% low on Ultra with AA and HairWorks disabled and SSAO instead of HBAO. GTX 1080 us 40% faster which translates to 68 fps average and 52 fps 1% low.

I played Witcher 3 with custom medium/high settings and actually preferred it to the Ultra preset. I don't like the blurry after effects at all, it just looks much better without motion blur and landscape DOF and both take up a fair chunk of fps. I don't remember there being much difference between SSAO and HBAO. And I disabled in-game AA and used Reshade for SMAA and for adjusting colors to make it look more realistically (de)saturated. So there's a lot of performance headroom, you should be able to run it about 80 fps average without noticing much of a decline in visual quality. Besides, it's the ultrawide screen that makes the wow factor.
 

KillerBunnyB

Junior Member
Mar 26, 2017
10
0
1
@KillerBunnyB Oh, excellent find. The high contrast VA panel makes it brilliant for atmosphere and darker scenes. It has FreeSync which GTX 1080 won't be able to take advantage of, but the 100Hz refresh rate does help with avoiding most of the screen tearing - the tearing is still there but it's less visible because of the faster refresh rate.

If you want to enable FreeSync, you can sidegrade to AMD Vega. Depending on Vega's pricing, the switch won't necessarily cost you much after you sell the GTX 1080.

I'd say that monitor is at least worth a try.



Yes, with HairWorks off. See GamersNexus (scroll down to halfway). GTX 980 Ti reference gets 48.7 fps average and 37 fps 1% low on Ultra with AA and HairWorks disabled and SSAO instead of HBAO. GTX 1080 us 40% faster which translates to 68 fps average and 52 fps 1% low.

I played Witcher 3 with custom medium/high settings and actually preferred it to the Ultra preset. I don't like the blurry after effects at all, it just looks much better without motion blur and landscape DOF and both take up a fair chunk of fps. I don't remember there being much difference between SSAO and HBAO. And I disabled in-game AA and used Reshade for SMAA and for adjusting colors to make it look more realistically (de)saturated. So there's a lot of performance headroom, you should be able to run it about 80 fps average without noticing much of a decline in visual quality. Besides, it's the ultrawide screen that makes the wow factor.

This is great, really. Though a little bummer as I was hoping the 1080 can handle mostly 60fps on this resolution.
Also I've been using the IPS Panel all my life. WIll VA make it better for me? Or it'll take some time to getting used to?
 

lehtv

Elite Member
Dec 8, 2010
11,897
74
91
@KillerBunnyB I've actually never used VA. But I doubt it will take any time getting used to. I've used mostly TN screens, and first time I looked at IPS it felt completely normal, just better. VA will probably feel completely normal, just with deeper blacks.

The main disadvantage with VA is that response time tends to be worse than with IPS and can lead to some ghosting. It really depends on the exact panel used, and I would think there's nothing to be worried about with a 100Hz panel.
 

richaron

Golden Member
Mar 27, 2012
1,357
329
136
Well if you are cool with the crappy stand the monitor I have, the Asus MX34V[Q], does not have the annoying Freesync flicker/limitations of the CF791.

I know you won't be running Adaptive Sync atm but it's something to consider if you're looking in this ballpark. It's still UWQHD 100Hz and a bit cheaper, but I suspect less color gamit (~100%SRGB vs ~125%) and less curvature (1800R vs 1500R).
 

KillerBunnyB

Junior Member
Mar 26, 2017
10
0
1
@KillerBunnyB I've actually never used VA. But I doubt it will take any time getting used to. I've used mostly TN screens, and first time I looked at IPS it felt completely normal, just better. VA will probably feel completely normal, just with deeper blacks.

The main disadvantage with VA is that response time tends to be worse than with IPS and can lead to some ghosting. It really depends on the exact panel used, and I would think there's nothing to be worried about with a 100Hz panel.

I see I see. This monitor still boasts a pretty decent response time of 4ms
 

KillerBunnyB

Junior Member
Mar 26, 2017
10
0
1
Well if you are cool with the crappy stand the monitor I have, the Asus MX34V[Q], does not have the annoying Freesync flicker/limitations of the CF791.

I know you won't be running Adaptive Sync atm but it's something to consider if you're looking in this ballpark. It's still UWQHD 100Hz and a bit cheaper, but I suspect less color gamit (~100%SRGB vs ~125%) and less curvature (1800R vs 1500R).

Hey this is actually quite a good find as well!! Is this a very new release?
 

richaron

Golden Member
Mar 27, 2012
1,357
329
136
Hey this is actually quite a good find as well!! Is this a very new release?

Very new. I'm still super stoked with it, hence posting about it on forums :p

In all seriousness I was waiting on the CF791 also, but saw it had 2 Adaptive Sync ranges (80-100 & 48-100). You need at least 2x min-max range for Freesync's "LFC" to work (and I consider LFC important), but the CF791 has heaps of reports of "flickering" with the 48-100 range. So I had to find something else...

Edit: Continuing; If not using Adaptive Sync then I suspect the CF791 might have slightly better image quality (if the curve isn't too tight), so that might make the difference to you. But the MX34VQ seems to have the edge when using Adaptive Sync in the 50-80Hz range, and that's much more important to me. Plus there's the better speakers & Qi charger vs USB hub & better stand point.
 
Last edited:

John Carmack

Member
Sep 10, 2016
160
268
136
This is great, really. Though a little bummer as I was hoping the 1080 can handle mostly 60fps on this resolution.
Also I've been using the IPS Panel all my life. WIll VA make it better for me? Or it'll take some time to getting used to?

The only advantage of VA is higher contrast ratios from the deeper blacks. Ghosting and tint/gamma/color shift when viewed even slightly off angle can be major drawbacks if you're used to IPS.
 

KillerBunnyB

Junior Member
Mar 26, 2017
10
0
1
Very new. I'm still super stoked with it, hence posting about it on forums :p

In all seriousness I was waiting on the CF791 also, but saw it had 2 Adaptive Sync ranges (80-100 & 48-100). You need at least 2x min-max range for Freesync's "LFC" to work (and I consider LFC important), but the CF791 has heaps of reports of "flickering" with the 48-100 range. So I had to find something else...

Edit: Continuing; If not using Adaptive Sync then I suspect the CF791 might have slightly better image quality (if the curve isn't too tight), so that might make the difference to you. But the MX34VQ seems to have the edge when using Adaptive Sync in the 50-80Hz range, and that's much more important to me. Plus there's the better speakers & Qi charger vs USB hub & better stand point.

Hey so I have an Nvidia GTX 1080, so in any case I'll wont really be able to use FreeSync. I'm a little backward with all these "Sync" technologies. Could you educate me a little on this? Like what is adaptive sync? Does it meaning running on half the refresh rate?
 

KillerBunnyB

Junior Member
Mar 26, 2017
10
0
1
The only advantage of VA is higher contrast ratios from the deeper blacks. Ghosting and tint/gamma/color shift when viewed even slightly off angle can be major drawbacks if you're used to IPS.

I have been using a 21.5inch IPS panel, so I'm assuming maybe, just maybe the switch to a 34" VA panel won't trouble me too much
 

lehtv

Elite Member
Dec 8, 2010
11,897
74
91
Hey so I have an Nvidia GTX 1080, so in any case I'll wont really be able to use FreeSync. I'm a little backward with all these "Sync" technologies. Could you educate me a little on this? Like what is adaptive sync? Does it meaning running on half the refresh rate?

FreeSync is AMD's implementation of VESA's open Adaptive-Sync which is part of the DisplayPort standard. Adaptive-Sync allows the monitor to operate at a range of refresh rates (e.g. 35-120Hz) as opposed to a fixed refresh rate (e.g. 144Hz). It dynamically adjusts the monitor's refresh rate to match the in-game framerate, and synchronizes it so the frame update and the monitor refresh occur simultaneously. This eliminates screen tearing practically entirely and makes especially lower framerates look much smoother than they would using just the usual vertical sync or no sync.

GSync is NVIDIA's proprietary technology that does basically the same thing as FreeSync/Adaptive-Sync but costs more. GSync monitors use a physical GSync module in the monitor that does all the magic, while FreeSync/Adaptive-Sync is basically firmware-based.
 

richaron

Golden Member
Mar 27, 2012
1,357
329
136
Hey so I have an Nvidia GTX 1080, so in any case I'll wont really be able to use FreeSync. I'm a little backward with all these "Sync" technologies. Could you educate me a little on this? Like what is adaptive sync? Does it meaning running on half the refresh rate?

My History lesson:

Back in the day CRT screens were run at the same frequency as the AC power you get from the wall, 50 or 60Hz depending on where in the world you were (later more advanced CRT screens improved on this but lets ignore that). For some reason when LCD monitors came on the market and took over in the early 2000's they decided to keep the US standard 60Hz "refresh rate" (all over the world), and this particular aspect stagnated for a decade or so. Even today if you read hardware reviews you will see many people use 60 frames per second as the default "playable", "ideal", or "good enough" number.

For a while LCD monitors got bigger and higher resolution while reducing the "response times" (time for pixels to change colors, usually given as a useless ideal number for marketting). But it took ages for developers to realize response times were fast enough and the cables they used had enough bandwidth that someone said "Hey, maybe we can do something other than 60Hz": and high refresh rate LCD monitors were born. Since higher refresh monitors can make games feel smoother they decided to run their monitors at 120Hz, or 144Hz, or 165Hz or whatever and sell them for lots of money to gamers.

Now if you've ever played games watching a frame rate counter you'll notice the actual number (frames per second, fps) your PC can do jumps around; it depends on what you are looking at and what's happening and so on. But the fps your computer is running is almost never exactly the same as the refresh rate of your monitor, which leads to problems like screen tearing.

It took about another 5 years (?) for some bright spark, probably at nVidia, to think "Hey, why don't we just run the screen at the same speed that it's getting frames from the computer" (at least with respect to gaming, it was used to save power in laptops also). And BAM what I call "Variable refresh rate technology" was born.

nVidia uses a proprietary system which they call G-Sync, which was out first and needs special hardware to run, had a better feature set initially, and generally costs more.

AMD then wanted a slice of the pie, but instead of designing their own proprietary system they helped/pushed VESA towards adding "Adaptive Sync" to the DisplayPort standard which everyone can use. Not only is AMD using the "Adaptive Sync" standard but Intel has also expressed it's intention of using it in the future.

So AMD's implementation of Adaptive Sync is called "Freesync" and it has extra stuff like Low Framerate Compensation ("LFC") which was announced some time later. For technical reasons the LCD displays are limited to a fairly strict range of refresh rates, so if the PC produces higher or lower fps than this range the monitor can not match refresh rates.

LFC helps where the fps is lower than the display's range and can double or triple the frames until they get back into the supported range (GSync does something similar in it's hardware). Variable refresh tech' is also useless when the PC is putting out more fps than the monitor can display, so in this situation to match refresh rates you slow the PC down with something like Vsync (which also works with non-variable refresh monitors), but this introduces input lag. Thankfully screen tearing is less noticeable at higher refresh rates so there is less need run Vsync, which is good because Vsync annoys me... The end.
 
Last edited:
  • Like
Reactions: MuchTooSexy

KillerBunnyB

Junior Member
Mar 26, 2017
10
0
1
@lehtv

I see, but in my case, since I have an Nvidia GPU, and I'll most likely take the ultrawide with freesync/adaptive sync in it, I'm not really going to reap any benefit of this smoothness right? The only thing I'll benefit is that native refresh rate is 100hz so tearing might be less noticeable.
 

KillerBunnyB

Junior Member
Mar 26, 2017
10
0
1
My History lesson:

Back in the day CRT screens were run at the same frequency as the AC power you get from the wall, 50 or 60Hz depending on where in the world you were (later more advanced CRT screens improved on this but lets ignore that). For some reason when LCD monitors came on the market and took over in the early 2000's they decided to keep the US standard 60Hz "refresh rate" (all over the world), and this particular aspect stagnated for a decade or so. Even today if you read hardware reviews you will see many people use 60 frames per second as the default "playable", "ideal", or "good enough" number.

For a while LCD monitors got bigger and higher resolution while reducing the "response times" (time for pixels to change colors, usually given as a useless ideal number for marketting). But it took ages for developers to realize response times were fast enough and the cables they used had enough bandwidth that someone said "Hey, maybe we can do something other than 60Hz": and high refresh rate LCD monitors were born. Since higher refresh monitors can make games feel smoother they decided to run their monitors at 120Hz, or 144Hz, or 165Hz or whatever and sell them for lots of money to gamers.

Now if you've ever played games watching a frame rate counter you'll notice the actual number (frames per second, fps) your PC can do jumps around; it depends on what you are looking at and what's happening and so on. But the fps your computer is running is almost never exactly the same as the refresh rate of your monitor, which leads to problems like screen tearing.

It took about another 5 years (?) for some bright spark, probably at nVidia, to think "Hey, why don't we just run the screen at the same speed that it's getting frames from the computer" (at least with respect to gaming, it was used to save power in laptops also). And BAM what I call "Variable refresh rate technology" was born.

nVidia uses a proprietary system which they call G-Sync, which was out first and needs special hardware to run, had a better feature set initially, and generally costs more.

AMD then wanted a slice of the pie, but instead of designing their own proprietary system they helped/pushed VESA towards adding "Adaptive Sync" to the DisplayPort standard which everyone can use. Not only is AMD using the "Adaptive Sync" standard but Intel has also expressed it's intention of using it in the future.

So AMD's implementation of Adaptive Sync is called "Freesync" and it has extra stuff like Low Framerate Compensation ("LFC") which was announced some time later. For technical reasons the LCD displays are limited to a fairly strict range of refresh rates, so if the PC produces higher or lower fps than this range the monitor can not match refresh rates.

LFC helps where the fps is lower than the display's range and can double or triple the frames until they get back into the supported range (GSync does something similar in it's hardware). Variable refresh tech' is also useless when the PC is putting out more fps than the monitor can display, so in this situation to match refresh rates you slow the PC down with something like Vsync (which also works with non-variable refresh monitors), but this introduces input lag. Thankfully screen tearing is less noticeable at higher refresh rates so there is less need run Vsync, which is good because Vsync annoys me... The end.


Thank you, this was very helpful. Ideally using my gpu with gsync would be the best option, but those monitors with gysync, specially in ultrawide format are so damn expensive!

My best bet in ultrawide is to get a native 100hz monitor AT LEAST. I dont think 3440X1440p with 60hz will be a ery enjoyable experience.

On other other hand the Acer Predator XB271HU has it all, and would probably provide the best gaming performance on a 27" 1440p panel. But ultrawide has that great immersion. So I'm still a little in a fix. But my suggest that ultrawide is a game-changer, so lets see where I eventually turn in to
 

lehtv

Elite Member
Dec 8, 2010
11,897
74
91
@lehtv

I see, but in my case, since I have an Nvidia GPU, and I'll most likely take the ultrawide with freesync/adaptive sync in it, I'm not really going to reap any benefit of this smoothness right? The only thing I'll benefit is that native refresh rate is 100hz so tearing might be less noticeable.

Yep. FreeSync only works on AMD GPU's, GSync only works on NVIDIA GPU's.
 

KillerBunnyB

Junior Member
Mar 26, 2017
10
0
1
Guys,

since you're all quite well versed with the ultra wide monitors, you have convinced me to purchase one as well. Most likely I'll go with the Samsung CF791 (if available in my region) or the LG 34UC98.

A small query: How do you scale YouTube/Facebook videos, TV shows and movies and games to 21:9? Is it done automatically?

I use VLC media player for viewing all my movies and tv shows. I play mostly the latest games and I use Firefox as my browser. Any advice on this front regarding the software/tips you guys use?
 

lehtv

Elite Member
Dec 8, 2010
11,897
74
91
@KillerBunnyB Videos should automatically play in their correct aspect ratio. So unless you have 21:9 content (like some movies), you'll have a black bar on each side when in full screen mode.
 

richaron

Golden Member
Mar 27, 2012
1,357
329
136
As above, regular (16:9, "widescreen") video content has black bars on either side when in fullscreen mode, which is one of the reasons why a VA panel with (generally darker blacks) was important to me. It also means I appreciate movies with the "cinematic" aspect ratio which fits 21:9 screens great. You could also stretch or crop the videos in VLC but I wouldn't do that.

As far as gaming is concerned I have yet to have any major problems. In a couple of situations the menus were weird, like in the wrong place or missing the edges. Also depending on the game and settings you could get black bars or the 16:9 image stretched to fill 21:9 (I've never seen this, but I've heard it happens). But mostly it works fine, even with many old games, and you get a bigger field of view which far outways the minor issues (although ymmv). And I'm sure modern and future games will have full ultrawide support.

And as far as browser based videos are concerned I rarely run them fullscreen. In fact with a 34" ultrawide I never feel the need to have a browser wider than 1/2 the screen, and I've watch many hours like this. Even inside a browser which is half the width of the screen YouTube videos will run 14-15" diagonal (16:9), which is bigger than many (most?) entire laptop screens...