Are you planning to buy a 4K monitor for gaming soon?

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

I’d like to know if you are planning to purchase a 4K monitor in the near future

  • I plan to buy a 4k monitor within the next month or so.

  • I am waiting for the cost of 4K monitors to decrease.

  • I will need to upgrade my GPU(s) first.

  • I do not have plans to purchase one at this time.

  • I already have a 4k display.


Results are only viewable after voting.

rickon66

Golden Member
Oct 11, 1999
1,824
16
81
Just bought an Acer 32" 4k to replace a Dell U3011. My man cave is now equipped with a Samsung 50" 4k TV and now the 4K monitor. Even with all the whining about black levels, refresh rates, etc., I am a happy camper!!
 

hawtdawg

Golden Member
Jun 4, 2005
1,223
7
81
I share your sentiments 100% when it comes to LCD, but I have to take exception with the Plasma IR point. People have been saying that "IR isn't an issue on recent plasmas" for years. They said it when I got a V20 plasma in 2010, and the V20 had plenty of IR. They were still saying it when I got an ST50 in 2013, and what do you know, that one had IR issues too. What's more, that model now had a pixel orbiter and scrolling bar tool, so clearly even Panasonic acknowledged that it was still an issue. I'd be hard pressed to believe that Plasmas suddenly overcame this hurdle in the last couple of years. To me "IR isn't an issue on recent plasmas" was always as hollow as "blacks are fine on recent LCDs", etc.

It's hardly worth discussing plasmas as an alternative to LCD anymore, as they no longer being manufactured. In smaller markets (like my country), the remaining plasma stock has already dried up. Part of me laments the loss of some fairly capable TVs, but on the other hand Plasma remained pretty stagnant since ~2007 (how many years was the Kuro king?) and was probably due retirement just like LCD is.

Hopefully we can all agree that OLED needs to hurry the hell up and get here to stay.

Though on the note about plasma tech stagnating, that had more to do with them having to go back to the drawing board around the same time the Kuro retired in order to get power usage down for newer Energy Star standards. Samsung and Panasonic had finally about caught up with this latest/last round of models while being more power efficient.

anyway, 4K OLED please.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
No they don't. Not a single one.


Plasma failed because it costs too much to make the panel, it is by far the best overall display tech we've seen yet. Black levels are everything for display quality, without a proper dynamic range it doesn't matter how good the display is in every other respect.

Black levels is not the I only thing plasma was better at: better real world measured constrast ratios, better viewing angles, better motion, better screen uniformity, better colour accuracy. I agree if we take a step back in terms of overall IQ, then all LCD/LEDs are inferior in IQ to Panasonic/Samsung plasma. This point is even more striking when we consider that TV LEDs now use local dimming which is superior to every PC monitor.

However, for PC gaming plasma has some limitations such as higher than avg. I put lag and image retention. Although contrary to popular belief, LCDs also get image retention. I personally experienced it with 2 units, one of them from Samsung - a reputable LCD manufacturer.

It's really a shame that plasma is dead because for a home theater setup it completely blows LCD/LCD away, unless your house is very bright where plasma glare ruins everything.

Maybe 32" QNiX/BenQ 2560x1440 VA panels are a good stop gap until OLED 4K but the current models still do not have FreeSync/GSync sorted out. Also, some of those 34" LG/Samsung 3440x1440 21:9 look really nice as a stop-gap solution.

As far as 4K on a 24" monitor, I strongly disagree. The extra emersion of a 32-37" 2560x1440/4K monitor is a way bigger benefit to slightly crisper PPI of a 24" 4K monitor. The goal of realism is to make video games as close as possible to reality. You simply cannot achieve that on small monitors where we'll everything in the game you see in front of you is small. There is a reason we go to the movie cinema because a 50" 8K TV cannot match the emersion factor of a high quality projector. When you are playing a racing or a fast person FPS game, it will be very difficult to resolve the difference between 24"
4k and 1440p monitor but boy place either of those next to a 32-37" high PPI monitor and it's all over for the small monitors. Believe it or not, a lot of console gamer's major complaints about PC gaming is playing on a small screen in an office chair.

For those hoping for a 120Hz 4K monitor, I can't imagine what graphics and CPU set up could pull that off in the 5 years. At least with FreeSync/GSync we will get more acceptable low FPS smoothness at 4K given the large demands on the graphics sub-system. However, 120 fps on a 4K monitor with everything maxed out on next gen game sounds like a total pipe dream for a long time.
 

KaRLiToS

Golden Member
Jul 30, 2010
1,918
11
81
Hopefully we can all agree that OLED needs to hurry the hell up and get here to stay.

Though on the note about plasma tech stagnating, that had more to do with them having to go back to the drawing board around the same time the Kuro retired in order to get power usage down for newer Energy Star standards. Samsung and Panasonic had finally about caught up with this latest/last round of models while being more power efficient.

anyway, 4K OLED please.

$27,039.49
$26,999.99 + $39.50 Environmental Handling Fee

http://www.bestbuy.ca/en-CA/product...spx?path=6cb86eccc7fb3b530899c632bb8c68befr02

10292440_2.jpg


IF ONLY I HAD THE MONEY FOR THAT.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
Black levels is not the I only thing plasma was better at: better real world measured constrast ratios, better viewing angles, better motion, better screen uniformity, better colour accuracy. I agree if we take a step back in terms of overall IQ, then all LCD/LEDs are inferior in IQ to Panasonic/Samsung plasma. This point is even more striking when we consider that TV LEDs now use local dimming which is superior to every PC monitor.

That's actually what I like about my plasma, the motion seems just a bit more fluid. Hard to describe, but something about it seems a little more life-like to me. And I don't feel the black levels are really much worse on my LCD TV's, in fact I never think about it when switching between the plasma and LCD. But, I do think I can pick out detail in the dark parts of the picture a bit better on the plasma. I give the plasma a slight edge overall, but an edge none the less.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Realism that's shattered by pixelation (if you're really including 32-37" 2560x1440 displays) and aliasing. No thanks, I'll take quality first, and then quantity can catch up later. Aliasing has never been more of a problem, with modern games crawling with masses of shader and transparency aliasing, while at the same time AA support is hopeless on DX10+. I prefer to minimize this with the finest pixels I can get, rather than have 4k of coarse pixels - it's not like I'll have tons of GPU headroom to spare on AA at that res, even if the game does support it properly.

This kind of pixel pitch really needs to be seen to be appreciated. It's a world of difference vs 2560x1440 at 27".

You aren't taking into consideration the viewing distance from the monitor in your comments. Someone using a larger 32-37" monitor is very unlikely to sit just 40-50 cm away from it unless they want to induce major headache and ruin their eyesight. Your comparison of PPI is valid if the human eye can actually resolve the extra pixels, yet based on your comments, it seems to me that you are comparing PPI in isolation to conclude that a large monitor with a lower PPI will have higher pixelation and thus inferior IQ.

For example, I just measured my viewing distance in real world use on a desk with my 15.6" 1080P laptop -- the laptop's screen sits 60-65 cms from my eyes (because my laptop screen is tilted away from me so the screen distance varies from the closest point near the bottom of the laptop where the keyboard is). This "low PPI 1080P" screen becomes Retina at 61 cm:
http://isthisretina.com/

Therefore, for my personal usage, I would not be able to resolve any extra detail going from 15.6" 1080P to a MacBook Retina Pro 15", nevermind 4K on such a small screen. Essentially 4K on a 15.6" laptop would be worthless for my personal usage model. However, some gamers might game 40-50 cm away from a 15.6" laptop, and for them it makes sense to get a higher PPI laptop.

My desk is 80 cm in width and my monitor sits at the very edge of it, add another 10-15 cm at least from the edge of the desk to my face. Below is an example of an 80 cm desk like mine with the monitor near the edge. You can see there would be at least 70 cm from the monitor to the end of the desk, not even accounting for the additional space between the desk and the person's face.

7326324308_56256644ed_z.jpg


32" 2560x1440 becomes Retina at 94 cm.
27" 2560x1440 becomes Retina at 81 cm.
40" 3840x2160 becomes Retina at 79 cm
37" 3840x2160 becomes Retina at 74 cm.
32" 3840x2160 becomes Retina at 64 cm.


vs. your example:

24" 3840x2160 becomes Retina at just 48 cm.

What about emersion/size of objects on the screen? Look just how much harder it would be to resolve the background details of ducks and tree branches on a tiny 23-24" monitor vs. a larger 30"+ monitor.

screen-sizes_tv-calulator.jpg


I will never game at only 48 cm from my desktop monitor as that's vomit inducing and too close. Therefore, for my intended purposes I could go as large as 40", or even 42" 4K and my eyes will not be able to tell the difference between a 42" 4K or a 24" 4K monitor. However, a 32-42" 4K monitor will allow for A LOT easier on the eyes productivity in Word/Excel vs. a tiny 24" 4K one, which would be an absolute eye strain in Windows for anything other than games. In addition, the 37-40" 4K monitor would be A LOT more immersive without adversely affecting IQ due to pixelation as you have implied in your post because the human eye would be unable to resolve the differences in higher PPI at comfortable viewing distances.

I am not saying everyone should get a larger desk or maybe they simply can't due to space constraints, but comparing monitors solely based on PPI as you have done is grossly misleading since it misses the most important component -- the ability of the human eye to actually resolve those pixels when taking into account the total viewing distance to the said screen. It's the reason why 50" 4K TVs positioned 10 feet away in the living room are a worthless marketing gimmick. In other words, a human eye is not good enough to tell a difference between a 50" 4K and a 50" 1080P display at 3 meters/10 feet. Now, it's true that the latest LCD/LED tech (such as quantum dots and local dimming) will make their its way into 2015 4K screens, while budget panels with inferior black levels, colour accuracy, etc. will become 1080P models, essentially making panels of 4K TVs superior to begin with, but not due to higher PPI.

It's not a shoe in that getting a higher PPI screen will magically improve pixelation and IQ because the human eye has finite capabilities. Couple this with poor Windows PPI scaling and it's not surprising why so many PC gamers think that a 24" 4K monitor is not particularly enticing as an upgrade. I would presume that 64 cm is a very reasonable minimum viewing distance for a gamer with a 32" 4K monitor, in which case it's already Retina to begin with, which is why so many PC gamers are looking at 32" and above for their 4K monitor upgrade.
 
Last edited:

CP5670

Diamond Member
Jun 24, 2004
5,681
782
126
There are some VA panels that aren't terrible, but "very good" is a stretch. Some TV's have local dimming, but no monitor does, and every, single, IPS monitor out there regardless of price has crap for black levels. The best ones out there don't break 1000:1 contrast ratio and most have IPS glow in the corners.

I find IPS to be overrated for this reason. Their image quality is no better than good TNs on dark colors, which is the main factor that matters in most games, while they exhibit significantly more motion blur. The only thing in their favor is the better viewing angles.

I'd only do 4k, like many others here, when we get 4k, 120hz, IPS/OLED, with low input lag at a reasonable price. Yes I know, asking the impossible. But I'm one of the guys who held on to my fw900 for a long time and nothing beats that monitor for twitch gaming.

I used a 2070SB CRT for many years and only took it out of service about a year ago when I discovered that my LCD had a strobing backlight mode for removing motion blur. The CRT's screen size is quite small by today's standards, but it still handles blacks better than any LCD and looks much better in a dark room.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I find IPS to be overrated for this reason. Their image quality is no better than good TNs on dark colors, which is the main factor that matters in most games, while they exhibit significantly more motion blur. The only thing in their favor is the better viewing angles.

Even the best TN panel, ROG Swift, cannot hope to come close to the best IPS panel in colours and black levels. TN either has grey blacks or dark blacks that make details disappear. Here is the $800 ROG Swift (left) against a budget $300 IPS monitor (right).

ErEpVPZ.jpg


LltrRFc.jpg


That's the best $800 TN panel ever made right there, beaten by a $300 IPS screen. ^_^

We shouldn't have a situation where an $800 screen is worse than a $300 screen in colours and black levels! That's unacceptable. That's why so many are waiting for 4K + IPS/VA.

You also mention viewing angles but kinda glance over them as a minor point but they pretty much wipe out colour and black levels on a TN.

05140151559l.jpg
 
Last edited:

skipsneeky2

Diamond Member
May 21, 2011
5,035
1
71
$27,039.49
$26,999.99 + $39.50 Environmental Handling Fee

http://www.bestbuy.ca/en-CA/product...spx?path=6cb86eccc7fb3b530899c632bb8c68befr02

10292440_2.jpg


IF ONLY I HAD THE MONEY FOR THAT.

You get that,you could skip out on the movies and wait for dvd releases and just have people pay you to let them watch those movies on your screen.

77'' is just massive,the wall my computer and desk is against is nearly that but open the front door and there goes visibility.Board that up in a second and go out the side door from now on and it would work as my couch is 12 feet away.
 

amenx

Diamond Member
Dec 17, 2004
4,555
2,893
136
Visual impact and immersion (for me at least) with a decent quality display comes first and foremost from SIZE. I've been on a 27" 1080p monitor in the past and pixels were indeed visible at the 20-24" I normally sit from it (mostly with text and icons). Throw a game or movie at it and pixellation is all but forgotten or even excused. NEVER would I have traded that for a 23" display with 'proper PPI'. Ask AdamK47 and the few others here who have the BenQ BL3200PT (32" 2560x1440p) what their experiences with it is like and if they would trade that with a 4k smaller monitor and see what they say. Although I am sure they may be interested in 4k if similar size and above. For entirely desktop usage, the argument would certainly be a bit different for me though.
 

CP5670

Diamond Member
Jun 24, 2004
5,681
782
126
Even the best TN panel, ROG Swift, cannot hope to come close to the best IPS panel in colours and black levels. TN either has grey blacks or dark blacks that make details disappear. Here is the $800 ROG Swift (left) against a budget $300 IPS monitor (right).

...

That's the best $800 TN panel ever made right there, beaten by a $300 IPS screen.

We shouldn't have a situation where an $800 screen is worse than a $300 screen in colours and black levels! That's unacceptable. That's why so many are waiting for 4K + IPS/VA.

You also mention viewing angles but kinda glass over them as a minor point but they pretty much wipe out colour and black levels on a TN.

There is not much difference in the black in those images. The IPS has better midtones and its overall image looks more vibrant, but the blacks are equally poor, like the sky in the first image. Both TN and IPS monitors have gray or blue shades for black, which are especially obvious when gaming in a dark room like I usually do. VA displays have significantly better blacks, but seem to be rare these days. Also, the Swift is hardly the "best TN panel ever made." It has good features and electronics, but the panel itself is fairly average. This review shows its black level and contrast ratio compared to some other monitors.

The viewing angles are indeed something in favor of IPS, but I find the off-angle shifts to be less annoying than any motion blur, especially at 120hz in older games than can maintain a constant 120fps.
 
Last edited:

KentState

Diamond Member
Oct 19, 2001
8,397
393
126
I'm waiting until there is a game worth upgrading for. As it stands, I'm struggling to buy PC games over console.
 

DooKey

Golden Member
Nov 9, 2005
1,811
458
136
Nope. I'll stick with my Swift until 4K offers 144hz and *sync as well as GPUs that can push the kind of monitor I want.
 

hawtdawg

Golden Member
Jun 4, 2005
1,223
7
81
We shouldn't have a situation where an $800 screen is worse than a $300 screen in colours and black levels! That's unacceptable. That's why so many are waiting for 4K + IPS/VA.

You also mention viewing angles but kinda glass over them as a minor point but they pretty much wipe out colour and black levels on a TN.

IPS and TN panels both top out in the 1000:1 contrast range, this means they have comparable black levels. In fact, I think that TN panels in general have slightly better contrast ratios (still absolute garbage though). VA panels aren't awful in this area (2x-3x better contrast ratios generally), and I'm glad they're gaining in popularity, they have their own set of issues.
 

Ramses

Platinum Member
Apr 26, 2000
2,871
4
81
I'd buy a really big Projector these days first.

My wife, who is the definition of non-technical end user(read: buying public), is a big proponent of that. She'd give up some quality for the big screen that isn't there when not in use in a heartbeat.

When wondering why the industry does the things it does, it's important to remember that things like contrast and black levels go totally over the head of the vast majority of people whipping out credit cards to buy this stuff. If it's not grainy and the colors are OK, it's good enough for the masses. Bring on the cheaper and thinner I guess for them.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I feel like DPI is becoming the subject of the next big "the human eye..." myth, replacing the old "the human eye can't distinguish frames above __fps" bollocks.

The comparison isn't even remotely close. Maybe some people on the internet claimed that the human eye cannot see beyond 60 fps, but which credible scientist did?

"The USAF, in testing their pilots for visual response time, used a simple test to see if the pilots could distinguish small changes in light. In their experiment a picture of an aircraft was flashed on a screen in a dark room at 1/220th of a second. Pilots were consistently able to "see" the afterimage as well as identify the aircraft. This simple and specific situation not only proves the ability to perceive 1 image within 1/220 of a second, but the ability to interpret higher FPS.
"

Source

You are making a correlation between 1 myth (60 FPS limits) that's basically made up on the Internet and another that's based on actual scientific knowledge (ability of the human eye to resolve PPI beyond a certain viewing distance).

You don't need to be able to distinguish between individual pixels to be able to see their cumulative effect in the form of shimmering/aliasing. Increasing DPI beyond retina levels still has a significant effect on IQ.

How can you spot differences in aliasing/pixelation IQ if you cannot spot the differences in individual pixels beyond the limits of the human eye? If you had 2 screens running motion at a distance far away from you beyond a high enough PPI resolution, you wouldn't be able to spot the difference. Are you claiming to be superhuman that you can? Even professional reviewers who review TVs every day for a living (CNET, AVS) can't tell a difference on a small 4K TV at a far enough viewing distance. There is a reason binoculars and magnifying glasses were invented to go beyond the limits of the human eyesight.

I feel like DPI is becoming the subject of the next big "the human eye..." myth, replacing the old "the human eye can't distinguish frames above __fps" bollocks. It's missing the point in a similar fashion; No, the human eye isn't scanning/processing 120 individual frames per second in their entirety, but you can still see the cumulative effect they have on fluidity, and no, the human eye isn't distinguishing individual pixel structures at 190dpi, but you can still see their cumulative effect in the form of aliasing, and there is much room for improvement.

If every PC gamer on our forum was hypothetically offered a free 32" 4K monitor vs. a 24" 4K one, which one do you think they would take assuming they have the desk space to accommodate a larger screen? You are one of the few people on our forum who thinks high PPI trumps large screen size + PPI beyond the capabilities of the human eye. Next thing you are going to try to tell us that PC monitors aren't good enough until they match Note 4's PPI of 515? Where do we stop exactly 16K, 32K?

PPI is an absolutely worthless metric without considering viewing distance. Make a 100 inch screen with 1 million PPI rating and place it 1 football field away from you and tell me you can resolve the extra details, less aliasing?

I am inclined to believe professionals that study and teach the human eye(sight) at top universities in the world rather than some opinion on how the human eye works posted on the internet.

"There's going to be some density beyond which you can't do any better because of the limits of your eye," said Don Hood, a professor of ophthalmology at Columbia University, in a phone interview with NBC News. "And consumers will soon realize that they aren't seeing much, if any, visual resolution and sharpness improvements," Soneira continued. The sets will likely be better than today's in other ways, he noted, "but the higher pixel count will not be the reason."
http://www.nbcnews.com/tech/gadgets...hones-surpass-limits-human-vision-f2D11691618

2D9962064-g-cvr-131212-pixels-jm-4p.nbcnews-ux-800-600.jpg


The reason manufacturers are pushing 4K on small sized TVs is because specs sell, like those worthless contrast ratio or refresh rate specs listed for LCDs that have nothing to do with real world measured performance. If in your mind you think higher PPI is always better, that's your choice and manufacturers will be glad to sell you 8K 24" monitor in 10 years. Beyond a certain point, the screen size creates an immersion factor that trumps the increased PPI specs. One good way to test this out is to use a 32" 4K monitor for 6 months and then force yourself to use a 15" 3K laptop monitor for work. Try that out and see how that actually works out in the real world. You might be have a chance of opinion regarding higher PPI vs. screen size.

Additionally, if PPI/pixel crawling is bothersome, one can enable super-sampling, DSR/VSR and various AA modes we have available but we can't just magically increase the screen size from 24" to 32-42". A larger monitor provides a lot more flexibility in this regard. If you want higher PPI, enable SSAA. With a smaller monitor you are always stuck with a small screen in front of you. The picture in post #83 shows how the everything in the image on a 23" TV is just smaller.
 
Last edited: