Why is everyone obsessed with 4K?

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Gryz

Golden Member
Aug 28, 2010
1,551
204
106
It's coming and it's not 3-D.
Sure. I don't think anyone disagrees whether it's coming.
The disagreement is about the fact that some people think that 4k is the best choice today. For everyone.

I say buy what you can afford and not poo-poo new tech just because it's outside of your price range.
Whether you can afford a little, or a lot, you still have to make choices. Price is not really a factor. There is no perfect monitor. Not for any budget.

I don't think people are dissing new tech. I think people are getting a little irritated with some individuals telling everyone they should go for 4k. Regardless it's the best choice for that individual or not. Even I get irritated by that some times.
 

Madpacket

Platinum Member
Nov 15, 2005
2,068
326
126
I'll admit it's my fault for bringing up the TN vs IPS differences but it was done for a reason. I hate when people waste money on superfluous technology like 4K TN panels. It let's manufacturers get away with overpriced crappy products slowing down where we really aught to be aiming (4K IPS @ 120Hz).

What has my "panties in a bunch" is the bad advice dolled out by some of you here who just look like you're tying to justify purchasing an inferior product you're now stuck with (if you only waited a month or two Shmork, that must hurt).

Back on topic. The OP asked about 4K.

The answer is 4K is not ready for the majority of games or gamers today (Steam proof stats aside) *if* they want the best gaming experience available.

The best gaming experience today with a traditional gaming monitor is with the 1440P IPS 144Hz panels with either G-Sync or Freesync, the monitors I linked above. I'm sure many other manufacturers will soon have monitors out with similar capabilities so it may be worthwhile waiting a few months, if anything, should drive the price down further of existing panels.

These panels offer no viewing angle issues (biggest problem with TN), generally better colour and contrast, and you can really crank up the settings in games with just *one* high end video card. But even if compared to 4K non TN panels (are there even any non TN 4K panels with Free or Gsync?), they are still *limited to 60hz* which is a real bummer.

If you think going from 1080P to 4K is a big jump, just try going from 60 to 120Hz. Gaming becomes much more responsive, fluid and immersive. Also with the help of the new vsync technology even games that can't handle 120Hz constantly will drop down but be handled by proper vsync and still be well over 60hz.

Add all of this up and it makes for a better gaming experience over what is available in 4K land today.

I'm partial to Freesync panels they are on average $200.00 cheaper and they can operate at lower Hz ranges to help smooth out the lower frames (helps with slower video cards). Plus you get the regular barrage of ports to connect other devices to it. G-Sync is nice in that it bypasses the panels scaler so you can get even a little more responsive gaming but the difference are pretty small and not really worth the extra money. Also you get one display port, that's it.

Acohol bob, I'm sorry you experienced flickering on your G-Sync monitor, that's not a common theme from what I've heard. It sounds to me from your experience you really just wanted a larger panel which is understandable, especially for 4K. As for lower quality textures, what are you talking about here?

There are other misconceptions people are spouting here that irk me. The texture quality isn't magically better because of increased resolution. Game assets such as the surface textures are chosen to be a specific size (most likely to fit 720 / 1080P resolutions as that's where the consoles are limited to). Just because you scale up the resolution doesn't make the textures better. This is a fallacy I keep reading and it boggles my mind.

Please refer to this thread for more info on that subject.

"http://www.neogaf.com/forum/showthread.php?t=792842"

2is: I relent, yes you're right the inferior technology "TN" is faster than IPS when measured at the same Hz. Well actually there's only really one panel on the market that is faster which is the Asus Swift but just by a bit, an advantage that only the most hardcore tourney players would likely care about. So perhaps you shouldn't be comparing these two different technologies since it's apples oranges right? (TN vs IPS) ;D

I'll be the first to ditch my 1440P panel for a 4K panel once 4K panels offer a better gaming experience. If I had to make a guess we're a few years away from that.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
@Madpacket: You post your opinion as fact way too much. There are several subjective differences between all these monitors, and no one is "best".
 

4K_shmoorK

Senior member
Jul 1, 2015
464
43
91
I hate when people waste money on superfluous technology like 4K TN panels. It let's manufacturers get away with overpriced crappy products slowing down where we really aught to be aiming (4K IPS @ 120Hz).

The answer is 4K is not ready for the majority of games or gamers today (Steam proof stats aside) *if* they want the best gaming experience available.

These panels offer no viewing angle issues (biggest problem with TN), generally better colour and contrast, and you can really crank up the settings in games with just *one* high end video card. But even if compared to 4K non TN panels (are there even any non TN 4K panels with Free or Gsync?), they are still *limited to 60hz* which is a real bummer.

Fair enough, I see the point you are trying to make. Seems like you really dislike TN panels, which is fine. I don't think the viewing angles are that bad or even really relevant considering people don't view content positioned far below, above, or to the side of the screen.

For TVs, sure.

It seems like you are done with 60 Hz, that is also fine. I like 4K and see some value in the "downgrade" from 144Hz to 60Hz G-SYNC gaming. Lived with a 27" Yamakazi IPS @ 60 Hz for 4 years. The 144 Hz really didn't do that much for ME, but I understand people can't play without it.

Thing is, it may be awhile until we start seeing ultra responsive 144 Hz 4K models as the bandwidth requirements for 4K content are demanding to say the least. It is likely new DisplayPort tech is needed, which means current graphics cards would not be compatible. So current hardware wouldn't cut it anyways.:biggrin:
 

B-Riz

Golden Member
Feb 15, 2011
1,595
765
136
The reason I LOL'd at you is because you said "AA should not be needed, if the games were designed with 4K in mind."

The primary function of AA (anti-aliasing) is to eliminate jaggies (aliasing) on geometry, so your initial premise is completely false and indicated to me that you know nothing about the topic. Higher resolution textures don't do anything for jaggies (aliasing), but rendering at a higher resolution does. Even if you meant to say "textures", you would still be wrong. Some games come with optional 4K (true 4096x4096) and even 8K texture packs direct from the developer and others have active mode communities dedicated to that very purpose. You should try it sometime as it looks incredible. The 4K era of gaming is upon us, you just clearly don't understand "how to even 4K".



That's the problem with only using Wikipedia as a source and then cherry-picking what you want to see. To be clear, you are focusing on 4K DCI, which is not the same as 4K textures and image sensors (4096x4096), or 4K (UHD) televisions.

As shown in that very wiki article, 4K DCI allows many other resolutions and aspect ratios to qualify as 4K DCI, up to a MAXIMUM resolution to ensure that all theaters are capable of achieving it. It also established these MAXIMUM values so that directors/DPs can set target resolutions for their desired aspect ratio in post-production. I'm sure that you have noticed that many films are shot 'scope at 2.35:1, some are 2.20:1, some are even 16:9, like TV shows. "pixels are cropped from the top or sides depending on the aspect ratio of the content being projected."

You will rarely - if EVER - see any movie image in the theater displayed at 4096x2160, it's almost always cropped in one or more ways. This is why consumer TVs and monitors use the 16:9 format - it began as a middle ground between 4:3 and 2.35:1 so that home viewers could still get decent height from 4:3 images and decent width from 2.35:1 images and everything in between. Now that television has moved to 16:9 almost exclusively and more and more people are watching movies at home, we're seeing more 21:9 displays (2.34:1) being released.

What that wikipedia article doesn't tell you is that the Consumer Electronics Association (CEA) itself defines 4K and UHD as being the same thing where 4K qualifies for any television capable of displaying a MINIMUM of 3840x2160.

"The group also defined the core characteristics of Ultra High-Definition TVs, monitors and projectors for the home. Minimum performance attributes include display resolution of at least eight million active pixels, with at least 3,840 horizontally and at least 2,160 vertically. Displays will have an aspect ratio with width to height of at least 16 X 9. To use the Ultra HD label, display products will require at least one digital input capable of carrying and presenting native 4K format video from this input at full 3,840 X 2,160 resolution without relying solely on up-converting."
https://www.ce.org/News/News-Releas...lectronics-Industry-Announces-Ultra-High.aspx



As I posted previously, you are free to buy commercial-grade 4K DCI equipment and enjoy your own version of a proper 4K DCI experience. I know that I can't afford it, but maybe you can.


When consoles can do true 4K (no up sampling), then it will be mainstream, and all games designed for / around it.


I have said earlier in the thread I want 4K+ resolution, 16:10 ratio, 120 / 144 Hz refresh rate.

This UHD-1 *is* a janky implementation, we are just getting gussied up TV left overs.

Not many realize 16:10 is superior, and the reason why.

https://en.wikipedia.org/wiki/Golden_ratio


4K vs UHD for those interested:

http://www.extremetech.com/extreme/174221-no-tv-makers-4k-and-uhd-are-not-the-same-thing
 

PrincessFrosty

Platinum Member
Feb 13, 2008
2,300
68
91
www.frostyhacks.blogspot.com
Next console cycle will probably target some kind of 4k@60hz support, that's likely to be 5-7 years from now, by that time 4k TVs will be mainstream and 1080p will be being phased out.

However consoles wont make use of 4k for most of the major AAA titles in that generation, it wont be until the generation of console after that which adopt strongly that resolution. It's almost certainly going to mirror 1080p support that we have now, it was possible last gen but not used, it's possible this gen and used intermittently in major titles, some are lower like 920p or wahtever daft res they use.

But with PC gaming its realistic to do it now, just not on a tight budget. For many as old as me and even older, this is just another step in screen resolution that we've made many times over the years, part of an established pattern of displays continually getting better. The only real debate here is if people can individually justify the move, many can't and that's fine, but the tech is undeniably better.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
When consoles can do true 4K (no up sampling), then it will be mainstream, and all games designed for / around it.

It is not a matter if when they can, but when the dev's decide it is worth doing. They can do 4K now. The dev's would just adjust the graphics they give you to accommodate it.

4K is doable now on a PC, you just have to adjust your graphics to accommodate it. The question is, what is the best balance of graphical settings and resolution?

The dev's won't likely target 4K on consoles until most people have 4K TV's, and even then, they might decide the trade offs on graphics is not worth the resolution.
 

B-Riz

Golden Member
Feb 15, 2011
1,595
765
136
My crazy theory is that the next gen consoles will be for 4K / VR, and powered by some AMD Zen / Fury graphics beast; unless nVidia gets a console chip and shakes things up.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
It is not a matter if when they can, but when the dev's decide it is worth doing. They can do 4K now. The dev's would just adjust the graphics they give you to accommodate it.

4K is doable now on a PC, you just have to adjust your graphics to accommodate it. The question is, what is the best balance of graphical settings and resolution?

The dev's won't likely target 4K on consoles until most people have 4K TV's, and even then, they might decide the trade offs on graphics is not worth the resolution.

Exactly.

There's even a lot of devs going on record to say they've reduced resolution below 1080p to run more effects and have better per-pixel graphics
 

Madpacket

Platinum Member
Nov 15, 2005
2,068
326
126
My crazy theory is that the next gen consoles will be for 4K / VR, and powered by some AMD Zen / Fury graphics beast; unless nVidia gets a console chip and shakes things up.

Yeah both VR and 4K is speeding up the graphics requirements to push out the frames necessary for a good gaming experience. It won't happen on 28nm but that's okay as we're not even half way through the current generation of consoles. Unless PC gaming overtakes console gaming in game sales the primary development platforms will still be for the lowest common denominator (XBOX One).

We have a long wait before native 4K game assets are the norm and the necessary graphics technology catches up :(
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
I'll admit it's my fault for bringing up the TN vs IPS differences but it was done for a reason. I hate when people waste money on superfluous technology like 4K TN panels. It let's manufacturers get away with overpriced crappy products slowing down where we really aught to be aiming (4K IPS @ 120Hz).

What has my "panties in a bunch" is the bad advice dolled out by some of you here who just look like you're tying to justify purchasing an inferior product you're now stuck with (if you only waited a month or two Shmork, that must hurt).

In regards to the first paragraph, based on my understanding we will need DP 1.3 before we can have a reliable way of pushing 4k @ 120Hz through a single cable and neither the 980Ti or Fury have that. Heck, AMD skimped with Fury and doesn't even equip it with HDMI 2.0, so as of now there really isn't much incentive for manufacturers to produce such panels or for consumers to purchase them without an interface to take advantage of it.

In regards to the second paragraph... I really don't see anyone doing that. I see people with different preferences than your own, and you're interpreting that as "oh noes, they're saying they have something better than I do, I'm going to respond and defend MY purchase and show them!"

Some people may WANT higher pixel density over speed just like some people prefer a better image quality with IPS even though it's a bit slower than an equivalent TN panel. ;)

Neither is a wrong choice, it's up to the individual to decide what THEY want.
 

Madpacket

Platinum Member
Nov 15, 2005
2,068
326
126
In regards to the first paragraph, based on my understanding we will need DP 1.3 before we can have a reliable way of pushing 4k @ 120Hz through a single cable and neither the 980Ti or Fury have that. Heck, AMD skimped with Fury and doesn't even equip it with HDMI 2.0, so as of now there really isn't much incentive for manufacturers to produce such panels or for consumers to purchase them without an interface to take advantage of it.

In regards to the second paragraph... I really don't see anyone doing that. I see people with different preferences than your own, and you're interpreting that as "oh noes, they're saying they have something better than I do, I'm going to respond and defend MY purchase and show them!"

Some people may WANT higher pixel density over speed just like some people prefer a better image quality with IPS even though it's a bit slower than an equivalent TN panel. ;)

Neither is a wrong choice, it's up to the individual to decide what THEY want.

I agree its up to the individual to choose what they think is best however I still think we as technophiles strive to hit the "sweet spot" and come to consensus when discussing technology. This is why I get emotional over silly things like displays as I think 1440P @ 144Hhz with g/f sync is that sweet spot.

It's a bit of a chicken and egg scenario as far as who goes first with displayport 1.3. There's no excuse for AMD to not equip the Fury with an HDMI 2.0 port though but at least it can manage 4K over displayport.

I also agree even if the monitors were available today (IPS 4K @ 120hz+) the graphics cards aren't there yet. But I believe the sooner they become available the faster Nvidia and AMD will release cards to drive them. It'll obviously take a a smaller lithography to handle this.

I think we should really make a poll listing the available options to gain consensus and end this thread.
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
I agree its up to the individual to choose what they think is best however I still think we as technophiles strive to hit the "sweet spot" and come to consensus when discussing technology. This is why I get emotional over silly things like displays as I think 1440P @ 144Hhz with g/f sync is that sweet spot.

I don't disagree with you, if I had to choose between the two I'd pick the 1440p option as well, but I'm not going to berate others that pick something else. This isn't like Intel vs AMD where one is clearly better than the other and is easily verifiable. Monitors are highly subjective and some people just like that pixel density.

Heck I'm still on a 60Hz 1200p panel :( I've been eyeing 1440p GSync panels but I don't like the price premium for the fact that it has GSync. I'm looking to Black Friday for some potential deals and maybe I'll step it up.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
The problem IMO with both freesync and gsync is it forces you to commit to one vendor for as long as you have the monitor, regardless if one vendor or the other is leading.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
The problem IMO with both freesync and gsync is it forces you to commit to one vendor for as long as you have the monitor, regardless if one vendor or the other is leading.

Agree'd. While I went with NV for this year, that doesn't mean I will again next year. And unlike GPUs where I can recycle it in the household, my financial adjuster would not permit annual monitor upgrades. She wants a bigger house :(
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I think a lot of people are excited about 4K simply because we feel that eventually it is probably going to be the "1080p" ubiquitous resolution for gaming and its rein could last over a period of 10+ years. It also results in a massive workload for GPUs and removes us farther from CPU bottleneck scenarios. For example, for a lot of PC games today, flagship cards like Fury X or 980Ti are CPU bottlenecked at 1080p without VSR/DSR.

Still, looking at 4K monitor, TV tech and GPU horsepower, it is still too niche. 4K adaptive sync monitors are too few, and 4K gaming on 27-28" isn't even an option for some gamers who would want 32"+ 4K gaming as a minimum. Windows DPI scaling is still not A+ which means outside of gaming, it's not as pleasant to use a 32" and below 4K monitor for productivity as say 1440/1600P or even a 34" 3440x1440.

As far as 4K TVs go, I encourage anyone here to go into your local electronics store such as BestBuy and look at the 55" LG 1080p OLED TV and compare it to any 4K TV. The difference in IQ is so dramatic in favour of the OLED, that you quickly forget the entire 4K vs. 1080p argument and realize just how garbage LED/LCD tech is overall. Pretty much take any non-tech savvy person/friend/significant other with you and I bet not 1 of them will pick the 4K LED/IPS/VA/TN over a 1080P OLED.

Ideally I would want a 120Hz 4K 40" OLED with Adaptive Sync by 2020, but I don't think we'll get there by that time. Between my Panasonic plasma and BenQ 32" 1440p, I am just going to keep waiting until GPUs get more powerful and we get better 4K screens, hopefully OLED becoming more popular and thus affordable in the next 5 years. After yet again seeing 55" OLED this week, all I can say is black levels, whites, colours and viewing angles on even 2015 4K Samsung LEDs are just last gen/budget technology. It's like comparing a Ferrari LaFerrari to a BMW M3.

Right now I would rather play games with maximum image quality at 1440P, rather than on medium at 4K. Maximum PC had a nice comparison chart where a single 980Ti was faster at 1440P than 980Ti SLI at 4K.

Once more Unreal Engine 4 games are out, next gen gaming on 4K on current cards will be an even bigger compromise. The GPU horsepower just isn't good enough yet imo, or you basically have to upgrade every 2 years to the next gen pair of flagships. If you can afford that, 4K is more feasible, but I wouldn't touch 4K without 980Ti SLI / Fury CF.
 
Last edited:

wilds

Platinum Member
Oct 26, 2012
2,059
674
136
It's kinda off topic, but once DP 1.3 becomes mainstream, 4K 120hz could be a viable product in late 2016, early 2017.
 

therealnickdanger

Senior member
Oct 26, 2005
987
2
0
I have said earlier in the thread I want 4K+ resolution, 16:10 ratio, 120 / 144 Hz refresh rate.

Don't we all, man, don't we all... DisplayPort 1.3 is what we need - 4K 120Hz! You'll probably never see 16:10 in the normal consumer channels like we had with CRTs and the early days of LCD. I don't really miss it anymore, the 1440p 21:9 monitors give more height than the old 1200p monitors for code/text/etc. while the added width is really fun for games.
 

alcoholbob

Diamond Member
May 24, 2005
6,390
469
126
It's kinda off topic, but once DP 1.3 becomes mainstream, 4K 120hz could be a viable product in late 2016, early 2017.

It took about 2 1/2 years for the first DP1.2 GPUs to come out. I imagine a refreshed Pascal in mid-2017 will be the first DP1.3 GPU.
 

MongGrel

Lifer
Dec 3, 2013
38,466
3,067
121
I think a lot of people are excited about 4K simply because we feel that eventually it is probably going to be the "1080p" ubiquitous resolution for gaming and its rein could last over a period of 10+ years. It also results in a massive workload for GPUs and removes us farther from CPU bottleneck scenarios. For example, for a lot of PC games today, flagship cards like Fury X or 980Ti are CPU bottlenecked at 1080p without VSR/DSR.

Still, looking at 4K monitor, TV tech and GPU horsepower, it is still too niche. 4K adaptive sync monitors are too few, and 4K gaming on 27-28" isn't even an option for some gamers who would want 32"+ 4K gaming as a minimum. Windows DPI scaling is still not A+ which means outside of gaming, it's not as pleasant to use a 32" and below 4K monitor for productivity as say 1440/1600P or even a 34" 3440x1440.

As far as 4K TVs go, I encourage anyone here to go into your local electronics store such as BestBuy and look at the 55" LG 1080p OLED TV and compare it to any 4K TV. The difference in IQ is so dramatic in favour of the OLED, that you quickly forget the entire 4K vs. 1080p argument and realize just how garbage LED/LCD tech is overall. Pretty much take any non-tech savvy person/friend/significant other with you and I bet not 1 of them will pick the 4K LED/IPS/VA/TN over a 1080P OLED.

Ideally I would want a 120Hz 4K 40" OLED with Adaptive Sync by 2020, but I don't think we'll get there by that time. Between my Panasonic plasma and BenQ 32" 1440p, I am just going to keep waiting until GPUs get more powerful and we get better 4K screens, hopefully OLED becoming more popular and thus affordable in the next 5 years. After yet again seeing 55" OLED this week, all I can say is black levels, whites, colours and viewing angles on even 2015 4K Samsung LEDs are just last gen/budget technology. It's like comparing a Ferrari LaFerrari to a BMW M3.

Right now I would rather play games with maximum image quality at 1440P, rather than on medium at 4K. Maximum PC had a nice comparison chart where a single 980Ti was faster at 1440P than 980Ti SLI at 4K.

Once more Unreal Engine 4 games are out, next gen gaming on 4K on current cards will be an even bigger compromise. The GPU horsepower just isn't good enough yet imo, or you basically have to upgrade every 2 years to the next gen pair of flagships. If you can afford that, 4K is more feasible, but I wouldn't touch 4K without 980Ti SLI / Fury CF.

Id it could be viable for 10 years, why buy one now when it's a little baby and hasn't really been developed much...
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
It took about 2 1/2 years for the first DP1.2 GPUs to come out. I imagine a refreshed Pascal in mid-2017 will be the first DP1.3 GPU.

I don't expect adoption rate of DP 1.3 will take as long though. DP seems to be gaining in popularity recently.
 

BonzaiDuck

Lifer
Jun 30, 2004
16,821
2,143
126
Already a tremendous thread, and I've tried to absorb some of the technical understanding infused into many posts. Graphics and displays have always been a low priority item with me.

I only switched from a pre-HD 150-lb tube TV to LCD-LED HDTV in 2011. And I only changed from a 4:3 1680x1050 Viewsonic desktop LCD to HD about two years ago.

My best friend of 25 years was worth about $25 million when he retired. I had a fleet of interchangeable Hondas; he had a fleet of interchangeable BMWs. He didn't buy new cars; I didn't buy new cars. But the wisdom may translate into computer technology: "For new auto technology, wait at least for the second year after it's been introduced." Otherwise, the logic doesn't apply here: "Always buy 'used' and couple years behind the model year."

My last 1080p desktop monitor went on the fritz in January, and I was looking at this same issue -- given the "hype" or excitement about 4K. I "needed" a replacement monitor. Whatever the price, there didn't seem to be sufficient 4K options to choose from. I decided I didn't know enough at that point. I might have picked a 1440p for the replacement, though.

I've got the graphics horsepower. But I think I'm going to take a year to see what develops on this angle. Others may have different priorities, and that's their privilege.