Why is everyone obsessed with 4K?

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

B-Riz

Golden Member
Feb 15, 2011
1,595
762
136
You mean DCI? DCI is for cinema (Digital Cinema Initiative) and sets standards for commercial theater digital projection. Not that it matters anyway, no studio is going to sell it at Best Buy or Walmart. If you want to buy yourself a $200,000 Christie projector, another $20,000-60,000 for a first-run DCI media server, then another $200-500 PER VIEWING, then you can enjoy the full cinema experience in your home.

PPI matters, but like I said before, once you get over 30", the jaggies become noticeable, no matter if the game was "designed with 4K in mind" - whatever that means. LOL

4K textures have to be implemented to, LOL at ur self. bro, do you even computer?

Since reading Wikipedia is hard.

"A 4K resolution, as defined by Digital Cinema Initiatives, is 4096 x 2160 pixels (256:135, approximately a 1.9:1 aspect ratio). This standard is widely respected by the film industry along with all other DCI standards."

These things sold to us as 4K monitors are not really 4K done properly.

I am not interested in some janky 16:9 (1.78:1) implementation.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
cinematic 4k eg 4096x2160 isnt going to happen (at least in a mainstream way) for consumers. Just like how 1080p isn't really 2k. The ship has already sailed.

The extra pixels are pretty meaningless anyways for gaming purposes.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
As for one card being incapable of pushing 4k: Nonsense. I can play GTAV with everything on very high/ultra besides grass (high) and MSAA (off) on a Tri-X 290 OC at 40-50fps.

I don't buy this for one second.

I play GTAV on a single overclocked 290 @1100 core on 3x1080p (which is less pixels than 4k by a bit) and I have to turn everything to "High" with a few settings on Medium or its equivalent, MSAA and FXAA off, and I can only get to 50 fps in a place with really low object count, low draw distance, few actors. My game drops down to 35-40 fps pretty regularly. GTAV is particularly stuttery if you dont vsync it at 60. Or 30 I guess though that seems like it would be just too few fps for a smooth experience.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
Wait, I thought TN panels meant faster refresh rates, which is good for gaming. What did I miss?

People here for some inexplicable reason suspend logic entirely as soon as they hear the words "TN" or "IPS"

They just start screaming "IPS GOOD TN BAD" no matter the use-case.
 

Sabrewings

Golden Member
Jun 27, 2015
1,942
35
51
Anyone who has seen 4k with their own eyeballs wouldn't need to ask why people are obsessed with 4k.

I don't quite understand it either. I still use 1080p on my 55" TV for gaming and its great. Games look absolutely gorgeous with DSR. I could see me wanting 1440p, but I don't notice a deficiency with 1080p to have any interest in spending money on it. My next "big thing" for fidelity I'm eyeing is VR.

Like was mentioned, after seeing a game running on a 4k TV the difference wasn't enough to make me want to spend the money to change. It just isn't jaw dropping.

Please, go on about how anyone who sees it first hand will be amazed.
 

Scalesdini

Junior Member
Jun 20, 2015
10
0
0
I don't buy this for one second.

I play GTAV on a single overclocked 290 @1100 core on 3x1080p (which is less pixels than 4k by a bit) and I have to turn everything to "High" with a few settings on Medium or its equivalent, MSAA and FXAA off, and I can only get to 50 fps in a place with really low object count, low draw distance, few actors. My game drops down to 35-40 fps pretty regularly. GTAV is particularly stuttery if you dont vsync it at 60. Or 30 I guess though that seems like it would be just too few fps for a smooth experience.

I vsync to 30 currently. I forgot I had AF and Post FX set to low/off as well as shadows set to sharp. In country (grassy) areas is where my fps drops lowest, but never under 30, used the benchmark tool + in game testing to tweak settings. Around 50-52fps is the best case scenario iirc from the ingame benches, with the lowest being 36-38, but ingame in grassy areas it hits 30-32. With vsync to 30, however, it stays perfectly smooth. I don't mind 30fps, but big dips/jumps in fps bug me, so I basically tinkered around with settings until I found a combo that looked good (in my opinion) and maintained at least 30fps so I could use vsync. If you want I'll grab screenshots of my settings for you.

I also have a 5820k OC'd to 4.2ghz with 16gb DDR4 3000, which makes a large difference for GTAV vs using similar cards on a 4790k for instance.

Please, go on about how anyone who sees it first hand will be amazed.

Adjust your eyes? I don't know, everyone who has seen 4k content on my 4k tv is blown away, and I have a cheap off-brand TV. I could never go back, personally.
 
Last edited:

B-Riz

Golden Member
Feb 15, 2011
1,595
762
136
cinematic 4k eg 4096x2160 isnt going to happen (at least in a mainstream way) for consumers. Just like how 1080p isn't really 2k. The ship has already sailed.

The extra pixels are pretty meaningless anyways for gaming purposes.

One can hope, right? :|
 

BonzaiDuck

Lifer
Jun 30, 2004
16,313
1,879
126
I don't know the average age or age distribution of forum members, so I can't be sure who might grasp firsthand my thoughts about this in consideration of my current experience.

20-20 vision until I was about 45. At age 23, I had a singular and bizarre accident in which a very large animal pulled me out of a tree and I landed on my face. I was told eventually it would affect my vision. No less true, in my 40s I could still read the entries in the Compact Edition OED without the magnifier. No more. And I don't think the accident head trauma has anything to do with it.

The doc says my far vision is still 20-20. But I have trouble, with or without my progressive lenses, reading the Media Center program descriptions on my 42" LED-LCD from approximately 8' to 10'.

UHD isn't likely to fix that for me. But I do wonder how we're making bigger advances toward more and more fine points of light, larger and larger data transfers to a display, and how much it matters to the eye. Any eye -- yours, mine, 20-20, Kyle Brothloffsky's Connecticut cousin named Kyle.

I may have seen a 4K in operation at COSTCO. Yes. It is "picture-perfect." But a long way short of compelling me to go down to the bank and withdraw a couple grand so I can put my LG 42" up for sale. Even at a lower price-point, I'm still going to balk at whether it really matters enough to me.
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
Buying a 60hz 4K TN panel is simply dumb. Yes even for gaming. That article you cite is outdated. There are newer 1440P IPS panels that are as fast or faster than the 4K TN panel you're happy with.

Ignorance is bliss I guess.

A couple things...

First, the total response time (which includes input lag) for a fast TN panel is going to be faster than a fast IPS panel. It just is, getting mad about it won't change that.

Second, you're comparing 1440p to 4k which isn't exactly apples to apples.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
A couple things...

First, the total response time (which includes input lag) for a fast TN panel is going to be faster than a fast IPS panel. It just is, getting mad about it won't change that.

Second, you're comparing 1440p to 4k which isn't exactly apples to apples.

You are confusing pixel response time with latency. Pixel response time is how long it takes a pixel to fully transition to a new color. It has nothing to do with input lag. It effects ghosting and motion blur.
 

Madpacket

Platinum Member
Nov 15, 2005
2,068
326
126
A couple things...

First, the total response time (which includes input lag) for a fast TN panel is going to be faster than a fast IPS panel. It just is, getting mad about it won't change that.

Second, you're comparing 1440p to 4k which isn't exactly apples to apples.

Not it just isn't. It get's tiring around here when people like you spread misinformation about a subject you know little about.

Total response time for a TN panel, especially a 4K 60hz panel (the current topic at hand) is much slower than the latest 1440P IPS displays by Acer and Asus as they magically can go above 60hz. Even when compared at 60hz there's very little difference in overall speed, but @ 1440P you will obviously aim for higher frames per second and receive much lower "total response time" as the demand on the graphics card is much less.

I won't do any more homework for you guys but please do yourself a favour and read the following articles before you spread more garbage info. Please take a look at the response time comparison charts.

Acer Predator - http://www.tftcentral.co.uk/reviews/acer_xg270hu.htm
Asus MG279Q - http://www.tftcentral.co.uk/reviews/asus_mg279q.htm

I'm guessing the 4K TN panel you and the other poster are referring to is this one? "Acer XB280HK bprz." Wow, the fun ends at 60hz and as soon as you slump a little in your chair the top 1/3 of the screen washes away. But hey, more pixels right! Good thing you have a couple 700 dollar video cards to keep it at 60hz.

In conclusion, these new IPS panels from Acer and Asus will provide a better gaming experience for less money.
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
You are confusing pixel response time with latency. Pixel response time is how long it takes a pixel to fully transition to a new color. It has nothing to do with input lag. It effects ghosting and motion blur.

I know the difference, both need to be accounted for if you're looking for a good gaming monitor and you'll see that both are in many reviews.

Example: http://www.tomshardware.com/reviews/vg248qe-144hz-gaming-monitor,3609-9.html

Not it just isn't. It get's tiring around here when people like you spread misinformation about a subject you know little about.

Total response time for a TN panel, especially a 4K 60hz panel (the current topic at hand) is much slower than the latest 1440P IPS displays by Acer and Asus as they magically can go above 60hz. Even when compared at 60hz there's very little difference in overall speed, but @ 1440P you will obviously aim for higher frames per second and receive much lower "total response time" as the demand on the graphics card is much less.

I won't do any more homework for you guys but please do yourself a favour and read the following articles before you spread more garbage info. Please take a look at the response time comparison charts.

Acer Predator - http://www.tftcentral.co.uk/reviews/acer_xg270hu.htm
Asus MG279Q - http://www.tftcentral.co.uk/reviews/asus_mg279q.htm

I'm guessing the 4K TN panel you and the other poster are referring to is this one? "Acer XB280HK bprz." Wow, the fun ends at 60hz and as soon as you slump a little in your chair the top 1/3 of the screen washes away. But hey, more pixels right! Good thing you have a couple 700 dollar video cards to keep it at 60hz.

In conclusion, these new IPS panels from Acer and Asus will provide a better gaming experience for less money.

Yes it just is. You're still comparing 4k to 1440p

I'm not referring to any particular 4k panel, i'm simply comparing TN to IPS. It doesn't matter how mad you get, it won't make IPS faster. You can label that "miss-information" if you wish, it just means you're wrong on multiple points.
 
Last edited:

Indus

Lifer
May 11, 2002
15,217
10,670
136
4k is thought of as retina display.

Once you use an tablet or a phone with a retina display or similar ppi, your monitor at 1080p looks pixelated. I never thought of it that way when I upgraded to 1080p in 2008 but in 2015 yes it was pixelated. I wanted 4k but couldn't afford it so I went with 1440p (2560 x 1440) and it's really as sharp and pleasant as retina display without the price.

If I could afford it, I'd have 4k but maybe for the next upgrade cycle. I hear Apple wants to go with 10k retina in a few years.
 

alcoholbob

Diamond Member
May 24, 2005
6,380
448
126
Not it just isn't. It get's tiring around here when people like you spread misinformation about a subject you know little about.

Total response time for a TN panel, especially a 4K 60hz panel (the current topic at hand) is much slower than the latest 1440P IPS displays by Acer and Asus as they magically can go above 60hz. Even when compared at 60hz there's very little difference in overall speed, but @ 1440P you will obviously aim for higher frames per second and receive much lower "total response time" as the demand on the graphics card is much less.

I won't do any more homework for you guys but please do yourself a favour and read the following articles before you spread more garbage info. Please take a look at the response time comparison charts.

Acer Predator - http://www.tftcentral.co.uk/reviews/acer_xg270hu.htm
Asus MG279Q - http://www.tftcentral.co.uk/reviews/asus_mg279q.htm

I'm guessing the 4K TN panel you and the other poster are referring to is this one? "Acer XB280HK bprz." Wow, the fun ends at 60hz and as soon as you slump a little in your chair the top 1/3 of the screen washes away. But hey, more pixels right! Good thing you have a couple 700 dollar video cards to keep it at 60hz.

In conclusion, these new IPS panels from Acer and Asus will provide a better gaming experience for less money.

Assuming you don't mind a smaller display (Gsync 1440P screens cap at 27" currently), lower quality textures, and don't mind some gsync flickering, then yeah, that's better. But if those things bother you, then no.

I personally used the Acer XB270HU 1440P IPS Gsync monitor for a short time and compared to gaming on Witcher 3 and Diablo 3 with my old display. Returned to my 4K 32" IPS display pretty quickly.
 
Last edited:

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,329
126
4k is thought of as retina display.

Once you use an tablet or a phone with a retina display or similar ppi, your monitor at 1080p looks pixelated. I never thought of it that way when I upgraded to 1080p in 2008 but in 2015 yes it was pixelated. I wanted 4k but couldn't afford it so I went with 1440p (2560 x 1440) and it's really as sharp and pleasant as retina display without the price.

If I could afford it, I'd have 4k but maybe for the next upgrade cycle. I hear Apple wants to go with 10k retina in a few years.

The idea of retina display is a very high PPI. Resolution is not a factor. You can get buy a big 50"+ 4K TV and the pixels are noticeable. It's about very high resolution in as small a screen size as possible.
 

Piklar

Member
Aug 9, 2013
109
0
41
I see these threads pop up a lot, and I am not signaling anyone out, but what is it? It is just because 4k displays are out now, so obviously 4k must be the best?

I am very happy with 1080, and I know the day will come when our TV's, video cards, and monitors will be able to do everything we want at 4k, but I can't help but wonder if we are getting the cart before the horse here (and I know that much of that is just due to the fact that 4k TVs and monitors are out now.)

Any thoughts?

In the past year my gaming has migrated out of the office setup which has a bunch of tasty displays and into the lounge which has a single 70" 4K screen. My fav spot for gaming is 8-9 feet from the screen and when gaming have ditched keyboard and mouse for wireless xbox controllers... Its no accident that the media center sprouted sli 970 G1 Gamings for HDMI 2.0 and conveniently OCed to rock stable 1500mhz to ensure decent graphic settings @ 4k/60fps (with vsync on). The office PC is no slouch with sli 780s running a 28" 4K Samsung, a pretty 1440 panel with aftermarket logic board good for 120Hz and a few other decent panels (see sig). Quite a waste of hardware sadly, only the wife spends time in there doing dull office work and showing nil signs of becoming a hot gamer chick with blue hair n pony tails like in my day dreams...

Now to the point.. why this predicament you say? ... If you played Witcher 3 on both these setups my guess is youd come to the same conclusion as I which is in summary two words " Superior immersion!" .. Smooth FPS Gaming on a 4K large screen trumps anything else currently available hands down! . Above that would be 4K on large panel in 3D or with 4K Oled display. Anyone gaming on an 8K panel with a bunch of 980ti? Probably not..

From where Im slothing out right now the only way forward I can imagine would be slapping on a pair of VR googles with at least 1440p per eye with adequate graphics power to ensure fluid and seamless FPS .. hopefully next year or so.. when that happens itll be time for another hardware upgrade !!
 
Last edited:

therealnickdanger

Senior member
Oct 26, 2005
987
2
0
4K textures have to be implemented to, LOL at ur self. bro, do you even computer?
The reason I LOL'd at you is because you said "AA should not be needed, if the games were designed with 4K in mind."

The primary function of AA (anti-aliasing) is to eliminate jaggies (aliasing) on geometry, so your initial premise is completely false and indicated to me that you know nothing about the topic. Higher resolution textures don't do anything for jaggies (aliasing), but rendering at a higher resolution does. Even if you meant to say "textures", you would still be wrong. Some games come with optional 4K (true 4096x4096) and even 8K texture packs direct from the developer and others have active mode communities dedicated to that very purpose. You should try it sometime as it looks incredible. The 4K era of gaming is upon us, you just clearly don't understand "how to even 4K".

Since reading Wikipedia is hard. "A 4K resolution, as defined by Digital Cinema Initiatives, is 4096 x 2160 pixels (256:135, approximately a 1.9:1 aspect ratio). This standard is widely respected by the film industry along with all other DCI standards."

That's the problem with only using Wikipedia as a source and then cherry-picking what you want to see. To be clear, you are focusing on 4K DCI, which is not the same as 4K textures and image sensors (4096x4096), or 4K (UHD) televisions.

As shown in that very wiki article, 4K DCI allows many other resolutions and aspect ratios to qualify as 4K DCI, up to a MAXIMUM resolution to ensure that all theaters are capable of achieving it. It also established these MAXIMUM values so that directors/DPs can set target resolutions for their desired aspect ratio in post-production. I'm sure that you have noticed that many films are shot 'scope at 2.35:1, some are 2.20:1, some are even 16:9, like TV shows. "pixels are cropped from the top or sides depending on the aspect ratio of the content being projected."

You will rarely - if EVER - see any movie image in the theater displayed at 4096x2160, it's almost always cropped in one or more ways. This is why consumer TVs and monitors use the 16:9 format - it began as a middle ground between 4:3 and 2.35:1 so that home viewers could still get decent height from 4:3 images and decent width from 2.35:1 images and everything in between. Now that television has moved to 16:9 almost exclusively and more and more people are watching movies at home, we're seeing more 21:9 displays (2.34:1) being released.

What that wikipedia article doesn't tell you is that the Consumer Electronics Association (CEA) itself defines 4K and UHD as being the same thing where 4K qualifies for any television capable of displaying a MINIMUM of 3840x2160.

"The group also defined the core characteristics of Ultra High-Definition TVs, monitors and projectors for the home. Minimum performance attributes include display resolution of at least eight million active pixels, with at least 3,840 horizontally and at least 2,160 vertically. Displays will have an aspect ratio with width to height of at least 16 X 9. To use the Ultra HD label, display products will require at least one digital input capable of carrying and presenting native 4K format video from this input at full 3,840 X 2,160 resolution without relying solely on up-converting."
https://www.ce.org/News/News-Releas...lectronics-Industry-Announces-Ultra-High.aspx

These things sold to us as 4K monitors are not really 4K done properly. I am not interested in some janky 16:9 (1.78:1) implementation.

As I posted previously, you are free to buy commercial-grade 4K DCI equipment and enjoy your own version of a proper 4K DCI experience. I know that I can't afford it, but maybe you can.
 

4K_shmoorK

Senior member
Jul 1, 2015
464
43
91
Not it just isn't. It get's tiring around here when people like you spread misinformation about a subject you know little about.

Total response time for a TN panel, especially a 4K 60hz panel (the current topic at hand) is much slower than the latest 1440P IPS displays by Acer and Asus as they magically can go above 60hz. Even when compared at 60hz there's very little difference in overall speed, but @ 1440P you will obviously aim for higher frames per second and receive much lower "total response time" as the demand on the graphics card is much less.

I won't do any more homework for you guys but please do yourself a favour and read the following articles before you spread more garbage info. Please take a look at the response time comparison charts.

Acer Predator - http://www.tftcentral.co.uk/reviews/acer_xg270hu.htm
Asus MG279Q - http://www.tftcentral.co.uk/reviews/asus_mg279q.htm

I'm guessing the 4K TN panel you and the other poster are referring to is this one? "Acer XB280HK bprz." Wow, the fun ends at 60hz and as soon as you slump a little in your chair the top 1/3 of the screen washes away. But hey, more pixels right! Good thing you have a couple 700 dollar video cards to keep it at 60hz.

In conclusion, these new IPS panels from Acer and Asus will provide a better gaming experience for less money.

What's got your panties in a bunch? You've really got an attitude problem, you're downright disrespectful.

That being said, I OWN a XB280HK. I don't just preach about products I've never touched and post links to articles citing insignificant statistics about response times that are indistinguishable to the common user. 144 Hz and the like are great, I actually own the XG270HU also, modern TN panels are not that bad and you are over exaggerating.

Sure the smoothness at 144 Hz is no joke and can provide a competitive edge in modern shooters. But you will need to push some serious FPS to get to that point. And here's an idea if your willing to hear another person's point of view without lashing out like a small child.

People value resolution over response times. Surprising, I know. TN technology has come a long way and if you think IPS provides a world of difference from a quality TN panel, that is just personal preference.

Sure a 144 Hz IPS G-SYNC 1440p bla bla blah will look great and preform even better, but you are lying to yourself if you think 4K at 60 Hz will provide a significantly inferior experience.

And why do you care so much what other people prefer, if you like your IPS panels, YOU LIKE YOUR IPS PANELS! Take a step back and realize you are insulting a person's intelligence over formalities pertaining to monitor technology.

The flaw in your logic is that these high end, ultra responsive displays are beneficial only for competitive gamers and shooter lovers. What about those who enjoy a good story and rpg elements? They could give a flying flip about the difference between 15 ms and 5 ms.

So many negative nancies in this thread, my goodness:thumbsdown:
 

Gryz

Golden Member
Aug 28, 2010
1,551
204
106
People value resolution over response times.
.....
The flaw in your logic is that .....
And the flaw in your logic is that you think that what you value, is what most other people value. And that is probably not true. Different people value different things.

You like higher resolutions.
MadPacket maybe likes fast response times.
That UFO guy likes no motion-blur.
Someone else like the smoothness of G-Sync.
Some gamers like their games to run at constant 144fps@144Hz.
Some people want the biggest monitor they can get.
Some people want 10-bits colors and high color accuracy.
Some people want high contrast, so they have proper blacks and shadows.
Some people want a wide-screen monitor.
Lots of people like low prices.

Some people need multiple inputs.
Some people can't use G-Sync.
Some people can't use FreeSync.
Some people have little space on their desk, so they want a small monitor.

There is not one monitor on the market that can give you all this. Not even if you are willing to pay $10k. Not even if you are willing to pay $100k. Just doesn't exist. Just like there are no cars that can do 250 km/hour and have the loading space of a truck.

So you have to make choices. You can't have it all.

The OP has a point. Lots of noise about 4k. While in reality, only 0.6% of gamers has a 4k monitor today. And I think that many people want something different from their monitor more, than that they want 4k.

If you give me a 4k monitor, for free, that does 144Hz, has G-Sync and ULMB, and is 27" or larger ? Sure, I'll take it. I actually wanted a VA monitor, but sure, I'll take your monitor too, even if it isn't VA. But no VA *and* no 144Hz and no G-Sync ? Mmm, let me think about it. Probably not.

4k Resolution is nice. But it also has drawbacks. You lose ~60%-70% of your framerates when you go from 1080p to 4k. That means you need 3 times the horsepower in your GPU(s) to achieve the same framerates. Maybe not a problem, if you want or can pay for SLI gtx980ti. But even then, you can still enable less eyecandy as someone who runs 1080p with those same videocards. I prefer eyecandy. I want the best shadows, HBAO+, the furthest viewdistance, the highest object-details, more polygons and tesselation, highest texture-detail, more foilage, better weather-effects, etc, etc. Those give me the best immersion. Higher resolution ? [I don't care about] resolution. It's at the bottom of my priority list.

But it's all personal.

In the meantime, there is a lot of hype over 4k.

Profanity isn't allowed in the technical forums.
-- stahlhart
 
Last edited by a moderator:

xthetenth

Golden Member
Oct 14, 2014
1,800
529
106
The flaw in your logic is that these high end, ultra responsive displays are beneficial only for competitive gamers and shooter lovers. What about those who enjoy a good story and rpg elements? They could give a flying flip about the difference between 15 ms and 5 ms.

Then why would those people be looking at high refresh rate IPS screens anyway when the refresh rate isn't a big deal for them?

Of course a screen optimized for their use is going to be better for their use than one that spends a good bit of effort being good at something they don't need.

For example, I'll make big sacrifices for 3440x1440, but that doesn't mean that my 34UM95 is better for twitch gaming than something like a Swift, it's different priorities. However that also doesn't mean that all IPS screens are worse than all TN screens.

Getting the right monitor is figuring out the tradeoffs you want to make, not what someone else wants to make, and judging the performance of that monitor, not vaguely similar monitors.
 
Last edited:

4K_shmoorK

Senior member
Jul 1, 2015
464
43
91
Then why would those people be looking at high refresh rate IPS screens anyway when the refresh rate isn't a big deal for them?

Of course a screen optimized for their use is going to be better for their use than one that spends a good bit of effort being good at something they don't need.

For example, I'll make big sacrifices for 3440x1440, but that doesn't mean that my 34UM95 is better for twitch gaming than something like a Swift, it's different priorities. However that also doesn't mean that all IPS screens are worse than all TN screens.

Getting the right monitor is figuring out the tradeoffs you want to make, not what someone else wants to make, and judging the performance of that monitor, not vaguely similar monitors.

I'm not sure where the disagreement is, that is exactly the point I'm trying to make. Everyone has their preference. I never said IPS screens were worse, in fact, a 144HZ IPS GSYNC blah blah blah will beat out the the TN equivalent most everytime.

My overall point is that people shouldn't adhere themselves to any sort of "monitor philosophy". People like what they like.

The thing is, I don't understand the bad rep that 4K is getting? Really, what is the big deal?

Current panels are not the best preforming. Sure

Current panels are pricey. Sure

4K requires more graphical horsepower. Sure

The thing I don't understand is why someone would belittle or dismiss someone's choice of buying new technology early rather than waiting 1-4 years for the "144HZ IPS GSYNC" iterations of 4K? Because you believe their choice is wrong? Those are some catholic church levels of banality right there.

4K looks great. If you don't care about resolution, you don't care about 4K. That's fine, no one is saying you have to like it. But the time will come where 4K becomes standard and the time will come where graphic cards have no problem preforming at ultra resolutions.

Listen, I'm not trying to make enemies here. Technology is great, technology is good. Let's be glad we live in a time where it is advancing so rapidly, where we get experience phenomena that we're previously thought to be impossible.

Side Note: Has anyone heard any news on Blue Phase Mode LCD's? Those who are in love with response times and refresh rates would love BPLCD tech. Lag could potentially be reduced to 10-20 us (mircoseconds!)
 

bigboxes

Lifer
Apr 6, 2002
41,809
12,337
146
And the flaw in your logic is that you think that what you value, is what most other people value. And that is probably not true. Different people value different things.

You like higher resolutions.
MadPacket maybe likes fast response times.
That UFO guy likes no motion-blur.
Someone else like the smoothness of G-Sync.
Some gamers like their games to run at constant 144fps@144Hz.
Some people want the biggest monitor they can get.
Some people want 10-bits colors and high color accuracy.
Some people want high contrast, so they have proper blacks and shadows.
Some people want a wide-screen monitor.
Lots of people like low prices.

Some people need multiple inputs.
Some people can't use G-Sync.
Some people can't use FreeSync.
Some people have little space on their desk, so they want a small monitor.

There is not one monitor on the market that can give you all this. Not even if you are willing to pay $10k. Not even if you are willing to pay $100k. Just doesn't exist. Just like there are no cars that can do 250 km/hour and have the loading space of a truck.

So you have to make choices. You can't have it all.

The OP has a point. Lots of noise about 4k. While in reality, only 0.6% of gamers has a 4k monitor today. And I think that many people want something different from their monitor more, than that they want 4k.

If you give me a 4k monitor, for free, that does 144Hz, has G-Sync and ULMB, and is 27" or larger ? Sure, I'll take it. I actually wanted a VA monitor, but sure, I'll take your monitor too, even if it isn't VA. But no VA *and* no 144Hz and no G-Sync ? Mmm, let me think about it. Probably not.

4k Resolution is nice. But it also has drawbacks. You lose ~60%-70% of your framerates when you go from 1080p to 4k. That means you need 3 times the horsepower in your GPU(s) to achieve the same framerates. Maybe not a problem, if you want or can pay for SLI gtx980ti. But even then, you can still enable less eyecandy as someone who runs 1080p with those same videocards. I prefer eyecandy. I want the best shadows, HBAO+, the furthest viewdistance, the highest object-details, more polygons and tesselation, highest texture-detail, more foilage, better weather-effects, etc, etc. Those give me the best immersion. Higher resolution ? [I don't care about] resolution. It's at the bottom of my priority list.

But it's all personal.

In the meantime, there is a lot of hype over 4k.

The masses like big and cheap. When did the light bulb go on over your head? That doesn't mean that 4K is over-hyped. It's coming and it's not 3-D. It's about maturation and mass adoption that we are waiting for. And the best monitors will still cost well north of what most are willing to pay for them. I say buy what you can afford and not poo-poo new tech just because it's outside of your price range.
 
Last edited by a moderator: