Why is everyone obsessed with 4K?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

B-Riz

Golden Member
Feb 15, 2011
1,595
762
136
Cue best Kanye West impression:

4K UHD-1 is great, and Imma let you finish, but "..." is the greatest of all time!

My problem with it is that it is still 16:9.

16:10 is a much better ratio, so, when 2560 x 1600 144Hz panels are available for $250 ish, I will move from 16:9 1080p 144Hz (non gaming second panel is 1920 x 1200).

3840 x 2400 at 144Hz would be amazing.

https://en.wikipedia.org/wiki/Graphics_display_resolution#WQUXGA

Besides, real 4K DCI is this (https://en.wikipedia.org/wiki/4K_resolution), so, until I can get a real 4K 120 / 144 panel, I have zero effs to give about it.
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
It's important to video card company fans who perceive an advantage for their company. I think it's readily apparent once you see who is pushing it from time to time.

For the average PC enthusiast it's not a big deal because they aren't going to buy anything in 4K until the price is low and the average single GPU can push it at decent settings.

I don't think we are going to see a better answer than this.
 
Feb 19, 2009
10,457
10
76
Average single GPU is already enough to push 4K at decent settings.

Tone down Ultra to High, runs great, looks similar. I face palm when I see reviewers run 4k with 8x MSAA.
 

B-Riz

Golden Member
Feb 15, 2011
1,595
762
136
Average single GPU is already enough to push 4K at decent settings.

Tone down Ultra to High, runs great, looks similar. I face palm when I see reviewers run 4k with 8x MSAA.

True, the whole point of 4K is you do not need MSAA anymore.
 

Ketchup

Elite Member
Sep 1, 2002
14,559
248
106
True, the whole point of 4K is you do not need MSAA anymore.

This is definitely true. I ran lower AA settings when I started running 1080, but I am sure others might reserve that for a higher resolution.
 

vissarix

Senior member
Jun 12, 2015
297
96
101
I see these threads pop up a lot, and I am not signaling anyone out, but what is it? It is just because 4k displays are out now, so obviously 4k must be the best?

I am very happy with 1080, and I know the day will come when our TV's, video cards, and monitors will be able to do everything we want at 4k, but I can't help but wonder if we are getting the cart before the horse here (and I know that much of that is just due to the fact that 4k TVs and monitors are out now.)

Any thoughts?

nobody its obsessed with 4k screen but obviously it looks way better then a 1080p screen..its like comparing a 320x240 vs 1920x1080 resolution...there is a huge difference..

i would personally prefer a 1080 with 144hz with no input lag since i play alot of fps shooters and 60hz seems like microstuttering all the time...
but playing gtaV and the witcher 3 on 4k maxed out must look amazing...
 

Madpacket

Platinum Member
Nov 15, 2005
2,068
326
126
Almost everyone above is commenting on something they obviously have not seen. 4K scaling on 8.1 has never given me an issue, and almost all the games I've played have scaled well with 4K. 4K really does bring games to life... the detail you start to see is absolutely incredible. I *can't* go back to 1080p. It looks that awful comparatively. Now the *only* downside, is you need serious GPU horsepower to push this resolution. Like 2x Fury X, 980 Ti, Titan X quality cards. But, if you can afford that without giving an arm or a leg, you are a moron if you choose a 1080p display over a 4K one *if* you have plenty of money.

It's not that many of us can't afford 4K with dual or 980 Ti's (that's insulting by the way), it's a matter of value. 4K simply isn't worth the cost of entry. Also I like how you cite "incredible amounts of detail" as if game assets are even created for 4K. Also you're likely playing on an TN LCD screen which generally have crap colour and contrast ratios which as one poster pointed out makes a much bigger difference than resolution.

Of course I could be wrong and you're rocking a 55" OLED 4K TV in which case you may have some merit.
 

Jacky60

Golden Member
Jan 3, 2010
1,123
0
0
I started on a 14 inch CRT and recently upgraded to a 40 inch 4k screen from a 27 inch 2560/1440. People are obsessed because it looks and feels much better. Just like my first 28inch LCD made my old 22in CRT feel like an abacus that had been replaced by a pocket calculator. It's a worrying question to ask because this is a tech website and I imagine most people applaud newer and palpably better technology apart from a few luddites who are 'happy' in their blissful ignorance of how much more enjoyable and immersive a bigger better screen is. It reminds me of my deceased father being unable to understand why anyone would possibly want to use the internet. The idea that it isn't worth the cost of entry is nonsense. It is once you've experienced it but obviously if your monitor is for email and occasionally watching grainy you-tube video then it isn't worth the cost of entry.
 
Last edited:

Gryz

Golden Member
Aug 28, 2010
1,551
204
106
Steam Hardware Survey gives good statistics about what gamers are using around the world.
http://store.steampowered.com/hwsurvey
2560x1440 - 1.11%
2560x1600 - 0.09%
2880x1800 - 0.01%
3200x1800 - 0.01%
3440x1440 - 0.04%
trans.gif

3840x2160 - 0.06%

Conclusion: a miniscule portion of gamers has a monitor that has higher resolution than 1080p.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
I find my triple 3x1080p monitor set up to be more immersive and impressive than 4k and its a bit easier to run.
 

Jacky60

Golden Member
Jan 3, 2010
1,123
0
0
True, the whole point of 4K is you do not need MSAA anymore.

Another fallacy, 4k looks loads better with AA but most times FXAA is good enough. All of these variables such as resolution, MSAA, FXAA contribute to a more lifelike experience but saying AA isn't necessary at 4k is not true unless you like crawling jaggies. I haven't got young eyes but I know what the real World looks like. Deluding yourself that somehow higher resolution obviates the need for AA is missing the point. These are all ways to incrementally get closer to what we see with our eyes every day and I don't see crawling jaggies as I glance around the room I'm in.
 

therealnickdanger

Senior member
Oct 26, 2005
987
2
0
Another fallacy, 4k looks loads better with AA but most times FXAA is good enough. All of these variables such as resolution, MSAA, FXAA contribute to a more lifelike experience but saying AA isn't necessary at 4k is not true unless you like crawling jaggies. I haven't got young eyes but I know what the real World looks like. Deluding yourself that somehow higher resolution obviates the need for AA is missing the point. These are all ways to incrementally get closer to what we see with our eyes every day and I don't see crawling jaggies as I glance around the room I'm in.

I agree for the most part. 4K on a 24" screen makes it harder to see jaggies, but they are clearly visible on anything over 30". I'll gladly accept some jaggies to get a higher frame rate, however.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
It's not that many of us can't afford 4K with dual or 980 Ti's (that's insulting by the way), it's a matter of value. 4K simply isn't worth the cost of entry. Also I like how you cite "incredible amounts of detail" as if game assets are even created for 4K. Also you're likely playing on an TN LCD screen which generally have crap colour and contrast ratios which as one poster pointed out makes a much bigger difference than resolution.

Of course I could be wrong and you're rocking a 55" OLED 4K TV in which case you may have some merit.
Biggest reason why I'm hesitant. Games just aren't created for high res sometimes and low res textures kill the immersion.

In the new smash Bros game. When I play on my 80 inch projector the char models look nice but sometimes I notice some extremely low res background textures which can kill the effect of am otherwise great looking game. This is far worse for pc titles though where I expect higher resolution
 

BonzaiDuck

Lifer
Jun 30, 2004
16,317
1,880
126
There are some things I could say about the appellation "Luddite" here.

I have some retired electronics-tech friends who gather much joy from refurbing and using "old-tech-stuff" to their pleasure and satisfaction.

I'm sure there are plenty of such folks here, but sometimes the forums seem like a convention of spendthrifts, and I include myself.

To resolve all this, however, I was only recently pleased to discover that Unigine Heaven or Valley, and 3DMark Firestrike Extreme and Ultra allow you to test a graphics card or Xfire/SLI at the higher resolutions -- while viewing the test on a 1080p monitor. [It appears that my 2x GTX970 SLI is "good to go."]

So if you really lust after 4K, you can take your time as you build and tweak the rig that will eventually feed it.

My first IBM-compatible computer was a Columbia transportable with a 9" or 10" CRT screen. IT was replaced with a 386 and NEC CRT monitor, the NEC lasting about ten years. Then, a 19" Viewsonic CRT. Not being graphics or gaming-obsessive back then, we watched our CRT's grow dim as the prices on LCD monitors dropped. We continue to use an old 23" 4:3 Viewsonic LCD upstairs, and it's still "quite bright."

I think if I'm going to worry about when to go "4K," I'll be conflicted about whether to upgrade the LG 42" HDTV or my desktop (BenQ Gaming) monitor. And the only difficulty with the desktop: there's only room here for perhaps a 27" unit, because the desk is also home to another older 4:3 LCD to use with the server when I don't want to work through Remote Desktop or the server dashboard.

In the olden days, we didn't junk our color TV until we began to hate it. So getting Ultra HD for either a TV or monitor raises the question "What to do with the old 1080p?" Goodwill Industries? An optimistic ad in the papers' for-sale section?

There's more involved in these decisions than just shelling out the ducats for a new display -- TV, monitor -- whatever.

But . . Wow! Ketchup! This thread seemed to grow like a wild-fire in So-Cal! Three freakin' pages?! In about a day's time?! Let me go back to the original post so I can pinch myself!
 
Last edited:

sm625

Diamond Member
May 6, 2011
8,172
137
106
I find my triple 3x1080p monitor set up to be more immersive and impressive than 4k and its a bit easier to run.

This is probably one of the biggest things 4k has going against it. 1080p has been around for so long that many people have migrated over to multiple displays. With 4k you either have to scale back down to one display, which is a kind of downgrade, or try to drive multiple 4k displays, which is insanely cost prohibitive.
 

Scalesdini

Junior Member
Jun 20, 2015
10
0
0
Anyone who has seen 4k with their own eyeballs wouldn't need to ask why people are obsessed with 4k.

As for one card being incapable of pushing 4k: Nonsense. I can play GTAV with everything on very high/ultra besides grass (high) and MSAA (off) on a Tri-X 290 OC at 40-50fps. Limited to 30hz because of the HDMI output on the card, it plays buttery smooth with no dips at all. I can achieve similar results in every other game I play, and expect better from my 980 Ti when it arrives.

Really, 4k's biggest problem is misinformation. Browsing tech sites, you find the same thing everywhere: TV's are bad and should be shunned for gaming (silly sentiment), benchmarks with 4k and tons of MSAA, benchmarks using older CPU's (I've noticed Haswell-E's get significantly better 4k results than other CPU's in 4k benchmarks, especially in GTAV), and of course the people who insist there's no "need" for 4k (need is subjective).

If you can afford the screen and the GPU to drive it, it is glorious. If not, that's fine too, 4k will keep dropping in price and you can get in on the beauty later on. It's funny to see so many people on tech sites shunning new tech because they can't afford it.
 

4K_shmoorK

Senior member
Jul 1, 2015
464
43
91
All these people replying saying "4K isn't worth it, you need too much horsepower" with either Titan X's or 980 Ti in SLI in your signature, WHY on earth would you spend $2K on graphics to play at 1080p or 1440P???

What a colossal waste of CASH!!

There is no WAY you would not be able to run most game @60FPS at 4K resolution with specs like that.

BTW, the cost of a 4K monitor is being grossly exaggerated in this thread, I just picked up the Acer XB280HK G-SYNC for $500 for my *lowly* single 980 Ti.

You guys need to get your heads out of your butts. 4K is not the future? 4K is a waste? This is tech advancing! Jeese, so much negativity about a resolution that will soon become the standard.

"You offend by bringing up things I can't afford" GTFO outta here, man. PC gaming is very much pay to play. If you can't afford 4K, then don't buy 4K man. NO ONE CARES WHAT YOU HAVE AND DON'T HAVE AND NO SENSIBLE PERSON WOULD EVER SHAME THOSE WHO DON'T HAVE THE LASTEST AND GREATEST!

4K is great and is here to stay, so what if current GPUs have trouble pushing the pixels? Its new tech soon to become current tech, get over it. New tech always has a price premium.


I get the "I need 120/144 Hz man", but I don't understand "60 Hz makes my eyes hurt and is poverty spec" argument. If you really believe 144Hz will put you at the top of the leaderboard every game then so be it. Everyone needs to calm down and take a breath.

Some people man...
 

4K_shmoorK

Senior member
Jul 1, 2015
464
43
91
Anyone who has seen 4k with their own eyeballs wouldn't need to ask why people are obsessed with 4k.

As for one card being incapable of pushing 4k: Nonsense. I can play GTAV with everything on very high/ultra besides grass (high) and MSAA (off) on a Tri-X 290 OC at 40-50fps. Limited to 30hz because of the HDMI output on the card, it plays buttery smooth with no dips at all. I can achieve similar results in every other game I play, and expect better from my 980 Ti when it arrives.

Really, 4k's biggest problem is misinformation. Browsing tech sites, you find the same thing everywhere: TV's are bad and should be shunned for gaming (silly sentiment), benchmarks with 4k and tons of MSAA, benchmarks using older CPU's (I've noticed Haswell-E's get significantly better 4k results than other CPU's in 4k benchmarks, especially in GTAV), and of course the people who insist there's no "need" for 4k (need is subjective).

If you can afford the screen and the GPU to drive it, it is glorious. If not, that's fine too, 4k will keep dropping in price and you can get in on the beauty later on. It's funny to see so many people on tech sites shunning new tech because they can't afford it.

AMEN BROTHER:thumbsup:
 

shady28

Platinum Member
Apr 11, 2004
2,520
397
126
Steam Hardware Survey gives good statistics about what gamers are using around the world.
http://store.steampowered.com/hwsurvey
2560x1440 - 1.11%
2560x1600 - 0.09%
2880x1800 - 0.01%
3200x1800 - 0.01%
3440x1440 - 0.04%
trans.gif

3840x2160 - 0.06%

Conclusion: a miniscule portion of gamers has a monitor that has higher resolution than 1080p.

Those stats explain the situation.

People do not use 4k TVs to play PC games. Most are using upconverted 1080p, if they even have that.

The whole 4k thing on PCs is statistically irrelevant. Literally 99.94% of gamers are not using 4k. More than 98.7% are using something less than 1440p.
 

B-Riz

Golden Member
Feb 15, 2011
1,595
762
136
Another fallacy, 4k looks loads better with AA but most times FXAA is good enough. All of these variables such as resolution, MSAA, FXAA contribute to a more lifelike experience but saying AA isn't necessary at 4k is not true unless you like crawling jaggies. I haven't got young eyes but I know what the real World looks like. Deluding yourself that somehow higher resolution obviates the need for AA is missing the point. These are all ways to incrementally get closer to what we see with our eyes every day and I don't see crawling jaggies as I glance around the room I'm in.

I respectfully disagree with you.

AA should not be needed, if the games were designed with 4K in mind.

It's all about PPI.

http://www.maximumpc.com/4k-monitor-2014/#page-1

But, all these panels are still half arsed UHD-1, not 4K CDI.
 

therealnickdanger

Senior member
Oct 26, 2005
987
2
0
I respectfully disagree with you. AA should not be needed, if the games were designed with 4K in mind. It's all about PPI. http://www.maximumpc.com/4k-monitor-2014/#page-1 But, all these panels are still half arsed UHD-1, not 4K CDI.

You mean DCI? DCI is for cinema (Digital Cinema Initiative) and sets standards for commercial theater digital projection. Not that it matters anyway, no studio is going to sell it at Best Buy or Walmart. If you want to buy yourself a $200,000 Christie projector, another $20,000-60,000 for a first-run DCI media server, then another $200-500 PER VIEWING, then you can enjoy the full cinema experience in your home.

PPI matters, but like I said before, once you get over 30", the jaggies become noticeable, no matter if the game was "designed with 4K in mind" - whatever that means. LOL
 

BonzaiDuck

Lifer
Jun 30, 2004
16,317
1,880
126
All these people replying saying "4K isn't worth it, you need too much horsepower" with either Titan X's or 980 Ti in SLI in your signature, WHY on earth would you spend $2K on graphics to play at 1080p or 1440P???

What a colossal waste of CASH!!

There is no WAY you would not be able to run most game @60FPS at 4K resolution with specs like that.

BTW, the cost of a 4K monitor is being grossly exaggerated in this thread, I just picked up the Acer XB280HK G-SYNC for $500 for my *lowly* single 980 Ti.

You guys need to get your heads out of your butts. 4K is not the future? 4K is a waste? This is tech advancing! Jeese, so much negativity about a resolution that will soon become the standard.

"You offend by bringing up things I can't afford" GTFO outta here, man. PC gaming is very much pay to play. If you can't afford 4K, then don't buy 4K man. NO ONE CARES WHAT YOU HAVE AND DON'T HAVE AND NO SENSIBLE PERSON WOULD EVER SHAME THOSE WHO DON'T HAVE THE LASTEST AND GREATEST!

4K is great and is here to stay, so what if current GPUs have trouble pushing the pixels? Its new tech soon to become current tech, get over it. New tech always has a price premium.


I get the "I need 120/144 Hz man", but I don't understand "60 Hz makes my eyes hurt and is poverty spec" argument. If you really believe 144Hz will put you at the top of the leaderboard every game then so be it. Everyone needs to calm down and take a breath.

Some people man...

Reasonable points, I think.

I gave my "monitor history" earlier. When HD had already established itself, I picked up a budget-priced Hanns-G -- more for use with my HTPC-functional system before I went for a "real" HDTV. It was the most god-awful short-lived and relatively crappy 1080p monitor I could get, and it seemed "just fine" until it crapped out after a year as a desktop monitor, and I revisited old "customer-reviews." I replaced it with a BenQ XL-2420Z gaming monitor (1080 but 144Hz) as a stop-gap choice, because there are more considerations I need to accommodate than just UHD : I'll want to upgrade and replace my KVM box, for instance. Which graphics card or card(s)? I resolved that.

"Need" in its most severe usage might better be applied to something like gasoline or home-heating oil -- it is something you can't do without. You have to put steaks in the freezer. You have to buy clothing. You have to have a roof over your head.

Somebody might "need" 4K because of certain computer applications. After that, I'd agree that it's "subjective."

So does this addition of "Beauty" seem worth it now, in terms of opportunity cost applied to other things one just wants? I suppose that's the crux of it.

You're also right, the prices are already dropping. Again, that's no less a relevant factor: How much do I want to pay now or later for extra resolution?

I won't criticize someone who wants (or needs!) dual $1K Titan cards. Obviously, if you get such items, you almost "need" a 4K monitor -- no . . freakin' . . . doubt.

But the stats sort of speak for themselves. Getting to the point of being "left behind" in tech-obsolescence is going to be a slow, lazy journey.