Help me understand 4K?

thehotsung8701A

Senior member
May 18, 2015
584
1
0
I been waiting to ask this question for a long time now but just didn't know how to word it correctly.

Many moons ago we had a resolution that was better than 1080p and it made PC the superior choice for gamer. Battlefield 2142 used it. That resolution is 1200p.

I'm still using 1080p which is probably as basic as it get in 2015. I want to upgrade but there is no way I know what 1440p and 4K will look like via through screenshot or Youtube video because a 1080p screen cannot show those resolution. This leave me at a standstill, how can I buy something that I can't even see first hand?

Here are some of my questions I have for you guys regarding resolution above 1080p and the differences it does make.

1. Could someone name me all resolution after 1080p and list which one is consider 4K, and which is not? There is so many different resolution, I also notice that HDTV also have different 4K resolution as well.

2. What exactly is 1440p? Is that not consider 4K? Is it consider 2K? How do these two resolution and its many variance look compare to 1080p? Is it just smaller text and sharper image or does it add more detail overall? Is it a significant differences meaning will it be night and day different when compare to 1080p? What does increase resolution do when it come to gaming?

A. Will it still support 1080p games. Will 1080p games on full screen be stretch out? Does bigger resolution mean bigger monitor size? And if not what is the point of it. Say you have a 22 inches 1080p versus a 22 inches 1440p and playing a 1080p game. Wouldn't you be force to use window for 1080p on a 1440p monitor versus those of 1080p monitor?

B. For new games like Witcher 3, is 1440p on low to medium setting better looking than 1080p on high setting? If I buy any of the resolution above 1080p, is 1080p supported as well?

C. How much of a differences is there between 1440p and 4K?


3. What is the average monitor size for user that game in 1440p and user who game on 4K?

4. Why are gamer spending the same amount of money for G-sync 1440p monitor when they can buy 4K monitor instead?

A. Is 4K the only resolution that can display DX12 and Virtual Reality? Is DX12 = VR or is VR something else separate from DX12?

5. How big of a differences does 4K make compare to 1080p in terms of price to performance ratios?

6. If a 4K monitor is the same size as a 1080p monitor, is there any huge differences or do you think a 27 inches or larger monitor to see the differences? Meaning, are there the perfect size monitor for each resolution?

7. Also is it true that 1440p is bad for people with bad eyesight? If so, what about 4K, is that good or bad for people without perfect vision?

8. Will games that support 4K look significantly much better than 1080p to the point that there is no return to 1080p after experiencing 4K?

A. Why do most gamers still use 1440p instead of 4K?

9. How does a 4K monitor affect game that support resolution lower than 1080p (this question apply to 1440p as well)? Will they support all low resolution that is available on a 1080p monitor?

10. Upgrading to a 4K monitor require a beefy GPU at the cost of reduce fps from 1080p so is it worth it in the end?

11. Is 4K worth it? How is it worth it? Will it make 1080p obsolete?

12. What the differences between 60 hz and 144 hz? Can you actually tell the differences? My current monitor is the Benq senseye 3 GW2255 LED. Is that a good or bad monitor? Here is the link to it.

http://www.benq.us/product/monitor/GW2255/specifications/

Also my monitor has awful response time, not sure what that does?

What is the differences between IPS and TN monitor and why do people prefer IPS over TN? Are there any other monitor technology that I should be aware off?

13. There appear to be two 4K resolution, a (3840x2160) and (4096 x 2160), which one is better and why would someone choose the 3840 over the 4096 one?

14. I don't know what these monitor look like but I used to have an Asus 1080p monitor and it was not the typical wide screen shape. It was an ultra wide 1080p and I didn't really like it, didn't like how longer the horizontal length was compare to a regular non ultra wide screen. However in your opinion, do you buy monitor with it natural size or do you guys prefer ultra wide screen and any other weird stretch variation?


Off topic: When I used to connect my laptop to my HDTV, there was always a delay. Say I shoot someone in CS, it would take a while for the TV to register it. So I guess the word is delay input. Now is delay input the fault of the HDTV hz refresh rate or is it the response time?


Sorry guys, I asked a lot of question but it something that I been wanting to ask for a very long time now.

This is a great forum and you guys know a lot. Thanks!
 
Last edited:

RadiclDreamer

Diamond Member
Aug 8, 2004
8,622
40
91
Its called 4k because of the resolution, 4096 x 2160...4096=4k. Same reason 1920x1080 is called..well 1080 :D

As far as im concerned and this is just my personal opinion, 1440 is going to be fine for some years to come just because of GPU limitations. Everything ive seen with 4k needs multi gpu and i personally dont want to pay the price of SLI or Xfire just to have a crappy experience with stuttering and instability.

Essentially all you are getting is more pixels, and more pixels in a given screen size = smaller pixels, smaller pixels = less aliasing and that leads to a better overall picture quality as well as makes larger/clearer textures look nicer.

as far as why someone would buy G-sync, well because the tearing just sucks, lots of people knock it but it really makes a noticeable difference
 

thehotsung8701A

Senior member
May 18, 2015
584
1
0
Its called 4k because of the resolution, 4096 x 2160...4096=4k. Same reason 1920x1080 is called..well 1080 :D

As far as im concerned and this is just my personal opinion, 1440 is going to be fine for some years to come just because of GPU limitations. Everything ive seen with 4k needs multi gpu and i personally dont want to pay the price of SLI or Xfire just to have a crappy experience with stuttering and instability.

Essentially all you are getting is more pixels, and more pixels in a given screen size = smaller pixels, smaller pixels = less aliasing and that leads to a better overall picture quality as well as makes larger/clearer textures look nicer.

as far as why someone would buy G-sync, well because the tearing just sucks, lots of people knock it but it really makes a noticeable difference

Why is it not call 2160p since 1080p is not call 1920p? Just kind of curious.

With the AMD Fury GPU be able to handle 4K resolution at 60fps?

I'm going to edit my post, I forgot to ask these questions.

1. What the differences between 60 hz and 144 hz? My current monitor is the Benq senseye 3 GW2255 LED. Is that a good or bad monitor? Here is the link to it.

http://www.benq.us/product/monitor/GW2255/specifications/

Also my monitor has awful response time, not sure what that does?

What is the differences between IPS and TN monitor and why do people prefer IPS over TN? Are there any other monitor technology that I should be aware off?


Off topic: When I used to connect my laptop to my HDTV, there was always a delay. Say I shoot someone in CS, it would take a while for the TV to register it. So I guess the word is delay input. Now is delay input the fault of the HDTV hz refresh rate or is it the response time?
 

YBS1

Golden Member
May 14, 2000
1,945
129
106
The higher resolutions really do look fantastic. I really want to move on from this 120Hz 1080p 27" Asus, but I'm really sensitive to refresh rates so I have to stay at 120-144Hz. I would like to move to the Ultra-Wide screen format at about 34", they just don't make those yet at those refresh rates. IPS would be nice but I'd be happy to settle with TN as long as it fits the bill in the other regards. It was rumored the upcoming Acer Predator XR341CK was going to do all that but it ended up falling short on refresh rate.
 

DaveSimmons

Elite Member
Aug 12, 2001
40,730
670
126
> [ will ] the AMD Fury GPU be able to handle 4K resolution at 60fps?

No. No single card can including a Titan.

2 x Fury might. Or not. No one knows yet.
 

alcoholbob

Diamond Member
May 24, 2005
6,271
323
126
Witcher 3 looks godlike at 4k. At 1080p it looks smeared and lacking in detail after you've seen it at 4k.
 

thehotsung8701A

Senior member
May 18, 2015
584
1
0
there's actually 2 4k's, there's double 1080p (3840x2160) and what he said, 4096 x 2160

So which resolution do most gamer who game in 4K use? I take it the 4096 would be more expensive?

The higher resolutions really do look fantastic. I really want to move on from this 120Hz 1080p 27" Asus, but I'm really sensitive to refresh rates so I have to stay at 120-144Hz. I would like to move to the Ultra-Wide screen format at about 34", they just don't make those yet at those refresh rates. IPS would be nice but I'd be happy to settle with TN as long as it fits the bill in the other regards. It was rumored the upcoming Acer Predator XR341CK was going to do all that but it ended up falling short on refresh rate.

How does one game on a 27" or higher monitor while sitting so close to the screen? Isn't that so bad for your eyes? What does refresh rate look like?

Witcher 3 looks godlike at 4k. At 1080p it looks smeared and lacking in detail after you've seen it at 4k.

Really? This mean 4K is a must for me cause currently I'm gaming in medium at 1050p and it suck. This is the first game I can't even game on 1080p.
 

Annisman*

Golden Member
Aug 20, 2010
1,918
89
91
So which resolution do most gamer who game in 4K use? I take it the 4096 would be more expensive?



How does one game on a 27" or higher monitor while sitting so close to the screen? Isn't that so bad for your eyes? What does refresh rate look like?



Really? This mean 4K is a must for me cause currently I'm gaming in medium at 1050p and it suck. This is the first game I can't even game on 1080p.

TW3 might look sweet at 4K but its probably a slideshow during actual gameplay. You've got to find your balance between IQ and fluidity. For me fluidity trumps IQ all day long.
 

thehotsung8701A

Senior member
May 18, 2015
584
1
0
TW3 might look sweet at 4K but its probably a slideshow during actual gameplay. You've got to find your balance between IQ and fluidity. For me fluidity trumps IQ all day long.

I normally google stuff but there so much to google nowadays so may I ask you what is IQ and what is fluidity?
 

Annisman*

Golden Member
Aug 20, 2010
1,918
89
91
IQ=image quality, for example 'Ultra' settings have better IQ than 'High' settings in a video game.

Fluidity refers to your user experience during gameplay regardless of the IQ, how smooth does it feel etc. Some of the things to consider are frame rate, input lag and micro stutter.

For many of us a more fluid gaming experience is more important than the Image Quality (IQ). Of course, we would all love to have the best IQ coupled with the smoothest gameplay but when you start talking about high resolutions such as 4K it is almost impossible to get both at the same time. That is the compromise the 4K gamer has to make for themselves.
 

thehotsung8701A

Senior member
May 18, 2015
584
1
0
IQ=image quality, for example 'Ultra' settings have better IQ than 'High' settings in a video game.

Fluidity refers to your user experience during gameplay regardless of the IQ, how smooth does it feel etc. Some of the things to consider are frame rate, input lag and micro stutter.

For many of us a more fluid gaming experience is more important than the Image Quality (IQ). Of course, we would all love to have the best IQ coupled with the smoothest gameplay but when you start talking about high resolutions such as 4K it is almost impossible to get both at the same time. That is the compromise the 4K gamer has to make for themselves.

So is 1440p what most user game on nowadays? Does 1440p support DX12 and VR?

Also does Pascal GPU mean you can game at 4K at 60fps or higher with hairwork and everything turn on?
 

YBS1

Golden Member
May 14, 2000
1,945
129
106
How does one game on a 27" or higher monitor while sitting so close to the screen? Isn't that so bad for your eyes? What does refresh rate look like?
27" isn't that bad, I guess I sit between a foot and a half to two feet from the monitor. Of course it looked humongous the first few days after I received it but like anything else you get used to it. It's just a monitor at this point. Low refresh will be perceived as a very jittery experience, again it's going to depend on what you're used to. Those who aren't used to 120+Hz probably wouldn't see anything wrong with 60Hz. If something happens to switch my refresh back to 60, I notice absolutely instantly in game, most of the time I'll not even make it that far, I usually notice something isn't quite right on Windows desktop.
 

tential

Diamond Member
May 13, 2008
7,355
642
121
4k I'd simply a resultion setting on monitor man. It's not some magic that makes you have to relearn everything you used to know.
Direct x 12 will work on it and every other resolution as direct x doesn't haven't anything to do with resolution. And vr will use 4k when gpus are strong enough to power it.

Understand the basic concepts my friend.
Using a higher resolution requires a higher end gpu which costs money. 4k is the highest resolution you can realistically get.
Using a higher resolution requires a higher end monitor to support that resolution which costs money. 4k monitors are on the very high end so they cost a decent amount relative to other monitors.

Haha that clarifies some of your questions as to why gamers use the resolution they do. There are more obviously but that's just basics.

Edit

As for what resolution is x amount better than y resolution? You should just seem some panels in person or just take a leap of faith. Only way. I'm hopping straight into 4k with a 3k purchase in gpus and monitors by the time I'm done of this next year and happy to do so. I've seen 4k downsampled shots to 1080p and that is more than enough to convince me.
 
Last edited:

SolMiester

Diamond Member
Dec 19, 2004
5,331
17
76
Yeah, some of the naming conventions around resolutions are strange. I have heard someone say Quad HD for 1440 (WTF?), and ultra HD for 4k, when in fact 4k is 4 x 1080 or Quad HD.

For me, UHD is too small unless you use windows scaling on 8.1.....For me 1440 on 28 is my best fit.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
So which resolution do most gamer who game in 4K use? I take it the 4096 would be more expensive?
All consumer gear is 3840x2160. 4096x2160 is the so-called "Cinema 4K" standard, and as implied by the name, is only in use by cinemas. It's meant to be a bit wider to accommodate 1.85:1 aspect ratio movies, which is one of the two common aspect ratios in American cinemas.

In other words, you don't need to even worry about the other 4K standard. Just 3840x2160.:)
 

thehotsung8701A

Senior member
May 18, 2015
584
1
0
4k I'd simply a resultion setting on monitor man. It's not some magic that makes you have to relearn everything you used to know.
Direct x 12 will work on it and every other resolution as direct x doesn't haven't anything to do with resolution. And vr will use 4k when gpus are strong enough to power it.

Understand the basic concepts my friend.
Using a higher resolution requires a higher end gpu which costs money. 4k is the highest resolution you can realistically get.
Using a higher resolution requires a higher end monitor to support that resolution which costs money. 4k monitors are on the very high end so they cost a decent amount relative to other monitors.

Haha that clarifies some of your questions as to why gamers use the resolution they do. There are more obviously but that's just basics.

Edit

As for what resolution is x amount better than y resolution? You should just seem some panels in person or just take a leap of faith. Only way. I'm hopping straight into 4k with a 3k purchase in gpus and monitors by the time I'm done of this next year and happy to do so. I've seen 4k downsampled shots to 1080p and that is more than enough to convince me.

So 4K would be future proof right? If a game doesn't run well in 4K, I can always run it in 1440p and 1080p correct? If that the case, I see no reason not to go to 4K over 1440p? Am I missing something?
 

RaulF

Senior member
Jan 18, 2008
844
1
81
Yeah, some of the naming conventions around resolutions are strange. I have heard someone say Quad HD for 1440 (WTF?), and ultra HD for 4k, when in fact 4k is 4 x 1080 or Quad HD.

For me, UHD is too small unless you use windows scaling on 8.1.....For me 1440 on 28 is my best fit.

That's because the HD standard is no 1080 but 720. WQHD is 4 times HD aka 720.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
So 4K would be future proof right? If a game doesn't run well in 4K, I can always run it in 1440p and 1080p correct? If that the case, I see no reason not to go to 4K over 1440p? Am I missing something?

Resolutions look better when set to the monitors native resolutions. If they are not, scaling is used, which doesn't look as good as using a monitor with that native resolution.

Also, to previous questions; 2k stands for a number of resolutions, including 1080p. 4k is being used simply as a marketing tool, rather than 2160p. 4K is easier to write and most 4k monitors are 3840x2160.

There are a couple reasons you might choose a 1440p (2560x1440) monitor or even 1080p monitor over a 4K monitor. There are refresh rates up to 144hz on the former resolutions, where as 4K is limited to 60hz and beware of the 30hz ones as well. And if you do not have the GPU horse power, you may have to use a lower resolution, which degrades the image with poor scaling.
 

escrow4

Diamond Member
Feb 4, 2013
3,339
122
106
4K is a stupid waste of time and money like anything that is bleeding edge and not mainstream. May be a flop like 3D. Who knows. Come back in 5yrs.
 

thehotsung8701A

Senior member
May 18, 2015
584
1
0
Resolutions look better when set to the monitors native resolutions. If they are not, scaling is used, which doesn't look as good as using a monitor with that native resolution.

Also, to previous questions; 2k stands for a number of resolutions, including 1080p. 4k is being used simply as a marketing tool, rather than 2160p. 4K is easier to write and most 4k monitors are 3840x2160.

There are a couple reasons you might choose a 1440p (2560x1440) monitor or even 1080p monitor over a 4K monitor. There are refresh rates up to 144hz on the former resolutions, where as 4K is limited to 60hz and beware of the 30hz ones as well. And if you do not have the GPU horse power, you may have to use a lower resolution, which degrades the image with poor scaling.

Let say I get a 32 inches 4K monitor if I want to run 1440p at it native resolution I'll just run it in window mode and it would still be the same size as a 27 inches 1440p monitor would it not?
 

Pariah

Elite Member
Apr 16, 2000
7,357
20
81
Let say I get a 32 inches 4K monitor if I want to run 1440p at it native resolution I'll just run it in window mode and it would still be the same size as a 27 inches 1440p monitor would it not?

No, it won't. It will scale to fit the screen. If your goal is gaming, unless you are prepared to lay out some pretty significant cash, go with a 27" 1440p monitor. For productivity, a 40" 2160p monitor is absolutely brilliant provided you have the desk space. Anything much smaller than that, 32" or less, and your eyes will hate you. At 40", a 2160p screen has basically the same dpi as a 27" 1440p monitor.

escrow4 4K is a stupid waste of time and money like anything that is bleeding edge and not mainstream. May be a flop like 3D. Who knows. Come back in 5yrs.

Quite insightful, with some very useful information to back it up.
 

Lonyo

Lifer
Aug 10, 2002
21,939
6
81
4K is a stupid waste of time and money like anything that is bleeding edge and not mainstream. May be a flop like 3D. Who knows. Come back in 5yrs.

4k isn't bleeding edge, it's mainstream affordable now, especially for things like TVs.
You can pick up a cheap 42" 4k TV for $400, or a 28" monitor for the same price. Most people wouldn't consider those prices "bleeding edge" for something that can last a long while.