Help me understand 4K?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
> [ will ] the AMD Fury GPU be able to handle 4K resolution at 60fps?

No. No single card can including a Titan.

2 x Fury might. Or not. No one knows yet.

Actually it depends on what settings you need to run for satisfactory IQ and what fps. That's a personal thing though. Maxed out? Any game? Probably need three.
 

dave1029

Member
May 11, 2015
94
1
0
Ok. 4K is just 4x the pixels of 1080p. Upping the resolution will make the image more crisp, clearer, and it will allow you to see more texture detail (even in games as old as 2011) that you can't really notice in the pixel soup that is 1080p. The main limitation for the majority of gamers is not the price of the monitor itself, but the price of the monitor plus having to buy 2 980 Ti's. It also basically forces you to go Nvidia since AMD (for the moment) has absolutely nothing that can compete with 980 Ti's in SLI. So many people will settle for 1440p with G-Sync @ 144Hz. That is probably the best all-around setup. G-Sync is amazing, 144 Hz is amazing, and the resolution is good enough. I myself, use an Acer XB280HK, which is a G-Sync 4K monitor. I personally love it, and wouldn't downgrade to 1440p. With that being said, what you should get definitely depends on how much money you have to spend. If you decide on 4K, you're going to want to grab my monitor, along with 2 980 Ti's. That is going to give you the best gaming experience @ 4K that you can get right now. Even when AMD launches their new stuff, my setup will still be the best because of G-Sync. If you don't want to spend the $3,500 that my entire setup would cost, a 1440p 120/144hz monitor is a great option along with a single 980 Ti. Or you can stay at 1080p. But I will warn you, every time you up the resolution a notch, you can't go back down. So make sure your hardware can support the increase.

But honestly man, I would recommend waiting on Pascal in 2016. By then, the top card will be able to push framerates @ 4K above 60 fps. The Titan X is hovering around 45 fps average, Pascal will probably be a 60% gain... so that's around 70 fps.
 

kasakka

Senior member
Mar 16, 2013
334
1
81
1. Could someone name me all resolution after 1080p and list which one is consider 4K, and which is not?

As mentioned, 4K is 4096x2160 or 3840x2160.

2. What exactly is 1440p? Is that not consider 4K? Is it consider 2K? How do these two resolution and its many variance look compare to 1080p? Is it just smaller text and sharper image or does it add more detail overall? Is it a significant differences meaning will it be night and day different when compare to 1080p? What does increase resolution do when it come to gaming?[/quote]

1440p is 2560x1440 so it's essentially 2.5k. 2k is 2048x1080 aka Digital Cinema 2k. You won't find 2k monitors really. At higher resolution you are able to see finer details in the game. Everything looks sharper and text is less pixelated. On the desktop you gain more space for your apps. Even 1080p -> 1440p is a big change, that's a lot more pixels.

A. Will it still support 1080p games. Will 1080p games on full screen be stretch out? Does bigger resolution mean bigger monitor size?

Yes, you can still play at 1080p. Since it's not the native resolution, it will be blurrier than with a native res 1080p display. That's why it's recommended to use the native res where possible. However games that only support up to 1080p will work fine.

Bigger resolutions usually come at bigger display sizes. The relationship of display size vs resolution vs viewing distance is important. Sitting on the sofa several feet away, you would not be able to see the pixels of a 40" 1080p TV but when sitting in front of a 40" 1080p monitor it would be a pixelated mess due to the viewing distance. Same goes for resolution - the larger the display, the larger the pixels (at the same resolution). At resolutions above 1440p this becomes largely irrelevant though as they just have so many pixels that you can't notice the individual ones.

So choose a display not only based on resolution but also based on how far you will be sitting from it. 27" is a pretty good size to be able to fully see the monitor when sitting right in front of it.

B. For new games like Witcher 3, is 1440p on low to medium setting better looking than 1080p on high setting?

Depends on the game's settings and how much they affect image quality and whether you prefer a sharper image or more detailed textures and shadows and more assets on screen.

C. How much of a differences is there between 1440p and 4K?

A lot. 4k has so many pixels that text looks like you would see on a book. With 4k you also pretty much have to use DPI scaling. This Windows feature scales text and UI so that elements are not too small on a 4k 27" display. It's pretty much mandatory for 4k unless you have a huge screen. So purely for desktop space 1440p and 4k are pretty much equal, it's just that 4k uses scaling and 1440p does not. You can try the feature with your current monitor and see how it changes everything.

The problem with DPI scaling is not all programs support it properly. Any custom window UI software has problems unless it has support for it specifically. Google Chrome is a known offender for example. Likewise apps using legacy Windows APIs have issues, either they show tiny icons or look blurry. Many games can also have issues with it where game UI is unreadably small on a 4k display because it reverts to 1:1 scaling and does not size UI elements accordingly. Unfortunately it's a problem that only goes away with newer, better programmed software.

3. What is the average monitor size for user that game in 1440p and user who game on 4K?

While there are 24" 4K displays, most are 27" and above. 1440p displays are firmly stuck at 27".

4. Why are gamer spending the same amount of money for G-sync 1440p monitor when they can buy 4K monitor instead?

Because 4k is still an immature tech and requires very high end GPUs to get good framerates. The scaling issues mentioned above and being stuck with only 60 Hz refresh rates (because Displayport does not yet support higher). It's generally an expensive investment right now. I'd say that next year would be the earliest that I'd consider 4k and that's only if we get 4k at higher refresh rates.

A. Is 4K the only resolution that can display DX12 and Virtual Reality? Is DX12 = VR or is VR something else separate from DX12?

4k has nothing to do with DX12 or VR. It's just a resolution. DX12 depends on GPU support and is just an API, nothing to do with VR on its own. VR has it's own display that has a certain resolution.

5. How big of a differences does 4K make compare to 1080p in terms of price to performance ratios?

Massive. You can get great 1080p performance with a single GTX 970 but for 4k you will need a Titan X (or 980 Ti) but preferably at least 980 SLI.

6. If a 4K monitor is the same size as a 1080p monitor, is there any huge differences or do you think a 27 inches or larger monitor to see the differences? Meaning, are there the perfect size monitor for each resolution?

This was mostly answered in a previous answer. There is an ideal size for a resolution - it's one where the screen is large but not so large that you can see individual pixels. For 1440p 27" is it. For desktop displays at above 30" they usually become unwiedly unless you put the display further away from you - just too much screen to fit your view at once.

7. Also is it true that 1440p is bad for people with bad eyesight? If so, what about 4K, is that good or bad for people without perfect vision?

Sharper, more detailed text (more pixels to represent each character) is easier to read. If you have bad eyesight you can use the DPI scaling to make text and UI elements larger. Of course you should have the proper glasses first. I have rather poor eyesight and have no problems with a 27" 1440p without DPI scaling.

8. Will games that support 4K look significantly much better than 1080p to the point that there is no return to 1080p after experiencing 4K?

Yes.

A. Why do most gamers still use 1440p instead of 4K?

Because you don't have to deal with scaling issues and don't need to buy top end GPUs to play. Also 1440p now come with higher refresh rate models and those make motion less blurry. I game on a 144Hz 1440p display and would not buy a 60 Hz monitor again.

9. How does a 4K monitor affect game that support resolution lower than 1080p (this question apply to 1440p as well)? Will they support all low resolution that is available on a 1080p monitor?

Yes.

10. Upgrading to a 4K monitor require a beefy GPU at the cost of reduce fps from 1080p so is it worth it in the end?

IMO not at the moment. I'd wait for the displays to become better (namely higher refresh rates) and for GPUs with Displayport connections that can support those higher refresh rates.

12. What the differences between 60 hz and 144 hz? Can you actually tell the differences?

Less motion blur, ability to display smoother movement (if your GPU can churn out 60+ frames). It's instantly noticeable even on the desktop.

Off topic: When I used to connect my laptop to my HDTV, there was always a delay. Say I shoot someone in CS, it would take a while for the TV to register it. So I guess the word is delay input. Now is delay input the fault of the HDTV hz refresh rate or is it the response time?

Input lag is the word you're looking for. On some TV's you can make this less worse by enabling a game mode (if the TV has one). It basically disables a lot of the processing the TV does to the image so it takes less time to show. However it's better to buy a TV or monitor that is known for low input lag. Check reviews.

Hope this answers most of your questions. I removed some that can be easily found with Google or are answered in other ones.
 

RadiclDreamer

Diamond Member
Aug 8, 2004
8,622
40
91
4k isn't bleeding edge, it's mainstream affordable now, especially for things like TVs.
You can pick up a cheap 42" 4k TV for $400, or a 28" monitor for the same price. Most people wouldn't consider those prices "bleeding edge" for something that can last a long while.

Until there is ample content, its still bleeding edge. A few sample discs and .1% of streaming titles doesnt cut it.
 

Jaskalas

Lifer
Jun 23, 2004
36,540
10,811
136
4K is a stupid waste of time and money like anything that is bleeding edge and not mainstream. May be a flop like 3D. Who knows. Come back in 5yrs.

Anyone that questions the value of 4k needs to walk into Sams Club or Costco and spend a few minutes watching the 4k curved TV. Those demonstrations should convince you nicely.

Affording it is another matter entirely, but prices will come down year after year just as they did with 1080p.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Let say I get a 32 inches 4K monitor if I want to run 1440p at it native resolution I'll just run it in window mode and it would still be the same size as a 27 inches 1440p monitor would it not?

I don't think it'll be the same size as a 27" 1440p monitor, but it will behave like a smaller 1440p monitor in appearance (I believe it'll be 21-22" in size). Alternatively, many monitors and even Nvidia and AMD software allow you to not scale when using smaller resolutions. This will result in fullscreen mode showing a 1 to 1 pixel and a much smaller image than the size of the monitor.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
I can't wait until games support 4K. I can't imagine how beautiful it will look. And RIP my GPU's.

...what?

Games already support 4k and have for quite a while. Old games even support 4k. It's just another resolution to be selected.
 

Anubis

No Lifer
Aug 31, 2001
78,712
427
126
tbqhwy.com
Yeah, some of the naming conventions around resolutions are strange. I have heard someone say Quad HD for 1440 (WTF?), and ultra HD for 4k, when in fact 4k is 4 x 1080 or Quad HD.

For me, UHD is too small unless you use windows scaling on 8.1.....For me 1440 on 28 is my best fit.


Quad HD (QHD) being 1440 because it based on the fact that 720p is technically "HD"
 

poofyhairguy

Lifer
Nov 20, 2005
14,612
318
126
Until there is ample content, its still bleeding edge. A few sample discs and .1% of streaming titles doesnt cut it.

A tv is different from a monitor.

Yes on TVs 4K video content is lacking.

On a monitor better text via using a retina effect is something that can be seen today at any Apple Store.
 

Gryz

Golden Member
Aug 28, 2010
1,551
204
106
Some facts that you need to be aware of.

1) 4K will only look good if the source material is 4K too.
If the source material is just 1080p or less, the image quality will be just as good as a 1080p image. Interpolation might sometimes make it look better, and soimetimes make it look worse.

1a) TV signal is 1080p at most. I don't think there are any cable or satellite companies that broadcast at 4k resolutions.

1b) Best DVD quality movies you can buy are still 1080p. BluRay. 4k DVDs might arrive soon. But they'll be expensive. And only new blockbusters will be available. If you want to download 4k movies, they'll be huge (tens of Gigabytes per movie).

1c) Almost all content on the web is made for 1080p or lower resolutions (phones&tablets). Webpages that have one or more pictures usuallly have them at 800x500 max or so. On a 4k monitor, all that happens is that the pictures in your browser will be two times smaller. The resolutions will not increase. Nice to have high ppi, but not when the picture is the size of a stamp.

1d) If you take pictures yourself, then 4k might make sense. If you have a 4k camera of course.

1e) Games.

Yes, many games support 4k. Or can be make to support 4k. But You shoudl realize this: 4k is 4 times the amount of pixels that need to be rendered. That means 4 times the work needs to be done. Your framerates will drop 75%. Actually, it's not that bad, because games do some work that is per frame, and not per pixel. But even then, expect your framerates to drop by 70%, 66% at best.

That is, if you have a videocard that has enough VRAM. You'll need more than 2GB VRAM for sure. 4GB seems to be enough at the moment, but that might change in the near future. But even if you have enough VRAM, do you have enough compute power on your videocard ? A gtx970 will be enough in older games. But not in the newest games.

Look at this benchmark:
GTA V on a gtx980ti.
http://www.anandtech.com/show/9306/the-nvidia-geforce-gtx-980-ti-review/13
27.8 fps for the gtx980ti on 4k with very high settings.
46.2 fps for the gtx980ti on 4k with only high settings.
Even with G-Sync that is not gonna feel very fluent.

My 3-year old gtx680 can do 46.2 fps on better than high settings on 1080p.
Are you willing to shell out top-dollar ($750+) for the videocard that your 4k monitor needs ?
Or do you prefer lower prices for 1440p or 1080p, while maintaining high fps and being able to enable more eyecandy ?

2) 4k Is still bleeding edge.
Look at these current stats from the Steam HW&SW survery:
http://store.steampowered.com/hwsurvey/
Look at primary display resolution.
1024 x 768 2.30%
1280 x 720 1.09%
1280 x 800 1.94%
1280 x 1024 5.69%
1360 x 768 3.02%
1366 x 768 26.33%
1440 x 900 5.08%
1536 x 864 1.69%
1600 x 900 7.82%
1680 x 1050 4.95%
1920 x 1080 34.54%
1920 x 1200 1.61%
2560 x 1440 1.11%
2560 x 1600 0.09%
2880 x 1800 0.01%
3200 x 1800 0.01%
3440 x 1440 0.04%
3840 x 2160 0.06%
5760 x 1080 0.06%
5760 x 1200 0.00%

There are 1.2% of Steam-gamers playing at 1440p. There are 0.18% of gamers that play at higher resolution than 1440p. One third plays at 1080p. Everybody else has a lower resolution.

So yes, 4K is bleeding edge. 1440p Can also be considered as bleeding edge. Anandtech is a website for enthousiast. What they do, and what they want, is nothing special. (It's all just bought in a store, nobody here builds their own technology). But it's still bleeding edge.

I've had a 1440p monitor on my desk for 3 days. The Acer XB270HU. Fantastic monitor. Unfortunately I had to send it back, because it is not very good when you play a lot of dark games with low ambient lighting. But it was very nice. The 1440p was more of an improvement than I had expected. At 1080p on a 27", I can see pixels, if I want to see pixels. During normal use, I don't see pixels on 1080p, only when I look for them. But on the 1440p, I couldn't see them, unless I put my face right in front of the panel. When I put back my old 1080p screen, it startled me ! So bad did it look. However, after an hour, I had adjusted again, and I don't miss the 1440p resolution.

Also, I didn't like my desktop and browser at 1440p. Yes, you can do scaling in Windows. But then defeat the whole purpose of higher resolution. I didn't like the smaller letters. Changing fonts didn't give me a satisfying result. And no matter what you do with your browser, webpages are just not made for higher resolutions.

My next monitor might be 1440k. Certainly not 4k. Hopefully the upcoming Acer Predator Z35 will be good. Only 2560x1080 resolution. But it has G-Sync, 144Hz and a VA-panel. A VA-panel should be perfect for me (good colors, high contrast, very good blacks). I rather have that than a 4k monitor.

In any case, before you buy a 4K monitor, you should see one with your own eyes. Maybe a 4K TV in a store nearby. That could give your a decent impression.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
Op, I definitely recommend you read the full 980ti review on anandtech so you learn basics then ask questions so you get a feel for the types of things we talk about on here and learn the basics and how it relates. This review tests gpus at multiple resolutions so you can understand how much performance you lose by upping image quality with things like resolution
 

sm625

Diamond Member
May 6, 2011
8,172
137
106
That steam hardware survey is a moving compositire, right? It's not showing accumulated scores from 3 years ago is it? Just what has been run in the last 30/60/90 days right?
 

PrincessFrosty

Platinum Member
Feb 13, 2008
2,300
68
91
www.frostyhacks.blogspot.com
For some random reason I thought I'd straight up just answer all of these, so it's a wall'o'text I'm afraid, hope it helps.

Not being able to use your monitor to see resolutions bigger than what it's capable of is not a problem new to 4k, its occurred with screens since forever, adverts on TV for colour TVs when all people had was black and white is a oldschool example, best idea is to go into a retail store and see first hand, or read subjective reviews online comparing them.

The primary benefit of all the additional pixels is really to cram more pixels onto the same area making the pixels smaller, you can sort of get right idea of the quality by simply sitting further back from your screen so the apparant size in your vision is smaller, you can also get a good idea from a lot of smartphones which often do very high resolutions but only across tiny 5-7" screens, the retina devices are a great example.

1) There's many standard resolutions above 1920x1080, namely 1920x1200 which is the 16:10 aspect ratio variant, and there's notably 2 more at 2560x1440 which again a 16:9 aspect ratio (like most TV) and then a 16:10 variant of that at 2560x1600. Then there's 2 versions of 4k which is 3840x2160 and 4096x2160, the former is what most monitors will be displaying, it's the equivelent of 4x 1920x1080 screens stacked in a 2x2 grid.

2) 1440p is just 2560x1440, the names of resolutions like 1080p or 1440p just refer to the 2nd part of the screen resolution which is the number of horizontal rows of pixels or put another way the height of the screen in pixels. It's not considered 4k, 4k is really just a branding term really and theres no 2k branding equivelent for 2560x1440 (1440p), however the resolution is pretty close to being half the total number of pixels (height x width) so in some sense you could consider it a kind of 2k especially relevant for performance (you'd expect to need roughly 2x the processing power to power 4k over 2560x1440 basically)

The increased resolution with gaming makes the picture more sharp, it allows you to see more detail in the scene and discern detail on objects that are smaller or in the distance, if the increase in pixels comes in the form of mostly increased pixel denisty (rather than increased screen size) then there's less need for things like anti-aliasing.

A) There's no real sense in which we have "1080p games", generally speaking most games can run in any screen resolution and if the game engine is built well it will simply render in the higher resolution. For media that is fixed resolution like movies then you can set your video card drivers to behave however you like, that includes running at the real resolution and adding in black borders, stretching to fit the entire monitor, stretching to fit and other subtle variations on this. Typically with games you'd simply run the game in the native resolution of your monitor since stretching to fit the wrong resolution produces very bad picture quality. There is a caveat here, if the resolution is a full multiple of each other you can perfectly scale without image quality loss, it turns out that the consumer version of 4k (3840x2160) is exactly 2x the height and 2x the width of 1920x1080 so this allows for loss-less scaling.

Bigger resolution doesn't always necessarily mean bigger monitor although it frequently does, the quality increase mostly comes from having a higher density of pixels in the same area, so generally speaking what you see is that as resolution goes up on new monitors, the size does too, but the size increases at a slower rate relative to the resolution, which means the higher res panels are indeed slightly bigger but they also have slightly better pixel density. It depends on the panel though, there's not much to manufacturers from making all different combos of size and pixels.

B) Settings preference is really just that, a preference. If you prefer higher resolution with medium settings or lower resolution with high settings, it's completely up to you, it's something subjective that you need to try and guage for yourself. Anecdotally I've swapped between a 30" 2560x1600 panel and a 24" 1920x1080 panel for probably 7+ years now, I enjoy them both for different reasons, I need to sacrifice settings on the 30" panel because the resolution is twice as many pixels so some games take a performance hit, but older games that my video card can handle well look really gorgeous on this kind of panel.

C) Can't really comment, i hear its pretty epic though, I imagine it's similar to when I upgraded from 1680x1050 to 2560x1600 some 7 years ago, it's a very big leap.

3) 2560x1440 tend to come in 27-30" and 2560x1600 tend to come in 30-32", 4k can come in a huge range of sizes some are as small as 32" i think, others are way bigger, essentially like 50-60" TVs. The jump to 4k is a huge one (4x 1080p) and so the range of panel sizes you'll find in production will vary a lot.

4) Simple, preference. G-sync eliminates tearing while maintaining high and smooth frame rates with minimal input lag which has traditionally been a problem for decades in gaming, these people want a solution that will eliminate screen tear, it bothers some people way more than it bothers others. I'm one of the few people who likes screen tearing, each to his own.

A) DX12 is a Microsoft API/standard for windows that allows games to access the video and sound hardware in your PC, you don't need DX12 for 4k. All current/modern VR (Virtual reality) is not dependent on either 4k or DX12, virtual reality headsets are very much like monitors they have a screen which has a specific resolution, none of them are 4k yet but in future this will be likely.

5) Price to performance of new and high end hardware such as 4k monitors and the GPUs you need to power such a high resolution is never good, price to performance ratio of damn near all hardware is always the best in the medium range, you always pay a premium for the latest and greatest until the technology matures years later and something better supersedes it.

6) There is usually some kind of approximate sweet spot for monitors size relative to its resolution, with 1080p it was about 24", some people bought 27" but many people found the pixels to be too big at this size. The problem with increasing screen size is that eventually it becomes too big to use comfortable, it took me a long time to get used to 30" and much bigger would be a waste IMO, too much of the screen sits in your peripheral vision were detail becomes harder to discern.

7) As resolutions get bigger and pixel density increases, anything with a fixed height/width in pixels such as classic fonts will appear smaller on your screen which makes it harder to read for everyone but potentially problematic if you have weak vision. Typically operating systems and applications come with DPI scaling for fonts and zooming for many apps which helps mitigate this problem, however many games do not and the HUD can become harder to read if its not scaled correctly, more adn more these days even games scale HUDs well which is a step in the right direction.

8) How wowed you are by 4k will differ from person to person, but it's very typical with new technology that you don't fully appreciate the benefit of the new tech until you're forced to go back to the old tech for some reason, then suddenly you can't live without it. There is a diminishing returns with things like screen resolution however, our eyes can only discern so much detail, we're no where near that limit yet, but as we approach it we'll appreciate each jump less and less.

A) Most gamers don't use 1440p or 4k, both resolutions are extremely uncommon among gamers, the amount of people using 2560x1440 and 2560x1600 for the 7-8 years they've been available has been very tiny, no more than a few percent, the adoption of 4k is even smaller right now. The cost is the primary factor, not only are the panels very expensive but running games at these resolutions requires extremely high end GPUs, 2560x1600 has required SLI or Xfire for the 7 years I've had it, only now with the 980 can i power this panel with a single card.

9) This was answered in 2.A, in short the video card can scale resolutions up and down, or fit them to the screen however you like.

10) If you can only afford one or the other (resolution or performance) then it's down to your personal preference, some people are happy to game at 30fps with choppy performance with nice graphics, others prefer less graphics and instead a steady 60-120fps.

11) It will eventually make 1080p obsolete, but not for a long time. I predict that everyone will have 4k TVs probably inside a decade and adoption in the PC space will probably be quite high by then, all resolutions eventually become obsolete, no one uses 640x480 or 800x600 anymore.

12) Everyone can see the difference between 60hz and 144hz despite the ignorant claims that still float around on the internet, 144hz monitors can refresh 144 times a second which gives a smoother exerpeince than 60hz providing your GPU can spit out more than 60fps (otherwise its kind of redundant), again how much you care about fast refresh rate depends on your subjective experience, some people love it and will sacrifice graphics others are the opposite.

Response time is a measure of how long it takes a monitor to change the colour of the pixels, its measured in ms (milliseconds or 1,000th of a second). If you have a rapidly changing scene on your monitor and you have a slow pixel response time then the pixels will lag behind the scene and cause an effect called ghosting which looks bad.

There's many different basic panel types, they all have similar performance characteristics. Typically TN panels have fast pixel response times, can run at higher refresh rates, however they have bad viewing angles and often bad colour reproduction. IPS tend to have much longer pixel response times and are limited (outside of botched hacks) to lower refresh rates, however they boast a much superiour colour reproduction and extremely good viewing angles. PVA is one you missed out which often can be found somewhere in between.

13) One is a standard 4k res for home cinema use (3840x2160), much like 1920x1080p it's a home media and broadcast TV standard, the slightly bigger one I believe is a standard for cinemas.

14) As I mentioned before there's nothing to stop anyone from making completely custom resolutions, you're talking about the aspect ratio which is the ratio of the vertical and horizontal, the home cinema and broadcast TV standard aspect ratio is 16:9 which 1920x1080, 2560x1440 and 3840x2160 all conform to. Remember that 1080p simply refers to the vertical resolution of the panel so you can have 2 different 1080p panels each with different widths. Panels that deviate from ratified standards tend not to be very popular since content is quite often developed and targeted for specific standards.

Lastly your off topic section. You're talking about input latency of the TV/monitor, this is measured separately and is independent from both refresh rate and pixel response time, typically latency is caused by TV/monitors having some kind of digital image processing chip inside them which processes the incoming images to make them sharper or cleaner, brighter or more vibrant in some way, quite often if you disable these effects in the menu you can speed up input latency. It's also one of the prime number one reasons that TVs are bad for monitors because they typically have a lot of input latency. Monitors marketed to gamers often boast very small input latencies for those who are bothered.
 

stockwiz

Senior member
Sep 8, 2013
403
15
81
I applaud the introduction of 4K but I'm not going to spend the money to adopt it.. for me it's not worth it. I was all for the introduction of 1080p and frustrated how everyone was lagging behind and still offering so much SD content but for me 4K represents diminishing returns for money spent.... kind of like the difference between buying shimano 105/ultegra parts for a road bike vs. dura ace... yeah there's a difference.. if you want to spend the cash and feel the difference is justified.

I game on my 60 inch 1080p set with 5.1 surround sound on a large comfortable bean bag... not sitting at a desk in a monitor.. so for me at distances I sit at the differences won't be all that readily apparent not to mention the framerate drop and needed hardware upgrades to support said resolution.
 
Last edited:

thehotsung8701A

Senior member
May 18, 2015
584
1
0
First off, Just want to let you guys know you guys are the best! Now back on topic:

@Pariah – Let me try to explain it again. Take a 27 inches 1440p monitor and a 32 inches 4K monitor. 1440p look good on a 27 inches cause it is it native res. But if I play on a 32 inches 4K in 1440p on window mode with that extra 5 inches real estate that is not use, wouldn’t my game look the same on both monitor since the portion that is on the 32 inches 4K monitor is not use? So it would be same size and same native res? Does that make sense?

@dave1029 – Thanks for the suggestion but life is too short to wait 18 months for Pascal especially when my PC seriously need a upgrade desperately. Not being able to run games in 1080p is turning me off from pc gaming. My rig is weaker than a Playstation 4 now. I got into an accident some months ago, and realize life is meant to be enjoyed. Live in the moment. Now if Pascal was around the corner, say in 2 months (that my max limit for waiting), then I would totally wait.

@kasakka – You say that 4K has perfect DPI scaling with 1080p. Does that mean if I run 1080p games in full screen on a 4K monitor it would look just as good as a 1080p screen or do you mean it only if it is run in window? I have seen 1440p monitor bigger than 27 inches?

@Gryz – what is a VA panel – is that similar to display port? My current monitor still has DVI port. Would go to say Best Buy and viewing 4K on a HDTV the same as on a monitor due to lower refresh rate? Also watching movies and TV in 4K at Best Buy or Costco won’t give me an idea on how it would look when I game?

@Tential – I read the review as you suggested. I am disappointed in its performance.

@PrincessFrosty – because your AWESOME! This is why I come here, because you guys know your stuff. Also just being able to reply back to all of you guys show me just how badly or how mandatory it is for me to get a 1440p or higher due to more screen real estate.

Do very old games like say Battle for Middle Earth II be able to have 1440p or 4K resolution? Games ranging from 2000 to 2007.

Now regarding 1440p monitor screen size. Is it better to have a 27 inches or a 30 inches 1440p monitor? Like which show more detail and which look better? I’m asking this since the sweet spot for 1080p monitor is 22 inches to 24 inches. Anything higher and the image isn’t as good.

G-sync vs Free-sync – I know we don’t have a choice when it come to either one but is G-sync actually better or does it just simply cost more due to display port instead of VESA like on the free-sync monitor?

Are TN monitor better for fps games like Counter Strike as oppose to IPS monitor? Is there no monitor that has the best of both world? PVA like you listed is in between. They now have IPS 144 mhz refresh rate now for 1440p G-sync monitor.

14. That what I thought thank you. My asus VW246H was one of those weird screen size (ultra wide). Guess I’ll never bother with any TV ever again.

Thanks for all the help mate!

@stockwiz – I know exactly what you mean since I’m a biker.


Afterthought:

Guys after much further thinking, I decided to take all of your advice and decided that I will forgo getting a 4K monitor since I prefer fluidity over Image Quality. I never buy bleeding edge especially at price where company can milk you and I’m certainly not going to start with it now. I may have money but I am always a smart shopper.

Real estate screen size is so important for me that I cannot multi-task on a 1080p screen no matter what screen size it is. As such I’m thinking of either going for a G-sync or Free Sync 1440p IPS 144 mhz refresh rate monitor or do 1080p triple monitor gaming. Please let me know what you think would be best? Both will offer me massive real estate to be able to multi-task and not have to tirelessly alt-tab. My desk can fit 3 24 inches monitor or even 3 27 inches monitor but that going way overboard especially since it $500 to $7.50 for one 1440p monitor.

Also what is lightboost?
 

Gryz

Golden Member
Aug 28, 2010
1,551
204
106
@Gryz – what is a VA panel – is that similar to display port? My current monitor still has DVI port.
VA is a technology used in LCD panels. It stands for Vertical Alignment. There are 3 main technologies in use: TN, IPS and VA. It has nothing to do with DVI, HDMI or DPort.
- TN is the cheapest. Has smaller viewing angles. Often 6-bit panel, because that's cheaper too. Can more easily achieve 144Hz. Most 144Hz monitors are TN.
- IPS is more expensive. Has nicer colors. Large viewing angles. Usually not for gamers. There's only one real 144Hz IPS gaming monitor (the Acer XB270HU. And an Asus FreeSync coming soon). However, the blacks and dark colors are usually not so good with IPS.
- VA. Least popular. Nice colors (maybe not as good as IPS, but better than TN). Usually not 144Hz. But there are 2 144Hz VA gaming monitors (Eizo FG2421 and upcoming Acer Predator Z35). High contrast ratio. Blacks on VA panels are supposed to be very good. Highest resolution usually 1080p (although the Z35 will be 2560x1080).

Most gaming monitors are TN. Because the technology does 144Hz easier.
IPS gaming monitors are starting to come. Nice colors. But not nice black.
VA is supposed to have good blacks. I play a lot of dark games. And I turn off the light in my room when I play. Hopefully VA is exactly what I'm looking for.

I'm not sure if my (long) post was clear. But I do not find resolution the only important feature of a monitor. In fact, I rather not have 4k myself. Because of gaming. I find other things like size, colors, darks, refresh-rate, G-Sync and ULMB much more important.

Would go to say Best Buy and viewing 4K on a HDTV the same as on a monitor due to lower refresh rate? Also watching movies and TV in 4K at Best Buy or Costco won’t give me an idea on how it would look when I game?
It's been over a decade since I was in a Best Buy. (I don't live in the US). But the idea is: if you have no idea about how 4K looks, you should try to get at least an impression. Even seeing a 4k TV with your own eyes is better than nothing. A shop will try to impress you to sell you a 4K TV. If you walk in, and you are not impressed with 4K TV, then it's likely you won't be impressed with a 4K monitor either. I suspect you might think that 4K is the best technology evar, and that you need it right away. I don't think so. But you can make a better decision yourself when you see at least something.
 
Last edited:

tential

Diamond Member
May 13, 2008
7,348
642
121
I applaud the introduction of 4K but I'm not going to spend the money to adopt it.. for me it's not worth it. I was all for the introduction of 1080p and frustrated how everyone was lagging behind and still offering so much SD content but for me 4K represents diminishing returns for money spent.... kind of like the difference between buying shimano 105/ultegra parts for a road bike vs. dura ace... yeah there's a difference.. if you want to spend the cash and feel the difference is justified.

I game on my 60 inch 1080p set with 5.1 surround sound on a large comfortable bean bag... not sitting at a desk in a monitor.. so for me at distances I sit at the differences won't be all that readily apparent not to mention the framerate drop and needed hardware upgrades to support said resolution.

For me it's simple. It's all about pushing graphical fidelity forward. 4K is a HUGE jump from 1080p. That means gpus have to get a LOT stronger to support this new resolution and this resolution will mean higher quality textures and what not. All of this pushes technology forward. I like progress, so I'm HAPPY for 4K to come and can't wait for the adoption of it.
This pushes infrastructure forces companies to upgrade cable lines and maybe even in the process internet lines as well to support 4K content.

4k not only represents a great jump in what monitors/HDTVs can do. It will push a LOT of infrastructure and hardware development as well to support it. I can't wait.
 

thehotsung8701A

Senior member
May 18, 2015
584
1
0
VA is a technology used in LCD panels. It stands for Vertical Alignment. There are 3 main technologies in use: TN, IPS and VA. It has nothing to do with DVI, HDMI or DPort.
- TN is the cheapest. Has smaller viewing angles. Often 6-bit panel, because that's cheaper too. Can more easily achieve 144Hz. Most 144Hz monitors are TN.
- IPS is more expensive. Has nicer colors. Large viewing angles. Usually not for gamers. There's only one real 144Hz IPS gaming monitor (the Acer XB270HU. And an Asus FreeSync coming soon). However, the blacks and dark colors are usually not so good with IPS.
- VA. Least popular. Nice colors (maybe not as good as IPS, but better than TN). Usually not 144Hz. But there are 2 144Hz VA gaming monitors (Eizo FG2421 and upcoming Acer Predator Z35). High contrast ratio. Blacks on VA panels are supposed to be very good. Highest resolution usually 1080p (although the Z35 will be 2560x1080).

Most gaming monitors are TN. Because the technology does 144Hz easier.
IPS gaming monitors are starting to come. Nice colors. But not nice black.
VA is supposed to have good blacks. I play a lot of dark games. And I turn off the light in my room when I play. Hopefully VA is exactly what I'm looking for.

I'm not sure if my (long) post was clear. But I do not find resolution the only important feature of a monitor. In fact, I rather not have 4k myself. Because of gaming. I find other things like size, colors, darks, refresh-rate, G-Sync and ULMB much more important.


It's been over a decade since I was in a Best Buy. (I don't live in the US). But the idea is: if you have no idea about how 4K looks, you should try to get at least an impression. Even seeing a 4k TV with your own eyes is better than nothing. A shop will try to impress you to sell you a 4K TV. If you walk in, and you are not impressed with 4K TV, then it's likely you won't be impressed with a 4K monitor either. I suspect you might think that 4K is the best technology evar, and that you need it right away. I don't think so. But you can make a better decision yourself when you see at least something.

Does TN monitor only matter if you play in the dark with the light off? Cause I never play in the dark without any light source. I always have the light turn on when I play at night.

I thought IPS was overall better? I did not know that TN has better black, that mean that the black is true black, am I correct? If so, I need to go with TN then because I plan on playing a lot of horror games.

Now in your opinion for triple 1080p monitor gaming, what monitor do you suggest? IPS or TN? Having 180 degree view point is just too amazing to pass up even if it at the cost of better graphic with 1440p.
 

Gryz

Golden Member
Aug 28, 2010
1,551
204
106
Does TN monitor only matter if you play in the dark with the light off? Cause I never play in the dark without any light source. I always have the light turn on when I play at night.
No. I was talking about VA.
VA has the higher contrast ratio. And the darker blacks.

IPS has slightly better colors than TN. There are a few reasons. One is that the viewing angles on IPS are better. If you don't sit exactly in front of you screen, or move your head a lot, or have people standing next to you, the colors on a TN screen will seem to be changing. That's not the case with IPS (nor with VA). Second reason is that more IPS screens have 8-bit color, while most TN screens have 6-bit color. However, when you compare a good 8-bit IPS screen with a good 8-bit TN screen, the differences are not that big anymore.

One of the biggest differences between VA and the other 2 technologies is the contrast ratio. IPS and TN panels have 1:1000 contrast. VA panels have a 1:3000 or sometimes even 1:5000 contrast ratios. So blacks are darker on a VA panel. The Acer XB270HU is an awesome screen, but it does have Quality Control issues. One of those is backlight bleeding. In combination with "ips-glow" that gives a very ugly effect if you play dark games in a dark room. That's why I didn't keep the XB270HU. For many others this is not a problem. But for some it is (and I am not the only one).

I thought IPS was overall better? I did not know that TN has better black, that mean that the black is true black, am I correct? If so, I need to go with TN then because I plan on playing a lot of horror games.
There is no perfect monitor. Not IPS, not TN, not VA. Not even if you don't mind paying $1k or $2k. Some companies make professional level monitors (e.g. for use in hospitals). And even those are not perfect. Not perfect for gamers in any case. They have very high resolution, very good colors. But they are not 144Hz and they don't have G-Sync or ULMB.

Now in your opinion for triple 1080p monitor gaming, what monitor do you suggest? IPS or TN? Having 180 degree view point is just too amazing to pass up even if it at the cost of better graphic with 1440p.
It all depends on your budget and what you wanna do with them.

I would never buy 3 monitors myself. Now that we have 21:9 monitors, that seems a much nicer option if you want a wide viewport while gaming. Also, if you want to game on 3 1080p monitors, that's still 3 times the amount of pixels you need to render.

As I said before, there is never a perfect solution. Not yet. Maybe in a few years. The LCD-monitor market had been completely stagnant for over a decade. No innovation. No attention to gamers. But that changed 18 months ago. Now we're starting to see new monitors made especially for gamers. The Asus ROG Swift is an awesome monitor for gamers. The Acer XB270HU is even better. And I have high hopes myself that the Acer Predator Z35 will be the perfect monitor for me. But is perfect for me, is probably not perfect for others. Because everybody needs to make choices.

And then another factor: good monitors are expensive. The G-Sync ones start at 500 euros/dollars. If you want 3 of those, that's gonna cost you.
 
Last edited:

PrincessFrosty

Platinum Member
Feb 13, 2008
2,300
68
91
www.frostyhacks.blogspot.com
@PrincessFrosty – because your AWESOME! This is why I come here, because you guys know your stuff. Also just being able to reply back to all of you guys show me just how badly or how mandatory it is for me to get a 1440p or higher due to more screen real estate.

Do very old games like say Battle for Middle Earth II be able to have 1440p or 4K resolution? Games ranging from 2000 to 2007.

Now regarding 1440p monitor screen size. Is it better to have a 27 inches or a 30 inches 1440p monitor? Like which show more detail and which look better? I’m asking this since the sweet spot for 1080p monitor is 22 inches to 24 inches. Anything higher and the image isn’t as good.

G-sync vs Free-sync – I know we don’t have a choice when it come to either one but is G-sync actually better or does it just simply cost more due to display port instead of VESA like on the free-sync monitor?

Are TN monitor better for fps games like Counter Strike as oppose to IPS monitor? Is there no monitor that has the best of both world? PVA like you listed is in between. They now have IPS 144 mhz refresh rate now for 1440p G-sync monitor.

Afterthought:

Guys after much further thinking, I decided to take all of your advice and decided that I will forgo getting a 4K monitor since I prefer fluidity over Image Quality. I never buy bleeding edge especially at price where company can milk you and I’m certainly not going to start with it now. I may have money but I am always a smart shopper.

Real estate screen size is so important for me that I cannot multi-task on a 1080p screen no matter what screen size it is. As such I’m thinking of either going for a G-sync or Free Sync 1440p IPS 144 mhz refresh rate monitor or do 1080p triple monitor gaming. Please let me know what you think would be best? Both will offer me massive real estate to be able to multi-task and not have to tirelessly alt-tab. My desk can fit 3 24 inches monitor or even 3 27 inches monitor but that going way overboard especially since it $500 to $7.50 for one 1440p monitor.

Also what is lightboost?

I can't say for specific games as I don't have 4k and have not tested, some older games may have fixed lists of screen resolutions which is a very old and outdated way of doing it, more modern games pull a list of supported resolutions from your video card driver and allow you to select those which means any arbitrary screen resolution are supported. You'll just have to research these games individually.

What size is better to have is kind of again a preference more than anything, what happens as you increase the screen size but leave the resolution fixed is you get less pixels per area (less PPI - pixels per inch) and so the apparant detail in that area goes down, but total size bigger. There is one other important aspect of this, and that's distance you sit from your screen, you see the pixel density or PPI is only one side of the equation. The distance you sit from your monitor alters how big the screen appears in your vision, up close and the screen fills more of your vision and far away it fills less, what you find is that some people with very large screens of 30" or above start needing to push the screen a bit further away from them to be comfortable to use. So bigger is not always better.

I personally think that 30" is a good size for 2560x1600 which is what I use, it has a good PPI balance vs the screen size, 27-30" is fine for 2560x1440, obviously 27" is smaller but will have a higher PPI, these things are always a trade off.

I don't know much about Gsync vs Freesync I don't use either I'm afraid.

It depends what you mean by better, different people prefer better image quality (IQ) and some people perfer better performance/smoothness. One thing I'll say is that in highly competative games like CS people serious about winning put all the settings low and crank up the frame rate and normally use very fast panels like 120-144hz (not Mhz!!!) becuase it gives them smoother aiming and they can be more precise and react faster where every millisecond counts.

There is currently no technology that is best at everything, TN is absolute king for competative gaming where you want 120-144hz with very fast pixel response times, however if you want better colour and viewing angles then you need to go IPS. This is why I have 2 panels for my PC which is a 30" 2560x1600 IPS for nice looking single player games and large work space, and a 24" 1920x1080 120hz fast panel for competitive gaming and 3D stereoscopic stuff, I swap around as I see fit. PVA is somewhere in between and they make pretty good all round panels.

There has been lots of IPS screens being sold as 120-144hz over the last few years, one problem is that IPS has a slow pixel response time (the time it takes for a pixel to change colour) and in most cases 144hz refresh rate is changing so fast that pixels can't change colour at a fast enough rate to keep up which leads to inaccurate colours, this is why for years we've never seen 120-144hz panels, beware claims of 120-144hz IPS, specifically look for what response times the pixels are.

a 144hz panel refreshes once every 6.9ms, if your panel takes say 8ms to change the pixel colour then a lot of pixels will never reach their intended colour before the next refresh hits, you ideally need something like 3-4ms response times to get good colour at 144hz.