1440p 144hz vs 4K 60hz

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Eric1987

Senior member
Mar 22, 2012
748
22
76
Do you think all the people here, who recommend 120-144hz monitors with experience using them are lying to themselves?

Google "how many FPS can the human eye see". Read through the numerous threads:
https://www.google.com/webhp?source...ie=UTF-8#q=how+many+FPS+can+the+human+eye+see

I'll give you the benefit of the doubt and assume you play with an IPS monitor with 8+ms response times, which blur the crap out of everything, preventing you from seeing clear responsive images.

http://www.newegg.com/Product/Product.aspx?Item=9SIA2RY1DK9756

Yes I do think that actually. It's a common elitist trend to want higher than necessary FPS. Has been since I had my 9500 PRO back in 2001 I think it was. Under 25 FPS is when it starts bothering me. Hell W3 at 40-50 FPS looks like a solid 60 if I didn't use an FPS counter.
 
Last edited:

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
http://www.newegg.com/Product/Product.aspx?Item=9SIA2RY1DK9756

Yes I do think that actually. It's a common elitist trend to see higher than necessary FPS. Has been since I had my 9500 PRO back in 2001 I think it was. Under 25 FPS is when it starts bothering me. Hell W3 at 40-50 FPS looks like a solid 60 if I didn't use an FPS counter.

Well, given that there are pages and pages of articles to debunking your "common knowledge" and there are no studies that prove your assertion. It is far more likely you are incorrect.

Perhaps your brain is slow when it comes to processing visual info, but for the vast majority of us, we see far better than 40 FPS.
 

ocre

Golden Member
Dec 26, 2008
1,594
7
81
I think that the internal game engine clocks are effected when frame rates start to sway. When a game gets in the 40s, the overall sluggishness occurs as this internal clock adjust to the lower framerate. The lag I am talking about most likelt has to do with this.

I googled blurb busters gsync test to see how i can relate,
http://www.blurbusters.com/gsync/preview2/

they talk about whole chain and total chain input lag, but they also talk about the tick rate that games poll on. The internal game engine clock tries to balance or even out the situation when frame rates drop. So, say you are walking forward and the frame rate drops, the internal game clock has got to adjust the effect of that same input, in a given cycle, based on the changing frame rate. This is all in an effort to keep the game as smooth as possible. If it didnt happen, then your charactor would be walking 20 miles per hour when frame rates are high and 2 miles per hour when it is low.

Internal game engine clocks change during game play and at low fps this all around makes the game feel sluggish and less responsive. Laggy.

The actual input lag of a single button, it may not change. But that doesnt mean the game isnt less responsive.

so back to the blurbuster article, reading the crysis 3 test and they had this to say:

There were many situations where G-SYNC’s incredible ability to smooth the low 45fps frame rate, actually felt better than stuttery 75fps

I really never read that before i have gsync. But if you fired up games you know well and found that all of a sudden 45fps was smoother than what you normally experienced at 75fps........you might see what i am saying. I was not expecting such a dramatic change. Can you imagine games running smoother and more responsive at 40fps vs 70fps?
 

Eric1987

Senior member
Mar 22, 2012
748
22
76
Well, given that there are pages and pages of articles to debunking your "common knowledge" and there are no studies that prove your assertion. It is far more likely you are incorrect.

Perhaps your brain is slow when it comes to processing visual info, but for the vast majority of us, we see far better than 40 FPS.

Yet no one I've ever met in person agrees with you. No one in person could ever prove me wrong, too. I bet you I could run the w3 at 50 FPS 4k and 150 FPS at 1080p and you wouldn't tell the difference.
 

Z15CAM

Platinum Member
Nov 20, 2010
2,184
64
91
www.flickr.com
You can run a cheap 1440p 27" @ $300 Samsung Koran 60Hz PLS Display at 120+ Hz. QNIX Great Display with an AMD 290 or 290X.

Suggest alt least a 750W PSU or an 850W in CF.
 
Last edited:

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Yet no one I've ever met in person agrees with you. No one in person could ever prove me wrong, too. I bet you I could run the w3 at 50 FPS 4k and 150 FPS at 1080p and you wouldn't tell the difference.

Given that I get nausea when playing 1st person view games at 40 FPS and get less and less of it until I reach 80ish FPS, I most certainly can tell from that alone. Not to mention that I'm pretty used to high FPS, so I spot it quite fast. Assuming we are talking about 1st person view games where you turn side to side often.

How can someone prove to you with nothing but an eye test? Studies use scientific data to prove their theories. You just go, nuh uh.
 

Matthiasa

Diamond Member
May 4, 2009
5,755
23
81
Well seeing as I have experience with this one.
I have a 144 hz monitor not that I would often push that on high settings. However, one day my graphics card died which meant switching back integrated graphics but the cable for that could only push 59-60hz. Even when I was able to change settings to get listed fps back to above 60hz min there was still a significant difference in my ability to perform against other opponents.
Note that this was a fighting type game where super high res doesn't help as much due to the lack of distance involved.

For certian things it can really mean the difference between being among the best and barely being able to compete.
 
Last edited:

Eric1987

Senior member
Mar 22, 2012
748
22
76
Given that I get nausea when playing 1st person view games at 40 FPS and get less and less of it until I reach 80ish FPS, I most certainly can tell from that alone. Not to mention that I'm pretty used to high FPS, so I spot it quite fast. Assuming we are talking about 1st person view games where you turn side to side often.

How can someone prove to you with nothing but an eye test? Studies use scientific data to prove their theories. You just go, nuh uh.

Scientific data? I haven't seen any besides web pages from random sites. And I am going on my own PERSONAL experience. Also with all my PC gaming friends, too. So which should i believe? My experience over 15 years or some random web pages saying I should be able to see more? I live in Vegas does anyone around here got any physical proof that it looks different? Not to mention I think it is a complete waste of GPU power. 60 FPS at 4k looks MUCH better than 144 FPS at 1080p. I laugh at 1080p you couldn't pay me to go down in resolution just to get above 60 FPS. REALLY stupid IMO.

Above poster. If you perform worse at 60 fps then you do at 144 then I am the best gamer alive. I would own everyone here with my eyes closed. I play ALL my games at 30-60 FPS and I am one of the best at the games I play. So am I godly or do you just suck?

EDIT: Anyone who lives in Vegas prove me wrong. If you prove me wrong you buy me lunch. If I prove you wrong you buy me lunch. Or a beer. Or a bowl. Whatever.
 
Last edited:

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Scientific data? I haven't seen any besides web pages from random sites. And I am going on my own PERSONAL experience. Also with all my PC gaming friends, too. So which should i believe? My experience over 15 years or some random web pages saying I should be able to see more? I live in Vegas does anyone around here got any physical proof that it looks different? Not to mention I think it is a complete waste of GPU power. 60 FPS at 4k looks MUCH better than 144 FPS at 1080p. I laugh at 1080p you couldn't pay me to go down in resolution just to get above 60 FPS. REALLY stupid IMO.

Above poster. If you perform worse at 60 fps then you do at 144 then I am the best gamer alive. I would own everyone here with my eyes closed. I play ALL my games at 30-60 FPS and I am one of the best at the games I play. So am I godly or do you just suck?

EDIT: Anyone who lives in Vegas prove me wrong. If you prove me wrong you buy me lunch. If I prove you wrong you buy me lunch. Or a beer. Or a bowl. Whatever.

We've gotten off topic. Go to this thread if you want to continue: http://forums.anandtech.com/showthread.php?t=2434081
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
The mods don't like even adjacent topics. Anyways, there is a very good reason you can't prove that more than 60 FPS is noticeable. You have to have a monitor that is higher hz than 60 to be capable of showing more, and seeing more.

I have a 120hz monitor, the above poster has a 144hz monitor. You don't. And on top of that, if you have poor pixel response times, that blurs the images, hiding the differences and clarity of the images.
 

Eric1987

Senior member
Mar 22, 2012
748
22
76
The mods don't like even adjacent topics. Anyways, there is a very good reason you can't prove that more than 60 FPS is noticeable. You have to have a monitor that is higher hz than 60 to be capable of showing more, and seeing more.

I have a 120hz monitor, the above poster has a 144hz monitor. You don't. And on top of that, if you have poor pixel response times, that blurs the images, hiding the differences and clarity of the images.

I already put the proof on the table. Someone come prove me wrong or I am going to continue preaching because I'm right.
 

tential

Diamond Member
May 13, 2008
7,355
642
121
No its not. I play all my FPS with around that frame rate. And yes I did read the entire article. Mr bionic here. I wish I could see that high of FPS. 40 FPS looks the exact same as 60 to me. Hell if I am on a SSD 30 FPS without stutter looks and feels just fine to me, too. So according to you people can see 144 FPS? LOL. I think people who say 144hz over 60hz 4k hasn't seen 4k. Games look breath taking in 4k. And I get more than enough FPS with my 2x290x's.

So you and your real life friends aren't sensitive to frame rates. Wow, no surprise there. Not many people really are sensitive to it. Most people who are are the ones who will seek out an online forum to talk about, aka what's happening right now. I haven't used a 144hz monitor. My first experience with over 60 hz will be with my first 4K hdtv at 1080p. Hisense apparently is debuting a TV on the 23rd for 50 inches $600. May pick that up as a stopgap.

I really want to game at 4K though, I would never give up getting to 4K just to get over 60 hz.
 

dave1029

Member
May 11, 2015
94
1
0
If you can push 60 fps @ 4K (Or 45 with G-Sync), you will never want to change anything ever again. I think a lot of you people are underestimating the effect that a higher resolution can have on gaming. Most people who try 4K are turned off compared to a lower resolution/hire frame setup because there are very very few systems that can push proper frames @ 4K.

Imagine a standard 28" screen. Now this particular monitor is 3840x2160. 1080p would fill 1/4 of this screen if no scaling were involved. 1440p would fill half the screen. Things look so much better because you have 4 pixels for every 1 of 1080p. It's not just clarity you gain. Textures and ambience all have details you never thought existed. You see the individual scrapes on Geralt's armor in TW3. You see each individual strand of hair. Particle effects look realistic. Yes, they actually look real if done properly. My modded Skyrim smoke looks like actual smoke @ 4K. But at 1080p it's just a pixel soup. And fella's, this is just the effect of upscaling. When games start making textures @ the resolution of 4096x4096, all you people and your 144hz 1440p monitors are going to be stunned. Most textures these days are at 2048x2048 so 1440p is seeing as much as it possibly can. 1440p was never meant to be the generational jump... it's just the bridge from 1080p to 2160p.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
I already put the proof on the table. Someone come prove me wrong or I am going to continue preaching because I'm right.

You haven't put one tiny speck of proof anywhere. All you've provided is proof of your ignorance and a whole lot of false bravado. You're definitively wrong. I'd pay money to listen to the rusty cogs in your head turn when you see CS:GO being played on a 60hz monitor side by side with a 144hz one
 

Bulldawg007

Junior Member
Jul 4, 2015
1
0
0
I just went with 1440 @ 144htz and here's why. I just got an Acer XB270HU monitor and a EVGA 980ti Hydro copper GPU. First off the amount of frames your eye can see is irrelevant. Here's an loose analogy, think of it like a high speed camera at a race track. The fast shutter speed makes for real clear pics where you can read Goodyear on the tires. Just like high speed video makes for much better slow motion shots and stills. Even if your eye can only see half the frames They are beautiful clear frames. 144htz looks better for fast moving games. You can even move a text box around on your desktop and still read it something you can't do @ 60htz. Also that speed and clarity is realized on any size monitor where if you ask me any monitor under 30" is a waste for 4K unless you are sticking your face right up to it. Here's a link to 4K distance chart http://www.rtings.com/info/4k-ultra-hd-uhd-vs-1080p-full-hd-tvs-and-upscaling-compared. My eyes are about 3-4' from my monitor so 1440P is enough for me given the speed drop I would have had to take to game in 4K Ever notice that the sample video in the stores on the 4K Tv's is only slow moving video clips, no fast pans or action shots. Because 4K lacks the speed to look as good in those situations vs the 1080P sets. Throw in the scaling issues of 4K and lack of content right now and I just don't think it's ready yet. It's all marketing hype. When there are 4K 144htz IPS monitors in a year or 2 then I'll take the plunge and get about a 34" to really see the difference.
 

Madpacket

Platinum Member
Nov 15, 2005
2,068
326
126
Interesting thread. I started a recent poll on monitor preferences and the clear majority choose 144HZ IPS displays with g/a sync at 1440P over other choices (including 4K)

http://forums.anandtech.com/showthread.php?t=2438047

And here's the reason why:

http://www.tftcentral.co.uk/reviews/content/asus_mg279q.htm#intro
http://www.tftcentral.co.uk/reviews/acer_xb270hu.htm

There now exists two IPS displays that can handle the input of 144Hz. Formerly only TN panels could handle this. Both screens have the same panels but the Acer costs quite a bit more due to the superior G-Sync technology. I ended up grabbing the Asus model as I don't think the Acer is worth the premium. Plus the freesync range 35-90 Hz is right where you need it to be so I don't find it a detriment. I'll only use freesync for single player games so input latency is not as critical.

Competitive gamers will disable a/f sync and shoot for 120/144hz and deal with the minimal tearing. Really competitive players will still use TN panels although with overshoot and image stability issues I wonder of these new IPS displays would be a better option, would be nice to hear from pro gamers (who isn't sponsored by a monitor company).

Anyways. 4K is great... but not with TN tech and not at 60Hz. Sure slower paced games may feel okay at 4K and look incredible if you can find game assets built for 4K (years away before this is standard) but we're just not there yet.

If you've been PC gaming as long as I have you will know 60Hz has *never* been fast enough. 80 to 85hz is the lowest Hz for smooth vision (this was easy to distinguish on CRT's but given how blurry LCD tech was, it kind of hid the problem) but 100+ Hz has always been the goal for good input response. These new IPS monitors handle this with ease and no crappy viewing angles to deal with.

Best of both worlds is now available!

Gaming at a high refresh rate makes them so much more enjoyable, fun and rewarding. Over the years input response is one thing that has really been overlooked as display technology advanced. It's really hard to describe unless you go way back and play on a CRT with high refresh rate or grab one of these new high Hz monitors and play literally any game with motion that requires any sort of timing.

Eric1987 - if that was the year you were born I could see how you missed playing games on classic high Hz CRT's and ended up with crappy LCD's to begin with. I know you state *you* don't see a difference but I guaranteed if you played games at 100+Hz you would *feel* a difference. You make it sound like high refresh rates are some kind of placebo. Please do yourself a favour and borrow one from a friend and swap for a few days, play some games and then swap back.

After you've done this please come back here and apologize for your ignorant comments :)


tldr; 1440hz/144P > 4K - majority agree:
http://forums.anandtech.com/showthread.php?t=2438047
 
Last edited:

Rebel_L

Senior member
Nov 9, 2009
449
61
91
Interesting thread. I started a recent poll on monitor preferences and the clear majority choose 144HZ IPS displays with g/a sync at 1440P over other choices (including 4K)

Oddly enough your poll does not even list a 144hz/1440p as an option

On crt monitors I do want a 85hz refresh, less always did have a noticeable pulsing to me, but considering the difference in how an lcd refresh vs a crt refresh works it is not really the same kind of issue as I have never noticed a pulse in any lcd I've used. I would guess that the actual issue on lcd's isn't even about refresh at all but rather fps rates with the max fps determined by the refresh.

I think if you wanted to have a better idea of what people actually find important then rather having such very specific monitor geared options you should try for a few different polls, the first one probably being the minimum fps you are comfortable gaming with (assuming steady fps with no lower than say 5% drop from the listed number), and then one with the maximum screen size you are comfortable gaming with. It might also be useful to poll people who have used g/f sync to see if they find a noticeable difference between that and without that in games where they are not running at sustained refresh rate fps.


Current choices of preferred monitor will always factor in available video cards you have or are in your budget. If you are buying a monitor to last a while then min sustainable fps will push upwards as you upgrade video cards in the future and so long as you are happy with steady fps of 60 or lower and can find a 4k in the panel type and size you like then for future use buying the higher resolution seems to be the way to go. In the end the best advice I could give to people is to find a store with a demo set up for 144hz to see if they can even tell a difference between that and 60hz as no matter what the general split is, if you cant tell the difference you may as well save some money.
 

Madpacket

Platinum Member
Nov 15, 2005
2,068
326
126
Rebel. Yes I botched the 1440P options in the poll but afaik there are no 60hz only 1440p monitors with g/f sync so..

If you took two of the same monitors and placed them beside each other. One running at 60Hz and the other 144Hz and played a demo of a competitive game like CS GO or Quake Live you would immediatley see the difference. If you don't you likely have severe visual problems and should see an eye doctor.

When you actually play the games you feel a tremendous difference in feedback. Everything just is just much smoother and easier to perform moves.

I don't see much value in asking people about a/f/g sync. There are dozens of articles from respected sources that explain the advantages of the new refresh rates. Anyone buying a new monitor today without this tech is really missing out. Especially the 4K crowd here. Monitors with backlight strobing can also make a giant difference in visual clarity in fast moving games but it really hurts brightness and contrast so not for everyone. Having used monitors with it I normally opted for 144hz over strobing.
 

Rebel_L

Senior member
Nov 9, 2009
449
61
91
Rebel. Yes I botched the 1440P options in the poll but afaik there are no 60hz only 1440p monitors with g/f sync so..

If you took two of the same monitors and placed them beside each other. One running at 60Hz and the other 144Hz and played a demo of a competitive game like CS GO or Quake Live you would immediatley see the difference. If you don't you likely have severe visual problems and should see an eye doctor.

When you actually play the games you feel a tremendous difference in feedback. Everything just is just much smoother and easier to perform moves.

I don't see much value in asking people about a/f/g sync. There are dozens of articles from respected sources that explain the advantages of the new refresh rates. Anyone buying a new monitor today without this tech is really missing out. Especially the 4K crowd here. Monitors with backlight strobing can also make a giant difference in visual clarity in fast moving games but it really hurts brightness and contrast so not for everyone. Having used monitors with it I normally opted for 144hz over strobing.

There are advantages, I don't disagree with that, its just that they are advantages that only some people make use of. In general I am the kind of person that always buys things that have features that I think sound useful for their technical merits even if I will likely never use them. This being a tech site we tend to all be enthusiasts that love features, but it also really tends to be that we recommend over spec stuff for people, using our own standards, rather than taking the time to make a good recommendation for the average person that comes here to look for info.

Lets take me as an example. The last time I played multiplayer fps was probably in the range of 10-15years ago and that was on local networks with friends. I still enjoy them from time to time, but I play single player these days, usually with unlimited ammo and guns to just enjoy shooting the crap out things. The most consistent time wasters for me have been Civilizations, MMO's, and some fun RPGs. Of the say dozen or so people that I know that game on a PC, none of them play multiplayer FPS's on their PC's, the few that do FPS do it on their consoles.

Of those people that I know PC game I have generally always been the one to dedicate the most $ towards the hobby. I love system building as well so I usually upgraded long before the games I played got choppy at max settings. My vision is still a little better than 20/20 and you can hand me most pairs of glasses + or - prescriptions and I can wear them without much noticeable eye strain for a few minutes of wearing them (really strong prescriptions I do feel the strain though). Of all the display issues I have had and/or heard any complaints about from all my friends (and I tend to hear most of them because I am they person that keeps most up to date on techy things) the only two that I can ever recall hearing are "it feels choppy" "it looks weird" in reference to TV 120hz modes. I realize TV 120hz mode is different than monitors running at 120hz, but since we tend to be hockey fans up here LCD tv tech is actually important. For the most part everyone I know adjusts fine to the higher hz tv modes and enjoys them, a couple of people (who don't own them) just cant stand the high hz modes for anything other than sports though. Everyone that I know has them has adjusted just fine after owning one for a week or two, but oddly never complain about anything other than sports on lower hz tv's. The choppy game play has so far always been associated to an FPS drop.... high traffic areas, high fx saturation on screen... you know the standard video card cant keep up because someone is using a 4 year old card and wants to run max settings. Any problems of the input lag type have always been traced to spikes in pings or using wireless peripherals. Of the people, including me, that have ever bothered looking at fps during slowdowns to see how bad the problem is that causes the gameplay not to feel smooth the consensus is that gameplay only starts to feel choppy when you start having fps spikes or continuous fps below 20.

I have had an LCD monitor for a long time, my first one was I believe a 20inch 1600x1200 benQ with in the ~30 ms grey to grey and input lag, and I can tell you it felt like a glorious upgrade from my 85hz crt at 1280x1024. I was worried that it would feel unresponsive, the adjustment however was so quick and natural that I did not even notice. Anytime I have upgraded myself or anyone else to a higher resolution is been a positive upgrade, side grades to better tech of the same resolution has only every been noticeable by anyone that I know when going from a TN panel to IPS, and then mostly for the viewing angles.

For me or the people I would recommend monitors to, >60hz is simply not a feature that is worth spending $ on, if its the standard or the choice costs no extra, of course you take it, but I cant justify sacrificing other features for it. Resolution and screen size however would be, some I would recommend smaller higher dpi screens for, others larger depending on their viewing distances. If any of them suffered from motion sickness (they however don't) I would suggest getting as high hz as they can afford with sync tech. In the end its all about personal preference and while there are certainly enough people who stubbornly refuse to try out new things, there are just as many that seemingly cant be ok with someone finding little value to a feature they value greatly.
 

Madpacket

Platinum Member
Nov 15, 2005
2,068
326
126
There are advantages, I don't disagree with that, its just that they are advantages that only some people make use of. In general I am the kind of person that always buys things that have features that I think sound useful for their technical merits even if I will likely never use them. This being a tech site we tend to all be enthusiasts that love features, but it also really tends to be that we recommend over spec stuff for people, using our own standards, rather than taking the time to make a good recommendation for the average person that comes here to look for info.

Lets take me as an example. The last time I played multiplayer fps was probably in the range of 10-15years ago and that was on local networks with friends. I still enjoy them from time to time, but I play single player these days, usually with unlimited ammo and guns to just enjoy shooting the crap out things. The most consistent time wasters for me have been Civilizations, MMO's, and some fun RPGs. Of the say dozen or so people that I know that game on a PC, none of them play multiplayer FPS's on their PC's, the few that do FPS do it on their consoles.

Of those people that I know PC game I have generally always been the one to dedicate the most $ towards the hobby. I love system building as well so I usually upgraded long before the games I played got choppy at max settings. My vision is still a little better than 20/20 and you can hand me most pairs of glasses + or - prescriptions and I can wear them without much noticeable eye strain for a few minutes of wearing them (really strong prescriptions I do feel the strain though). Of all the display issues I have had and/or heard any complaints about from all my friends (and I tend to hear most of them because I am they person that keeps most up to date on techy things) the only two that I can ever recall hearing are "it feels choppy" "it looks weird" in reference to TV 120hz modes. I realize TV 120hz mode is different than monitors running at 120hz, but since we tend to be hockey fans up here LCD tv tech is actually important. For the most part everyone I know adjusts fine to the higher hz tv modes and enjoys them, a couple of people (who don't own them) just cant stand the high hz modes for anything other than sports though. Everyone that I know has them has adjusted just fine after owning one for a week or two, but oddly never complain about anything other than sports on lower hz tv's. The choppy game play has so far always been associated to an FPS drop.... high traffic areas, high fx saturation on screen... you know the standard video card cant keep up because someone is using a 4 year old card and wants to run max settings. Any problems of the input lag type have always been traced to spikes in pings or using wireless peripherals. Of the people, including me, that have ever bothered looking at fps during slowdowns to see how bad the problem is that causes the gameplay not to feel smooth the consensus is that gameplay only starts to feel choppy when you start having fps spikes or continuous fps below 20.

I have had an LCD monitor for a long time, my first one was I believe a 20inch 1600x1200 benQ with in the ~30 ms grey to grey and input lag, and I can tell you it felt like a glorious upgrade from my 85hz crt at 1280x1024. I was worried that it would feel unresponsive, the adjustment however was so quick and natural that I did not even notice. Anytime I have upgraded myself or anyone else to a higher resolution is been a positive upgrade, side grades to better tech of the same resolution has only every been noticeable by anyone that I know when going from a TN panel to IPS, and then mostly for the viewing angles.

For me or the people I would recommend monitors to, >60hz is simply not a feature that is worth spending $ on, if its the standard or the choice costs no extra, of course you take it, but I cant justify sacrificing other features for it. Resolution and screen size however would be, some I would recommend smaller higher dpi screens for, others larger depending on their viewing distances. If any of them suffered from motion sickness (they however don't) I would suggest getting as high hz as they can afford with sync tech. In the end its all about personal preference and while there are certainly enough people who stubbornly refuse to try out new things, there are just as many that seemingly cant be ok with someone finding little value to a feature they value greatly.

You make a valid point about most gamers are casual gamers. They don't tend to care about things until frame-rates dip below a certain threshold (depends on how sensitive they are). I can PC game at 20 to 30 FPS and actually still enjoy them depending on the genre. Graphics are only one component, game-play / fun factor is still king. However (and I apologize for using a car analogy) if you let that gamer sit in front of a 4K 60Hz with g/f sync or a 144Hz 1440P setup it would be like them getting out of their Kia and jumping into a Porsche. Even casual gamers would notice unless they're being completely obtuse or ignorant. The difference are not subtle, they are gigantic.

That being said this (Anandtech forums) is an enthusiast site dedicated to gaming enthusiasts, heck this thread is about 4K vs 144Hz 1440P so it makes sense to look at these two options and weigh the pro and cons and let people decide for themselves what the best option is.

Ideally people could just go to a store and test multiple setups themselves and decide but I don't see many computer stores willing to do that, so they have to rely on the experience and feedback of others'.

As for comparing television or sports to gaming monitors and games (I'm in southern Ontario BTW so I get the hockey thing) this isn't a valid comparison. 120Hz / 240 / 480Hz TV's frame double, triple or quadruple every 60Hz (which broadcast TV runs natively) to give the "soap opera" effect. This works okay for sports but for everything else meant to run at 23.987x fps, it butchers the experience. It's like watching broadcast news which sucks for movies or TV shows. Most people I know turn this feature off and leave it off. Not many TV's actually run at more than 60 FPS, they just show the same image multiple times which gives the illusion of smooth playback. However it ends up being a jerky mess that adds input latency etc.

In other-words, just a hack.
 

Rebel_L

Senior member
Nov 9, 2009
449
61
91
You make a valid point about most gamers are casual gamers. They don't tend to care about things until frame-rates dip below a certain threshold (depends on how sensitive they are). I can PC game at 20 to 30 FPS and actually still enjoy them depending on the genre. Graphics are only one component, game-play / fun factor is still king. However (and I apologize for using a car analogy) if you let that gamer sit in front of a 4K 60Hz with g/f sync or a 144Hz 1440P setup it would be like them getting out of their Kia and jumping into a Porsche. Even casual gamers would notice unless they're being completely obtuse or ignorant. The difference are not subtle, they are gigantic.

That being said this (Anandtech forums) is an enthusiast site dedicated to gaming enthusiasts, heck this thread is about 4K vs 144Hz 1440P so it makes sense to look at these two options and weigh the pro and cons and let people decide for themselves what the best option is.

Ideally people could just go to a store and test multiple setups themselves and decide but I don't see many computer stores willing to do that, so they have to rely on the experience and feedback of others'.

I think threads just become too combative way to quick for stuff that ultimately is personal preference for an overall experience. Its much worse in other sections, but refresh rates seems to be the most "passionate" topic these days as far as display tech goes. Often I have noticed that the people who end up asking for which is better x or y seem to be the ones that I would deem as not tech enthusiasts. Some gamers have big budgets, some small, some play a lot, some a little, but the direct questions commonly seem to be from the people who have read enough to see some things being talked about they want to get a better feel for because they get lost a bit in the detailed discussions or just don't have that much interest in the spec type of informat. So they ask X vs Y and as passionate enthusiasts we tend to turn into a brawl of X vs Y and end up leaving people with as many questions as they had at the start :)


As for comparing television or sports to gaming monitors and games (I'm in southern Ontario BTW so I get the hockey thing) this isn't a valid comparison. 120Hz / 240 / 480Hz TV's frame double, triple or quadruple every 60Hz (which broadcast TV runs natively) to give the "soap opera" effect. This works okay for sports but for everything else meant to run at 23.987x fps, it butchers the experience. It's like watching broadcast news which sucks for movies or TV shows. Most people I know turn this feature off and leave it off. Not many TV's actually run at more than 60 FPS, they just show the same image multiple times which gives the illusion of smooth playback. However it ends up being a jerky mess that adds input latency etc.

In other-words, just a hack.

You are totally right that its not anywhere near the same because it always is a hack. I use it more as a way to judge a person sensitivity to frequency effects. From what I have noticed in my limited sample supply it seems to have some correlation with people who get motion sickness driving/on boats etc. My 120hz TV is always running in 120hz mode, and because I have gotten used to it and 60hz TV's as well the only time I really notice the difference if im watching hockey and the puck is harder to follow because its just a blurry streak. I seem to be designed for easy adaptation to screen frequency and find that very few of the issues surrounding it are ever issues for me (I also have no issues with car or boat sickness either). I consider it a positive because I get greater selection of monitors to choose from when upgrading.