- Feb 7, 2010
- 3,562
- 14
- 81
So, I've been considering upgrading for the last few months, and the Titan X/G-sync has me considering getting one plus a ROG Swift. That'd be like $1750-ish, which is within budget technically, but I'm considering all options that would see me spending less (such as ROG Swift + 980, XG270HU + 290X, or R9 390x when it comes out, etc). I'm also considering 4k...
Anyway, here are the things I am wondering :.
1. At 4k, does freesync/gsync make 45-50 FPS feel good enough compared to a GPU that can solidly maintain 60FPS?
2. Does this also work for 1440p? I wonder this because if I go the 980/290x route, it might not be long before multiple new releases can't be maintained at 60FPS at 1440p ultra quality, there are a few even now...
3. I've been reading some of Guru3D's reviews, where they also benchmark framerate experience (like so: http://www.guru3d.com/articles_pages/nvidia_geforce_gtx_titan_x_review,28.html). I'm assuming that this is an important factor complementing the monitor's input lag/latency, but exactly how important is it? Also, they have been doing the FCAT stuff since before gsync/freesync; does gsync/freesync have any bearing on this?
3b. Any chance Anandtech will ever start benchmarking this and include it in the Bench?
Any thoughts? Comments need not be focused on the exact questions. My actual goal is to:
1. Upgrade to at least 1440p
2. Get better input lag for shooters
3. Upgrade to a sufficiently powerful GPU that I will get better FPS at 1440 than I do at 1080p now (Dragon Age Inquisition Ultra I was getting around 30-40 much of the game)
Current system for reference:
4690k with Hyper212 EVO
Gigabyte GA-297X-SLI
8GB 1600 DDR3
850W Thermaltake SMART PSU (70 amps on 12v rail)
Currently, playing 1080p using a 7970 non-ghz version
Using Samsung 32" TV as monitor. I like the way it looks, except I'm betting the input lag is horrendous...
Anyway, here are the things I am wondering :.
1. At 4k, does freesync/gsync make 45-50 FPS feel good enough compared to a GPU that can solidly maintain 60FPS?
2. Does this also work for 1440p? I wonder this because if I go the 980/290x route, it might not be long before multiple new releases can't be maintained at 60FPS at 1440p ultra quality, there are a few even now...
3. I've been reading some of Guru3D's reviews, where they also benchmark framerate experience (like so: http://www.guru3d.com/articles_pages/nvidia_geforce_gtx_titan_x_review,28.html). I'm assuming that this is an important factor complementing the monitor's input lag/latency, but exactly how important is it? Also, they have been doing the FCAT stuff since before gsync/freesync; does gsync/freesync have any bearing on this?
3b. Any chance Anandtech will ever start benchmarking this and include it in the Bench?
Any thoughts? Comments need not be focused on the exact questions. My actual goal is to:
1. Upgrade to at least 1440p
2. Get better input lag for shooters
3. Upgrade to a sufficiently powerful GPU that I will get better FPS at 1440 than I do at 1080p now (Dragon Age Inquisition Ultra I was getting around 30-40 much of the game)
Current system for reference:
4690k with Hyper212 EVO
Gigabyte GA-297X-SLI
8GB 1600 DDR3
850W Thermaltake SMART PSU (70 amps on 12v rail)
Currently, playing 1080p using a 7970 non-ghz version
Using Samsung 32" TV as monitor. I like the way it looks, except I'm betting the input lag is horrendous...