Will 4K gaming @ 60fps locked be reality with the 1180 GTX

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
I want to take a shot at making this smaller.

Creating a game requires trade-offs. You could add more bells and whistles if your target resolution was 720 @ 60fps. But, those bells and whistles is not the optimal trade off, because, the benefit of those bells and whistles diminishes and you gain less than if you had a target of 1080 with fewer bells and whistles.

You could target 4k instead of 1080 as you would gain more just as you do when you bump from 720 to 1080, but because so few have 4k, why spend the time.

Fair representation?

Kind of close, though I think this is missing a key point. You are saying the bells and whistles wouldn't be worth it at 720p. My point isn't even whether it is worth it or not, but that the dev's will give it to us anyway, based on what their target resolution is, and how much system power is available.
 

realibrad

Lifer
Oct 18, 2013
12,337
898
126
Kind of close, though I think this is missing a key point. You are saying the bells and whistles wouldn't be worth it at 720p. My point isn't even whether it is worth it or not, but that the dev's will give it to us anyway, based on what their target resolution is, and how much system power is available.

But that is because there is very little cost not to add them at the specified resolution. Many people have a 1080/(Ti) and game at 1080. So, you might as well but in the extra stuff and let it reduce perf by a bunch because they would be well over their monitors FPS cap anyway. If they were at 4k, you would need less extra stuff to look as good, but, if you keep in those extra things you would lose a massive perf.

So, if you have 100 units of perf

@720
40 units for resolution
60 units for extra
-con is low resolution

@1080
65 units for resolution
35 units for extra
-con less units for extra, but worth it because the increase in resolution

@4k
90 units for resolution
10 units for extra
-con waste of time as designing a game this way when very few have this resolution
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
But that is because there is very little cost not to add them at the specified resolution. Many people have a 1080/(Ti) and game at 1080. So, you might as well but in the extra stuff and let it reduce perf by a bunch because they would be well over their monitors FPS cap anyway. If they were at 4k, you would need less extra stuff to look as good, but, if you keep in those extra things you would lose a massive perf.

So, if you have 100 units of perf

@720
40 units for resolution
60 units for extra
-con is low resolution

@1080
65 units for resolution
35 units for extra
-con less units for extra, but worth it because the increase in resolution

@4k
90 units for resolution
10 units for extra
-con waste of time as designing a game this way when very few have this resolution
My point isn't about what is worth it or not. Only about the nature of game development. If 720p was the mainstream resolution, and only 2% had 1080p monitors, they'd add those added effects to push 720p. I also bet that you'd have a group of people telling us that 720p is superior to game on, because you can use higher graphical settings.
 
  • Like
Reactions: crisium

HutchinsonJC

Senior member
Apr 15, 2007
467
207
126
I don't feel that I really agree with you bystander.

The difference from 720p or (1024x768 rather) to 1080p on computer screens that are comfortable in physical size to look at was pretty dramatic in terms of text smoothness at the size the text needed to be to be readable at that 2 to 3 feet distance. Getting to 1080P was a pretty huge necessity.

I don't see the point in drawing up theoretical circumstances like 2% of the screens being 1080p and how a group of people would argue for 720p... because no one in their right mind would argue for 720p on ~24" screens... the difference between 1024x768 and 1920x1080 was so glaringly obvious.

But now that we are up to 1440p, the differences that you're eye is able to see on a 24 or 27" screen... well, the difference isn't quite as night and day as it was coming up from 1024x768, let's just say that.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
I don't feel that I really agree with you bystander.

The difference from 720p or (1024x768 rather) to 1080p on computer screens that are comfortable in physical size to look at was pretty dramatic in terms of text smoothness at the size the text needed to be to be readable at that 2 to 3 feet distance. Getting to 1080P was a pretty huge necessity.

I don't see the point in drawing up theoretical circumstances like 2% of the screens being 1080p and how a group of people would argue for 720p... because no one in their right mind would argue for 720p on ~24" screens... the difference between 1024x768 and 1920x1080 was so glaringly obvious.

But now that we are up to 1440p, the differences that you're eye is able to see on a 24 or 27" screen... well, the difference isn't quite as night and day as it was coming up from 1024x768, let's just say that.

You are focusing on the wrong thing here. I do not think 720p is worth developing for. I don't want to use 720p with higher graphical settings. I have not said or hinted at that one bit.

I have simply stated that the developers will design their games around the power of the hardware available and the mainstream resolution at release. The precise resolution that is mainstream at the time is irrelevant. They simply create their games to look the best they can with the hardware of the day. I also noted that the player base also tend to be predictable in their response to the new hardware.
 

Batboy88

Member
Jul 17, 2018
71
2
11
Oh I can see the difference Big time going up to 1440p/2k/Ultrawide whatever....Still looks Awesome. 4k may be Hit or miss but yeah you should see that too. It does Look crisper and more of an Edge and Detail.
 

HutchinsonJC

Senior member
Apr 15, 2007
467
207
126
You are focusing on the wrong thing here. I do not think 720p is worth developing for. I don't want to use 720p with higher graphical settings. I have not said or hinted at that one bit.

Hey, I'm not trying to say that you were rooting for 720p, but you did draw it up as a theoretical that there'd be a group of people cheering up 720p, and that's what I'm disagreeing with you on. 1024x768 next to 1920x1080 was so dramatic of a difference, there's no one in their right mind that would argue for 720p... hence why I said I don't see the point in drawing up the theoretical circumstance.

If 720p was the mainstream resolution, and only 2% had 1080p monitors, they'd add those added effects to push 720p. I also bet that you'd have a group of people telling us that 720p is superior to game on, because you can use higher graphical settings.
 

wilds

Platinum Member
Oct 26, 2012
2,059
674
136
The games I play are very easy for 4k60. I have been loving 4k60 for a few years now. It really comes down to the game itself.

Elite Dangerous, Destiny 2, and Rainbow 6 are very easy to run. It really comes down to which games.

I don't play R6 at 60hz anymore, but Elite and Destiny 2 are incredibly well optimized for 4k.

I would be far more concerned about 4k144hz rather than 4k60 at this point.
 

Batboy88

Member
Jul 17, 2018
71
2
11
The games I play are very easy for 4k60. I have been loving 4k60 for a few years now. It really comes down to the game itself.

Elite Dangerous, Destiny 2, and Rainbow 6 are very easy to run. It really comes down to which games.

I don't play R6 at 60hz anymore, but Elite and Destiny 2 are incredibly well optimized for 4k.

I would be far more concerned about 4k144hz rather than 4k60 at this point.

I still Haven't tried Destiny 2, I thought it looked cool though...Hadn't Played much stuff Lately really...Played alot of BF1 and Fallout 4...And had got pretty far into ESO and really Modded etc.
 

wilds

Platinum Member
Oct 26, 2012
2,059
674
136
I still Haven't tried Destiny 2, I thought it looked cool though...Hadn't Played much stuff Lately really...Played alot of BF1 and Fallout 4...And had got pretty far into ESO and really Modded etc.

Fallout 4 is another game that runs GREAT at 4k60. I am entirely CPU limited in that as well as GTA V at 4k. Both games heavily modded for way more AI. I also have to add ARMA 3 which looks great at 4k, where I am again entirely CPU limited with lots of AI.

I absolutely love 4k60 in CPU limited games. But in titles like twitch shooters, I can't stand 60hz.

The worst game that runs at 4k for me is Star Citizen, which I kinda feel like an idiot spending $60 for Squadron 42...
 

Batboy88

Member
Jul 17, 2018
71
2
11
Fallout 4 is another game that runs GREAT at 4k60. I am entirely CPU limited in that as well as GTA V at 4k. Both games heavily modded for way more AI. I also have to add ARMA 3 which looks great at 4k, where I am again entirely CPU limited with lots of AI.

I absolutely love 4k60 in CPU limited games. But in titles like twitch shooters, I can't stand 60hz.

The worst game that runs at 4k for me is Star Citizen, which I kinda feel like an idiot spending $60 for Squadron 42...

Yeah Arma absolutely needed Intel...GTA was pretty demanding too. ESO was capable on some other stuff Early on but still CPU demanding you just really needed to mod the hell out of it...Then they changed and updated a bunch of stuff and really Broke it....
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Hey, I'm not trying to say that you were rooting for 720p, but you did draw it up as a theoretical that there'd be a group of people cheering up 720p, and that's what I'm disagreeing with you on. 1024x768 next to 1920x1080 was so dramatic of a difference, there's no one in their right mind that would argue for 720p... hence why I said I don't see the point in drawing up the theoretical circumstance.
You underestimate your follow gamers. It may be hard to imagine now, but you do have to realize that all those who say 1080p is superior to 1440p and 4K due to being able to play at higher details, don't have higher than 1080p monitors. They are making up assumptions without having experienced both. I've been gaming for over 30 years. This scenario has played out over and over from the beginning.
 

Batboy88

Member
Jul 17, 2018
71
2
11
You underestimate your follow gamers. It may be hard to imagine now, but you do have to realize that all those who say 1080p is superior to 1440p and 4K due to being able to play at higher details, don't have higher than 1080p monitors. They are making up assumptions without having experienced both. I've been gaming for over 30 years. This scenario has played out over and over from the beginning.

Exactly...I'm on 2k/Ultrawide with the 1060 and 7700k...yeah there's still quite a bit of stuff that I can't really turn everything up or Supersample not without a good frame hit or just not run right or whatever...yet it still looks better to me regardless....while yeah there is a handful of stuff I can though. Why I said it is Realistic having a 1080 or like Vega 56. 2k and 4k are absolutely worth it.

It's tempting to go up to 4k...A lot good Monitor Deals...and yeah we are back to good deals on cards again. But might want to wait till Next gen too though.

And it don't help another 4.8ghz chip, volt Hungry one and delid for sure past that...Not Silicon Lottery this one.
 
Last edited:

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
The games I play are very easy for 4k60. I have been loving 4k60 for a few years now. It really comes down to the game itself.

Elite Dangerous, Destiny 2, and Rainbow 6 are very easy to run. It really comes down to which games.

I don't play R6 at 60hz anymore, but Elite and Destiny 2 are incredibly well optimized for 4k.

I would be far more concerned about 4k144hz rather than 4k60 at this point.

Well, even in Destiny 2 with my 1080ti I don't get locked 60fps 100% of the time with every setting maxed. I have to reduce depth of field and even then there are times when a particular spot will not be 60fps but closer to 50. There's a difference between averages and locked. Averages are no good if the lows drop to 25 in tense combat but you average 60 because moving through a hallway is at 150fps. There are some public events that get so hectic in Destiny 2 that my framerate tanks for a second or two. Could be the game, could be something else but my point stands and I have to actually reduce the resolution scale to 93% of 3840x2160 in order to maintain 60fps in 99% of all situations. That's still not quite locked either if you want to be truthful.

Personally I have zero desire to play a game at medium just to have higher resolution. I'd take 1080p + Ultra settings vs 4k @ medium if that's what it came down to. For me the biggest benefit of going to a 4k TV was HDR. Games look incredible in HDR too. I imagine this is what the OP was referring to, max settings at 4k with 60fps locked because anyone can put the game on low and claim they are playing at 4k but to be honest the game would look better at 1080p with everything to the max.
 
Last edited:

Batboy88

Member
Jul 17, 2018
71
2
11
My system can certainly play 4k games at 60hz right now, but it depends on the settings, and of course the game. Older CoD games no problem, Far Cry 5, no way. I know 4k gets trashed here regularly because it's not over 60hz, but even when I dip into the high 40's for FPS playing DOOM everything on ultra, it is perfectly playable, and it looks great, single player. Plenty of people that are enthusiasts use 4k monitors or televisions right now.

The bigger question imho is that when the 4k/120hz monitors arrive what GPU will even come close to keeping up to that. For online shooters I can't seeing that be viable for years yet. If I were into online first person shooters I certainly wouldn't even consider any 4k monitor right now, however if you mainly play single player and can match your GPU to your monitor's method of adaptive sync it can be very enjoyable.

If the OP is thinking the new cards will be able to maintain 60 fps on every game that won't happen anytime soon I'm afraid.

Well Most of everything I did want to try out...Hasn't been that intensive or hard to run....so yeah literally been 0 issues with just about everything.
 

wilds

Platinum Member
Oct 26, 2012
2,059
674
136
Well, even in Destiny 2 with my 1080ti I don't get locked 60fps 100% of the time with every setting maxed. I have to reduce depth of field and even then there are times when a particular spot will not be 60fps but closer to 50. There's a difference between averages and locked. Averages are no good if the lows drop to 25 in tense combat but you average 60 because moving through a hallway is at 150fps. There are some public events that get so hectic in Destiny 2 that my framerate tanks for a second or two. Could be the game, could be something else but my point stands and I have to actually reduce the resolution scale to 93% of 3840x2160 in order to maintain 60fps in 99% of all situations. That's still not quite locked either if you want to be truthful.

Personally I have zero desire to play a game at medium just to have higher resolution. I'd take 1080p + Ultra settings vs 4k @ medium if that's what it came down to. For me the biggest benefit of going to a 4k TV was HDR. Games look incredible in HDR too. I imagine this is what the OP was referring to, max settings at 4k with 60fps locked because anyone can put the game on low and claim they are playing at 4k but to be honest the game would look better at 1080p with everything to the max.

There are a lot of settings I reduce in D2 to maintain 4k60. I hate depth of field in any game, so that is off. I will always disable Ambient occlusion as well. It kills my 1070. Light shafts are lowered and shadows down to high.

I absolutely prefer lower settings on a 4k display vs ultra 1080p on a 1080p display. I actually play at lower settings on my 1080p monitor to maintain 180hz/180fps. I love that we see things completely opposite. That's not a bad thing since we can choose unlike being on a console!
 

Batboy88

Member
Jul 17, 2018
71
2
11
There are a lot of settings I reduce in D2 to maintain 4k60. I hate depth of field in any game, so that is off. I will always disable Ambient occlusion as well. It kills my 1070. Light shafts are lowered and shadows down to high.

I absolutely prefer lower settings on a 4k display vs ultra 1080p on a 1080p display. I actually play at lower settings on my 1080p monitor to maintain 180hz/180fps. I love that we see things completely opposite. That's not a bad thing since we can choose unlike being on a console!

Yeah Frames should not be much an Issue at all. Like I said it's God UFO freakshow stuff their doing now lol.
 

jpiniero

Lifer
Oct 1, 2010
16,395
6,868
136
I do think the focus with Turing is going to be 4K, iow the gap between the 1180 and 1080 Ti will be much higher at 4K than at 1080p. 128 ROPs maybe?
 

Batboy88

Member
Jul 17, 2018
71
2
11
I do think the focus with Turing is going to be 4K, iow the gap between the 1180 and 1080 Ti will be much higher at 4K than at 1080p. 128 ROPs maybe?

I do not know..They've kept Next gen ideas and whatever planed locked up real tight. Everyone does want to know. It makes no sense to get a 1080 if you didn't get one when the better Cards are right around the Corner.