Will 4K gaming @ 60fps locked be reality with the 1180 GTX

Organik

Member
Jul 15, 2018
58
2
6
Guys I have been thinking a lot about for a long time now to get a 1060GB 6GB My card gives me 60FPS but I want to go to 4K res, and I tried it and get 1fps with my card. Ive heard 1080 Ti SLI can do 4K but framerates drop to 30's and 40's I hear.

What will this 1180 GTX do for us gamers who have 4K mons and powerful PC's and wanna enjoy 4K gaming. I think it has to be as fast as the 1920res was for 980 GTX or 1080 GTX. Well see I guess. IMO its all about 4K gaming, its the future.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Until developers stop creating games to be run at 1080p, and assume everyone has a 4K monitor, no video card will be able to play maxed settings at 4K and maintain 60 FPS. But nothing changes for the 4K user, as all they'd end up doing is reducing the visual quality of Ultra. Something you can do now, by simply lowering the settings they give you. The only difference is it won't have the label Ultra, even though it's the same settings.

If you want 4K and have 60 FPS, lower your settings.
 
  • Like
Reactions: Organik

Organik

Member
Jul 15, 2018
58
2
6
Good to know Thanks. What do you think about 2x SLI 1180 GTX. Can that handle all settings maxed 4k and give 1920res type performance. I say theres a big chance, hopefully, because 4K has been around over 4 years the devs have to step it up or nVidia or ATI has to step it up and produce cards that can handle 4k like butter.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,570
10,199
126
I really don't know if I would go the SLI route. TBH, even NV seems like they are supporting it less and less. It's also much more of a hassle to get working, so to speak, with DX12, due to the support falling on the game developer, mostly, rather than just the driver developer (NV). So you might see in-game support for mGPU, and you might not. Some games might even support cross-vendor mGPU, I think that Ashes did, as a tech-demo sort of thing. But I expect that to be mostly fairly rare going forward, although I would love to be surprised and see it pop up more often.

Then again, you get games like Civ V, that even crash, if CFX is enabled. (Don't know how that game responds to SLI being enabled.)
 
  • Like
Reactions: Organik

happy medium

Lifer
Jun 8, 2003
14,387
480
126
I think a 1180ti will most likely do it.
Mabe a overclocked 1180.

3 months later 5k will be the new thing and you can chase that resolution next.......

Then 8k after that.......

Then someone will convince us that our eyes can tell the difference between 8k and 12k....

And so on and so on and so on....
 
  • Like
Reactions: Organik

alcoholbob

Diamond Member
May 24, 2005
6,343
412
126
I dont see the 1180 being faster than the titan v since it's also being made on 12nm process, so unlikely unless u mean on medium-high vs max details. In that case even a 1080ti can do 4k60 locked today.
 
  • Like
Reactions: Organik

Organik

Member
Jul 15, 2018
58
2
6
I dont see the 1180 being faster than the titan v since it's also being made on 12nm process, so unlikely unless u mean on medium-high vs max details. In that case even a 1080ti can do 4k60 locked today.

Yes your right, medium to high is good enough for me.
 

maddogmcgee

Senior member
Apr 20, 2015
400
380
136
If you want 60fps wouldn't you be better getting a 1440p screen? You could have higher settings at 2k so I would imagine it would look better overall than the 4k at lower graphics settings. It's also plenty high enough resolution for a big monitor AND you won't feel the need to get the very highest card of every generation.
 
  • Like
Reactions: Organik

Mopetar

Diamond Member
Jan 31, 2011
8,178
6,940
136
It’ll probably be close, but I don’t think all titles will reach that mark and that’s just presently available games.

Judging by recent history the 1180 should be similar to a 1080 Ti so it seems unlikely. Maybe the 1180 Ti gets there, but it might take until the 1280 Ti before we’ve truly landed.

I guess it really comes down to how much you want to max settings. There are a lot of games that run a lot faster with very little visual difference between high and ultra settings. Even medium looks damn good in some games.
 
  • Like
Reactions: Organik

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
Like others have said you can already do 4k60fps locked at medium settings in many games on an OC 1080 Ti and Titan V today
 
  • Like
Reactions: Organik

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
It’ll probably be close, but I don’t think all titles will reach that mark and that’s just presently available games.

Judging by recent history the 1180 should be similar to a 1080 Ti so it seems unlikely. Maybe the 1180 Ti gets there, but it might take until the 1280 Ti before we’ve truly landed.

I guess it really comes down to how much you want to max settings. There are a lot of games that run a lot faster with very little visual difference between high and ultra settings. Even medium looks damn good in some games.

If the 1180ti came out tomorrow, and it was 4 times faster than any video card today, in 1 year, the developers will have games out with settings that will drop it to less than 60 FPS at 4K. The devs have the ability to make any current video card crawl to a halt at any resolution. AAA game dev's pick the settings they do, so that people with 1080p monitors can run the game at Ultra and have decent FPS, which means 4K is going to need their settings turned down. They have always had the ability to offer much higher detail graphics that are much more demanding. They simply do not show us those settings, because people will cry that their game is poorly optimized. They instead, just show us what will be capable of being used at 1080p, and no more.
 

NTMBK

Lifer
Nov 14, 2011
10,361
5,468
136
No, it probably won't. I expect you'll need to wait for 7nm graphics cards to get 4k60.
 

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
As bystander let on, it'll take a culture shift. They can make games with perfect SLI scaling and Tri-Titan V and chug along at a slide show. Do you think Pixar movies run in real time on anything on this planet?

Devs seem content to have Ultra settings playable at 1080p60 with mid-range cards like a 1060, which means high end cards are aimed more at 1440p60 than 2160p60.

The culture should shift eventually. 1024x768 was once the standard, and we wondered when we could get games running smooth at 1600x1200.

You can shift your own expectations though.

If you value PPI over graphical fidelity, than do not do all Ultra. Max your textures and lower everything else to Medium or so. Basically, you need 2013 graphical fidelity. But perhaps not even that's enough... afterall 2013's Crysis 3 on 4k Very High will not hit a smooth 60fps even on an OC 1080 Ti.

I too run into the trap of wanting Ultra on everything, but the great thing about PC games is choosing what you prioritize: framerates, graphical fidelity, or PPI. You can't have it all, even if you have the latest and greatest (there are Titan V users with the new 4K 144hz monitors... they are still making that compromise).
 
  • Like
Reactions: Organik

Mopetar

Diamond Member
Jan 31, 2011
8,178
6,940
136
If the 1180ti came out tomorrow, and it was 4 times faster than any video card today, in 1 year, the developers will have games out with settings that will drop it to less than 60 FPS at 4K. The devs have the ability to make any current video card crawl to a halt at any resolution. AAA game dev's pick the settings they do, so that people with 1080p monitors can run the game at Ultra and have decent FPS, which means 4K is going to need their settings turned down. They have always had the ability to offer much higher detail graphics that are much more demanding. They simply do not show us those settings, because people will cry that their game is poorly optimized. They instead, just show us what will be capable of being used at 1080p, and no more.

I think what you describe is more of a result of consoles driving sales for most AAA titles (obviously there are some that are PC only) and needing to hit 60 FPS on a TV with the console hardware. However, as 4K TVs get cheap and the next generation of consoles has better hardware, there will be a shift to get to 4K.

Making a CPU/GPU crawl is always easy. Look at the large number of games that have ultra settings that tank FPS for little or no improvement in visual quality. If you do something like crank the the tessellation to ridiculous levels it’s incredibly easy. But I wouldn’t classify that as useful.

Occasionally we do get a game that pushes the state of the art forward. The developers find new ways to use the game’s graphics to enhance the visual aesthetic beyond simply increasing the polygon count. Then other games start to incorporate those things in their games as well. Then you get games where medium isn’t nearly as good because even at ultra settings the GPU is busy enough with actual work for all the different effects and spare power would speed it all up rather than just cranking things up to a pointless level.
 

HutchinsonJC

Senior member
Apr 15, 2007
466
205
126
Until developers stop creating games to be run at 1080p, and assume everyone has a 4K monitor, no video card will be able to play maxed settings at 4K and maintain 60 FPS.

I'm sorry, but this is kind of putting the carriage in front of the horse.

First, no amount of developer assumptions of "everyone has a 4K monitor" is going to either a) change that there's still a lot of monitors out there being used for gaming that are NOT 4K or b) magically make graphics processing technology able to suddenly run games (across a huge variety of genres/titles) at 4k at 60+ fps across the board.

Even if you could literally wipe out every non-4K monitor in existence over night, and forced developers to look at 4k and beyond as the only forth going option, the graphics processing industry needs time to improve their tech.

Next, the developers don't have anything to gain to "stop creating games to be run at 1080p". It's not as if making games able to be played at 1080p has any hindrance on them also being able to make games playable at 1440p or 4k. Games come "out of the box" able to be played at several resolutions based on your hardware's capabilities.

OP,
If you want to enjoy 4k gaming, you'll have to wait until the GRAPHICS PROCESSING TECH is available to readily push 60+ fps across a majority of genres and titles, or play those games that do not require all that much graphics processing power at 4k and play everything else with turned down settings and/or resolutions.

I don't even know why people are looking at 4k gaming as a serious thing right now. 4k gaming is in its infancy both on the panel and the graphics processing side of things. It'll be the same lot crying that they don't have any graphics processors to push their 4k panels to 144hz the year following. Simply put, you kinda gotta wait for the tech to mature.
 

Batboy88

Member
Jul 17, 2018
71
2
11
Yeah People are already and been doing it lol, The 1060 can touch some Lighter/ok stuff in 4k Pretty well. It's still really more of a 2k card though imo.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
I'm sorry, but this is kind of putting the carriage in front of the horse.

First, no amount of developer assumptions of "everyone has a 4K monitor" is going to either a) change that there's still a lot of monitors out there being used for gaming that are NOT 4K or b) magically make graphics processing technology able to suddenly run games (across a huge variety of genres/titles) at 4k at 60+ fps across the board.

Even if you could literally wipe out every non-4K monitor in existence over night, and forced developers to look at 4k and beyond as the only forth going option, the graphics processing industry needs time to improve their tech.

Next, the developers don't have anything to gain to "stop creating games to be run at 1080p". It's not as if making games able to be played at 1080p has any hindrance on them also being able to make games playable at 1440p or 4k. Games come "out of the box" able to be played at several resolutions based on your hardware's capabilities.

OP,
If you want to enjoy 4k gaming, you'll have to wait until the GRAPHICS PROCESSING TECH is available to readily push 60+ fps across a majority of genres and titles, or play those games that do not require all that much graphics processing power at 4k and play everything else with turned down settings and/or resolutions.

I don't even know why people are looking at 4k gaming as a serious thing right now. 4k gaming is in its infancy both on the panel and the graphics processing side of things. It'll be the same lot crying that they don't have any graphics processors to push their 4k panels to 144hz the year following. Simply put, you kinda gotta wait for the tech to mature.

Have you ever downloaded Crysis 1-3 mods that allow you full access to the settings? How about Skyrim, or other games? There are a number of games, that have your standard Low to Ultra settings, that have sliders to take the settings much further than Ultra. These are all examples of how the dev's chose their Low to Ultra settings based on the speed of the hardware most people have available. They have always been capable of giving use spectacular visuals that drop a 1080ti to it's knees at 720p, but they don't, as their game would get labeled as being very unoptimized (optimized is mostly a matter of finding the right compromise between visuals and performance). At the same time, they don't make Ultra run at 60 FPS on 4K monitors either. They typically make it so the top end video card can run 1440p at 60 FPS in their games.

If they made games run at ultra on 4K monitors and maintain 60 FPS, they are leaving visual quality on the table. People who chose 1080p and 1440p would have lower quality visuals. This would result in the game getting labeled as having low graphics quality. AAA games need the graphics quality buzz to help sales.

Another note: have you noticed that over the years that the top end video card has always been able to maintain 60 FPS at mainstream resolution, and not higher, at the same time video cards continue to get faster and faster? Do you think that is coincidence?

Hint to last question: Dev's have told us over the years, that they plan their graphics quality based on the predicted hardware that will be available when it is released.
 
Last edited:
  • Like
Reactions: crisium

CakeMonster

Golden Member
Nov 22, 2012
1,555
717
136
The way things are (not) moving now I wish there was more monitors with more resolutions available. 1440 is almost exclusively available as 27" which is way too small. 4K is available at 32" which is fine but running too slow so far. 1600 (16:10) is available at 30" which I think is pretty much perfect but those monitors have never gotten the *sync treatment and max out at 60hz. Ultra wide is and will always be unpractical outside of very narrow use cases and the bragging factor.

Why can't someone release a 30"+ 1600p *sync monitor, or maybe 1800p? There seems to be a gamer segment in that range, and it would be sweet for desktop use as well when 4K is pushing it too far with performance and text size with bad windows scaling.
 
  • Like
Reactions: Feld

HutchinsonJC

Senior member
Apr 15, 2007
466
205
126
bystander36,

Your argument, as near as I can tell,

Until developers stop creating games to be run at 1080p, and assume everyone has a 4K monitor, no video card will be able to play maxed settings at 4K and maintain 60 FPS.

is basically that developers can add more bells and whistles and bring a gpu to its knees, and that if they weren't trying to do all these bells and whistles, the games would play fine at 4k. AKA, if we could magically get rid of 1080p for everyone, developers would be forced to turn off a lot of bells and whistles (a lower base line) to get a game to run at 4k.

I'm curious exactly what the point of a higher resolution screen is, where you'd be able to better see all these details, if you're just turning them all off just to make it work on 4k?

Logically, I can see where you maybe turn off, or at the least lower MSAA or the likes with the higher resolution, which frees up some processing power, but your overall point is lost on me.

They typically make it so the top end video card can run 1440p at 60 FPS in their games

There's a realistic reason for that: ownership statistics.

Steam survey being easily accessible says that 1920x1080 is some 60.55% of folk's primary display resolution. You can't not keep 60% of your gamers in mind. 4k resolution looks like 1.21%. 1440p is 3.45%

1366x768 has 14.47% | silly laptops...

If I was a AAA developer, I wouldn't be trying to write my games for 4k 144 fps as the base line.

If I was a hard core gamer, I wouldn't want my AAA developer turning off all these bells and whistles to achieve 144 fps on 4k, when the whole point of 4k is to SEE more details.

There's a balance, and I'm sure developers know and realize that. Otherwise they would've lost the game a long time ago.
 

Mopetar

Diamond Member
Jan 31, 2011
8,178
6,940
136
I think that perhaps an even bigger upside of cards that can do 4K at 60 FPS is that they’ll likely be able to do VR at levels that people have been claiming are truly necessary for a fully immersive experience.

Some people have sworn that you really need 90 FPS and that you probably want 1440 displays at a minimum. I think that’s what a lot of the next generation of headsets are targeting for minimum specs.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
bystander36,

Your argument, as near as I can tell,
...

My argument is simple. Developers design their games for the hardware available to the players at the time of release. At the time of release, they dial the settings back, so that the top end cards can run their games at Ultra on mainstream monitors and beat 60 FPS. That often results in 1440p still being able to do this, but 1080p atm is really their target.

The above is a fact, and not open to debate.

The result of the above facts of development, Ultra 4K is always out of reach until the developers consider it the mainstream resolution.

Now here is where the mental game come into play. All you have to do is lower your settings so that 4K can handle it, and you can play at 4K now, but can you handle the idea of not using maxed settings? Would you be happier if the developers took those higher details away so that 4K can run the games at 60 FPS on Ultra, even though you can play like that now, only it won't be labeled as Ultra (many people on these forums can't handle that part)? Would you rather play at 1080p with higher graphical settings, or 4K at lower graphical settings? Would you rather play at 720p with even higher graphical quality?

At some point, we, as the gamers, have to let go of "Ultra" and actually decide what is the better balance between graphical settings, and resolution crispness.

Summed up: I believe if you want to play at 4K at 60 FPS, that you can do it right now. You just have to be willing to reduce graphical settings. The idea that GPU's will catch up to Ultra 4K is a false idea, as the dev's will just keep moving the bar.
 
  • Like
Reactions: crisium

CPUGuy

Junior Member
Nov 20, 2008
16
3
76
With next gen consoles rumored to be more HDR compliment I would think the same will apply to future PC titles. Therefore, I think 4K, HDR 60FPS will be the norm in a few years. The issue now is the substandard use of HDR as seen today in some monitors that offer HDR now. For example, limited HDR sections, low nit, gamut, etc.

Once that's all addressed and cost less then $1000 we should be looking at a new standard.
 

realibrad

Lifer
Oct 18, 2013
12,337
898
126
My argument is simple. Developers design their games for the hardware available to the players at the time of release. At the time of release, they dial the settings back, so that the top end cards can run their games at Ultra on mainstream monitors and beat 60 FPS. That often results in 1440p still being able to do this, but 1080p atm is really their target.

The above is a fact, and not open to debate.

The result of the above facts of development, Ultra 4K is always out of reach until the developers consider it the mainstream resolution.

Now here is where the mental game come into play. All you have to do is lower your settings so that 4K can handle it, and you can play at 4K now, but can you handle the idea of not using maxed settings? Would you be happier if the developers took those higher details away so that 4K can run the games at 60 FPS on Ultra, even though you can play like that now, only it won't be labeled as Ultra (many people on these forums can't handle that part)? Would you rather play at 1080p with higher graphical settings, or 4K at lower graphical settings? Would you rather play at 720p with even higher graphical quality?

At some point, we, as the gamers, have to let go of "Ultra" and actually decide what is the better balance between graphical settings, and resolution crispness.

Summed up: I believe if you want to play at 4K at 60 FPS, that you can do it right now. You just have to be willing to reduce graphical settings. The idea that GPU's will catch up to Ultra 4K is a false idea, as the dev's will just keep moving the bar.

I want to take a shot at making this smaller.

Creating a game requires trade-offs. You could add more bells and whistles if your target resolution was 720 @ 60fps. But, those bells and whistles is not the optimal trade off, because, the benefit of those bells and whistles diminishes and you gain less than if you had a target of 1080 with fewer bells and whistles.

You could target 4k instead of 1080 as you would gain more just as you do when you bump from 720 to 1080, but because so few have 4k, why spend the time.

Fair representation?
 
  • Like
Reactions: crisium

nOOky

Diamond Member
Aug 17, 2004
3,144
2,154
136
My system can certainly play 4k games at 60hz right now, but it depends on the settings, and of course the game. Older CoD games no problem, Far Cry 5, no way. I know 4k gets trashed here regularly because it's not over 60hz, but even when I dip into the high 40's for FPS playing DOOM everything on ultra, it is perfectly playable, and it looks great, single player. Plenty of people that are enthusiasts use 4k monitors or televisions right now.

The bigger question imho is that when the 4k/120hz monitors arrive what GPU will even come close to keeping up to that. For online shooters I can't seeing that be viable for years yet. If I were into online first person shooters I certainly wouldn't even consider any 4k monitor right now, however if you mainly play single player and can match your GPU to your monitor's method of adaptive sync it can be very enjoyable.

If the OP is thinking the new cards will be able to maintain 60 fps on every game that won't happen anytime soon I'm afraid.