GeForce GTX 1180, 1170 and 1160 coming in August. Prices inside

Page 18 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

24601

Golden Member
Jun 10, 2007
1,683
39
86
Listed clocks don't matter that much.

What will really determine end-user clocks will be how low they limit end-user voltages.

We already know that Turing will hit at minimum 2050 mhz at pascal voltages.

Will be annoyed if none of the AIB cards have a Dual-link DVI-D port though.

RIP Korean Overclocked IPS/PLS panels

The ASUS RX Vega had one, so there's a slim hope the Nvidia release will.
 
Last edited:
  • Like
Reactions: ozzy702

sze5003

Lifer
Aug 18, 2012
14,183
625
126
I'd rather wait for some benchmarks. No point pre ordering when we only know it's not that big of an upgrade.
 

Brahmzy

Senior member
Jul 27, 2004
584
28
91
If you’re running 4K like me, you upgrade regardless. Been waiting for this card for a LONG time. Even 25% over 1080Ti is 25% over 1080Ti.
 

sze5003

Lifer
Aug 18, 2012
14,183
625
126
If you’re running 4K like me, you upgrade regardless. Been waiting for this card for a LONG time. Even 25% over 1080Ti is 25% over 1080Ti.
Nah I never upgrade unless it's closer to 40% more than what I already have. I'm on 1440p still so unless I see some performance results and comparisons, I can wait. 1k for just 20% doesn't justify it for me. I dont mind spending that much but I have to have a reason for it.
 

Brahmzy

Senior member
Jul 27, 2004
584
28
91
^^ Did you not read my post clearly? You’re not running 4K, so your reply was essentially meaningless. Think about it.
If I was running 1440 with a 1080Ti I wouldn’t upgrade either. 4K is a completely different story, thus my part about 4K.
 

sze5003

Lifer
Aug 18, 2012
14,183
625
126
^^ Did you not read my post clearly? You’re not running 4K, so your reply was essentially meaningless. Think about it.
If I was running 1440 with a 1080Ti I wouldn’t upgrade either. 4K is a completely different story, thus my part about 4K.
Yes I read correctly. Even if I was doing 4k I'd want something a bit better than 20-25% it's not the money that's the issue. Of course we are basing all this without seeing any benchmarks.
 

Brahmzy

Senior member
Jul 27, 2004
584
28
91
25% is huge for frame rate minimums @ 2160p60.
If you’re dipping into 30s n 40s and now you dip into the 40s n 50s, that’s a massive difference in playability.
Again, you don’t have 4K, so you’re talking about things you don’t know about.
Anybody running large-screen 4K with any AA knows what I’m talking about.
 

sze5003

Lifer
Aug 18, 2012
14,183
625
126
I use mine mainly for a military simulator software I've been working on in VR so I'd be interested in how well it would offer improvements in that area. I have not had the need to go to 4k yet since I don't feel like getting a new gsync monitor right now.
 

Timmah!

Golden Member
Jul 24, 2010
1,419
631
136
I have to say i am mightily surprised regarding the rumored 2080Ti - did not really see that coming. With the first leaks of those boxarts yesterday i got confused for a moment and thought that Ti might be actually just full uncut 104 die, while regular 2080 the one with rumored 2944 cores. But it seems, the Ti is truly going to be boast the big 102 chip. Most interesting... Hopefully its not more expensive than 1080Ti, cause i am going to want 2 :-O

Anyway, i think this only means that the 2080 (or 104 chip) with its 2944 core-count is not really faster than 1080Ti, at least not when you leave out ray-tracing out of equation - which for more users will be majority of times. And that leaked GPU beating Titan V in Ashes is probably not 2080, but the Ti version.
 

Justinus

Diamond Member
Oct 10, 2005
3,174
1,517
136
25% is huge for frame rate minimums @ 2160p60.
If you’re dipping into 30s n 40s and now you dip into the 40s n 50s, that’s a massive difference in playability.
Again, you don’t have 4K, so you’re talking about things you don’t know about.
Anybody running large-screen 4K with any AA knows what I’m talking about.

I've been gaming exclusively in 4k since 2015. I will absolutely not pay any amount of money for a mere 20-25% increase.

It's going to have to be a 50% increase across the board for me to even think about it. I have the idea that's going to mean waiting for the next series after this.
 

Brahmzy

Senior member
Jul 27, 2004
584
28
91
I've been gaming exclusively in 4k since 2015. I will absolutely not pay any amount of money for a mere 20-25% increase.

It's going to have to be a 50% increase across the board for me to even think about it. I have the idea that's going to mean waiting for the next series after this.

Me too - gaming early 2015 on 40"+ displays. Absolutely I'll upgrade for 25%. Different strokes / priorities / budgets. I won't be happy until I can hold a solid 60FPS with no dips with AA on the titles I play. I'll take any step closer to that I can, and as I've told people over a year ago, this won't be the generation card to deliver that yet. It'll be a lot closer though.
 

Justinus

Diamond Member
Oct 10, 2005
3,174
1,517
136
Me too - gaming early 2015 on 40"+ displays. Absolutely I'll upgrade for 25%. Different strokes / priorities / budgets. I won't be happy until I can hold a solid 60FPS with no dips with AA on the titles I play. I'll take any step closer to that I can, and as I've told people over a year ago, this won't be the generation card to deliver that yet. It'll be a lot closer though.

I've been using a Dell 24" 4k60 display since 2015. I do get the luxury of much higher PPI than you do, so frequently I can turn down or off AA with very little visual impact (some exceptions apply).

I also have my 1080ti in a custom loop and it runs 2088/12400 daily stable, which helps out quite a bit from stock or FE.

I honestly didn't think the 1080ti was a big enough jump from my 980ti to upgrade until I saw in some newer games it could average as much as +80-90% the framerate as the 980ti. That made me throw my money down.

I can run practically every game that isn't poorly optimized (quantum break, monster hunter world, hellblade) at 4k60 with some tweaking from ultra/max settings (some exceptions apply).

I don't want to make it a bit easier to get 4k60 or tune up the minimums, I want to demolish 4k60 and get a high refresh rate 4k display so I can enjoy 60-100+ FPS, and +25% on a 1080ti ain't gonna do it. +50-70% would have been a start.
 
Last edited:

Brahmzy

Senior member
Jul 27, 2004
584
28
91
Sorry about the size - definitely not a luxury. Zero desire to play on a 24” screen.
I was playing on a 32 1440p and a 30 1600p before the 40 4K, now 43 4K, 75 4K for HT.
There’s zero immersion factor in a 24” screen. -Far from a luxury, just sayin’.
But, it does require some level of AA and or SuperSampling.
Never been into small screens! Again, different strokes.
 

Justinus

Diamond Member
Oct 10, 2005
3,174
1,517
136
Sorry about the size - definitely not a luxury. Zero desire to play on a 24” screen.
I was playing on a 32 1440p and a 30 1600p before the 40 4K, now 43 4K, 75 4K for HT.
There’s zero immersion factor in a 24” screen. -Far from a luxury, just sayin’.
But, it does require some level of AA and or SuperSampling.
Never been into small screens! Again, different strokes.

I have bad eyes, so I have to be really close to my display. The high PPI is necessary for me due to the viewing distance. I can see perfectly only to 14" and wearing glasses using a monitor sucks.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
I don't want to make it a bit easier to get 4k60 or tune up the minimums, I want to demolish 4k60 and get a high refresh rate 4k display so I can enjoy 60-100+ FPS, and +25% on a 1080ti ain't gonna do it. +50-70% would have been a start.

50%-70% almost never happens.

Really 25-30% is the typical generation and it will be enough for lots of people, and not everyone is on 1000 series cards.

This generation packs something beyond typical performance upgrade. The most interesting thing will what happens with the RT/Tensor HW going forward.
 

piesquared

Golden Member
Oct 16, 2006
1,651
473
136
285W TDP ouch. Thats probably a best case scenario number, we're probably looking at well over 300W power draw. All evidence is still pointing to hot and power hungry, which explains why the only leaks so far are showing beefy coolers or water cooling designs.
 

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
I want to demolish 4k60 and get a high refresh rate 4k display so I can enjoy 60-100+ FPS

Nvidia has worked to make this possible. They now have high refresh 4K HDR monitors and these new GPU's coming out are rumored to have some kind of improved SLI function; something to do with NvLink (rumored). If true, and even using old SLI, it should be possible to get two Titans or 2080Ti's and a new 4K HDR monitor and have high framerates, HDR as well as Ray Tracing.

Considering it's 4K @ 98hz+ we're talking about, new Titans should really be used over the 2080ti's. The more power, the better.

The new setup can be yours for only $8000. If you are willing to settle for the 2080Ti's (highly unrecommended) then the setup is yours for only $4000. If you insist on only using a single card (plebeian and highly unrecommended) then you must wait 12 months as well as offer a modest donation of $3000 to your benefactors.
 
  • Like
Reactions: Justinus

Karnak

Senior member
Jan 5, 2017
399
767
136
285W TDP ouch. Thats probably a best case scenario number, we're probably looking at well over 300W power draw. All evidence is still pointing to hot and power hungry, which explains why the only leaks so far are showing beefy coolers or water cooling designs.
Remember that the PNY card is a factory overclocked one. Probably like 250W for the 1080ti FE, don't know about the 2080 though.
 

Justinus

Diamond Member
Oct 10, 2005
3,174
1,517
136
Nvidia has worked to make this possible. They now have high refresh 4K HDR monitors and these new GPU's coming out are rumored to have some kind of improved SLI function; something to do with NvLink (rumored). If true, and even using old SLI, it should be possible to get two Titans or 2080Ti's and a new 4K HDR monitor and have high framerates, HDR as well as Ray Tracing.

Considering it's 4K @ 98hz+ we're talking about, new Titans should really be used over the 2080ti's. The more power, the better.

The new setup can be yours for only $8000. If you are willing to settle for the 2080Ti's (highly unrecommended) then the setup is yours for only $4000. If you insist on only using a single card (plebeian and highly unrecommended) then you must wait 12 months as well as offer a modest donation of $3000 to your benefactors.

I fully appreciate your tone. It seems like this is the unfortunate truth. I'd gladly drop $1000 for a 4k high refresh rate display and $800-1000 for a card that was fast enough to drive it, but nvidia is pricing all of the things we could want into the stratosphere.
 

alcoholbob

Diamond Member
May 24, 2005
6,271
323
126
I've been gaming exclusively in 4k since 2015. I will absolutely not pay any amount of money for a mere 20-25% increase.

It's going to have to be a 50% increase across the board for me to even think about it. I have the idea that's going to mean waiting for the next series after this.

There's a Titan V CEO Edition on Ebay right now for $5500 bid...

900GB/s memory bandwidth, 5120 shader cores, 128 ROPS will beat whatever is coming out on Monday :D
 
  • Like
Reactions: moonbogg

Azix

Golden Member
Apr 18, 2014
1,438
67
91
I've seen regular custom coolers.

285W TDP ouch. That's probably a best case scenario number, we're probably looking at well over 300W power draw. All evidence is still pointing to hot and power hungry, which explains why the only leaks so far are showing beefy coolers or water cooling designs.

I should have taken bets when I told people its possible for AMD and Nvidia to switch power consumption leadership. Its not certain that they have switched roles, they may just have similar power consumption results once the new AMD cards hit. But if they do switch, I am looking forward to all the people who were criticizing AMDs power consumption values, with comparable performance cards, changing their tune to "it doesn't really matter". At minimum these cards look to be using significantly more than the previous generation. Which is what I suspected would happen if Nvidia made major changes to their architecture beyond what they did with pascal.

I also said its possible for performance leads to switch. new architectures and new manufacturing processes boost the chances of this happening. People swear AMD is a mile behind nvidia but that may not be true. Vega64 for example had the horsepower, but didn't manage to push the pixels fast enough. New processes and architectures can allow engineers to address whatever issues there were in the past.

Things could get interesting.

50%-70% almost never happens.

Really 25-30% is the typical generation and it will be enough for lots of people, and not everyone is on 1000 series cards.

This generation packs something beyond typical performance upgrade. The most interesting thing will what happens with the RT/Tensor HW going forward.

The issue here is that whatever they have added to increase the value might have delayed relevance like GCN suffered. Raw performance increases with dx12 vulkan and dx11 are more important.
 
Status
Not open for further replies.