GeForce GTX 1180, 1170 and 1160 coming in August. Prices inside

Page 10 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

piesquared

Golden Member
Oct 16, 2006
1,651
473
136
No, just high clocks. Architecture is going to be at least as efficient as Pascal clock for clock and on a refined process with a significant clock boost.

Well, high clocks or not it doesn't really change that indications thus far are that these chips appear to be set to run power hungry and hot. We'll have to see if reference cards are any different, but certainly after market cards appear to run hotter than prior generations judging by the cooling that AIB are showing off. I also doubt that the 12nm process is going to be see much refinement, maybe minor, the cooling seems to indicate the 15% higher clocks are only due to said power increase. That could likely be achieved by overclocking the current cards with the same parameters.
 

piesquared

Golden Member
Oct 16, 2006
1,651
473
136
Correct,
the numbers im calculating are gtx1080 180 watts, RTX 2080 50% faster at 200 watts.
So yes a fairly efficient card for performance per watt. More efficient than Pascal.

I believe earlier in this thread I was stating a 40% better performance per watt and people thought I was crazy.
Well I was close, it seems.
We shall we next week.

50% faster in all games then? Or just ones that are coded specifically for the architecture.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
Last edited:

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
It's interesting that all the leaks thus far are of cards with pretty beefy cooling. From what we have seen so far, it would seem to indicate a power hungry and hot running architecture.

Or maybe they are going for quieter running card because the biggest complaint of all closed-cooling cards is noise.
 
Last edited:
  • Like
Reactions: ozzy702

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
No, it's just that Wells Fargo were looking stupid as they had nvidia at $140 and "Underperform" for the last 2 years. Nvidia hasn't under performed and the share price is more like $260 which shows they are basically clueless.

Stock analysts are trailing indicators. They only react after the obvious has happened.
 
  • Like
Reactions: ozzy702

DooKey

Golden Member
Nov 9, 2005
1,811
458
136
If they release it now it will sell for a lot more that $649.
7nm is coming next year & might allow a new generation 30xx.
Nvidia won't allow AMD alone on 7nm for a whole year if they stick to a 2 year cycle on 12nm.

This 14-16 nm > 7 nm transition will screw up recent traditions. Don't be a slave to the past.

Agreed. 7nm next year is going to limit the sales life of these 12nm cards. If there's going to be a Ti it will come out no later than the holidays or it won't come out at all.
 

maddie

Diamond Member
Jul 18, 2010
5,151
5,537
136
NVidia is so far ahead of AMD, that they could just stick with 12nm all of next year.
Havn't you been the one who have always argued that AMD is pretty irrelevant and Nvidia needs to compete with themselves for replacement sales, so they will keep pushing, regardless of being dominant ? Now you say this.
 

ozzy702

Golden Member
Nov 1, 2011
1,151
530
136
NVidia is so far ahead of AMD, that they could just stick with 12nm all of next year.

They could stick with 12nm longer than that and still hold the lead. AMD is a good three to four years out from competition and that's only if they throw some of their CPU gainz at their GPU division. I fully expect to see near zero competition in the GPU space until at least 2021 and by then Intel may be a competitor.
 
  • Like
Reactions: Head1985

ozzy702

Golden Member
Nov 1, 2011
1,151
530
136
Well, high clocks or not it doesn't really change that indications thus far are that these chips appear to be set to run power hungry and hot. We'll have to see if reference cards are any different, but certainly after market cards appear to run hotter than prior generations judging by the cooling that AIB are showing off. I also doubt that the 12nm process is going to be see much refinement, maybe minor, the cooling seems to indicate the 15% higher clocks are only due to said power increase. That could likely be achieved by overclocking the current cards with the same parameters.

12nm will allow for significantly higher clocks than Pascal was able to hit even overclocked to the hilt at lower power consumption per clock. Is it a large jump like 7nm goodness? Nope, but it will give NVIDIA another round of top tier cards in a market where they are only competing with themselves.12nm should be cheap to mass produce on at this point so pricing may be surprisingly low given the lack of competition since NVIDIA knows they need to get GPUs out the door asap for them to compete with themselves again on 7nm.
 

sze5003

Lifer
Aug 18, 2012
14,304
675
126
The last time I had amd card was when there was a bit of competition, the 7970ghz edition card. That was a nice card and carried me through a lot. Since then I've not seen anything from amd compete with a 1080 or 1080Ti which is ashame because I always went for the best performance for the price but now I can't really do that.
 

ozzy702

Golden Member
Nov 1, 2011
1,151
530
136
The last time I had amd card was when there was a bit of competition, the 7970ghz edition card. That was a nice card and carried me through a lot. Since then I've not seen anything from amd compete with a 1080 or 1080Ti which is ashame because I always went for the best performance for the price but now I can't really do that.

NVIDIA is still the best performance for the price on the high end... because they're the only option. LOL

Yeah, competition is king for the consumer and in a market with zero competition, we suffer.
 

maddie

Diamond Member
Jul 18, 2010
5,151
5,537
136
The last time I had amd card was when there was a bit of competition, the 7970ghz edition card. That was a nice card and carried me through a lot. Since then I've not seen anything from amd compete with a 1080 or 1080Ti which is ashame because I always went for the best performance for the price but now I can't really do that.
Thats a bit harsh. Mid-range and low, AMD competes fairly well. It's only at the extreme top, Ti level, where they don't. The mining phenomenon skewed competition a lot, otherwise if prices were stable, I still think the Vega series would have been in the running. $399 for a Vega 56 is not a bad price.

If you want the absolute best performing card, then Nvidia is the default choice, but most buyers go mid, so I disagree with you on no competition.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
... pricing may be surprisingly low given the lack of competition since NVIDIA knows they need to get GPUs out the door asap for them to compete with themselves again on 7nm.

If NVidia is only competing with themselves, they don't need to rush anything. They could give the 12nm 2000 series GPUs two years on the market, just like the 16nm 1000 series.

I think NVidia is very hooked in on the process side. After all 12nm FFN is tailor made for them. They probably know that 7nm probably won't be as great for GPUs in 2019 as many people assume.

They can ride out the 7nm rough patch (lower yield, higher cost) on a very mature and proven 12nm process.
 
  • Like
Reactions: crisium

sze5003

Lifer
Aug 18, 2012
14,304
675
126
Thats a bit harsh. Mid-range and low, AMD competes fairly well. It's only at the extreme top, Ti level, where they don't. The mining phenomenon skewed competition a lot, otherwise if prices were stable, I still think the Vega series would have been in the running. $399 for a Vega 56 is not a bad price.

If you want the absolute best performing card, then Nvidia is the default choice, but most buyers go mid, so I disagree with you on no competition.
Yea but why not have an alternative at every level? My favorite part of being a consumer of any product or company is choice. I was considering an RX 580 or one of those series but if I want to max everything smoothly at 1440p, I couldn't really do that and then there is also VR which I use for simulations.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,576
10,215
126
NVidia is so far ahead of AMD, that they could just stick with 12nm all of next year.
Didn't AMD do exactly that with the 65nm (90nm?) Opterons, and thought that they were so far ahead of Intel, that they didn't need to focus on process development for a while? And then, Intel came along, and ate their shirt, for like 10 years in a row after that?

NV isn't stupid.
 
  • Like
Reactions: n0x1ous

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
Didn't AMD do exactly that with the 65nm (90nm?) Opterons, and thought that they were so far ahead of Intel, that they didn't need to focus on process development for a while? And then, Intel came along, and ate their shirt, for like 10 years in a row after that?

That had little to do with process, and almost everything to do with Intel having superior architecture for almost a decade. Ryzen is a big deal, not because of process, but because of AMD finally having a competitive architecture again.

7nm may not turn out to be the panacea that you think it will be and NVidia is in a much better position to judge than any of us.
 

n0x1ous

Platinum Member
Sep 9, 2010
2,574
252
126
what are these companies going to do after Jim Keller dies. there will be no one left to come back and save them!
 
  • Like
Reactions: Elfear

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
Thats a bit harsh. Mid-range and low, AMD competes fairly well. It's only at the extreme top, Ti level, where they don't. The mining phenomenon skewed competition a lot, otherwise if prices were stable, I still think the Vega series would have been in the running. $399 for a Vega 56 is not a bad price.

Theoretically competes from a customer facing perspective.

But if you look at Vega 64/56 cost to build. It has a die bigger than Pascal Titan/1080ti, yet it only compete against the lower tier 1080/1070.

AMD designs need bigger dies, more expensive memory, and more power (more expensive VRM circuitry likely) to compete with NVidia parts.

That makes for a lot worse margins, so really they are not so competitive. AMD is lucky that mining kept pricing high, and their margins up along with it.

Before Vega and mining, NVidia had already cut GTX 1070 MSRP to $349, and without mining it probably would have gone lower which would have crushed AMD margins even more. But thankfully for AMD mining made any kind of price pressure irrelevant.
 

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
I don't think Nvidia will leave its two-year cadence unless AMD pulls a Zen in the GPU space.

Turing might be like Maxwell v2 though. GTX 980 September 2014, GTX 1080 May 2016. Less than two full years. I can see a GTX 3080 in March-May 2020.

Why would they force 7nm cards out in 2019 and risk a huge gap before the next-gen? We take Nvidia's tight cadence for granted sometimes, but they can't always innovate improvement indefinitely. If GTX 4080 is in 2022 due to process nodes and architecture engineering time requirements, then you have 3 years on 7nm GTX 3080 if it launches in 2019.

Waiting in H1 2020 for launch also gives the 7nm process time to mature. Let the phone chips hog it for a while.

I guess they could do a 2019 7nm Quadro or Titan, but I would never expect GeForce gaming cards on 7nm in 2019 not unless Nvidia has corporate espionage indicating AMD is hiding something revolutionary.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,576
10,215
126
That had little to do with process, and almost everything to do with Intel having superior architecture for almost a decade.
Not a student of History, I see.

Back when AMD introduced the Athlon 64 architecture, it WAS more advanced than anything Intel had on the market at the time.
AND, Intel was behind on their 90nm generation, AMD was ahead, and then they decided to slow down their process development, because they felt that they were so far ahead of everyone. Then Intel introduced 65nm Core2, and AMD was left high and dry.
 

JasonLD

Senior member
Aug 22, 2017
488
447
136
Not a student of History, I see.

Back when AMD introduced the Athlon 64 architecture, it WAS more advanced than anything Intel had on the market at the time.
AND, Intel was behind on their 90nm generation, AMD was ahead, and then they decided to slow down their process development, because they felt that they were so far ahead of everyone. Then Intel introduced 65nm Core2, and AMD was left high and dry.

Intel released 90nm part 6 month before AMD's. Problem was their netburst architecture, but they never fell behind on process node.
 

vailr

Diamond Member
Oct 9, 1999
5,365
54
91
If top end Nvidia GPU's are so much better (at performance per watt) than AMD's, then why did Apple decide to dump the Nvidia option from all of their Mac PC's?
Or: maybe the yet unreleased "modular Mac Pro 2019" will again allow the Nvidia option?
Or at least for external GPU boxes.
 
Status
Not open for further replies.