GeForce GTX 1180, 1170 and 1160 coming in August. Prices inside

Page 11 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
I don't think AMD sitting on their laurels on process nodes had anything to do with Core 2 rocking their boat. It took Phenom II on 45nm in 2009 for AMD to properly compete with 65nm Core 2.

Of course, by the time it came out they were competing with 45nm Core 2 and a vastly superior Nehalem.

45nm in 2006 was not feasible, so I don't think them not pursuing smaller than 65nm has any truth at all.

Phenom (I on 65nm) was inferior to Core 2 but at least it competed better than K8 did. If they launched that in 2006 instead of 2008 then they would have done better. But I don't think they took anything for granted. They had serious revenue problems despite having a superior product due to illegal anti-competitive practices from Intel. They never had the R&D budget so them falling behind is logical. Zen somewhat catching up to Lake is a far bigger shocker than this.
 

maddie

Diamond Member
Jul 18, 2010
5,151
5,537
136
I don't think Nvidia will leave its two-year cadence unless AMD pulls a Zen in the GPU space.

Turing might be like Maxwell v2 though. GTX 980 September 2014, GTX 1080 May 2016. Less than two full years. I can see a GTX 3080 in March-May 2020.

Why would they force 7nm cards out in 2019 and risk a huge gap before the next-gen? We take Nvidia's tight cadence for granted sometimes, but they can't always innovate improvement indefinitely. If GTX 4080 is in 2022 due to process nodes and architecture engineering time requirements, then you have 3 years on 7nm GTX 3080 if it launches in 2019.

Waiting in H1 2020 for launch also gives the 7nm process time to mature. Let the phone chips hog it for a while.

I guess they could do a 2019 7nm Quadro or Titan, but I would never expect GeForce gaming cards on 7nm in 2019 not unless Nvidia has corporate espionage indicating AMD is hiding something revolutionary.
The lead time between commencing a design and marketing it rebuts that argument. You can't wait to see what your competitor introduces to immediately counter from a performance point of view. 7 nm and its derivatives will a long node as 14nm was. Time for many advances.
 

n0x1ous

Platinum Member
Sep 9, 2010
2,574
252
126
If top end Nvidia GPU's are so much better (at performance per watt) than AMD's, then why did Apple decide to dump the Nvidia option from all of their Mac PC's?
Or: maybe the yet unreleased "modular Mac Pro 2019" will again allow the Nvidia option?
Or at least for external GPU boxes.
because Apple never offers top end GPU options in any of their systems so its irrelevant to them
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
Not a student of History, I see.

Back when AMD introduced the Athlon 64 architecture, it WAS more advanced than anything Intel had on the market at the time.
AND, Intel was behind on their 90nm generation, AMD was ahead, and then they decided to slow down their process development, because they felt that they were so far ahead of everyone. Then Intel introduced 65nm Core2, and AMD was left high and dry.

I remember my history well. In that period I had a Pentium Pro, followed by an Athlon, followed by a Core 2.

AMD was ahead, and Prescott behind, not because Prescott was on 90nm, but because of Prescott's Netburst architecture, which was completely abandoned. Intel realized and abandoned this architectural mistake.

Intel stayed ahead because initially they rapidly switched Architectures. Netburst only had about 2 generations, then they actually went back to to a revamped P6 architecture for Core, Core 2, so about 2 generations. Then Nehalem was a huge improvement, and Sandy Bridge was a nice improvement again (then the stagnation set in).

Meanwhile AMD didn't really improve much on the architecture side. They stagnated while Intel created Core, then Nehalem, then Sandy Bridge, and if anything, AMD arguably regressed with Bulldozer.

The story of Intel dominating that last decade was more about AMD architecture stagnation than it is about process.
 
  • Like
Reactions: GodisanAtheist

JasonLD

Senior member
Aug 22, 2017
488
447
136
If top end Nvidia GPU's are so much better (at performance per watt) than AMD's, then why did Apple decide to dump the Nvidia option from all of their Mac PC's?
Or: maybe the yet unreleased "modular Mac Pro 2019" will again allow the Nvidia option?
Or at least for external GPU boxes.

2 proprietary loving companies won't go well together lol. It is more than likely AMD gave them far more attractive prices on those GPUs than Nvidia is willing to offer.
 

maddie

Diamond Member
Jul 18, 2010
5,151
5,537
136
Theoretically competes from a customer facing perspective.

But if you look at Vega 64/56 cost to build. It has a die bigger than Pascal Titan/1080ti, yet it only compete against the lower tier 1080/1070.

AMD designs need bigger dies, more expensive memory, and more power (more expensive VRM circuitry likely) to compete with NVidia parts.

That makes for a lot worse margins, so really they are not so competitive. AMD is lucky that mining kept pricing high, and their margins up along with it.

Before Vega and mining, NVidia had already cut GTX 1070 MSRP to $349, and without mining it probably would have gone lower which would have crushed AMD margins even more. But thankfully for AMD mining made any kind of price pressure irrelevant.
You're conflating things. From the consumer's perspective, they're competitive in mid and low range. Profit margins relate to internal company decisions and don't affect my ability to afford a product. Retail price and performance is what matters to me. I feel fairly certain that is the case for most people and there's nothing theoretical about that.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
You're conflating things. From the consumer's perspective, they're competitive in mid and low range. Profit margins relate to internal company decisions and don't affect my ability to afford a product. Retail price and performance is what matters to me. I feel fairly certain that is the case for most people and there's nothing theoretical about that.

Throughout the life of the products GTX 1080 has also been more affordable than Vega 64, while performing slightly better, so those production costs matter.
 

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
The lead time between commencing a design and marketing it rebuts that argument. You can't wait to see what your competitor introduces to immediately counter from a performance point of view. 7 nm and its derivatives will a long node as 14nm was. Time for many advances.

I'm not a simpleton expecting, say, July 2019 Navi launches and wrecks Turing and then Nvidia goes into panic mode and suddenly designs, develops, and releases a 7nm card a few months later. That's impossible and ludicrous. What are you thinking of me?

The two year cadence is not an accident. And AMD's upcoming performance is not unpredictable. Put the two together and they will only violate their heavily researched and tested 2 year cadence if they know AMD has a miracle (which is what it would take for Turing to fall behind AMD in energy efficiency and die size efficiency in 2019).

Also the extra time doesn't need to be all sitting on their laurels. Why release a 7nm die shrunk Turing in 2019 when you can release a faster 7nm new architecture in 2020? The faster chip combined with longer time between product releases will move more units. They won't do both, as they really begin canabilizing their own sales at that point and diminishing the generational gains to where it's like CPUs.
 
  • Like
Reactions: PeterScott

maddie

Diamond Member
Jul 18, 2010
5,151
5,537
136
I'm not a simpleton expecting, say, July 2019 Navi launches and wrecks Turing and then Nvidia goes into panic mode and suddenly designs, develops, and releases a 7nm card a few months later. That's impossible and ludicrous. What are you thinking of me?

The two year cadence is not an accident. And AMD's upcoming performance is not unpredictable. Put the two together and they will only violate their heavily researched and tested 2 year cadence if they know AMD has a miracle (which is what it would take for Turing to fall behind AMD in energy efficiency and die size efficiency in 2019).

Also the extra time doesn't need to be all sitting on their laurels. Why release a 7nm die shrunk Turing in 2019 when you can release a faster 7nm new architecture in 2020? The faster chip combined with longer time between product releases will move more units. They won't do both, as they really begin canabilizing their own sales at that point and diminishing the generational gains to where it's like CPUs.
Never meant that, apologies if insinuated, but why would you say cannibalizing their own sales? In the old [?] days, we had quicker releases, so if it happens once now, what's the big anomaly? Gamers were always ready to upgrade then and will be again if the jump is substantial. I suggest the jump to 7nm is huge and allows this. For all you know, they will be able to allow them to quickly seal their raytracing mode to their benefit.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
I'm not a simpleton expecting, say, July 2019 Navi launches and wrecks Turing and then Nvidia goes into panic mode and suddenly designs, develops, and releases a 7nm card a few months later. That's impossible and ludicrous. What are you thinking of me?

The two year cadence is not an accident. And AMD's upcoming performance is not unpredictable. Put the two together and they will only violate their heavily researched and tested 2 year cadence if they know AMD has a miracle (which is what it would take for Turing to fall behind AMD in energy efficiency and die size efficiency in 2019).

Also the extra time doesn't need to be all sitting on their laurels. Why release a 7nm die shrunk Turing in 2019 when you can release a faster 7nm new architecture in 2020? The faster chip combined with longer time between product releases will move more units. They won't do both, as they really begin canabilizing their own sales at that point and diminishing the generational gains to where it's like CPUs.

I expect the 2000 series will have a 2 year run, though they might have some 7nm die shrinks as part of the mix when it makes economic sense.
 

Qwertilot

Golden Member
Nov 28, 2013
1,604
257
126
Even with NV’s enormous profits/R&D etc, they’ll presumably want some time gap between their inevitable 7nm compute chip(s) (Fairly soon one would imagine) and the die shrunk gaming cards.
 

alcoholbob

Diamond Member
May 24, 2005
6,380
449
126
Even with NV’s enormous profits/R&D etc, they’ll presumably want some time gap between their inevitable 7nm compute chip(s) (Fairly soon one would imagine) and the die shrunk gaming cards.

Plus they aren't Intel, which is facing a competitive product in Zen. Nvidia is so far ahead in the GPU game they can wait until yields improve before moving to 7nm.
 

Karnak

Senior member
Jan 5, 2017
400
773
136
Creating an article about the TPU database placeholders for unreleased products. That's what TPU is doing with every upcoming GPU. Didn't expect that, not even from wtftech.
 

sze5003

Lifer
Aug 18, 2012
14,304
675
126
Interesting if true but from previous experience they usually sell the same card for $1200 first (Titan) then come out with a nice performing Ti edition. This would be the first time they would launch it at the same time as the base models too. My brother is praying they do lol but I doubt it, because he knows he's getting my 1080ti if that's the case.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
Creating an article about the TPU database placeholders for unreleased products. That's what TPU is doing with every upcoming GPU. Didn't expect that, not even from wtftech.

The most reliable leads we have are from NVidia. We have the specs of real Quadro cards, and the Easter Egg video, of the Monday reveal, which points to an RTX 2080 consumer Ray Tracing card.

Given those two things it very easy to extrapolate probable consumer variations on Quadro cards, and that is what we are getting now. Extrapolations from NVidia's info.

The latest WCCFtech rumor is just a regurgitation of Videocardz info from before.
 

swilli89

Golden Member
Mar 23, 2010
1,558
1,181
136
Would be amazing to get 2080 Ti card on Day 1, however, wasn't the top Turing card Nvidia showed like 750mm^2?

Highly doubt a consumer card makes it out at this size. But it would be AMAZING if Nvidia releases the 2080 at like $599 and a 2080 Ti at something like $799.
 

sze5003

Lifer
Aug 18, 2012
14,304
675
126
Would be amazing to get 2080 Ti card on Day 1, however, wasn't the top Turing card Nvidia showed like 750mm^2?

Highly doubt a consumer card makes it out at this size. But it would be AMAZING if Nvidia releases the 2080 at like $599 and a 2080 Ti at something like $799.
Looking at those specs it does look like a $800 card. I mean I'm fine with waiting, not like I'd be able to get one on day 1 either. I'd probably go for another Asus model.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
Looking at those specs it does look like a $800 card. I mean I'm fine with waiting, not like I'd be able to get one on day 1 either. I'd probably go for another Asus model.

Looking at those specs the RTX 2080 will be the $800 (or higher) card, and the RTX 2080i will be more like $1500 or higher.
 
  • Like
Reactions: beginner99

sze5003

Lifer
Aug 18, 2012
14,304
675
126
Looking at those specs the RTX 2080 will be the $800 (or higher) card, and the RTX 2080i will be more like $1500 or higher.
I don't think so, that makes it the most expensive Ti card which means what the new titan will be pushing near 2k and up counting the prices being inflated due to demand the first few months after release. That's more than I would want to spend so I'll wait if that's the case. I'm expecting $8-900.
 
Last edited:

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
I don't think so, that makes it the most expensive Ti card which means what the new titan will be pushing near 2k and up counting the prices being inflated due to demand the first few months after release. That's more than I would want to spend so I'll wait if that's the case. I'm expecting $8-900.

It's would also be the largest die ever (and price per area is increasing) in a Ti card, and NVidia has no real competition, so it only makes sense that it would be the most expensive Ti card ever. Titan V is $3000, so Titan T at $2000 or more is also expected.

Anyone expecting a revolutionary new card, with a massive 754 Sq-mm die, for $800 is setting up for a disappointment.

If the spec leaks are correct, and 2080 Ti is using the RTX 8000 die, and 2080 using the TRTX 5000 die (~500 sq-mm) then I expect massive price increases across the board.
 

sze5003

Lifer
Aug 18, 2012
14,304
675
126
Yea I could see that happening if it's announced soon but I don't think it will be. Gives me more than enough time to have the money ready for it. I guess I can justify it, I've spent more on watches lol.
 
Status
Not open for further replies.