NVIDIA preparing four Maxwell GM204 SKUs (VideocardZ via S/A)

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
the AMD 7000 series were horribly priced. they offered zero increase in performance per dollar over what the 6000 series launched. in fact on the 7970 you literally got LESS card for your money as the 7970 was 50% more expensive that what 6970 launched at yet only 40% faster. whats funny is that hardly anyone complained about that. so AMD fans had no issue getting nearly 10% LESS performance per dollar but criticized Nvidia for actually giving a 35% performance increase over the 580 for same amount of money. and hell you got a much more efficient card too.

so yeah if AMD would not have gouged the hell out of pricing then Nvidia would not have had the chance to release its mid range card at previous flagship prices. and yes I am aware that AMD priced it against the 580 but again that was crazy to give less performance for your money than previous gen. I dont recall that ever happening with next gen launch.


I still remember me here and here, RussianSensation and others complaining about the price of the new 28nm Graphics Cards. It took more than a year for the prices to finaly come down to the previous Gen levels.

When NV and AMD will go to 16nm, then 980Ti could be at $999 or more at launch :whiste:

03-05-2012
You people believe the AMDs Radeon prices are high due to luck of competition from NV, you are about to witness the worst GPU release (Performance/price) in years.
 

sao123

Lifer
May 27, 2002
12,656
207
106
Sorry I didn't mean to come over rude. My point is: I don't get this labeling of a GPU being high or mid. I mean you are getting a new GPU which is faster then the top-modell of the previous gen. By definition this is a high-end card. It doesn't matter that they could release an even bigger die. It's a question about price. These huge dies cost a lot to manufacture, it makes sense for nvidia to release them later, once they produced enough for the high end Tesla cards.

So you would be ok with the following...

Intel releases the Core I3 model selling it at the I7 premium price, then next releasing the same architecture core I5 at the I7 premium price, and eventually the core I7 as the flagship at the I7 price...
instead of launching them all side by side as done now with the I7 as the flagship processor?

This is the very equivalent of what NVidia is doing with GM204/207/210.
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
So you would be ok with the following...

Intel releases the Core I3 model selling it at the I7 premium price, then next releasing the same architecture core I5 at the I7 premium price, and eventually the core I7 as the flagship at the I7 price...
instead of launching them all side by side as done now with the I7 as the flagship processor?

would we be OK with 35% performance increase per year?
with i3 giving 35% over last year i7-x960
(although 6 months from now even faster chip will be released)

uhmm... :hmm:

WHERE DO I SIGN?
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
rediction time from me. I feel good about the following:

Full GM204/GTX 880 Ti:
2560 Cuda cores @ 1080MHz (Max turbo clock)
20SMM
4GB VRAM @ 1500MHz - 223GB/s bandwidth
TDP: Around 225W

GTX 880 Ti - Full GM204
Performance: 30% faster than GTX 780 Ti
(Maxwell +35% better in performance/core vs Kepler)

GTX 880:
2304 Cuda cores
18 SMM
256bit
About 15% faster than GTX 780 Ti.

GTX 870:
2048 Cuda cores
16 SMM
192bit

GTX 880MX Mobile graphic card:
Exact same specifications as GTX 880 but perhaps slightly lower clocks.

So you think GM204 will only be 40-45% more efficient than GK110, even though GM107 is 70% more efficient than all of Kepler?
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Nvidia beat AMD (in performance) when it released the gtx680 and undercut AMD in the process. Had AMD released the 7970 at $399, we most surely would not have seen the gtx680 at $499. So is it AMD's fault that Nvidia's "midrange" went up in price? No, of course not. But it is one factor to weigh in for this discussion and I think it just shows that prices were heavily inflated on 28nm vs. 40nm. Both companies are guilty of milking situations, Nvidia is usually in a better position to do so.

I don't disagree that both companies are to blame. However, AMD tried to offer excellent price/performance during 4870/5870/6970 eras and consumers as a whole didn't care for it. Hence why AMD raised prices to compensate for lack f volume. I personally bought 4890, GTX470s, 6950. I supported both firms during these eras. What I don't do is just blindly buy NV cards gen after gen and then complain that AMD raised prices since they had no other choice.

What if AMD releases a new 385mm^2 die using GCN 2.0 on 28nm in January and beats the 290x by 15-20%, are you also going to label it "mid-range" since it's smaller than Hawaii?

100% I would. How can a real next generation flagship chip only outperform the previous gen by 15-20%? That's a failure in my books.

You also have to consider this might be Nvidia's best chip for the next 12 months. Is it still mid-range then?

Yes, if the codename is GM204 and NV follows up with the real monolith large for GM200/210 in 12 months. It would have meant milking the midrange.

Would you say 7900GT was a flagship chip for Nv that generation if they purposely delayed 7900GTX?

Regarding increased R&D costs for new generation, even if true, NV is sitting in record gross margins, far higher than during GTX200 or Fermi generations. As I already said if gamers are OK with this, then that's the direction the industry moves. We will just have to wait longer to achieve 2x the performance/$ increase than in the past and also before the flagship chip is launched. I suppose this is the new price to pay for lower nodes and more expensive fabs/wafers.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
Yes, if the codename is GM204 and NV follows up with the real monolith large for GM200/210 in 12 months. It would have meant milking the midrange.

No, I mean literally no new GPU exists for 12+ months. If GM210 is ready to go, we all know nvidia wont hold back from releasing it to the professional market. But I am saying if Charlie is right, and Nvidia is planning to entirely skip 20nm, then gm204 may be the fastest graphics card in existence from Nvidia for 12+ months. GM210 might not even tape out for 16nm until this time next year (or later).

If that is the case, do you think gm204 still mid-range and should be priced as such?
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,330
126
If the best GM204 card is sub 20% faster than 780ti, as an enthusiast/hobbiest for PC, that is a failure to me. If the improvement is so small it doesn't motivate a purchase, that is a fail. It will become like Intel and CPUs; no point upgrading more than every couple generations because the performance improvement is so small. In Intel's case that is their intended choice to not focus on major performance gains. For nvidia & AMD, they are at the mercy of TSMC and not having near the resources or market dominance Intel has. There is nothing exciting about perf/w in the geforce/radeon market unless it means big performance increases as well. The choice between saving $2 a month in power or a 400W monster that is 100% faster is obvious. Gaming GPUs have a huge impact on gaming and we need big jumps consistently.

20-30% is decent and 40%+ is good. This is making allowances for the lack of a node shrink. Really, 70%+ is what you expect in performance increases from one flagship to the next. See 580 to 780 and 6970 to 290X; as much as 100% faster in performance. Same story with 285 to 480 and 4890 to 5870 etc. If they're having trouble delivering real solid improvements one generation to the next consistent with the past then this is bad news for discreet GPUs.

The tactic now is split up the mid-range and high-end of each new generation and sell them each as a generation unto themselves staggered apart. Best strategy is to buy them that way. Skip the mid-range 'high-end' and wait for the real high-end, and the mid-range 'high-end' to drop to a mid-range price when the real top part drops if that is your card.

I think the high prices are here to stay. Slower release cycles with less exciting product line-ups, massive cost increases on the new nodes and less discreet GPUs sold. $1000 could well become standard pricing on halo 20nm/16nm cards, whatever they'll be. Really, we are getting 4-5 years of nothing but 28nm GPUs if this is accurate ?
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
If the best GM204 card is sub 20% faster than 780ti, as an enthusiast/hobbiest for PC, that is a failure to me. If the improvement is so small it doesn't motivate a purchase, that is a fail. It will become like Intel and CPUs; no point upgrading more than every couple generations because the performance improvement is so small. In Intel's case that is their intended choice to not focus on major performance gains. For nvidia & AMD, they are at the mercy of TSMC and not having near the resources or market dominance Intel has. There is nothing exciting about perf/w in the geforce/radeon market unless it means big performance increases as well. The choice between saving $2 a month in power or a 400W monster that is 100% faster is obvious. Gaming GPUs have a huge impact on gaming and we need big jumps consistently.

20-30% is decent and 40%+ is good. This is making allowances for the lack of a node shrink. Really, 70%+ is what you expect in performance increases from one flagship to the next. See 580 to 780 and 6970 to 290X; as much as 100% faster in performance. Same story with 285 to 480 and 4890 to 5870 etc. If they're having trouble delivering real solid improvements one generation to the next consistent with the past then this is bad news for discreet GPUs.

The tactic now is split up the mid-range and high-end of each new generation and sell them each as a generation unto themselves staggered apart. Best strategy is to buy them that way. Skip the mid-range 'high-end' and wait for the real high-end, and the mid-range 'high-end' to drop to a mid-range price when the real top part drops if that is your card.

I think the high prices are here to stay. Slower release cycles with less exciting product line-ups, massive cost increases on the new nodes and less discreet GPUs sold. $1000 could well become standard pricing on halo 20nm/16nm cards, whatever they'll be. Really, we are getting 4-5 years of nothing but 28nm GPUs if this is accurate ?

People like to blame TSMC for the lack of performance increases between new GPU generations. But you are missing the bigger picture and perhaps not wanting to accept how different times have changed.

It all comes down to the financial situation and today's economy. The cost of development/designing of a new architecture is simply becoming much higher with lower returns per generation. On top of the fact that GPUs are getting more complicated by each year, semiconductor process technology is getting much harder to polish/mature and the market in itself is shrinking due to newer markets/changing trends. The returns that many CPU/GPU manufacturers reaped in the early 00s is just not there anymore.

So from a business point of view, what they are doing is the right thing or else AMD/Intel/nVIDIA won't survive. We as consumers will have to deal with it because without sales/profits etc, there won't be products to buy to begin with.

Increasing GPU architecture cycles, using the tick-tock approach, increasing prices (incl. inflation), perhaps introducing new APIs like mantle so that you dont need to rely on new architectures to bring in performance improvements (just updating the GCN architecture little by little). Its going to get worse and worse from here and out.

I guess this process was accelerated due to the rise of consoles and mobile technology along with less and less competition meaning that launch dates are more relaxed compared to what it was 5~10 years ago where even a few months of delay would mean handing market shares to your nearest competitor.. Those days were good but we have to move on. This doesn't just apply to dGPUs either however.

I also happen to think that when games were becoming 3D and it was a real boom back then, the demand for 3D accelerators was huge. But nowadays not many games do push the boundaries due to consoles and lack of return for PC games. Thinking about it, its been a while since we had a really good game that also pushed graphical limits to its boundaries.. something that could help re-vitalize the dGPU business.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
It simply seems that dGPU makers cant afford the latest node anymore. Its only gonna go downhil from here.

I don't think economics are what's dictating they stay on 28nm. It's the lack of a viable alternative. The way nVidia and AMD compete, if one decided to skip a mode the other would jump on it so fast the first wouldn't have time to realize how badly they screwed up.
 

Galatian

Senior member
Dec 7, 2012
372
0
71
So you would be ok with the following...

Intel releases the Core I3 model selling it at the I7 premium price, then next releasing the same architecture core I5 at the I7 premium price, and eventually the core I7 as the flagship at the I7 price...
instead of launching them all side by side as done now with the I7 as the flagship processor?

This is the very equivalent of what NVidia is doing with GM204/207/210.

No nvidia is only doing what Intel has been doing for years: They design one architecture but several chips around it. You can get a 16 Core Ivybridge CPU for example, but that will cost you an arm and a leg. Those chips are meant for the business sector and such was the case with the GK110 chips. They were meant for Tesla cards first and foremost.

IMHO they only release the big GPUs because a) they couldn't offer anything else, since 20nm has been postponed for so long now by TSMC and b) they perhaps produces more that they could sell on the high end.

You are essentially complaining that Intel doesn't sell you their 16 Core CPU for 300€. It's the exact same situation, but I see no one complaining.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
No nvidia is only doing what Intel has been doing for years: They design one architecture but several chips around it. You can get a 16 Core Ivybridge CPU for example, but that will cost you an arm and a leg. Those chips are meant for the business sector and such was the case with the GK110 chips. They were meant for Tesla cards first and foremost.

IMHO they only release the big GPUs because a) they couldn't offer anything else, since 20nm has been postponed for so long now by TSMC and b) they perhaps produces more that they could sell on the high end.

You are essentially complaining that Intel doesn't sell you their 16 Core CPU for 300€. It's the exact same situation, but I see no one complaining.

I'm not sure what you are getting at? nVidia and AMD both always sell their top GPU to the public. The biggest reason you don't see top end CPU's sold to the public is they have no use for them.
 

Galatian

Senior member
Dec 7, 2012
372
0
71
I'm not sure what you are getting at? nVidia and AMD both always sell their top GPU to the public. The biggest reason you don't see top end CPU's sold to the public is they have no use for them.

Yes but with Kepler nVidia decided to change their practice and have two chip designs, one aimed for gaming and one aimed for GPGPU. They have/had much higher profit margins on the later chip. It only makes sense for them to sell it at a higher price as a GPU to normal customers, because those customers in the end directly eat into the profit margin they could get selling those chips to businesses as Tesla cards.

My point is: we really can't see where they are going. I don't like that prices went up either, but the fact is: those GK110 chips are expensive to make as they are much bigger. Price rises exponentially with die-size. I have no problem with them releasing a a smaller die GM204 chip if I can get it at the "normal" high end 350-400€ mark. I can then wait out the 20% gain they get from the bigger GM200/210 die from the next generation and get the GP304 chip. That's the point I'm making. People first complained that they wanted the big GPU that was made for Tesla and now they complain that it is more expensive. That's like people complaining they wanted the big Xeon parts also on the desktop, but don't want to pay the 2011 platform prices.
 
Last edited:

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,330
126
That's inaccurate. With Fermi there was was gf104 and gf110 as well, nothing has changed apart from bifurcating the two iterations of the same architecture from being mid-range and high-end to instead being marketed as two different generations.

It has nothing to do with one being the part used in their professional cards. They've always sold the chip used in their professional cards as a geforce card as well. You're labeling what has always been the case as something novel and using that to explain the pricing. What is actually novel, is as I stated, splitting up the two iterations of a single architecture and presenting them both as high end by staggering their release.

I think this is just the new reality. AMD and nvidia can no longer afford to stay current on the latest process tech because of increasing costs and they now can't cost-effectively deliver meaningful performance increases as quickly as they were able to before. So now we get small increments staggered out and drastically increasing prices in a short time span. 28nm cards have been significantly more expensive than 40nm cards were, whereas 40nm cards were very similar in price to the previous 55nm cards and delivered huge performance increases immediately.

This is going to have an effect on sales. When new releases lack the excitement of the traditional huge performance leaps we were used to seeing, I think less cards will be sold to the traditionally persistent upgraders. It's starting to just not be worth it, much like CPUs, and we'll have to skip a generation to get a meaningful upgrade.
 

Galatian

Senior member
Dec 7, 2012
372
0
71
That's inaccurate. With Fermi there was was gf104 and gf110 as well, nothing has changed apart from bifurcating the two iterations of the same architecture from being mid-range and high-end to instead being marketed as two different generations.

It has nothing to do with one being the part used in their professional cards. They've always sold the chip used in their professional cards as a geforce card as well. You're labeling what has always been the case as something novel and using that to explain the pricing. What is actually novel, is as I stated, splitting up the two iterations of a single architecture and presenting them both as high end by staggering their release.

You are correct, but GF104 had even less Transistors then GK104. GK110 more then doubles the transistor count. More transitors = more chance a die is defective hence it increases price. Add a possible price increase for the 28 nm technology and my point still stands: nvidia projected that GK110 is just to expensive to make to sell it at the same high-end prices we were accustomed to with previous generations.

I think this is just the new reality. AMD and nvidia can no longer afford to stay current on the latest process tech because of increasing costs and they now can't cost-effectively deliver meaningful performance increases as quickly as they were able to before. So now we get small increments staggered out and drastically increasing prices in a short time span. 28nm cards have been significantly more expensive than 40nm cards were, whereas 40nm cards were very similar in price to the previous 55nm cards and delivered huge performance increases immediately.

This is going to have an effect on sales. When new releases lack the excitement of the traditional huge performance leaps we were used to seeing, I think less cards will be sold to the traditionally persistent upgraders. It's starting to just not be worth it, much like CPUs, and we'll have to skip a generation to get a meaningful upgrade.

That I agree on and it was part of my point to begin with ;-)
 

witeken

Diamond Member
Dec 25, 2013
3,899
193
106
I think they can still stay at the latest nodes. A GPU is $100+, while the A7 SoC apparently only costs $20.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
Sure it did! AMD regained discrete leadership with discrete market share over nVidia!

AMD never had marketshare leadership in the discrete market. When I look back on the numbers I can find. There is only a few % change back and forth. The most positive I can find with AMD is when they had 40.9% share.

If you got other numbers, please provide them.
 

Flapdrol1337

Golden Member
May 21, 2014
1,677
93
91
5870 would've given them discrete marketshare leadership if they had produced enough, thing was sold out for ages.