"Too Little, Too Late....Too Expensive" Did you agree?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
Depending on power and performance metrics, this card might be more comparable to a GTX 680 than GTX 480:
Late (3.5 months vs 4.5 months)
Cheaper (R9 290X is most likely going to be cheaper than GTX 780/Titan)
Faster (possibly I guess, marginally faster ala 680 versus 7970.)
Power (if it uses less power than 780/Titan, well that's another notch more towards a GTX 680 comparative)

EDIT: And everyone seems to have loved the GTX 680.

This is a great comparison and generally all very true. My problem with the entire situation (this card and all cards) is that 1) prices have driven up considerably on 28nm for top tier cards and 2) the performance level of this card has been available (sans dual-gpu cards) since Titan and even with the more recent 780's, so all in all it's just not very exciting - ESPECIALLY if it's $649-699.

Everyone blamed AMD for the initial higher prices (as did I) but Nvidia followed right in line and kept pushing them up. $500 flagship cards at launch now seem to be a permanent thing of the past. I'll not be excited until Maxwell and/or 20nm cards start popping up, and even then if perf/$ doesn't improve noticeably over the current crop then everything to me is too little too late.
 

OCGuy

Lifer
Jul 12, 2000
27,227
36
91
If it is 20nm only then it's more than a year away. Unless they make pure paper launch or very limited 1500$+ top Maxwell Titan 2-like in very limited number. Then maybe 12 months, but I would still doubt it.

Does anyone have any links to why they think 20nm is Q4 2014?

Everything I can find shows TSMC 20nm will be ramped up to mass production Q1, with some of their customers already having samples.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
The problem with Fermi wasn't that it was late, it was that it was late and it sucked. It wasn't nearly the worst GPU of all time (the FX 5800 probably takes that cake), but on launch it was maybe 5-15% faster than the 5870 with insane power consumption. An overclocked 5870 could keep pace with it and consume less power, so it was an all-around-failure.

Preemptively making this thread makes it seem like the butthurt still runs deep in fanboys and the agenda is quite clear. I'll take the opportunity to remind everyone like I always do in every single one of these discussions that it will depend on the specifics. If the 290 is released and it performs similarly to a 780 in the games I play, well I probably won't get one because I already didn't think the 780 was worth it. If it performs like a Titan AND overclocks like a dream, then I'll be tempted. I know the GK110-based GPUs have been having luck recently with BIOS voltage mods (better late than never), but I quite honestly don't have much faith in the robustness of nvidia's reference designs as I have had a handful of nvidia GPU's die on me. However, I've been overvolting AMD cards since the 4870 and have yet to have one fail, and that always weighs in on my decision.
Depending on power and performance metrics, this card might be more comparable to a GTX 680 than GTX 480:
Late (3.5 months vs 4.5 months)
Cheaper (R9 290X is most likely going to be cheaper than GTX 780/Titan)
Faster (possibly I guess, marginally faster ala 680 versus 7970.)
Power (if it uses less power than 780/Titan, well that's another notch more towards a GTX 680 comparative)

EDIT: And everyone seems to have loved the GTX 680.
Excellent post. That's another way to look at it and it may be a better comparison. Time well tell of course, but I wonder why this wasn't the comparison made in the OP? :confused:
 
Last edited:

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Does anyone have any links to why they think 20nm is Q4 2014?

Everything I can find shows TSMC 20nm will be ramped up to mass production Q1, with some of their customers already having samples.

TSMCs 20nm is in risk production right now. TSMC's 28nm was in risk production in 2009.

I think you can extrapolate from there. 20nm won't be here until 2H 2014 and will have prohibitively high wafer costs. On that note, anyone thinking 20nm will be cheaper is going to be in for a hell of a rude awakening. 20nm will be far more expensive than 28nm products.
 
Last edited:

LegSWAT

Member
Jul 8, 2013
75
0
0
This slogan as it stands leads to monopolistic (and in my view plain stupid) consumer behavior. No matter how a product is called and which company produces it, as consumers we should in all market situations embrace the variety of products, their respective features, flavors and ultimately price points. Variety of offers as it stands is what constitutes a market. Do one of these car comparisons and you'll see what I mean. I've never heard anyone complaining that VW put the Touareg on the market when you could buy the Porsche Cayenne (both cars share development and internal mechanics btw; just imagine THAT with gpus :D )
 

LegSWAT

Member
Jul 8, 2013
75
0
0
No new node=no advance really for GPUs. And as transistor cost goes up, so does price. Not to mention IGP eroding the value segment completely and starting to touch the mainstream segment of dGPUs.
Transistors is the least thing to worry about when talking prices. The issue with 20nm is far-higher-than-before fab upgrade and node development costs, wafer costs and lithography costs. And then, transistors are projected to be merely around 35% faster, unless upgraded to FinFet, which is even more expensive in development and production.
 
May 13, 2009
12,333
612
126
The saying "A butt for every seat" applies here. I like others love to complain about the state of gpu prices/development. Truth be told there is a card out there for you regardless of your price range. Heck I found a used 7850 2gb for $100. Will work just fine for me. Why should I give a flip if some overpriced new tech is out there? If someone is willing to drop 6 or 7 hundred on the latest tech then I say have at it. Even at $300 if a 7970/280x isn't enough gpu for the money then there is no pleasing you.
 
Last edited:

OCGuy

Lifer
Jul 12, 2000
27,227
36
91
TSMCs 20nm is in risk production right now. TSMC's 28nm was in risk production in 2009.

I think you can extrapolate from there. 20nm won't be here until 2H 2014 and will have prohibitively high wafer costs. On that note, anyone thinking 20nm will be cheaper is going to be in for a hell of a rude awakening. 20nm will be far more expensive than 28nm products.

That's all fine and dandy, but you can't base 20nm off of 28. 28nm had some notorious issues. This is from August, so if there is something more recent I would be interested:



“On 20nm [process technology], we see little competition. The risk production has started in the first quarter [2013] and the volume production will start in early 2014 next year […] the equipment [is] already being installed, the equipment [is] streaming in and [is] being installed. […] The volume production will start in early 2014,” said Morris Chang, the head of TSMC, during quarterly conference call with financial analysts.

http://www.xbitlabs.com/news/other/...ng_20nm_Process_Technology_to_Early_2014.html


BSing investors causes major problems...
 

seitur

Senior member
Jul 12, 2013
383
1
81
Does anyone have any links to why they think 20nm is Q4 2014?

Everything I can find shows TSMC 20nm will be ramped up to mass production Q1, with some of their customers already having samples.
What blackened23 wrote. + early 20nm non-risk production won't be used for game market dGPUs.

Few months ago I thought that we may see 20nm dGPUs line on shelf in 2014, but not since Intel switched it's priority cutting-edge fabs production plans into mobile chips production at cost of desktop chips.

That will ensure that other mobile players (most importanty Apple ofc) will pick up the glove and they will not leave enough 20nm volume for dGPUs. Maybe for professional or HPC market but definately not for gaming GPUs. Unless yiels will be extrodinary well, then maybe top of the line GPU chips for completly bonkers amount of money.


btw. I wonder if we'll see 28nm GPU with DGGR6?
 
Last edited:

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Sorry, that doesn't say anything at all.

"Intel will ship 14nm SoCs and processors in 2H 2014 and this is the time when we expect to see 20nm high performance chips from TSMC"




That means absolutely nothing...

GPU's fall under the "high performance chips" area when it comes to fabs. 20nm Volume is scheduled to start in February of 2014, but this is for less complex chips. high performance chips (GPU's, SoC's, etc) is still second half of 2014. And Apple has apparently already bought up a large amount of the volume for what will most likely be its A8.
 

DooKey

Golden Member
Nov 9, 2005
1,811
458
136
TSMCs 20nm is in risk production right now. TSMC's 28nm was in risk production in 2009.

I think you can extrapolate from there. 20nm won't be here until 2H 2014 and will have prohibitively high wafer costs. On that note, anyone thinking 20nm will be cheaper is going to be in for a hell of a rude awakening. 20nm will be far more expensive than 28nm products.

Yep, 20nm is going to be VERY expensive for high-end vid cards. That's why I'm hoping first gen Maxwell is on a mature 28nm process.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Well, if that's a genuine statement - performance gains are absolutely still possible on 28nm, and Maxwell does have some interesting architectural enhancements that make it quite different than anything that has proceeded it. Case in point, Apple's A7 mobile SOC has basically blown away everything else in terms of the ARM SOC race despite being on the same lithography process, so there's definitely room for improvement. The only thing that cannot improve is density, but that may not be a big deal when considering the trade offs. 28nm's trade off is that it is mature and reasonable in price - 20nm will be extremely expensive, I know I saw this figure somewhere before but the "break even" point for any company using 20nm from the get-go will be something like 500-750k units. That is quite a doozy and very much more expensive than 28nm.

As well, you must keep in mind that TSMC's marketing always makes everything sound rosy when they describe their process enhancements. The facts are, TSMC has never been truthful in their transitions, their 20nm will be considerably worse than intel's 22nm with FinFETs, TSMC is still in 20nm "risk production" ie it is NOT READY, and TSMC's risk production for 28nm took nearly 1.5 years while they worked out the kinks. Besides which, even their board of directors basically stated that May 2014 is when volume production starts (if you believe them, they had VERY early esitmates for 28nm as well) so that means if volume production starts then, products ship in 2H 2014. And those products will be incredibly expensive.

It is what it is. This will only get worse with time. Videocardz.com had an article regarding Maxwell sometime a couple of weeks ago and the word was, Maxwell is being introduced at 28nm and will be refreshed on 20nm when the process is mature. I think that personally is fine and wouldn't read into it too much - gains are still *very much* possible on 28nm.

Personally, I think Apple and Qualcomm will be fighting over 20nm wafers from the outset just because they have oodles of money, way more than nvidia or AMD combined. From there, it will take some time for the other players to catch up. That's just the way I see it. Certainly nvidia is in a better position than AMD to pay for wafer costs, but it won't be easy for NV by any means IMHO. We'll see what happens in any case.

If i'm wrong, and I hope I am, I would welcome it. But I can't see 20nm being a welcome sight for anyone if they do somehow show up in NV GPUs, because the wafer costs are pretty ridiculous. Then again, NV does have the ability to charge more for newer technology? Again, we'll see what happens.
 
Last edited:

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
Personally, I think Apple and Qualcomm will be fighting over 20nm wafers from the outset just because they have oodles of money, way more than nvidia or AMD combined. From there, it will take some time for the other players to catch up. That's just the way I see it. Certainly nvidia is in a better position than AMD to pay for wafer costs, but it won't be easy for NV by any means IMHO. We'll see what happens in any case.

What if, just WHAT IF, Apple or Qualcomm do not have their chips ready for final silicon when 20nm is ready? Hahaha that would be funny.
 

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
In regards to whether 2H 2014 for 20nm GPUs is pessimistic or realistic:

http://focustaiwan.tw/news/aeco/201309020030.aspx

"Commercial Times:

JP Morgan Chase said Qualcomm will place three to four orders with TSMC for semiconductors using 20 nm-class process technology next year.

In the second half of 2014, Apple Inc.'s monthly order with TSMC for chips using 20-nm-class technology will reach 45,000 to 50,000 units, it forecast.

These orders will help push up TSMC's 2014 revenues from products using 20 nm technology by 20 percent, the financial institution said."

So volume production for Apple is expected 2H 2014, I can't imagine many other companies beating them outside of very expensive lower volume chips.
 

sandorski

No Lifer
Oct 10, 1999
70,101
5,640
126
Meh, AMD and Nvidia have yet to match the Virge DX and that is 18 years of continued domination.
 

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
Depending on power and performance metrics, this card might be more comparable to a GTX 680 than GTX 480:
Late (3.5 months vs 4.5 months)
Cheaper (R9 290X is most likely going to be cheaper than GTX 780/Titan)
Faster (possibly I guess, marginally faster ala 680 versus 7970.)
Power (if it uses less power than 780/Titan, well that's another notch more towards a GTX 680 comparative)

EDIT: And everyone seems to have loved the GTX 680.

OP what is your opinion on the 680, was it too little, too late... too expensive? It seems to be a valid comparison as the 290x situation is shaping up very closely to that, of course we don't know the final price yet but it will likely be less than the NV counterpart, but your thread implies you do.
 

Arkaign

Lifer
Oct 27, 2006
20,736
1,377
126
The problem with Fermi wasn't that it was late, it was that it was late and it sucked. It wasn't nearly the worst GPU of all time (the FX 5800 probably takes that cake), but on launch it was maybe 5-15% faster than the 5870 with insane power consumption. An overclocked 5870 could keep pace with it and consume less power, so it was an all-around-failure.

Preemptively making this thread makes it seem like the butthurt still runs deep in fanboys and the agenda is quite clear. I'll take the opportunity to remind everyone like I always do in every single one of these discussions that it will depend on the specifics. If the 290 is released and it performs similarly to a 780 in the games I play, well I probably won't get one because I already didn't think the 780 was worth it. If it performs like a Titan AND overclocks like a dream, then I'll be tempted. I know the GK110-based GPUs have been having luck recently with BIOS voltage mods (better late than never), but I quite honestly don't have much faith in the robustness of nvidia's reference designs as I have had a handful of nvidia GPU's die on me. However, I've been overvolting AMD cards since the 4870 and have yet to have one fail, and that always weighs in on my decision.

Excellent post. That's another way to look at it and it may be a better comparison. Time well tell of course, but I wonder why this wasn't the comparison made in the OP? :confused:

Idk, I think the FX5800 was disappointing to be sure, but it's not as bad as people remember :

http://techreport.com/review/4966/nvidia-geforce-fx-5800-ultra-gpu

Just overpriced and late and overhyped, but it did actually deliver competitive performance with the rival cards at release. Nvidia even had time to release the proper FX card, the 5900 (256-bit bus) before ATI finally got around to finishing X800.

http://ixbtlabs.com/articles2/gffx/5900u.html#p25

The thing about the FX5800 and 5900 wasn't that they were terrible cards really, it's just that they were SOOOOO OVERPRICED. The loud stock fan on 5800 reference cards was stupid as well. But go back and look at the benches, they were fast enough to on average be the fastest thing available at the time.

9800 Pro was a much better buy though, that's definitely the truth. 85-110% of the performance depending on game at sometimes just over half the cost.

Worst GPU release I can think of in the past decade :

http://www.anandtech.com/show/2231

LATE, SLOW, EXPENSIVE, CRAP DRIVERS, oh man. They were really really really terrible. At least the FX5800 came out with competitive performance. The 2900 was just a total abomination. Annihilated by the then ancient 8800s.

I didn't buy the FX or the 2900 of course. The worst card I've ever owned, bar none, is this though :

http://www.anandtech.com/show/438

What a stunningly horrible card that was. Bought it blind with store credit, and was amazed at how terrible in every respect that it was.
 

seitur

Senior member
Jul 12, 2013
383
1
81
I dont think GDDR6 is anywhere near done.
Propably. Althrough GDDR development was always preety quiet and not reported to public as much as DDR as far as I can remember from years ago when GDDR2 was the king heh
 

USER8000

Golden Member
Jun 23, 2012
1,542
780
136
The NV30 was probably the worst GPU launch in the last decade or so. It was overpriced,consumed way too much power,the competition had faster cards and its DX9 implementation sucked.

The worst thing it meant the cards down the range sucked even more,which meant ATI could hold prices for their midrange cards. The whole first generation FX series had major performance and image quality issues with games like HL2 for example.
 
Last edited: