Nvidia Kepler Yields Lower Than Expected –CEO. Fermi 2.0?

Page 12 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Cheapest 3gb GTX 580 I see there is 549.99. Same price as the 7970. I guess that would be helpful for anyone doing 3d surround resolutions eh.

The VAST majority of gtx 580 sales are on 1.5gb models. gtx 580 3gb was kind of an answer to a question that nobody asked, wasn't it? As is 3gb on a 7950, they just can't release a 7950 with less ram than a 6950 or sales would suffer. Maybe for tri-sli or tri-fire on 7680x1600 or something crazy like that I guess...

Isn't that almost always the case? with the exception of duds like P4 which are slower then their predecessor, every new GPU or CPU architecture was "the best ever made" when released.

Normally companies cancel a failed project rather then releasing a product inferior to what they already have on the market (other than Bulldozer)...
Cases like the P4 are rare and far between when marketing execs get too full of themselves and think they can spin a worse new product.

FTFY
 
Last edited:

bryanW1995

Lifer
May 22, 2007
11,144
32
91
You do know the top GPU featuring GK104, the one that will probably be launched in April, will also probably be called the GTX 680, right?

There's no reason to call it GTX 660 when you're not gonna release anything higher than it for 6+ months.

^ In bold.
It means you Fail. ;)

^In bold.
It means you Fail. ;)




We can play this game all night long, but at the end of day only one of us gets paid, or receives "gifts", in some way by a graphics card manufacturer.

Seriously guys, the PM button is there for a reason...
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
How many people here believe that Kepler was supposed to launch with Tahiti on January 9th 2012? Raise your hands. Because all this talk of Kepler being late is kind of BS. Kepler is on it's OWN schedule and has NOTHING to do with AMD's schedule. I hope this is realized sooner than never. There are no official launch dates and never were in the past to give anyone the notion that Kepler is late. So stop acting like it is. Kepler gets here when it is good and ready. That is all. hehe.

Didn't we go round and round this when 5870 came out? Kepler is "late" because the competition beat them to market. It doesn't matter if NV wasn't trying to win the race, it's still a race and AMD is still in first currently. Last round, NV fixed their issues and came out with a great hammer blow that garnered them some revenge (gtx 460 in the summer, then gtx 580/70 in time for the holiday shopping season), but that first round at 40nm was clearly in AMD's favor. Without some sort of positive news/buzz soon, I think that more and more we will be comparing this launch to the gtx 480 instead of gtx 580 launch.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Are you kidding????

nvidia is making tons of money off these complex designs. Something AMD has not been able to do. Can you not wrap your heads around this? its not even remotely complex. These designs are making Nvidia cash AMD can only dream of.

AMD GPU division is not even close to matching nvidias success. Any claims of AMD doctoring the books to hide their GPU profits is completely in the imagination.

These are cold hard facts. You would have to be very irrational to see this. To not understand this really makes me wonder????

You could say you prefer AMD over nvidia but to say that its the wrong approach or a bad design? thats complete insanity. Its like a mad house!!!!

NV has a good general design, which doesn't mean it's the best gaming design, from an efficiency standpoint.
It's different ways of making thigns (well, has been in the past).
NV has a product they can market to higher margin areas, while AMD has focused on consumer products up until now.
They haven't been competing in the same arena, so it's easy to say that AMD has the best design.
It's also easy to say NV has the best design. And both statements are true.

What works for NV is not aiming for the consumer market, but still being able to sell products in that market. This was done by focusing on GPGPU and workstation cards. That's where the vast majority of their profits come from.
What worked for AMD was making consumer GPUs, what will work in the future for AMD is making GPGPU oriented GPUs.

And you can say AMD "doctored" the books to hide GPU profits, because we don't know the breakdown of expenses allocated against GPU revenue. It could be that GPU profits are worse than the surface, if they allocated R&D to CPU in respect to Trinity/etc, or they could be better if they allocated all APU GPU expenses to the GPU market. It's not doctoring the books, it's just us not having a true picture because they could be allocating things legitimately in any way they want.
Equally NV rolled their chipset business into their GPUs in their accounts, muddying the numbers for their consumer GPU products.
 
Last edited:

LOL_Wut_Axel

Diamond Member
Mar 26, 2011
4,310
8
81
NVIDIA 8800GTX 11/08/2006 vs ATI 2900XT 5/14/2007 (NVIDIA launched first)
NVIDIA 8800GT 10/29/2007 vs ATI HD3870 11/15/2007 (NVIDIA launched first)
NVIDIA GTX280 6/16/2008 vs ATI HD4870 6/25/2008 (NVIDIA launched first)
NVIDIA GTX285 1/15/2009 vs ATI HD4890 4/2/2009 (NVIDIA launched first)
NVIDIA GTX480 3/26/2010 vs ATI HD5870 9/23/2009 (AMD launched first)
NVIDIA GTX580 9/11/2010 vs AMD HD6970 12/15/2010 (NVIDIA launched first)
NVIDIA GTX680 00/00/2012 vs AMD HD7970 01/09/2012 (AMD launched first)

NVIDIA launched 5 times ahead of ATI/AMD in the last 7 GPU releases.

Wow, nice research there. This is, how did they put it, inexcusable? :cool:

Except, given your obvious... "preference..." in favor of NVIDIA, you and your friend forgot some simple facts.

1. The gap between the launch of the HD 3870 and 8800GT was 17 days.

2. The gap between the launch of the HD 4870 and GTX 280 was 9 days.

3. The GTX 280 and HD 4870 are part of the same same series as the GTX 285 and HD 4890, so the latter comparison is invalid.

4. The GTX 580 launched in November, not September. That puts the difference between the HD 6970 and GTX 580 at a whopping month.

Given this, the only ones that are actually valid when it comes to AMD/ATI being late is the 8800GTX vs. HD 2900XT and the GTX 580 vs HD 6970. That's two. On one occasion ATI was late by six months, and on the other AMD was late by one month. Contrast to NVIDIA, who's most recent release will be 4-5 months late (Kepler; GK104), and who's release of the 400 series was 6 months late.

If you actually take your preference aside, you can see AMD has been better than NVIDIA for the past two years or so overall when it comes to product launches.
 
Last edited:

bryanW1995

Lifer
May 22, 2007
11,144
32
91
It's inexcusable regardless. How is a smaller team with less $$ to work (AMD) beating the mighty Nvidia to the punch? Not just this round but 5XXX series to. It's inexcusable IMO. I'm being as patient as I can but now it's April and not March. I don't appreciate the dead silence either. If AMD didn't price tahiti so high I might have one.

IMO if the nvidia engineers or developers or whoever can't get the job done then it's time to get someone that can. I know this crap wouldn't cut it at any other job so why should it be tolerated in this case?

That is a severely misguided outlook. "It's inexcusable" ?? What is the "It's" ?. A lot of people will have us believe that Kepler is late because Tahiti launched already. That isn't the case at all. Nvidia is beating to it's own drum as I have said before. It was also rumored (no idea if it's true or not) that Tahiti was brutally rushed to market by AMD. I don't know if people can attest or are willing to attest to that and quality of launch drivers or lack thereof? I have no idea so I'll leave it to others to confirm or debunk on their own accord. As far as Kepler's tardiness? That is a fabrication.

I disagree with both of you. Oilfieldtrash, this isn't car manufacturing, oil processing, etc etc. This is incredibly complex manufacturing being performed at unbelievably tiny scales. You can push/yell/scream/etc only so much, if you go too hard you'll lose the people who gave you the competitive advantage in the first place. Keysplayr, that is clearly a lame attempt to discredit AMD's launch schedule. AMD isn't doing anything different than NV is right now, they just have smaller dies that are easier to move to a smaller manufacturing process. Is it possible that they pushed their guys harder than usual to try to get GCN cards out by xmas? Absolutely, I'd be surprised if they didn't. Does NV do the exact same thing all the time, possibly even right now that AMD has a new card out and NV doesn't? Of course they do, we both know that jhh is a tyrant (though a very very effective one).
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,330
126
All good points, in fact I've made many of them over the past few years. I have a couple of comments:

1. As IDC strongly showed us on pgs1-2, a smaller die is inherently better for early adoption on a new node. This is likely why both camps used to do the original die shrink on low or mid-end cards before bringing it out to the high end. leading to:
2. If NV does that exact thing this time, aren't they just going back to what has historically worked well for them? And really, if their midrange "gtx 660ti++" or whatever they call it ends up competitive with 7970, won't that offer a viable alternative to NV fans for the early part of the year?
3. That's probably a smart strategy for NV, as instead of the "heat" that they got from rushing out gtx 480 before it was ready they would instead get lauded for competing with AMD using a midrange part. Heck, NV could even be 5-10% slower on average but still kick ass in games like civ5 and bf3 and they'd still be ok as long as the high end gets here by the fall.

I believe they'll put out a mid-range, or rather a mid die sized card(for nvidia), first. Every meager scrap of a rumour has stated it's a 3XXmm2 sized die and their big chip is going to be late this year.

I think it will compete with 7970 going on the rumoured die size being accurate. This is not necessarily always going to be true though, GTX 460 was almost the same size as 5870, but noticeably slower. It will at least be in the ballpark one way or the other I would think. I don't think we'll be lauding them for competing with a midrange card, likely just for bringing prices down on halo cards by $50 or so.

I think the actual mid-range will be AMD's 7870/7850 and some iteration of the GK104 from nvidia with disabled areas, with the full version being called a 680.

I think the card will come out as a 680 and be aimed squarely at the 7950 or the 7970. I doubt they will want to just release a card that amounts to a GTX 580 with a better price on a new process. I mean, they want to release a mid-range card and make money with it of course, but I can't see them wanting to play second fiddle right out the gate and not be seen as a performance contender.

I'm wagering we will see another 480 to 580 type situation with a 6 month window between the two. Release the 'GK104' we've all been hearing about as a 680 and a few scaled back versions - 670/660?, then six months later when they are ready, drop the big die chip as a 780/770. The difference this time being it is more of a chip unto its self, rather than a corrected and mildly improved chip - think GF104 to GF100 happening in reverse, similar chip but not one being the same as the other with disabled areas.

All guesswork of course. I want to see the 450-500mm2 monster die. Going on what AMD is doing with 350mm2 in the 7970, another 100-150mm2 is going to bring a heap of performance if the card can still be clocked fast without getting too hot. I just don't think we are seeing it until the fall though.

Right now the 7970 is sitting on 30% more performance just by turning up the clock speeds. I want to know what AMD is going to do with it, just go for a 7970 'ultra' and put out a highly clocked version of Tahiti or actually increase die size/shaders and a higher clock speed to boot. The more I think on it, the more I am inclined to wait until Nov/Dec of this year and see what is going on then. I don't even really need an upgrade, I just want to use less than three cards, I think I can wait to do that.

I just don't have much faith in nvidia on new nodes after 40nm, especially as they are delayed again and not ready to compete against AMD. I think they'll play it smart and put out this small die 'mid-range' card, but it sure is not going to wow anyone but fanboys performance wise compared to the 7970. My bet is that if they pushed for their large chip we'd see GTX 480 all over again, so they are aiming to in effect if we use their 40nm cards; release the GTX 460 first and then release the GTX 580 six months later without ever having released the 480. Smart, but it sure does suck if you want some options in the high end.
 
Last edited:

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
http://www.newegg.com/Product/Product.aspx?Item=N82E16814162073

$423 shipped AR, that beats your card's price by $56. That cooler is ridiculous, but I'd bet that it's pretty effective. And I don't know about you, but a 3 slot cooler is no inconvenience for me, even with my mATX mobo.

Here in the US that card if $465, you can get a 7950 for $450, or an OC version for $470ish. It may be cheaper in Argentina (I think thats what you mean by AR?!), but thats not here.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
NVIDIA 8800GTX 11/08/2006 vs ATI 2900XT 5/14/2007 (NVIDIA launched first)
NVIDIA 8800GT 10/29/2007 vs ATI HD3870 11/15/2007 (NVIDIA launched first)
NVIDIA GTX280 6/16/2008 vs ATI HD4870 6/25/2008 (NVIDIA launched first)
NVIDIA GTX285 1/15/2009 vs ATI HD4890 4/2/2009 (NVIDIA launched first)
NVIDIA GTX480 3/26/2010 vs ATI HD5870 9/23/2009 (AMD launched first)
NVIDIA GTX580 9/11/2010 vs AMD HD6970 12/15/2010 (NVIDIA launched first)
NVIDIA GTX680 00/00/2012 vs AMD HD7970 01/09/2012 (AMD launched first)

NVIDIA launched 5 times ahead of ATI/AMD in the last 7 GPU releases.

gtx 580 was in november 9, not september 11.

8800 gtx vs 2900xt was awesome for NV. 5870 vs gtx 480 was awesome for AMD. gtx 285 over 4890 wasn't too shabby, either. As for the other 3, they're all within a couple of weeks of each other and are very nearly a tie. The current situation looks to come in somewhere between the gtx 285/4890 launch and the 5870/gtx 480 launch. The longer NV takes to bring out something competitive, the greater the pressure mounts on them.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
I'd give you credit for your "research", if it meant anything. AMD is killing Nvidia in GPU design, period. Keep waiting for Kepler. It'll be worth it. :)

Are you a stock broker or financial analyst? I'm speaking for a majority of enthusiasts, not investors and those who bring out fallacious strawman arguements pointing to financials for companies with much more diverse businesses than just GPUs.

When you are sitting on <insert any (inferior) Nvidia product here>, when your buddies are using Radeon 7970s, do profit reports make you feel better about your purchase? Just curious, because the guys here with 7970s are sh*t-stomping whatever Nvidia hardware you might be using. ;)

Sorry dude, but I agree with the other posters. NV is generally considered superior to AMD in descreet desktop gpus, though AMD has significantly closed the gap at least in the past few years.
 

wahdangun

Golden Member
Feb 3, 2011
1,007
148
106
All good points, in fact I've made many of them over the past few years. I have a couple of comments:

1. As IDC strongly showed us on pgs1-2, a smaller die is inherently better for early adoption on a new node. This is likely why both camps used to do the original die shrink on low or mid-end cards before bringing it out to the high end. leading to:
2. If NV does that exact thing this time, aren't they just going back to what has historically worked well for them? And really, if their midrange "gtx 660ti++" or whatever they call it ends up competitive with 7970, won't that offer a viable alternative to NV fans for the early part of the year?
3. That's probably a smart strategy for NV, as instead of the "heat" that they got from rushing out gtx 480 before it was ready they would instead get lauded for competing with AMD using a midrange part. Heck, NV could even be 5-10% slower on average but still kick ass in games like civ5 and bf3 and they'd still be ok as long as the high end gets here by the fall.

1. yes it is but if they used old strategy then nvdia will still use 40nm for kepler high-end and using 28nm for midrange kepler GPU.
2. yeah, but nvdia never release midrange GPU first, what they do is using old process for flagship GPU and new process for midrange GPU.
3. that quite good strategy but i doubt JHH will do something like that.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
AR is not upfront price. You still have to pay full pricing when you buy and hope you get the rebate check two-three months later after you send it.

Way to change the parameters of the discussion when they don't suit your point. Even if you only value AR prices as a % of the total rebate amount(which I admit that I typically do when I'm buying something), the gtx 580 is still somewhat cheaper for nearly the same performance.

I'm sorry that you haven't bothered to read news articles for... what... a month now?

GK104 in April and GK110 in Q3 or Q4 2012, simple.

Why no links to your proof? Were you having trouble finding anything from a reputable site? Are you embarrassed to use Fuad for your "proof"? Even Kyle at [H], a notorious NV hater, states that there has been no news from NV at all on Kepler. Go ahead, I'll wait for you to actually find something concrete. There might actually be something out there, but everything I've read thus far has used the BSN/fudzilla/etc writing style of "kepler looks like it could end up being something maybe sorta like...." instead of "JHH at NV has confirmed to us today that kepler will launch with the gk104 technology...".
 
Last edited:

LOL_Wut_Axel

Diamond Member
Mar 26, 2011
4,310
8
81
Way to change the parameters of the discussion when they don't suit your point. Even if you only value AR prices as a % of the total rebate amount(which I admit that I typically do when I'm buying something), the gtx 580 is still somewhat cheaper for nearly the same performance.

Change what parameters? It's you that's doing that. You still have to pay full price upfront; everyone does. And you still have the chance of not getting the rebate check even if you submitted everything correctly. Given that, upfront price is the real price, and after rebate (AR) is a bonus.

Saying otherwise is dishonest.
 
Last edited:

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Change what parameters? It's you that's doing that. You still have to pay full price upfront; everyone does. And you still have the chance of not getting the rebate check even if you submitted everything correctly. Given that, upfront price is the real price, and after rebate (AR) is a bonus.

Saying otherwise else is dishonest.

You said nothing about not using rebates, and that is a standard industry practice, so any reasonable person would have assumed that rebates were ok in your price comparison. I fail to see how including a rebate discount in a price with an "AR" after it is dishonest. However, it's easy to see how it's dishonest to claim that 7950 is cheaper than gtx 580, then to change the parameters of the argument to exclude rebates after your point is refuted.
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
627
126
Even Kyle at [H], a notorious NV hater, states that there has been no news from NV at all on Kepler.
Kyle is not an Nvidia hater, he ran his GTX580's in SLI for quite awhile I believe. He goes with whatever is fastest, my guess is he is running two 7970's currently.
 

LOL_Wut_Axel

Diamond Member
Mar 26, 2011
4,310
8
81
You said nothing about not using rebates, and that is a standard industry practice, so any reasonable person would have assumed that rebates were ok in your price comparison. I fail to see how including a rebate discount in a price with an "AR" after it is dishonest. However, it's easy to see how it's dishonest to claim that 7950 is cheaper than gtx 580, then to change the parameters of the argument to exclude rebates after your point is refuted.

Upfront you still have to pay more, so the real price is still more. And even if you include it, again, there's lots of people that don't bother with them, and there's others which submit it and never get the rebate check back.

So no, it's dishonest to mention it as final price. Only mention instant rebates or after coupon/code (AC) as final because that changes the price everyone has to pay.
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
More Apple SKU's Kepler is going in.
Latest Apple rumor: Mac Pro desktop to be updated with Ivy Bridge-E processors, Nvidia Kepler graphics


In particular, M.I.C. Gadget is reporting that the workstation will be graduating to Intel&#8217;s new eight-core Ivy Bridge-E (for &#8220;Extreme&#8221;) CPUs when they launch in a couple of months. Since you can currently get a pair of six-core processors for the Mac Pro, this update will presumably give you the option of a 16-core pairing, albeit one that will set you back about $5,000. Ivy Bridge should not only bring performance improvements, but also power savings as well thanks to the greater efficiency of its 22nm manufacturing process.
In addition, the same site reports that Apple will dump ATI graphics for this refresh, opting instead to throw its hat in with Nvidia&#8217;s forthcoming Kepler lineup. AMD reportedly had difficulties with the drivers for the previous Mac Pro graphics, so Apple is jumping to its competitor, whose CUDA GPGPU cores should be useful to the workstation crowd

HPC Customers Are Waiting for Nvidia "Kepler" Processor - Company.


One of the main advantages of Kepler architecture for supercomputers is over two times higher double precision GFLOPS performance per watt compared to Fermi architecture that is in use today. Some of the technologies that Nvidia promised to introduce in Kepler and Maxwell (the architecture that will succeed Kepler) include virtual memory space (which will allow CPUs and GPUs to use the "unified" virtual memory), pre-emption, enhance the ability of GPU to autonomously process the data without the help of CPU and so on.
Although Nvidia started to ramp up manufacturing of chips that are based on Kepler architecture in calendar 2011, the company will only release actual products powered by those chips in March, April or even later. In fact, Nvidia's interim chief financial officer even implied that the next-gen products will be launched in the coming months, but did not directly state that Kepler will be available in the second quarter of first quarter of the company's fiscal 2013 (which ends in late April '12).
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Kyle is not an Nvidia hater, he ran his GTX580's in SLI for quite awhile I believe. He goes with whatever is fastest, my guess is he is running two 7970's currently.

I agree and quite consistent. He was very positive about GTX 480 surround gaming; as he was positive about the added ram with the 6970 EyeFinity.