AMD Q3 results: even worse than revised expectations

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
They always bring their pro cards out later. They would have launched their high end chip if AmD had not layed an egg.

Made up by fanbois who can't admit to themselves that nVidia wasn't capable of opening up a can of whoopass on lowly little AMD.

So, GK110 is for the commercial market by choice? Then who's selling them? Where are they?

I've read of one client being shipped a very small percentage of their order of these chips that, in theory, could be made in quantities to supply the consumer market if nVidia wanted to.

LINK
DOE target date for the Keplers being available to our users is March 2013. We will change the name (from Jaguar to Titan) once we have gotten through acceptance (most likely sometime between December 2012 and March, 2013".
Why are they waiting so long? If they can make them to compete with AMD if they needed to, then surely they could supply 14,592 of them to Oak Ridge.

This segment is currently being transformed into a number crunching monster capable of 20 petaflop performance, possibly gaining the crown of fastest supercomputer once it's officially benchmarked and listed in the TOP500 list of the world's fastest supercomputers. The deadline of inclusion is November, but it's not clear at this point whether that will be met, since it still needs to be built and tested.
Seems like a pretty poor business decision to miss the deadline for getting your product into the computer that will become the world's fastest. I suppose that's AMD's fault as well?
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
They always bring their pro lineup later

Not this time. Oak Ridge has received a small percentage of what they've ordered. There is no consumer GK110. Seems to me that statement does more to dismiss the argument that nVidia could release GK110 to the consumer market if they had to, than confirm it.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
The burden of proof first comes from the claim of existence/intention.

No one would say something did not exists or was not true unless it was claimed to be so first.
No one could ever say god does not exist if it was not for the fact that it was said he does first.

Claims of existence can come out of thin air, Claims of non-existence can also come out of thin air but the claims of non-existence is impossible unless the claims of existence is there first.

Cant or wont give proof then people don't have to accept it as fact.
In the context of what was said the same tool that is meant to prove it is the same one that will disprove it, if there is no links to back it up then there is no need for links to disproof and counter, but that does not mean the disproof and counter can not be used if feasible if the person wants to do the legwork but why should that person when the person making the claim in the first place can not be bothered or cant.
I think it's even more basic than that, it doesn't matter about existence/intention, but just a matter of making a claim.

If you claim something, the burden is on you to support your claim.

Generally, when you claim something that is so basic and obvious, we all excuse the need for support. If I claim the sun will rise tomorrow, we all just accept that without needing proof. But it's still a claim.

So I guess when people make a claim that is different than what most people would expect, then most people would be more likely to expect support for that claim.

Also, I think there is an effect where people get familiar with earlier posts. I think it's a common thread around here that NVidia and AMD will compete against each other and time the release of their cards based on what each other is doing, and adjust pricing etc. with those market effects in mind. So, a person familiar with that may make a claim that relies on that support, without specifically mentioning that support. That's bad form, even if most readers already know the support, it's still an unsupported claim unless you mention that support.
Great posts. I agree, a little accountability would bring this forum up a notch. :thumbsup:

For the news, not really surprised. How long can a company be chronically mismanaged before they tank? We'll soon find out. :)
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Wrong, wrong, wrong. :rolleyes:

They gained in the discrete desktop market because nVidia stopped shipment of nearly all 40nm products. In the end they lost share in the overall discrete business.

As usual, always defending NV and missing the context.

Since people are pinning the problems on AMD's finances due to HD7000 series, I thought you'd want to read up the data again that AMD moved discrete GPUs last quarter - you know they sold, not set on store shelves.

Q2 2012 Discrete GPU market share
AMD = 37.8%
NV = 61.9%

Q3 2012 Discrete GPU market share
AMD = 40.3%
NV = 59.3%

Since hardly anyone in this sub-forum cares about mobile GPUs, not sure what your point is? Sorry, AMD gained market share at NV's expense, no need to make excuses why NV lost it. Check this chart.

The $100 million write-down was attributable to CPUs, specifically Llano inventory based on this article.

At least BenSkywalker logically and to the point explained why ATI's acquisition has hurt AMD financially but these claims that the desktop HD7000 series was a failure and that AMD had millions of 40nm outdated discrete GPUs to write down are a fantasy.

ATI was showing consumer growth potential, impressive margins, with their mobile hand held -- Imageon , DTV --Xilleon and Chip-sets! Very vocal about GPU Processing and even GPU Physics -- mirrored nVidia. They were hitting record territories with over-all revenue and profits.

Whatever ATI was -- is no longer, to me.

I never knew that all those things you listed above had any relation to how we should actually be judging the AMD/ATI graphics product. ATI made kick-ass GPUs that beat NV from time to time, ironically much like HD7970 GE beat GTX680 this generation in performance. Strictly from a product-based perspective, HD7000 series is very similar to X800/X850, X1900/1950XT series.

In fact, I don't remember any generation of ATI where they had so many competitive products at the same time. Outside of GTX670 and 690, NV little of anything worth buying from performance or price/performance since at least June (and to a non-biased, non-marketing brand-washed consumer, NV had nothing worth buying sub-$300 level from January to August 2012). Even now HD7750 > GT640, HD7770 > GT650, HD7850 > GT650Ti, HD7870 > GTX660, HD7950 > GTX670, GTX670 ~ HD7970 (but loses to 1 Ghz 7970) and 7970 GE > GTX680.

AMD/ATI graphics lineup has never looked this strong since at least X1950XT series as in 1 generation AMD managed to win both the price/performance and performance crown for single-GPUs for more time in 2012 than NV did and across more price segments. When did ATI do that?

Also, you mention ATI was very local about GPU processing and physics? It was mostly talk. AMD graphics has the most capable GPUs today for consumers for GPU processing applications, ATI cards offered little of anything in this area.

If ATI released HD7970 GE series it would have been $550 and people would have bought it like hot cakes. AMD does it and people ignore its OpenCL, bitcoin mining, double precision compute and overclocking capabilities.

AMD has a brand value problem and ATI graphics suffered because the AMD brand. People now expect a flagship HD7970 GE SKU to cost $300 when it's faster than $500 GTX680. Makes sense....

The actual product line of HD7850/7870/HD7950/7970/7970GE is no worse than ATI cards of the past. What kind of GPGPU processing does GTX680 offer to consumers? S After the exposed hack in BL2, we got 100% proof that PhysX is just a marketing gimmick and NV has 0 interest in bringing physics to all of us gamers. All it cares about is using PhysX to sell NV cards, period. It has no desire to colloborate on an open-standard physics standard all gamers can enjoy. Sorry, not everyone drinks AMD drivers don't work and PhysX is awesome Koolaid.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Made up by fanbois who can't admit to themselves that nVidia wasn't capable of opening up a can of whoopass on lowly little AMD.

So, GK110 is for the commercial market by choice? Then who's selling them?

NV couldn't have launched consumer GK110 GPU in large enough volumes at profitable levels in 2012. Given that they can't even meet the mere 15,000 K20s order for Oak Ridge by March 2013, it's more obvious than ever that GK104 was the best NV had for consumer GPUs. At the first the theory that NV held back the real flagship sounded good on paper but since NV is only starting to ramp up volume production for GK110 by November-December of this year, still supporting the view that GTX680 was always meant to be GK110 are just fanboy talk now.

If GTX780 is not GK110, it'll be the final nail in the coffin for the GK110 theory. The very idea that NV can somehow overcome the laws of physics and launch a 1Ghz GK110 chip with 500-600mm^2 die when AMD is running into the 250W TDP room on its 365mm^2 chip should immediately give anyone a pause of how NV can trick the laws of physics on the same 28nm node.

I dug up through some VR-Zone articles and even as early as last year they already had info that NV's highest end flagship for 2012 was a dual-GK104 chip (that VR-Zone labelled as GK110 as a mistake). VR-Zone said NV was going to switch to a small die strategy for 2012 as early as November 2011:

"Following right after GK104 will be GK110 - a dual GK104 flagship, thus completing NVIDIA's line-up for most of 2012 - remarkably similar to AMD's sweet spot strategy." ~ November 26, 2011

That means VR-Zone already knew NV was going with dual-GK104 for a flagship 2012 card before Tahiti XT even launched. This means NV had no idea how fast Tahiti XT would be before they planned the GK104 --> dual-GK104 flagship strategy. NV never even planned to launch a 500-600mm^2 GTX680 it appears and most of us missed this article among a sea of other rumors.

The most ironic self-pawnage is the same NV fans who claim GTX670/680 are just mid-range NV cards are the ones who ended up dropping $400-500 on what they themselves continue to call a $250-300 GPU at best. Talk about face-palming yourself by admitting that you are then willing to pay $400-500 for a mid-range GPU.
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
NV couldn't have launched consumer GK110 GPU in large enough volumes at profitable levels in 2012. Given that they can't even meet the mere 15,000 K20s order for Oak Ridge by March 2013, it's more obvious than ever that GK104 was the best NV had for consumer GPUs. At the first the theory that NV held back the real flagship sounded good on paper but since NV is only starting to ramp up volume production for GK110 by November-December of this year, still supporting the view that GTX680 was always meant to be GK110 are just fanboy talk now.

If GTX780 is not GK110, it'll be the final nail in the coffin for the GK110 theory. The very idea that NV can somehow overcome the laws of physics and launch a 1Ghz GK110 chip with 500-600mm^2 die when AMD is running into the 250W TDP room on its 365mm^2 chip should immediately give anyone a pause of how NV can trick the laws of physics on the same 28nm node.

I dug up through some VR-Zone articles and even as early as last year they already had info that NV's highest end flagship for 2012 was a dual-GK104 chip (that VR-Zone labelled as GK110 as a mistake). VR-Zone said NV was going to switch to a small die strategy for 2012 as early as November 2011:

"Following right after GK104 will be GK110 - a dual GK104 flagship, thus completing NVIDIA's line-up for most of 2012 - remarkably similar to AMD's sweet spot strategy." ~ November 26, 2011

That means VR-Zone already knew NV was going with dual-GK104 for a flagship 2012 card before Tahiti XT even launched. This means NV had no idea how fast Tahiti XT would be before they planned the GK104 --> dual-GK104 flagship strategy. NV never even planned to launch a 500-600mm^2 GTX680 it appears and most of us missed this article among a sea of other rumors.

The most ironic self-pawnage is the same NV fans who claim GTX670/680 are just mid-range NV cards are the ones who ended up dropping $400-500 on what they themselves continue to call a $250-300 GPU at best. Talk about face-palming yourself by admitting that you are then willing to pay $400-500 for a mid-range GPU.

If the GTX 780 is GK114 instead of GK110 it will still be AMD's fault. It's gonna be $500 and that'll be AMD's fault. War in the middle east, also AMD's fault. :rolleyes:
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
NV couldn't have launched consumer GK110 GPU in large enough volumes at profitable levels in 2012. Given that they can't even meet the mere 15,000 K20s order for Oak Ridge by March 2013, it's more obvious than ever that GK104 was the best NV had for consumer GPUs. At the first the theory that NV held back the real flagship sounded good on paper but since NV is only starting to ramp up volume production for GK110 by November-December of this year, still supporting the view that GTX680 was always meant to be GK110 are just fanboy talk now.

If GTX780 is not GK110, it'll be the final nail in the coffin for the GK110 theory. The very idea that NV can somehow overcome the laws of physics and launch a 1Ghz GK110 chip with 500-600mm^2 die when AMD is running into the 250W TDP room on its 365mm^2 chip should immediately give anyone a pause of how NV can trick the laws of physics on the same 28nm node.

I dug up through some VR-Zone articles and even as early as last year they already had info that NV's highest end flagship for 2012 was a dual-GK104 chip (that VR-Zone labelled as GK110 as a mistake). VR-Zone said NV was going to switch to a small die strategy for 2012 as early as November 2011:

"Following right after GK104 will be GK110 - a dual GK104 flagship, thus completing NVIDIA's line-up for most of 2012 - remarkably similar to AMD's sweet spot strategy." ~ November 26, 2011

That means VR-Zone already knew NV was going with dual-GK104 for a flagship 2012 card before Tahiti XT even launched. This means NV had no idea how fast Tahiti XT would be before they planned the GK104 --> dual-GK104 flagship strategy. NV never even planned to launch a 500-600mm^2 GTX680 it appears and most of us missed this article among a sea of other rumors.

The most ironic self-pawnage is the same NV fans who claim GTX670/680 are just mid-range NV cards are the ones who ended up dropping $400-500 on what they themselves continue to call a $250-300 GPU at best. Talk about face-palming yourself by admitting that you are then willing to pay $400-500 for a mid-range GPU.

RS. You're more interested, or at this point I probably need to say "obsessed" in the fanboys than you are interested in the technology discussions lately. This may be entertaining to you but I find it now starting to grind on my nerves a bit, and I'm not alone. Would you kindly back off the fanboy targetting for just a little while? It's gone past old and is rapidly approaching ancient. Much appreciated.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
The very idea that NV can somehow overcome the laws of physics and launch a 1Ghz GK110 chip with 500-600mm^2 die when AMD is running into the 250W TDP room on its 365mm^2 chip should immediately give anyone a pause of how NV can trick the laws of physics on the same 28nm node.

Much like we should acknowledge right now that AMD putting out the 8970 at 4.5GHZ core clock with 12,568 cores that only uses 55watts is just a ludicrous dream by the AMD faithful?

I keep seeing you post your own dream specs for the GK110, and then saying they aren't possible.

Just under 300 watts *more* power draw then a 5770? Yeah, nVidia has released a part just like that-

http://www.bit-tech.net/hardware/graphics/2011/03/24/nvidia-geforce-gtx-590-3gb-review/8

I'm not saying they will or won't release a monster spec GK110 part, but it is beyond idiotic to claim that nV won't release a part that shatters 250 watts. Even if we stick to just this generation, 111 watts more then a GTX 680? Yeah, they did that too-

http://www.anandtech.com/show/5805/...view-ultra-expensive-ultra-rare-ultra-fast/16

nVidia has never given the impression they care in any way about your 250 watt power barrier. Maybe with utterly inept engineers it is some huge obstacle, but they have continued to release parts that significantly exceed the Russian Sensation deemed limit for graphics cards.
 
Last edited:

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
As usual, always defending NV and missing the context.

Since people are pinning the problems on AMD's finances due to HD7000 series, I thought you'd want to read up the data again that AMD moved discrete GPUs last quarter - you know they sold, not set on store shelves.

Q2 2012 Discrete GPU market share
AMD = 37.8%
NV = 61.9%

Q3 2012 Discrete GPU market share
AMD = 40.3%
NV = 59.3%

Since hardly anyone in this sub-forum cares about mobile GPUs, not sure what your point is? Sorry, AMD gained market share at NV's expense, no need to make excuses why NV lost it. Check this chart.

The $100 million write-down was attributable to CPUs, specifically Llano inventory based on this article.

At least BenSkywalker logically and to the point explained why ATI's acquisition has hurt AMD financially but these claims that the desktop HD7000 series was a failure and that AMD had millions of 40nm outdated discrete GPUs to write down are a fantasy.



I never knew that all those things you listed above had any relation to how we should actually be judging the AMD/ATI graphics product. ATI made kick-ass GPUs that beat NV from time to time, ironically much like HD7970 GE beat GTX680 this generation in performance. Strictly from a product-based perspective, HD7000 series is very similar to X800/X850, X1900/1950XT series.

In fact, I don't remember any generation of ATI where they had so many competitive products at the same time. Outside of GTX670 and 690, NV little of anything worth buying from performance or price/performance since at least June (and to a non-biased, non-marketing brand-washed consumer, NV had nothing worth buying sub-$300 level from January to August 2012). Even now HD7750 > GT640, HD7770 > GT650, HD7850 > GT650Ti, HD7870 > GTX660, HD7950 > GTX670, GTX670 ~ HD7970 (but loses to 1 Ghz 7970) and 7970 GE > GTX680.

AMD/ATI graphics lineup has never looked this strong since at least X1950XT series as in 1 generation AMD managed to win both the price/performance and performance crown for single-GPUs for more time in 2012 than NV did and across more price segments. When did ATI do that?

Also, you mention ATI was very local about GPU processing and physics? It was mostly talk. AMD graphics has the most capable GPUs today for consumers for GPU processing applications, ATI cards offered little of anything in this area.

If ATI released HD7970 GE series it would have been $550 and people would have bought it like hot cakes. AMD does it and people ignore its OpenCL, bitcoin mining, double precision compute and overclocking capabilities.

AMD has a brand value problem and ATI graphics suffered because the AMD brand. People now expect a flagship HD7970 GE SKU to cost $300 when it's faster than $500 GTX680. Makes sense....

The actual product line of HD7850/7870/HD7950/7970/7970GE is no worse than ATI cards of the past. What kind of GPGPU processing does GTX680 offer to consumers? S After the exposed hack in BL2, we got 100% proof that PhysX is just a marketing gimmick and NV has 0 interest in bringing physics to all of us gamers. All it cares about is using PhysX to sell NV cards, period. It has no desire to colloborate on an open-standard physics standard all gamers can enjoy. Sorry, not everyone drinks AMD drivers don't work and PhysX is awesome Koolaid.
QED. Well put :thumbsup:
 

Final8ty

Golden Member
Jun 13, 2007
1,172
13
81
I would agree. But then it also matters if someone claims something as an opinion or as a fact. In this instance and particular argument, it couldn't matter less.

Indeed an opinion is fine, but that's not how it was put across.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
My post was just to invoke thought. The specific subject manner was the very short sighted view about nvidia being late with their full 28nm lineup and such.

You discussed why NV waited longer to launch GTX600 series but as gamers we don't care. When technology is late, it's late and NV shouldn't get a pass from us gamers for being late and allowing AMD to overcharge gamers because they couldn't offer any competition for more than half a year.

I don't need excuses why NV needed 8 months to launch it's sub-$300 line when AMD launched HD7970 in December of 2011 and completed the entire single-GPU line by March. As a consumer, NV was late and it hurt the gaming market and hence the criticism. It was also criticized for not moving the price/performance curve or bring new features (unlike Fermi cards). This was also explained in details in threads in the past. Fermi offered superior DX11 performance, more VRAM, more overclocking headroom. What did GTX600 offer over HD7000 series? None of these.

Anyway, the post was more food for thought as to why nvidia would hold off launching chips they already had. To clear out inventory was brought into my discussion and i brought in AMD as an example to try to relate the concept to bring forth understanding.

That's the problem. You brought in discussion of inventory of how it relates to GPUs but AMD has a CPU inventory problem with Llano and so forth. So what was your comment regarding 40nm parts AMD had laying around? It's made up!

My post was just to provoke thought but since you want to get specific then lets talk on some of your key points.

Right but your post regarding inventory issues related to GPUs cannot provoke though since unlike millions of Fermi parts that needed to be cleared from the channel, AMD didn't have this problem if you actually read what the write-down is related to. Thus, your example doesn't bear any meaning to provoke thought. NV and AMD executed their GPU strategy totally differently it is NV who had a 28nm wafer shortage that forced them to delay their GTX600 low and mid-range parts by 6 months in order to be able to fulfill their mobile GPU kepler contracts first.

First off you say " it's financially impossible" that AMD's GPU division is only 8%.

That's not what I said at all. I see there is a continuous reading comprehension problem on this forum.

I said since AMD's desktop discrete GPU division only brings less than 10% of the firm's actual cash flows, AMD's main issues and the decline in the stock price are not attributable to its performance as it's financially impossible for such a small unit of the company to drive the stock price 60% down.

Your implying their GPU division could do nothing and couldnt effect their profits? What?? What your using as proof doest make any sense at all. Your concept here is completely unacceptable and false.

I never once said their "GPU division could do nothing, couldn't effect their profits". First of all it's affect, not effect. Second of all, what I am saying is their desktop discrete GPU division could evaporate to 0 and it would lose than 10% of the stock price - meaning AMD is primarily a CPU, not a GPU company from a financial sense. What does that mean? It means that AMD hasn't made a lot of $ from their graphics division in a long-time, not just with HD7000 series, but a long-time. In other words, something else has kept the stock alive all these years and is no longer performing well.

What are those other product lines? CPUs, APUs and servers. All of those are now suffering more than ever, while the GPU business is fairly flat and not performing that much worse outside of expectations. Since the discrete desktop GPU business is even smaller than AMD's mobile GPU business, it hardly has an impact on AMD's financial performance. Even if AMD sold 2x as many or 2x less HD7000 desktop cards, it would hardly move the stock up or down by more than 10% because it's such an immaterial product line in the context of the positive cash flows generated (or in this case not generated) by rest of the company.

This is why this forum should stick to non-finance based stuff and the finance stuff should be either left to people who understand it or work in the field, with the rest just listening and learning. I actually do understand the concepts because it's my job. You are just doing it from an amateur respective.

Why? well....... if AMD made 100million more in the graphics division for Q2 then they wouldve reported 130+million for the Quarter. This wouldve been an out of this world quarter for AMD. Every extra dime made by their GPU division is a dime for the whole AMD. ITs the whole AMD, and your prentending they dont go together. Their profits are a collection of the entire AMD performance. The better their GPU division does, the better AMD does. Its that simple.

See you are again not understanding the context. If Coca-Cola released a new energy drink, it doesn't expect to keep the company affloat. Coca-Cola's main business is not energy drinks. AMD's main business has not been graphics for a LONG time now (at least 5 years). Blaming AMD on how little $ their GPUs make is not seeing the big picture -- Consumer GPU business has never really been a very profitable business and HD7000 series is no different in this regard to HD4000/5000/6000 series or GTX400 series for that matter. This is especially true for the desktop discrete GPU business that has become a small fraction of both NV and AMD at least 3-4 years ago. You can even say NV's desktop GPU business is a cost center for their professional graphics, nothing more.

Even if AMD's graphics division made 100 million more in 1 quarter, it would hardly turn the company around. Besides, I told you already, AMD's graphics division is performing as expected or close to it. So why would it make $100 million dollars? That's like asking Coca Cola to make 2x more $ than it normally makes. Out of where? Thin air?

The consumer graphics business does not make a lot of $ in today's economy, neither for AMD, nor NV. AMD needed the CPU business to keep the company afloat. . Even NV's entire net income for Q2 and Q3 was hardly a lot and they sell Tegra and have 95% market share in professional graphics that AMD doesn't have.

I am telling you the actual business of consumer graphics is not a very profitable business in the beginning and it's also not where NV makes the most $ either. Even NV knows this which is why they make most of their $ in professional graphics and moving to mobile CPU development. If you remove the Tegra, Quadro and Tesla lines from NV, it's discrete GPU business is hardly making more $ than AMD's that would actually matter to the bottom line of AMD.

Which is when AMD decided to drop the 79XX series and didnt expect nvidia to be able to counter.

Do you work for AMD? Of course when AMD dropped HD7000 series early they expected to counter. This was the whole point AMD released HD7000 series early with lower clocks - to beat NV to market and command higher profit margins.

AMD has had to lower prices to very very low and ultra competitive prices. Across their 28nm lineup. Ask yourself this, if AMD was able to get 50% more out of their GPUs, would that help AMD profits? Of course it would, if you cant see it than i worry about you. If AMD could sell their GPUs for a higher markup they would in a heartbeat. It would benifit AMD tremendously if they could get more out of their GPUs. Their gross margin would average higher if AMD could get more out of their GPUs. But all this is besides my point.

It would hardly matter. 10% of a product segment cannot fix 90% of the firm. It's pure finance/mathematics. 10% of a firm cannot drop margins from 44% to 31%. This is attributable not to AMD graphics lower margins but 100 million write-down related to CPUs. But neither you nor people in this thread get this because you neither work in finance nor understand what the numbers mean or how to calculate them.

Here I'll do a quick example for you just so you see you are not seeing the big picture.

Q2 2012
90% of the firm's product lines have 44% margins (the rest)
10% of the firm's other product lines have 50% margins (AMD desktop graphics)
The firm's average margins would be 44.6%

Let's say in Q3 2012
90% of the firm's product lines have dropped margins to 32% because of 100 million write-down for Llano inventory
10% of the firm's other product lines have increased margins to 60%
The firm's average margins would be 34.8% (a huge drop!)

See you are assuming AMD's desktop products are killing the firm but missing 90% of the company. Even if AMD made more $ in desktop GPUs, it hardly matters. It's pure math. When 85% of the firm is losing $, the GPU division cannot sustain the firm.

I wasnt writing about the truth of what happened to AMD. I was offering a line of thinking and it was specifically related to the Nvidia being late comments we keep hearing over and over. I was just offering a larger picture and using AMD as an example to this. You may not understand this but i meant no harm to AMD (or you) in anyway.

When I criticized NV for being late this year, it was from a view of a consumer not investor. If you own NV stock, sure you may care that NV delayed its products by 6 months. NV delivered its sub-$300 product line late and offered little to the consumer from a price/performance or overclocking perspectives that HD7700/7800/7950 didn't already offer for months. All AMD had to do was drop prices and it neutralized NV's sub-$300 line as worth buying. When Fermi launched, even if AMD dropped prices, it still made Fermi superior in many areas - VRAM, overclocking, DX11 performance. Fermi was superior technologically for us overclockers and gamers, Kepler is not. Therefore, for us consumers, there is no excuse why NV was late by 6-8 months. Generally speaking this forum has been critical of companies that launch late and bring little to the table and unlike GTX670/680/690, the rest of NV's line brought very little to the table despite being late.

GTX650/650Ti/660/660Ti are inferior to 7770/7850/7870/7950 in performance, price/performance and overclocking. Not at all like Fermi. Fermi was late but it brought it. Sub-$300 Kepler was late, and it brought nothing to us consumers worth talking about besides PhysX. That's why I criticized NV being late by 6+ months as a consumer because it allowed AMD to dictate high prices which hurt us due to NV's lackluster GTX600 launch.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
RS. You're more interested, or at this point I probably need to say "obsessed" in the fanboys than you are interested in the technology discussions lately. This may be entertaining to you but I find it now starting to grind on my nerves a bit, and I'm not alone. Would you kindly back off the fanboy targetting for just a little while? It's gone past old and is rapidly approaching ancient. Much appreciated.

I think our forum would be better if it was more objective. If someone makes a claim that GK104 was meant to be mid-range and that AMD is failing hard because all NV needed was a mid-range GPU against them this round, then why do you have a problem when I disagree with that and provide supporting evidence that shows it was not meant to be mid-range? Why do you find this offensive?

What technology would you like to discuss in the "AMD Q3 results" thread or now that AMD and NV launched their 28nm 1st generation GPU line-ups?

Ok let's discuss then. If NV held back GK110 on purpose, why would they let AMD have the fastest single-GPU this generation? If NV held back GK110 on purpose, why is it taking them until Q4 to start volume production on GK110 (K20) and another quarter at least to fill the Spring 2012 corporate client customer pre-orders for K20 chips? If NV held back GK110 on purpose, why are there rumors now that NV will use GK114 / 114-GX because it's also concerned about die size and power consumption issues? If NV held back GK110 on purpose, why couldn't they launch sub-$300 desktop GPUs and were forced to prioritize mobile GPU production due to 28nm wafer constraints for 1st half of 2012? If NV held back GK110 on purpose, why did NV concede the single-GPU performance crown this generation?

You know JHH loves to win and he wants the single-GPU performance crown. Why then if NV can at any time launch a much faster card than GTX680, we haven't seen it since 7970 GE launched in June?

Share your thoughts on this then.
 
Last edited:

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
I think our forum would be better if it was more objective. If someone makes a claim that GK104 was meant to be mid-range and that AMD is failing hard because all NV needed was a mid-range GPU against them this round, then why do you have a problem when I disagree with that and provide supporting evidence that shows it was not meant to be mid-range? Why do you find this offensive?

What technology would you like to discuss in the "AMD Q3 results" thread or now that AMD and NV launched their 28nm 1st generation GPU line-ups?

Ok let's discuss then. If NV held back GK110 on purpose, why would they let AMD have the fastest single-GPU this generation?

You know JHH loves to win and he wants the single-GPU performance crown. Why then if NV can at any time launch a much faster card than GTX680, we haven't seen it since 7970 GE launched in June?

Share your thoughts on this then.

Thank you.
 

Final8ty

Golden Member
Jun 13, 2007
1,172
13
81
Great posts. I agree, a little accountability would bring this forum up a notch. :thumbsup:

For the news, not really surprised. How long can a company be chronically mismanaged before they tank? We'll soon find out. :)

My only sympathy would be for the workers as i cant stand bad management.

Where i work a particular manager who lost contracts in the hundreds of thousand because he could not be bothered to go upstairs and check an email or make a phone call when asked and thus the job was not done to the right specifications or giving the job to unqualified staff who end up screwing it up.

Particular involvement with Max payne 3 was a close call, nearly missed the deadline because of managers not listening to the staff.

Just finished particular involvement with Football Manager 2013, i just hope there is no screw up there.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Much like we should acknowledge right now that AMD putting out the 8970 at 4.5GHZ core clock with 12,568 cores that only uses 55watts is just a ludicrous dream by the AMD faithful?

I don't know anyone who claimed HD8970 will be more than 30-40% faster than HD7970 GE? Do you have a link for that? Most people actually were understanding that AMD has a power consumption and TDP limits. This is why even 30-40% to many sounded too optimistic and with recent claims of just 15% faster, it's further proof that power consumption and die size limits may be a real problem for viable production. You keep denying this pointing to 300W dual-GPU cards as support. Cooling 2x 150W GPUs on 1 board with 2 separate heatsinks is not the same as cooling a 250-300W GPU with 1 heatsink.

What most people are saying that if AMD is running into power consumption problems with Sea Islands, why would nV not have the same issues on 28nm? It's not like NV has alien 28nm node, or has access to some alien heatsinks or fans?

I keep seeing you post your own dream specs for the GK110, and then saying they aren't possible.

Those aren't my dream specs. The 2880 SP, 240 TMU, 384-bit bus specs are coming from the full 15 SMX clusters of the entire K20 chip. The discussion on GTX780 specs on our forum in earlier months hypothesized whether or not NV would launch a fully unlocked GK110-based consumer GeForce card. Those are not the specs I made up, but they are K20's fully unlocked technical specifications. Although we haven't got a confirmation if K20 itself has the full 2880 SPs as some rumors have stated it would have between 2,496 and 2880 SPs to maximize yields.

Just under 300 watts *more* power draw then a 5770? Yeah, nVidia has released a part just like that-
http://www.bit-tech.net/hardware/graphics/2011/03/24/nvidia-geforce-gtx-590-3gb-review/8

2x 300mm^2 die is not the same as having 1 chip at 600mm^2. Not even sure why you are posting there GTX590's power consumption unless you are now saying GTX780 will be a dual-GPU card? I never said anything about dual-GPU cards not being able to use more than 250-270W.

It's simple, at 365mm^2, Tahiti XT2 @ 1.05ghz uses more than 200W of power. Kepler boosts to 1.2ghz. So it's obvious to just about anyone that if AMD is having problems enlarging Sea Islands to 400-420mm^2, NV will have to deal with the exact same power consumption issues if it made a 420mm^2 Kepler chip, nevermind a 600mm^2 one.

Of course because NV doesn't have compute fat, they don't need to make a 600mm^2 die chip, as even a 420mm^2 Kepler would be faster than a 420mm^2 Sea Islands due to the nature of the compute transistors wasting space in the AMD chip and not really helping in games. This actually gives even more reason why NV can get away with a 450mm^2 or smaller GTX780 and still beat HD8970 easily.

I'm not saying they will or won't release a monster spec GK110 part, but it is beyond idiotic to claim that nV won't release a part that shatters 250 watts. Even if we stick to just this generation, 111 watts more then a GTX 680?

No one said anything about NV not releasing "a part" - GTX780 is rumored to be a single-GPU, not a dual-GPU product. How did it work out for NV and its 250W average, 270W peak GTX480 card? Why would NV go back to making a card most of the industry criticized for being hot, loud and using a ton of power? Why is it that BenSkywalker says NV should blow 250W average because that's what he wants, that NV is suddenly going to throw out the performance/watt advantage Kepler has and throw it all out the window?

has never given the impression they care in any way about your 250 watt power barrier. Maybe with utterly inept engineers it is some huge obstacle, but they have continued to release parts that significantly exceed the Russian Sensation deemed limit for graphics cards.

For single-GPUs NV publicly stated they wanted to focus on performance/watt in every press release and in the launch of Kepler / as well as presentation of GTX690. JHH keeps talking a lot about performance/watt and the criticism the firm got from its customers regarding Fermi. Again, I never said 250W is a barrier NV's engineers cannot physically cross. They can cross it and they did with GTX480 but your claims that NV is likely to do this out of the blue after they went 180* to focus on performance/watt instead is counter to their own Kepler strategy.

Even the idea of a GK114 gives more credibility to the fact that NV is very much keeping the performance/watt as an important consideration since GK110 has a ton of compute aspects that most gamers don't want, and a GTX780 based on GK110 would penalize the product from a power consumption perspective without much benefit for games.

Again, I never said NV won't beat HD8970 but you have continuously made claims that GTX780 could be 50-100% faster than HD8970 which are mathematically impossible unless NV launches some mythical 1Ghz 2880 SP part.

The other problem with your projection is it looks at the most extreme case. When someone makes a reasonable projection, you should look at low-end, mid and high scenarios (or low probability and high probability scenarios and average them to arrive at a mid-point). In your case, the 50-100% faster than 8970 claims are so far out to the extreme range, your scenario allows for no compromises. It means NV will release the fastest possible GPU, throw TDP out the window, throw noise levels out the window, through yields out the window, throw margins out the window, etc. Also, strategically it would make Maxwell simply look bad. GTX780 will launch in 2013, most likely a year still plagued with console ports, while in 2014 we could see more next generation games and it would make more sense to leave more performance gains to Maxwell as a selling point, especially since it will also benefit from a lower node process. For 780 to beat HD8970 by 50-100%, 8970 would have to be a total failure and at the same time NV would have to drop a huge monolithic die.

I don't think the math can even work for the greater than 50% faster scenario and I already explained to you before why this is.

HD7970 Ge is faster than 680 is. If HD8970 is 15% faster than 7970GE, it'll be at least 20% faster than GTX680. Just for your projection of 50% faster over 8970 to come true, GTX780 would need to be 80% faster than GTX680. How is that possible without a 1Ghz 2880 SP, 240 TMU, 384-bit GK110? Explain.

With rumors that GK114 will be 15-30% faster than GTX680, every day we get new information, your projection loses more and more credibility and the points I brought up earlier regarding die size limitations from a profitability and power consumption perspectives are being used in the same rumors explaining why NV won't launch GK110 as a consumer GPU.
 
Last edited:

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
My only sympathy would be for the workers as i cant stand bad management.

Yes it does suck for them. Maybe with some luck Nvidia or Intel would pick some of them up. That's a lot of people though. Almost 2500 was it?
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
I never knew that all those things you listed above had any relation to how we should actually be judging the AMD/ATI graphics product.


Actually the thread is not about judging product but financials -- purchasing ATI was raised and added some past financial info around mid 2006: official info on the ATI purchase was offered. Not only do you feel the need to judge product compared to nVidia but oddly, ATI's past products, too.
 

ocre

Golden Member
Dec 26, 2008
1,594
7
81
You discussed why NV waited longer to launch GTX600 series but as gamers we don't care. When technology is late, it's late and NV shouldn't get a pass from us gamers for being late and allowing AMD to overcharge gamers because they couldn't offer any competition for more than half a year.

<snip>.

Good luck with your- everyone boycott nvidia because they were "late"- campaign. Its really quite amusing. Lets see how that works for you!!!

Anyway yuou keep posting Q2 results for some kind of proof that things are swell in Q3??? However you think that works. Reality is that this thread is about the Q3. You know in the title. You would think that something changed between Q2 and Q3, you know or this thread wouldnt be here.

moving on, the rest of your post is pretty much exactly my point. I know as a gamer you dont care about the reasons or bla bla bla. You clearly acknowledge the vastly different routes AMD and nvidia has went lately. Now we can see how each one pays off in the end.

You know i was just offering some deeper thought to your "nvidia is late" campaign. And of course you wouldnt want to here it. But when you say, "as a gamer we dont care", it kinda makes me chuckle. You know what this thread is about dont you? As a gamer you somehow have a vested interest in it. Its all about AMD and profits, something their investors should be greatly interested in. So when it comes to nvidia strategies that may help them actually turn a profit, your not wanting to hear it. Really the only reason i bring this in is because your endless nvidia late post. Just offering some thoughts on the different strategies and how they pay off in the long run.

Anyway, clearly your not interested in this stuff. Especially when it gives deeper rationality to your attempts to mindlessly deface. Your post indicatates you do not care as a gamer why nvidia might do things like they do, so let it be. My thoughts on the matter wasnt for you, obviously.

Also-
Please lets wait to see the results of Q3, exactly. You know, then you will have more grounds to debate on. Just because Q2 is not at all what any of this is about. And ass bad as it was, Q3 is gonna be far worse. Q3 is the one thats concerning everyone. Its the one that will cost up to 30% of AMDs workforce to be cut. And the true report is yet to come. So its kinda ironic.
 

NIGELG

Senior member
Nov 4, 2009
852
31
91
No one can answer why Nvidia fans would buy this supposed 300 dollar midrange card for 500 dollars,


Anyhow whatever happens happens.Life will go on...We'll see if AMD can survive in some form or the other.I've had so much bad experiences with Nvidia cards whereas ATi cards have been good to me from 9800Pro to 5850.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
No one can answer why Nvidia fans would buy this supposed 300 dollar midrange card for 500 dollars,

If the GK-104 was ever intended to replace the original price-points of the GF-114 or GF-104 is irrelevant based on 28nm price performance was already set.

I'm not a nit-picker or a blame poster by nature but was vocal about the evolutionary and incremental price/performance from AMD and nVidia with 28nm considering the node and respective arches were substantial and significant.

Thankfully, with more choice and competition, gamers may find more 28nm value for their wallet or budget.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
No one can answer why Nvidia fans would buy this supposed 300 dollar midrange card for 500 dollars,


Anyhow whatever happens happens.Life will go on...We'll see if AMD can survive in some form or the other.I've had so much bad experiences with Nvidia cards whereas ATi cards have been good to me from 9800Pro to 5850.

Where have you been? It's AMD's fault they did that. So, it only makes sense to make sure you don't buy an AMD card and that leaves you with very little choice except to buy one of those over priced nVidia midrange cards.