• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

Nvidia Q3 Financial Results

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ajay

Lifer
Jan 8, 2001
16,094
8,114
136
GF100 was bad, and only GF100. It's poor efficiency was necessary because they needed to get Teslas and Quadros out yesterday, by that time. They will need to make power efficiency a very high priority going forward, but everything since the GF104 has been at least good enough, and the expectations the FUD were based on generally involved Fermi after GF100 to be nothing but cut-down GF100.

In the conference call JHH specifically mentions that the GPU engineers put allot of hard work in to maximize power efficiency. On the mobile Keplers, they are seeing better thermals and performance and getting more design wins.

Personally, I doubt they'll bother with 28nm PC GPUs this year. They can already get premiums for their 40nm GPUs, and they really can't afford the PR of another late one like that. Better to have a lax schedule and it out early or near on time, than an aggressive schedule and fumbles, even if AMD can radically beat them in performance per Watt for several months. They need Fermi++ to get out the door perfectly, much more than they need it to get out the door quickly.

I was thinking 2012, not 2011 - so you are correct and the babble below was produced due to a random bit error in my brain :oops:

Of course they'll put out desktop Keplers this year. Since the implementation of Kepler is going so well, they will definitely want to claw back some market share from AMD. Though, NV will do that while making good profits as well (unlike ATI/AMD). Kepler seems poised to give NV a solid performance bump with lower thermals, which would keep them in the lead performance wise and negate ATI's only advantage (better thermals).
 
Last edited:

peonyu

Platinum Member
Mar 12, 2003
2,038
23
81
Fermi like the NV30 came out late, in the spring after an AMD\ATI release in the fall that set the competition ahead. Fermi like NV30 was large,hot, loud, and consumed a lot of power. Fermi like NV30 was a new arch and pushed the process envelope and was also bit by process issues. Fermi like NV30 appears set to be validated by subsequent iterations of the arch(NV30-->NV35-->NV40)=(GF100-->GF110-->Kepler). Ultimately it sounds like Fermi's legacy could be like NV30s. It set in motion a design that allowed Nvidia a long series of success stories and generated profits.


Not even close. There are similarities to the FX5800 but the differences are huge. For one the FX was loud, hot and.....SLOW compared to the Radeon 9700 Pro. The 9700 Pro was quiet, not hot and the fastest card bar none at the time.

Fermi is/was loud, hot but it is the fastest card there is at the top bracket. So the tradeoff is a worthwhile one for Nvidia. The 5800FX was just a disaster for Nvidia. Fermi isn't one at all.
 

Mopetar

Diamond Member
Jan 31, 2011
8,497
7,753
136
EXTRA CASH = tegra 3

once again the fastest gpu garunteed for Nvidia

From everything I've read the Tegra 3 GPU isn't overly impressive. The biggest advantage it has over other SoCs is the two additional cores and higher clock speed. We'll have a clearer picture when the Transformer Prime benchmarks come out, but I won't be surprised if the chip isn't the graphics powerhouse that it's been made out to be.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
From everything I've read the Tegra 3 GPU isn't overly impressive. The biggest advantage it has over other SoCs is the two additional cores and higher clock speed. We'll have a clearer picture when the Transformer Prime benchmarks come out, but I won't be surprised if the chip isn't the graphics powerhouse that it's been made out to be.

The GPU on paper is 3x more poweful than the GPU in Tegra2. Even if it's "only" matching the ipad 2's GPU, it's got 2 more CPU cores to work with and it should be faster in just about everything. It's funny, though, that a company which makes the biggest and baddest GPU's in the world is a half step behind their competition's GPU's in the ARM space.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Fermi like the NV30 came out late, in the spring after an AMD\ATI release in the fall that set the competition ahead. Fermi like NV30 was large,hot, loud, and consumed a lot of power. Fermi like NV30 was a new arch and pushed the process envelope and was also bit by process issues. Fermi like NV30 appears set to be validated by subsequent iterations of the arch(NV30-->NV35-->NV40)=(GF100-->GF110-->Kepler). Ultimately it sounds like Fermi's legacy could be like NV30s. It set in motion a design that allowed Nvidia a long series of success stories and generated profits.

NV30 didn't beat ATI/AMD in performnace...FERMI did.
Bad analogy is bad analogy.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
From everything I've read the Tegra 3 GPU isn't overly impressive. The biggest advantage it has over other SoCs is the two additional cores and higher clock speed. We'll have a clearer picture when the Transformer Prime benchmarks come out, but I won't be surprised if the chip isn't the graphics powerhouse that it's been made out to be.
Even if it's close to the paper improvements, it won't be a powerhouse; not with that memory bus. It should, however, be fast enough to finally give PowerVR some serious competition. They've had the performance per Watt crown since back then they made PC video cards, and if Tegra can get anywhere near its paper improvements (very likely), they'll finally have some real competition.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
The GPU on paper is 3x more poweful than the GPU in Tegra2. Even if it's "only" matching the ipad 2's GPU, it's got 2 more CPU cores to work with and it should be faster in just about everything. It's funny, though, that a company which makes the biggest and baddest GPU's in the world is a half step behind their competition's GPU's in the ARM space.

nVidia can't build huge SoCs. Apple and Samsung using their own SoCs for their own products, but nVidia must sell them to OEMs. They are designing relative small Dies with great performance. Look at Tegra 2 and A4. Tegra 2 is a litte bit smaller but overrall 2x times faster. A5 is more than twice as huge as A4, Tegra 3 only 64% over Tegra 2 with up to 3x GPU, up to 2.6x CPU (1.4x one Thread, 1.3x two Threads, 1.95 three Threads) performance, a 5th Low-Power Core, an improved Videodekoder and Image-Processing-Processor, better power management (seperated Clock and Power-Gating of every CPU). Tegra 3 it's huge improvement over Tegra 2 on the same process. It would be very easy for them to bring the GPU-performance up to A5 but this will cost them die size and power.

I think Tegra 3 is the better Computer-on-Chip, even for gaming because games are not only related to gpu-performance.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
Funny, 2 years ago I saw tons of "NVIDIA IS DOOOOOMED" statements splashed all across the interwebs. Let's at least check out 28nm from both camps before making our decisions, shall we?

There is a big difference, Nvidia was still making money.
The 5xxx series was AMD's best chance and they blew it.
Their cpu's are outclassed and 3 years behind, their desktop gpu's are just about breaking even.
Their pro gpu's are crap or at least making no money.
The notebook space will soon (15 months) be taken over by 8 core 3 watt ARM chips with xbox 360 like graphics with 48 hour battery life and windows 8.

Intel, Nvidia and ARM are gonna crush the company if they don't do something now.
 
Feb 19, 2009
10,457
10
76
ARM is not magical. Please don't worship it. Its good for what it is, low power consumer devices.

As soon as you move into real computing, it's not even anywhere near x86 cpus in performance. Can it run windows 8? Sure, so what, any crap pos will run it.

Come back when ARM cpus actually compete performance wise.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
What is interesting is seeing nVidia make all kinds of money while AMD's graphics division only pocketed $14 million during the same period. What is AMD doing wrong? Their products are fine and they perform well. nVidia is earning boatloads of money.

Perhaps it's the workstation market. I never hear about FireGL cards any more.
 
Feb 19, 2009
10,457
10
76
NV's HPC earned a huge amount of money with 1000% margins. You can put two-two together and figure out most of their "profit" this quarter is purely from that segment.

Net income (profit) ~$200M.

HPC revenue ~ $230M.

Their professional cards sell for $3000 - $5000 ea. You can do the maths. :D

This is where AMD is screwing up with no competition.
 
Last edited:

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Interesting.

AMD has always been able to engineer incredible hardware. Software has always been their weakness.

I'm surprised people are willing to pay so much for HPC. I guess for certain tasks the GPU really is the ultimate piece of hardware. I would think AMD could compete in this realm as well. In many cases their hardware outperforms nVidia's in GPGPU stuff.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
Their professional cards sell for $3000 - $5000 ea. You can do the maths. :D
No, they start at just under $200. But, that $200 one is equivalent to a $30-50 consumer card, and there's no way it's costing them much more, even counting in separate and added software support.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
Interesting.

AMD has always been able to engineer incredible hardware. Software has always been their weakness.

I'm surprised people are willing to pay so much for HPC. I guess for certain tasks the GPU really is the ultimate piece of hardware. I would think AMD could compete in this realm as well. In many cases their hardware outperforms nVidia's in GPGPU stuff.
Nothing is ever fast enough, and HPC typically involves problems that are too much for any one physical computer to be able to even theoretically work on. GPUs are not necessarily perfect, but are one upcoming option, and can be insanely efficient, compared to CPUs, if and only if data locality can be guaranteed. Fujitsu isn't exactly shaking in their boots over nVidia's progress. Energy companies, for instance, are a historically big user of clustered computers for large sets of calculations--it's not like they are spending money and not expecting a return on it.

Finally, correct results have been worth paying for for decades, and AMD does not have the GPU hardware to come close to that, on a large scale (racks of GPUs), even if you completely discount the software support advantages that NVidia has over AMD. People have been paying for hardware with high quality validation, and good ability to detect faults, since the 70s; and they've been paying a ton extra for x86 CPUs that do that since the late 90s.
 
Last edited:

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
So how does the Money NV recieves from Intel for license agreement figure in this . I failed to see it writen in the above article . . 1.5 billion over five years is alot of money $300,000.000 a year . I failed to see that amount in the above . That would be a bit over $70,000,000 a qt .
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Besides the x2 5000+ black edition and 5850 gpu, I cant think of any other "incredible" hardware from AMD.
Llano seems like it might be a winner.
All of AMD's GPUs are competitive with nVidia.

I could make a list of all of AMD's incredible hardware over the years, but I don't have time. They had a really good run in terms of CPUs and GPUs over the years. Right now they're not competing at the high end, but that doesn't mean that they can't. They're choosing not to.
 

ocre

Golden Member
Dec 26, 2008
1,594
7
81
So how does the Money NV recieves from Intel for license agreement figure in this . I failed to see it writen in the above article . . 1.5 billion over five years is alot of money $300,000.000 a year . I failed to see that amount in the above . That would be a bit over $70,000,000 a qt .

A good breakdown:

http://www.xbitlabs.com/news/graphi...oducts_Lift_Nvidia_s_Revenue_and_Profits.html

Quote,
"Nvidia's GeForce business – which includes desktop and notebook GPUs, memory, chipset products and license revenue from the company's patent cross license agreement with Intel – was up 1.0% compared with the previous quarter, at $644.8 million as chipset revenue declined $47.3 million to $22.4 million in the third quarter. In overall, the situation is not bad and the company manages to offset declines in chipset revenues with sales of discrete GPUs and other related products."

So its supposedly factored in here. I believe it comes to 70 million a quarter if you divide it up for 5 years. This agreement was actually based on intel licensing the right to geforce technologies.


NV's HPC earned a huge amount of money with 1000% margins. You can put two-two together and figure out most of their "profit" this quarter is purely from that segment.

Net income (profit) ~$200M.

HPC revenue ~ $230M.

Their professional cards sell for $3000 - $5000 ea. You can do the maths. :D

This is where AMD is screwing up with no competition.

Nvidia puts tons of the cash from this segment back into it. Its not just pure profits, they have to work very hard to make happen. This market is one they worked for. The margins are good but there is a lot that goes into the large sales which make up a majority of the revenue. A lot of cash to make happen.

When its all figured in, I am pretty sure that 1/3 of their profits come from the professional markets. This is the ratio that i believe they still hold to.

Nvidia spends a heck of a lot of their revenue to push into these new markets. They have tons of projects going on all at once. Some of the stuff we see today, some things are yet to be seen. They are branching out in many directions and have no debt. They are managing extremely well and it all steamed from one source of cash. Lots of their cash goes into their push into these other markets. They are doing all this while still making cash to spare. All these things considered its a hell of a task
 
Last edited:

ocre

Golden Member
Dec 26, 2008
1,594
7
81
All of AMD's GPUs are competitive with nVidia.

I could make a list of all of AMD's incredible hardware over the years, but I don't have time. They had a really good run in terms of CPUs and GPUs over the years. Right now they're not competing at the high end, but that doesn't mean that they can't. They're choosing not to.

LOL! i am 100% sure they arent competing in the high end just cause they choose not to. thats such a lame thing to say. The bulldozer is not their chip they made just to compete in the mid grade. If AMD could compete in the high end, the very well would. Period!

As far as GPUs, they have a better chance to stay competitive in the high end here if they dont blow it. Nvidia is pushing very hard to survive, AMD must also. The high end is more important than you might think. There is no doubt that AMD wanted BD to bulldoze intel's direction. They would love to be on top.

Besides, what was all this 6990 crap from AMD if they didnt want the high end?
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
AMD deliberately went with a small die design approach focussed on efficiency, this is a fact. In the process, they were not far off at all from competing at the high end of the market. If they can actually get the 7000 series out the door ASAP, we may witness a changing of the guard and the de-throning of nVidia.
 

NIGELG

Senior member
Nov 4, 2009
852
31
91
AMD deliberately went with a small die design approach focussed on efficiency, this is a fact. In the process, they were not far off at all from competing at the high end of the market. If they can actually get the 7000 series out the door ASAP, we may witness a changing of the guard and the de-throning of nVidia.
I hope to see competition always....but I doubt there will be any 'dethroning'.AMD has a lot of work to do...
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Even if there is a dethroning it will be very brief in all likelihood seeing as nVidia is trying to hammer Kepler out the door.

AMD wins in performance-per-watt right now, so in essence they are already "winning" as we speak in certain respects.
 
Last edited:

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
There is a big difference, Nvidia was still making money.
The 5xxx series was AMD's best chance and they blew it.
Their cpu's are outclassed and 3 years behind, their desktop gpu's are just about breaking even.
Their pro gpu's are crap or at least making no money.
The notebook space will soon (15 months) be taken over by 8 core 3 watt ARM chips with xbox 360 like graphics with 48 hour battery life and windows 8.

Intel, Nvidia and ARM are gonna crush the company if they don't do something now.

Oh please. ARM is not going to match x86. AMD's already in the laptop space with the various Fusion chips. The higher end ones, Llano, are already as good/better than the 360 and the less expensive ones (Zacate and Ontario) are better than current ARM chips. In 15 months ARM may have improved, but AMD inevitably will have as well. Since Llano proved a success for AMD and they've got newer CPU architectures already, a follow up is all but inevitable. ARM might have the edge in TDP -- emphasis on might, as the 40 nm dual core C-50 chip is already down to 9 w TDP -- but they won't beat out x86 in processing power in the forseeable future.


What is interesting is seeing nVidia make all kinds of money while AMD's graphics division only pocketed $14 million during the same period. What is AMD doing wrong? Their products are fine and they perform well. nVidia is earning boatloads of money.

Perhaps it's the workstation market. I never hear about FireGL cards any more.

The GPU department put a lot of research dollars into Fusion while the revenue technically all went to the CPU department, basically.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
ARM is good enough for tablets and perhaps netbooks. They aren't there yet in terms of performance for an actual PC. They might want a proper office suite and some serious apps before they try to go down that road.

Are they allowed to emulate x86? I'm pretty sure I've read somewhere that emulation is legal.
 
Last edited: