• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

Intel Iris & Iris Pro Graphics: Haswell GT3/GT3e Gets a Brand

Page 12 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
They not losing money with Tegra. Gross margins are 50%, much higher than on their OEM Geforce products.
 

ams23

Senior member
Feb 18, 2013
907
0
0
Idontcare said:
The same slippery slope can happen in discrete GPUs. Once low-cost iGPU/APU products hit critical volumes and manage to revenue-starve the R&D engines for future discrete GPU products the game will be over for them as their amortized costs will spiral out of control as the volumes sold get lower and lower.

Discrete GPU's are evolving in order to stay relevant and to avoid becoming commoditized. In the future, GPU's will have CPU cores integrated on die (among other things). And the R&D investments made for GPU's will be leveraged across all lines of business, including mobile.
 

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76

What's there? 900MM isn't for GPU only as you stated, but GPU, software, VLSI, Content and Process R&D. Of those Tegra gets benefts of Software, VLSI, content and Process R&D, on top of the 300MM exclusively dedicated to Tegra.

When a product is responsible for less than 20% of your revenues, has negative operating margins but eats 25% of your R&D budget and benefits from a sizable portion of the other 75%, this is where you are betting your farm. And of couse Nvidia is spending record on R&D. They need to get other markets, and for that they have to pay it.

But... You have to be very careful when reading Nvidia investor material. Reading Nvidia material is the business equivalent of racing against Dirk Dastardly.

have a look at slide 15. Did you notice the nice CAGR number? Nice, isn't it? Did you notice what the higher numbers are? Yes, not GPU. But this is just the beginning. Did you notice that they chose FY10 (2009) for baseline? Yes, this is the first shennanigan here, because it was a very bad year for the GPU business, far worse than 2007 or 2008. Put those two and you may have close to 0% CAGR.

But there is more. Still on slide 15, what about that CAGR number for Geforce Gamer and Geforce OEM? Sounds a healthy market, doesn't it? But once you cross reference the numbers with Nvidia overall market share and overall size market, you can see why they can get those nice numbers. Nvidia is eating AMD market share, as you can see in slide 19.

On the same slide 19 they show us something new. They mention a 12% CAGR for their GPU business and mentions the 3.2 billion in revenues. This 3.2 billion in GPU revenues includes 300 million of Intel settlement, and this revenue is pratically net. Purge this and CAGR becomes far smaller.

Once we purge this revenues we can verify that the overall market has slightly shrunk since 2009. Purge the revenues from the PSB business and you can see that Geforce business shrunk even more.

As I said, the downward trend in the dGPU market is there for everyone to see. Nvidia management is doing the right thing in running away from GPU. The market is shrinking, and the attack it is going to face from Intel and AMD isn't something they can support.
 

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
They not losing money with Tegra. Gross margins are 50%, much higher than on their OEM Geforce products.

Gross margins don't mean anything alone. They may have 99% gross margins and still lose money. On Tegra's case, they didn't disclose gross margins (and I doubt they are over 50%), but they lost 157MM on this segment in 2012. Better than the 200MM lost in 2011, but it is still a loss. They didn't say anything about breaking even on Tegra, meaning that we'll have to get used to red ink when talking about Tegra.
 
Last edited:

ams23

Senior member
Feb 18, 2013
907
0
0
mrmt said:
As I said, the downward trend in the dGPU market is there for everyoneto see. Nvidia management is doing the right thing in running away from GPU. The market is shrinking, and the attack it is going to face from Intel and AMD isn't something they can support.

I don't think you get the big picture here. In the future, what is currently known as a discrete GPU will be a much more integrated device (including but not limited to having the CPU integrated on GPU die). At some point in the future, each and every product that NVIDIA makes will be a Tegra variant.
 

NTMBK

Lifer
Nov 14, 2011
10,455
5,842
136
I don't think you get the big picture here. In the future, what is currently known as a discrete GPU will be a much more integrated device (including but not limited to having the CPU integrated on GPU die). At some point in the future, each and every product that NVIDIA makes will be a Tegra variant.

Sooo... how on earth are PC games, running on x86, going to be able to make use of those ARM cores? :confused:
 

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
I don't think you get the big picture here. In the future, what is currently known as a discrete GPU will be a much more integrated device (including but not limited to having the CPU integrated on GPU die). At some point in the future, each and every product that NVIDIA makes will be a Tegra variant.

This is where everyone is heading, no? Everyone is heading to SoC, not fragmented components anymore. Nvidia is just one more SoC designer, and one far smaller than Intel, Samsung, Qualcomm or even TI for that matter.
 
Last edited:

SiliconWars

Platinum Member
Dec 29, 2012
2,346
0
0
Sooo... how on earth are PC games, running on x86, going to be able to make use of those ARM cores? :confused:

As I mentioned earlier, Android is winning the OS war so Nvidia still has a chance.

Whether or not they can develop a strong enough CPU to drive enthusiast gaming is another story.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
GT3e doesn't come close to the PS4 though. Even if Intel makes yearly doubling of their graphics power, it would take 3 years before I'd expect any Intel IGP to deliver reasonably comparable visuals.

The first sentence is true, but its not far away as you think. Haswell GT3e running @ 1.3GHz has 832GFlops compute, compared to 1.84TFlops for the PS4. Kaveri should have 922GFlops @ 900MHz.
 

ams23

Senior member
Feb 18, 2013
907
0
0
mrmt said:
They didn't say anything about breaking even on Tegra, meaning that we'll have to get used to red ink when talking about Tegra.

Actually NVIDIA did recently say that Tegra is essentially a profitable business because they can heavily leverage their latest GPU R&D investments. The incremental R&D investment for Tegra is about $300 million for fiscal yr 2014 IIRC. Since Tegra revenues are projected to be well north of that, and depending on how expenses are allocated across all lines of business, Tegra could actually end up to be a positive contributer to the bottom line.
 

ams23

Senior member
Feb 18, 2013
907
0
0
Sooo... how on earth are PC games, running on x86, going to be able to make use of those ARM cores? :confused:

The better question is: why would PC games of the future not make use of ARM CPU cores, considering ARM's prevalence in the mobile market today?
 

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
Actually NVIDIA did recently say that Tegra is essentially a profitable business because they can heavily leverage their latest GPU R&D investments. The incremental R&D investment for Tegra is about $300 million for fiscal yr 2014 IIRC. Since Tegra revenues are projected to be well north of that, and depending on how expenses are allocated across all lines of business, Tegra could actually end up to be a positive contributer to the bottom line.

If someone at Nvidia said that I'll consider it yet another Dirk Dastardly moment of them, because what they reported in their SEC fillings is a 200MM loss in 2011 and a 157MM loss in 2012, and given that the Tegra business is cash intensive, it should be a cash negative business too.

And if we are to believe in their investor slides, Tegra should be losing a lot more money, because a lot of their "core" R&D spending benefits the Tegra line up and the GPU business should be funding this bill. It is Tegra that is getting a free lunch, not the other way around.
 

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
The better question is: why would PC games of the future not make use of ARM CPU cores, considering ARM's prevalence in the mobile market today?

Because they are 10 years behind x86 in performance :)
 

SiliconWars

Platinum Member
Dec 29, 2012
2,346
0
0
The better question is: why would PC games of the future not make use of ARM CPU cores, considering ARM's prevalence in the mobile market today?

The problem is that we are many years away from powerful enough ARM cores capable of running near current enthusiast standards. Nvidia might not survive long enough to see it. ARM and Android could supplant wintel eventually but it's more likely to be under Qualcomm's and Samsung's leadership than Nvidia's.
 

ams23

Senior member
Feb 18, 2013
907
0
0
This is where everyone is heading, no? Everyone is heading to SoC, not fragmented components anymore. Nvidia is just one more SoC designer, and one far smaller than Intel, Samsung, Qualcomm or even TI for that matter.

They are just "one more SoC designer" that has an incredible amount of expertise in graphics and GPU computing at a time when the GPU is becoming more and more important in SoC designs.
 

SiliconWars

Platinum Member
Dec 29, 2012
2,346
0
0
Because they are 10 years behind x86 in performance :)

We don't know how far behind or otherwise ARM is at desktop level as they don't design CPU's for that market. Given 100W or so they certainly won't be 10 years behind. Somebody needs to develop one first.
 

MightyMalus

Senior member
Jan 3, 2013
292
0
0
As I mentioned earlier, Android is winning the OS war so Nvidia still has a chance. Whether or not they can develop a strong enough CPU to drive enthusiast gaming is another story.

You give too much hope to NVIDIA. By the time NVIDIA pulls their custom 64-bit SoC, Apple and Qualcomm would already have theirs out. Heck, Isn't 64 bit SnapDragon already expected in 2014? And Samsung pretty much uses a direct ARM design, so those will be out even sooner.

Hey, I wanna see what NVIDIA comes up with, but they aren't near any of the other guys.

Because they are 10 years behind x86 in performance

And that matters to the $2.2Billion in app store sells this first Quarter?

They are just "one more SoC designer" that has an incredible amount of expertise in graphics and GPU computing at a time when the GPU is becoming more and more important in SoC designs.

Qualcomm has many years of GPU expertise, since they bought part of ATI years ago.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
nVidia offsets the missing low-end with Tegra. Not even that they have much higher margins than with a low end card.

Revenue != profit. Revenue and profit are two totally different things, ideally one creates the other. Unfortunately, this just isn't the case for Tegra. This has been mentioned before: The tegra division has only produced net loss quarter after quarter with no profit yet. This is plainly visible if you read nvidia's balance sheets and income statements (among other quarterly documents) in their entirety, specifically looking at divisional breakdowns between the Tegra division and the Geforce consumer division.

I understand it's a long term proposition for Tegra, but your wording seems to indicate that nvidia is making beau-coup money from Tegra, when the opposite is the case. Since you spend your entire day seemingly studying all matters related to nvidia, might I suggest learning how to interpret a quarterly income statement. Understanding these things better would serve you well. I assume you'd like to argue with facts instead of spreading meaningless drivel about how x86 is doomed and that Tegra 4 is going to be the only chip relevant in 2014.
 
Last edited:

ams23

Senior member
Feb 18, 2013
907
0
0
Because they are 10 years behind x86 in performance :)

LOL, right...at a fraction of the power, die size, and price too. Anyway, we should revisit in one or two years when new 64-bit custom ARM processors come to market that actually attempt to reach beyond ultra low TDP's.
 

ams23

Senior member
Feb 18, 2013
907
0
0
mrmt said:
If someone at Nvidia said that I'll consider it yet another Dirk Dastardly moment of them, because what they reported in their SEC fillings is a 200MM loss in 2011 and a 157MM loss in 2012, and given that the Tegra business is cash intensive, it should be a cash negative business too

You don't get it. The reported loss was due in part to the way company-wide expenses were allocated across lines of business, and also due in part to expenses incurred on new products (such as Icera i500 modem, T4i, etc.) that have yet to achieve revenue. The important point to note is that incremental R&D expense in Tegra is $300 million for FY2014, and Tegra as a whole (including auto, embedded, tablets, smartphones) brings in much more revenue than that, and more than enough to offset expenses for FY2014.
 

NTMBK

Lifer
Nov 14, 2011
10,455
5,842
136
The better question is: why would PC games of the future not make use of ARM CPU cores, considering ARM's prevalence in the mobile market today?

Because you lose the decades of backwards compatibility that x86 and Windows gives you. Because PC gaming is entrenched in x86 and growing rapidly, and it would take a massive incentive to move it from there.
 
Aug 11, 2008
10,451
642
126
Because you lose the decades of backwards compatibility that x86 and Windows gives you. Because PC gaming is entrenched in x86 and growing rapidly, and it would take a massive incentive to move it from there.

Not to mention that almost every business, manufacturing, educational, research, health care, etc, etc, institution runs on x86.


People that say android is overtaking windows are only looking at part of the picture--the consumer space which is obsessed with content consumption devices and little tablet toys.

When there is real work to be done, call in x86 and Windows. Could that change, of course, but I think it is a long ways off.
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
LowEnd sub $100 Discrete GPUs are loosing ground. Cards like HD6450 and GT210/220 are not selling millions of units like they used to be. Thats the reson that the dGPU market is shrinking.

So, more PCs may have been shiped but the majority of those low end PCs now using iGPUs and not those Low End sub $100 dGPUs as was happening before.

Thanks, you just proved my point.
 

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
You don't get it. The reported loss was due in part to the way company-wide expenses were allocated across lines of business, and also due in part to expenses incurred on new products (such as Icera i500 modem, T4i, etc.) that have yet to achieve revenuet. The important point to note is that incremental R&D expense in Tegra is $300 million for FY2014, and Tegra as a whole (including auto, embedded, tablets, smartphones) brings in much more revenue than that, and more than enough to offset expenses for FY2014.

I don't think you have an idea of the size of the absurd you are saying here.

Intel CPU business isn't reporting losses because Haswell or Broadwell aren't generating revenues, same with Nvidia GPU business, they aren't reporting losses in their GPU business because Maxwell isn't generating revenues. This is to be expected, no? You fund R&D for future products with current products revenues.

And you can clearly see that on Nvidia financial statements. Nvidia didn't report an operating loss of $157MM for every business of them except GPUs, but for the Tegra processor business only, meaning that only what is directly related to Tegra is included as operating expense to get to this number.

With the data they provided you can also have a glimpse of their sub-30% gross margins in the Tegra business... but let's leave that for another thread.