Nvidia touts record design wins for Sandy Bridge PCs

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

StrangerGuy

Diamond Member
May 9, 2004
8,443
124
106
What use does an IGP have embedded to the CPU for the "enthusiast" when all is going to do is create more heat?

Never heard of switchable graphics? If anything having the IGP actually dramatically reduces your idle/light load power consumption. If anything Intel IGPs excels at it will be their sheer minimal power draw. You don't want your "insert dedicated mobile GPU" to idle for web surfing or office work and stuff; it kills your battery life really quick.
 

Red Storm

Lifer
Oct 2, 2005
14,233
234
106
What use does an IGP have embedded to the CPU for the "enthusiast" when all is going to do is create more heat?

You know how a certain graphics company is telling everyone to use a 2nd mid-range card for "physics" calculations to enhance graphics? And how video cards are slowly being utilized more and more to perform certain duties much faster than a CPU ever could? Well, in a few years you'll have that mid-range number crunching chip embedded on to the CPU itself. It's tough to get a proprietary solution that only half the market has access to to go mainstream. But, soon enough everyone who buys a new CPU is going to get a graphics chip packaged along with it. That is (hopefully) how things like physics and other goodies will become more accessible, and as a result, more utilized by developers.

You have to look beyond the performance figures of the first gen APUs and think about where it will take us down the road. I personally can't wait. :)
 
Last edited:

Meghan54

Lifer
Oct 18, 2009
11,528
5,045
136
More heat+5450 graphic power? No thanks.

What use does an IGP have embedded to the CPU for the "enthusiast" when all is going to do is create more heat?

What you're oblivious to in all this is the vast majority of "wins" Nvidia is claiming are in notebooks, and those relegated to the upper end....desktop replacement units/gaming units.

Intel, on the other hand, is focusing the upcoming SB's IGP towards the lower end.....a market segment much, much larger than the 1% enthusiast market you seem so concerned about.

Intel's IGP coming with SB is going to shove Nvidia right out of the low-to-mid-range notebooks and possibly desktops. And in those cases, esp. with notebooks, LESS heat will be produced as the IGP is integrated into the cpu, making there only one significant heat source to be cooled instead of two separate and significant heat sources.....a cpu and discrete video processor. So, a win-win situation for Intel....ease for the OEM to put video that most buyers will need (play videos, games on social networking sites, email, etc.)....and cheaper and easier cooling solutions can be designed.

The upper end, there will always be a demand for discrete video gpu's....that won't change. But then, the IGP in the SB cpu can be disabled, consequently no additional heat.
 

Rayb

Member
Dec 31, 2008
122
1
76
Shame really. Would be nice if you could use a discrete GPU and the IGP concurrently. Maybe such will be the case with fusion...

What would be the point? If you need a much better GPU, choose one according to your needs. Keep in mind, we're talking about notebooks for the roadwarriors, maximizing battery life is the objective here. This technology is readily available today, not if or when fusion is ready.
 

Gloomy

Golden Member
Oct 12, 2010
1,469
21
81
What would be the point? If you need a much better GPU, choose one according to your needs. Keep in mind, we're talking about notebooks for the roadwarriors, maximizing battery life is the objective here. This technology is readily available today, not if or when fusion is ready.

Crossfire between IGP and discrete on a notebook would be pretty great.
 

Patrick Wolf

Platinum Member
Jan 5, 2005
2,443
0
0
What would be the point? If you need a much better GPU, choose one according to your needs. Keep in mind, we're talking about notebooks for the roadwarriors, maximizing battery life is the objective here. This technology is readily available today, not if or when fusion is ready.

It's just a waste. Having GPU power available that has to go dormant. Both gaming-on-the-go and desktop users could benefit even if it's only by a small margin at it's current state.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
I think just like "Fusion" people were expecting way too much out of Sandy Bridge. Sure it's the "next generation" of integrated graphics, but it's still just as far behind the next generation of discrete graphics as it was last round.

It's just a cheaper way to make a CPU+GPU, nothing more. Anyone who thought otherwise either did no research or did a lot of wishful thinking. This is why these companies are throwing tons of cash at NVIDIA.
 

cotak13

Member
Nov 10, 2010
129
0
0
I think just like "Fusion" people were expecting way too much out of Sandy Bridge. Sure it's the "next generation" of integrated graphics, but it's still just as far behind the next generation of discrete graphics as it was last round.

It's just a cheaper way to make a CPU+GPU, nothing more. Anyone who thought otherwise either did no research or did a lot of wishful thinking. This is why these companies are throwing tons of cash at NVIDIA.

First, design wins does not mean assured cash. Nvidia only makes money for each device sold. No one's thrown money at Nvidia. That's would be one messed up universe where the customer pays the vendor for the parts plus pays for access in the first place. No, in electronics the customer gets free samples, cheap subsidized or free tools and free support. I speak from experience.

A bit of background info about CPU GPU integration.

Fusion is not about just sticking a CPU and GPU onto the same die to save costs. If it was that simple fusion would have been quick to market. I mean if you buy a newly produced Xbox you already have that. AMD wouldn't have had to buy ATI. They would just have to partner.

What is fusion really is about? Heterogeneous computing.

Let me quote some stuff from AMD:

"To really answer the question “will discrete GPUs die out?” we need to look at the quantum level. Despite having the power budget for nearly unbounded performance, one of the bottlenecks for discrete GPUs is PCI Express, the interface to the system. In the case of graphics workloads, normally PCI Express does not present a constant bottleneck, however, everyone and their dog has heard of parallel computing. This is the case where there is a lot of traffic between the discrete GPU, CPU and main memory, and PCI Express becomes a liability in terms of bandwidth and latency.

Many people do not realize this, but the “life” of a CPU is dreadful and boring. It exists primarily waiting for data. The use of discrete GPUs for parallel computing through PCI Express will not improve the “quality of life” of the CPU. While some discrete GPUs will offer graphics and pure FLOP performance over an APU, the performance will be limited, in some cases, by the PCI Express interconnect.

This is where AMD Fusion APUs will shine. AMD Fusion APUs have not only been designed to offer great graphics performance, they also have been designed to offer great parallel compute performance. The fact that the CPU core resides next to the GPU core connected by a bus of mere nanometers, helps diminish the bandwidth and latency issues presented to parallel computing on a PCIE bus.

The design plan for successive generation of AMD APUs includes architectural innovation, as well as tighter and faster interconnects between the CPU cores and the GPU cores. One goal is to advance the parallel compute capabilities without sacrificing x86 and graphics performance."

If you ever actually used GPGPU you'll know that while it's powerful it's limited and difficult to program for. Badly done it's no better than using a powerful CPU. Of particular problem are things that CPUs do very well like looping and branching. Also as quoted above, moving data between system and GPU is actually limited right now by PCIE. That and various other reasons is why fusion is a useful idea for moving GPGPU from something niche useful for very specific applications (requiring specific niche programming skills to draw the best out of) into something more general purpose.

I have an entire research seminar presentation open right now done by Altera where they investigated GPGPU vs ASIC vs CPUs. The conclusion is that wow GPGPU is powerful but needs LTC when it comes to coding and how you present the data for computation. That is what fusion is trying to solve. It's not as simply as putting CPU and GPU together to save cost. It's about combining the best of GPU and CPU to get a chip that can do well in things CPUs excel at and things that GPU or SIMD processors are good at.
 
Last edited:

Obsoleet

Platinum Member
Oct 2, 2007
2,181
1
0
I've been waiting a while now for a SB laptop. The performance of SB is fine, and will probably be the first laptop that doesn't turn into a fireball while gaming, or drain the battery an exorbitant amount (will still need plugged in I'm sure), while having all the performance a laptop could possibly need. At least in my book, it's a laptop, for light gaming at most. I'm willing to bet SB itself gets plenty hot without tossing in some hodge podge GPU.

As soon as I can get my hands on SandyBridge, I'm ordering one, and there is no way that I want an Nvidia GPU messing it up, raising the price or being near it. No "design win" for me.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
The performance of SB is fine,

For who, you, and for what purposes?

and will probably be the first laptop that doesn't turn into a fireball while gaming,

for what games? I'm sure most laptops won't turn into a fireball with the games it's meant to play

or drain the battery an exorbitant amount (will still need plugged in I'm sure),

If it will still need a plug whats the difference? Optimus saves battey life.

while having all the performance a laptop could possibly need.

Says who?
it's a laptop, for light gaming at most.
You got one..:)

I'm willing to bet SB itself gets plenty hot
Contradict yourself much?

there is no way that I want an Nvidia GPU messing it up,

How is it gonna mess it up?

raising the price

You know the prices allready? please share.

No "design win" for me
Unless you own Nvidia stock.
 

Dadofamunky

Platinum Member
Jan 4, 2005
2,184
0
0
Just saying most of the pre-configured desktops with discrete graphics don't have high end GPU's in them. I doubt it'll change with Sandy Bridge.

What seems likely is that Intel may start to strangle sales of lower-end discrete GPUs - which neither NVidia or AMD want, to be sure. Particularly on the mobile side, where a lot of their bills get paid.

That's kinda scary from a standpoint of wanting SOME differentiation and competition in the market. However, the GPU folks still have some ammo. But at the least this may hurt their margins.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,815
7,171
136
You'd have a hard time finding discreet graphics in most laptops nowadays anyhow, Intel's new IGP's are great news because instead of having to choose between mediocre processors (AMD) or flat out terrible GPUs (Intel), 2011 is going to be the year of pretty good all around regardless of which team you're rooting for.

Discreet graphics are going to become more prevalent in laptops as they take over more and more marketshare from Desktops, and hence are required to do the same things people's desktops did prior to this sea change.
 

Obsoleet

Platinum Member
Oct 2, 2007
2,181
1
0
For who, you, and for what purposes?
Nah I was thinking of getting President Obama one next Christmas. It's for running military operations out of the White House. I'm glad you asked. Maybe he'll do some light gaming, and everything else you'd use a laptop for ya derp.
for what games? I'm sure most laptops won't turn into a fireball with the games it's meant to play
Maybe some TF and LoL. Haven't really decided yet! I'll let you know right before installing. I also play the drums, read the book World War Z, and enjoy long walks on the beach. Anything else you're curious about?
If it will still need a plug whats the difference? Optimus saves battey life.
Longterm gameplay will always required being plugged in. There's a difference, in battery life. You're a quick one.
Says who?
You got one..:)
Contradict yourself much?
Isn't there a minimum maturity level to post here.. SB will probably get plenty warm by itself. I can't imagine some ungodly NV GPU hobbled onto it. No thanks. Believe that I'm not the only one with this viewpoint.
How is it gonna mess it up?
Good, quality Intel engineering, that is just fine the way it is. Intel > Nvidia all day my boy.
You know the prices allready? please share.
Sorry, I was under the assumption a hacked on NV GPU would make my laptop cheaper. Thanks for the protip.
 

Gloomy

Golden Member
Oct 12, 2010
1,469
21
81
Nvidia doesn't have IGPs anymore though, Wreckage. They don't like to see a bridge that isn't aflame.
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
What seems likely is that Intel may start to strangle sales of lower-end discrete GPUs - which neither NVidia or AMD want, to be sure. Particularly on the mobile side, where a lot of their bills get paid.

That's kinda scary from a standpoint of wanting SOME differentiation and competition in the market. However, the GPU folks still have some ammo. But at the least this may hurt their margins.

This is FANTASTIC in my opinion. Both AMD and NV has gotten very lazy with sub-par discrete mobile GPUs and this will put the pressure on. Why get a low-end discrete that is marginally better than crappy Intel GMA and uses more power? The SB GPU should be good enough for very light gaming, and have awesome power usage (which it definitely will). This means if an AMD or NV discrete is added, it better be decent. Mobile GPUs were pretty strong 4-5 years ago, but have really slacked-off compared to desktop variants recently. I would love to see the trend reverse itself, and SB/Fusion should be able to hopefully do just that.
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
It's just a waste. Having GPU power available that has to go dormant. Both gaming-on-the-go and desktop users could benefit even if it's only by a small margin at it's current state.

You are definitely right in theory, but this is not very likely in practice. Take this example: The integrated GPU is 1/4 the power of the discrete (say a SB GPU plus a pretty fast discrete). It's not very easy to load-balance rendering between 2 uneven GPUs. That is why SLI/CF is done with identical hardware. The scaling of a 1:1 and 1:4 part would probably end-up with performance around (or even slower) than just the discrete PLUS adding in the extra heat because you are using 2 GPUs.