Future of IGP graphics and discrete GPU's with upcoming IB!

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Feb 19, 2009
10,457
10
76
Llano already obsoletes low end discrete.

Thats fine, since everything moves up a notch. Discrete always maintain an edge. The discussion points would be when "enough" is enough for the average user.

Well, thats assuming software stagnates and stay the same. Or that the "average" user is happy to game at low res indefinitely (unlikely). Given laptops all have hdmi out, the average user may be inclinded to plug their notebooks to output at 1080p for some casual gaming. If thats the case, then iGPUs need a long way to go to be acceptable for TODAYs games, not factoring in next gen game requirements.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,786
136
rumors point that it's even less than 60%. The x4 performance jump is just at syntetic benchmarks....cofcof.nvidia.cofcof

http://forum.beyond3d.com/showpost.php?p=1584795&postcount=109

If you actually read more of that thread though, you'd notice the Ivy Bridge's specifications are as follows:

2.2GHz CPU core
400-900MHz GPU

versus

3.4GHz with Turbo Mode for 2600K
850-1350MHz GPU

On there its showing 30% higher performance over the HD 3000 that clocks 50% higher, with the CPU that's practically twice as powerful.*

*Actually Ivy Bridge will clock about 15% less than the HD 3000 in the 2600K, but that's mitigated by the fact that every other desktop CPU is clocked the same as Ivy Bridge's. 1150MHz is 28% higher than 900MHz, add the 30% performance advantage on a preliminary sample and that's how you get 60% gains.
 

Puppies04

Diamond Member
Apr 25, 2011
5,909
17
76
Is that what Sandy Bridge E is for, then? For those enthusiasts who only want raw CPU power and nothing else?

Not really because games that like single threaded performance would be faster on my imaginary quad @ 8ghz vs a hex at 4.6.

SB-E is E-peen for almost every situation when gaming with certain exceptions at uber high resolutions or tri+ sli/crossfire setups.
 
Feb 19, 2009
10,457
10
76
Not really because games that like single threaded performance would be faster on my imaginary quad @ 8ghz vs a hex at 4.6.

SB-E is E-peen for almost every situation when gaming with certain exceptions at uber high resolutions or tri+ sli/crossfire setups.

Show me a bench of SB-E OC vs 2600K OC in gaming at high res/cf/sli that has SB-E faster.

It's a gimmick server CPU being sold as a consumer CPU, with no benefits for gaming, period.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
If you actually read more of that thread though, you'd notice the Ivy Bridge's specifications are as follows:

2.2GHz CPU core
400-900MHz GPU

versus

3.4GHz with Turbo Mode for 2600K
850-1350MHz GPU

On there its showing 30% higher performance over the HD 3000 that clocks 50% higher, with the CPU that's practically twice as powerful.*

*Actually Ivy Bridge will clock about 15% less than the HD 3000 in the 2600K, but that's mitigated by the fact that every other desktop CPU is clocked the same as Ivy Bridge's. 1150MHz is 28% higher than 900MHz, add the 30% performance advantage on a preliminary sample and that's how you get 60% gains.

Those are not the specifications. IB desktop parts will exceed current SB clockspeeds, what you posted is for the mobile parts.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Show me a bench of SB-E OC vs 2600K OC in gaming at high res/cf/sli that has SB-E faster.

It's a gimmick server CPU being sold as a consumer CPU, with no benefits for gaming, period.

Have you paid attention to extreme processors for the past 10 years?

No benefits for gaming? No joke :) It is NOT designed for gamers!

But just to spark the flame: Did you see the SLI/xfire scaling benchmarks for x79 a while back? SLI and crossfire seem
to benefit quite a bit from the new platform, *many* games were showing a 15-35 framerate increase on the x79 platform in dual GPU configurations
in comparison to sandy bridge.

I'd say 15-30 fps is not trivial :) Of course the amount varies per game and is it worth it for 500$ Probably not. Especially not for a gamer. But
i'd say EE processors are doing just fine for their intended demographic.
 
Last edited:

PingSpike

Lifer
Feb 25, 2004
21,756
600
126
Lets say that the IB iGPU has performance close to lets say...a 5850.

And lets also say it runs on dragon farts. :D

You're never going to get that out of an IGP, well at least not until system ram has become as fast as the dedicated memory in the 5850...which might happen in a decade or so.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
And lets also say it runs on dragon farts. :D

You're never going to get that out of an IGP, well at least not until system ram has become as fast as the dedicated memory in the 5850...which might happen in a decade or so.

Don't say never, I remember when people said 17" CRT monitors would never be cheaper than 500$. LCD's would never be mainstream. 640k RAM is more than anyone needs.

So on and so forth. It'll happen in time!
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Don't say never, I remember when people said 17" CRT monitors would never be cheaper than 500$. LCD's would never be mainstream. 640k RAM is more than anyone needs.

So on and so forth. It'll happen in time!

The bolded part is a stupid lie/myth and needs to die...
 

cotak13

Member
Nov 10, 2010
129
0
0
1)No matter how good Intel's IGP gets, they can only fit so much on a CPU with a TDP of 95W.

2)Also take into account that the DDR3 dual channel memory will be a bottleneck for IGPs.

1) If leaks are to be believed intel has power head room to grow into for IB on 22nm. They'll lean on their process advantage you can bet your bottom dollar on that.

2) http://www.micron.com/innovations/hmc.html still a tech lab sort of deal right now, as everyone have talked about it but when we'll be able to buy who knows, but hold some promises.
 

tweakboy

Diamond Member
Jan 3, 2010
9,517
2
81
www.hammiestudios.com
A GPU inside a CPU will never replace a dedicated video card.

Disable onboard GPU and use a video card. Who cares 300 percent faster ,,, it will never be as fast as a actual video card. its for people that want video and CPU with chip instead of buying a dedicated video card. Its great for the consumer that doesnt play games. Not great for us gamers,,, Ivy Bridge will be nice,, but disable onboard GPU,, pointless. thx gl
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
A GPU inside a CPU will never replace a dedicated video card.

Disable onboard GPU and use a video card. Who cares 300 percent faster ,,, it will never be as fast as a actual video card. its for people that want video and CPU with chip instead of buying a dedicated video card. Its great for the consumer that doesnt play games. Not great for us gamers,,, Ivy Bridge will be nice,, but disable onboard GPU,, pointless. thx gl

You wish.

It won't be long before "Good Enough®" people will appear.

They will argue that games run fine at reduced settings.
They will argue that since consoles run games ""Good Enough®" at upscaled SD with no AA, no AF, and all the other defrects of current consoles and fusion/APU/IGP will be fine for you!

They will argue that since Angry Birds runs "Good Enough®" on mobiles...you don't need a GPU.

It would be silly....they will argue.

They are already appearing on this board.
Posting that their new Llano-rig plays games "Good Enough®"

Welcome stagnation...bye bye innovation.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
No I think that lie is annoying like hell.
But it is a good way to gauge which other debaters actually does factfinding before posting...and who is just running on auto-pilot ;)

There aren't enough hours a day to fact check everything and this quote is both trivial (not worth your time to fact check) and well spread (assuming everyone says so, so its likely to be true).

As only humans we must prioritize what we fact check and this simply ranks far too low due to the above two reasons for anyone to bother... at least, until someone tells them it is a myth and then they have reason to check it... or most likely just reclassify it in their memory from "real quote" to "uncertain, commonly believed but I was told its a myth, no time to check"
 

Anarchist420

Diamond Member
Feb 13, 2010
8,645
0
76
www.facebook.com
You wish.

It won't be long before "Good Enough®" people will appear.

They will argue that games run fine at reduced settings.
They will argue that since consoles run games ""Good Enough®" at upscaled SD with no AA, no AF, and all the other defrects of current consoles and fusion/APU/IGP will be fine for you!

They will argue that since Angry Birds runs "Good Enough®" on mobiles...you don't need a GPU.

It would be silly....they will argue.

They are already appearing on this board.
Posting that their new Llano-rig plays games "Good Enough®"

Welcome stagnation...bye bye innovation.
Good point and that's scary as hell too.

Americans only care about art quality and gameplay rather than technical quality. That's evidenced by so many people so loudly demanding next gen console back buffers be 1080p rather than back buffers with more RGBA precision and at least 4xRGMSAA or 2x RGSSAA. That's why I don't fool with consoles because their graphics suck. Unfortunately, that's becoming acceptable even with PCs.

I think that they should charge lower royalty fees and more for the hardware and make it certain formats mandatory (if I believed in IP and was head of a console company that's what I'd do).
 

Olikan

Platinum Member
Sep 23, 2011
2,023
275
126
Welcome stagnation...bye bye innovation.

no man, igps will the new force of innovation.
it will be painfull at the begginig for sure, but then we might see some crazy stuff like out of order gpus to get rid of bandwidth wall.

sure, it's just a example, i don't really know if is actually possible create out of order gpus
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
With GPU compute going the way it is with shared memory space there is the very real chance that at some point we'll have the advantage switching back to the CPU because its within the same package and cache. Sitting on the end of a bus like PCI-E is going to have less memory bandwidth and we may find that game engines of the future run on compute and not DX. Far far out all sorts of mad things can happen to change the face of the discrete graphics card.

However I agree for now there is no way any IGP will be remotely interesting to an actual gamer, they are simply too slow and there isn't enough power budget nor memory bandwidth available to do much about it. But in 10 years of developing this technology who knows what we'll see.
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
http://www.theverge.com/2011/12/7/2616348/3d-dram-ibm-micron-128-gbps
3D DRAM created by IBM and Micron, 128GBps memory on the way.

Smaller and faster RAM with up to 128GBps transfer speeds could be on the way thanks to research from IBM and MIcron. The companies have developed three-dimensional memory, stacking individual DRAM chips vertically that would normally have to be placed side-by-side, though the efficiencies gained aren't solely down to space. Communication between the stacked chips and the host device is achieved through a new creation known as through-silicon vias, or TSVs, which run vertically through the pile of chips and act as conduits to the host device. Thanks to the TSVs, in testing the memory has reached data speeds of 128GBps, ten times faster than current memory. On top of this, IBM claims the chips are 70 percent more power efficient than today's DRAM.
Yes please, can we get 2 x 128 GB/s memory bandwidth DRAM sticks?
Suddenly your CPU's IGP has 256 GB/S memory bandwidth (a Radeon 6970 *only* has 176 GB/s).

Can you imagine say 1-2 years from now, your IGP has as much memory bandwidth as your current GPU does? and if IGPs keep getting better, we might just see discrete cards die off all together.

(if low ends die off, it might be come to expensive to produce high end GPUs that very few buy, thus they stop doing it, except for professional users)
 

wahdangun

Golden Member
Feb 3, 2011
1,007
148
106
ha, intel IGP and their crap driver, in no way intel can catch AMD or NV in GPU market, they even have a trouble enabling TnL, and larrabe failure