Zodiark1593
Platinum Member
- Oct 21, 2012
- 2,230
- 4
- 81
I even played PSX emulators on my AMD 780G chipset IGP. Which is, I think, around or slower than a 5450.
Not exactly too much of a challenge for that igp. ePSXe runs even on a VIA unichrome.
I even played PSX emulators on my AMD 780G chipset IGP. Which is, I think, around or slower than a 5450.
Can you imagine a Skylake CPU that's the same size as the current i5/i7, ~120mm2, instead of having 4c/8t, it's actually 8c/16t and it costs Intel the SAME to manufacture meaning they could sell it at the same price?
Wouldn't that be better for PC users/gamers? Heck yes.
But it's bad for Intel. Because they wouldn't be able to get insane profits like selling 5960X 8c/16t for $999.![]()
Can you imagine a Skylake CPU that's the same size as the current i5/i7, ~120mm2, instead of having 4c/8t, it's actually 8c/16t and it costs Intel the SAME to manufacture meaning they could sell it at the same price?
Can you imagine a Skylake CPU that's the same size as the current i5/i7, ~120mm2, instead of having 4c/8t, it's actually 8c/16t and it costs Intel the SAME to manufacture meaning they could sell it at the same price?
why would intel sell it at the same price? people who want 8 cores are willing to pay extra for those 8 cores than they are for 4 cores, and so intel prices accordingly.
Just so you know, Intel's 4-core CPUs with iGPUs are still much smaller than their 6- and 8-core CPUs without:
http://www.anandtech.com/show/8426/...review-core-i7-5960x-i7-5930k-i7-5820k-tested
Ivy Bridge 4C + iGPU: 160mm^2
Ivy Bridge 6C (no iGPU): 257mm^2
Haswell 4C + iGPU: 177mm^2
Haswell 8C (no iGPU): 356mm^2
Despite the iGPU being half of the CPU's total die area, leaving it out and doubling the core count more than doubles the CPU size, suggesting 8 cores takes up more than 4x more area than 4 cores.
Can you imagine a Skylake CPU that's the same size as the current i5/i7, ~120mm2, instead of having 4c/8t, it's actually 8c/16t and it costs Intel the SAME to manufacture meaning they could sell it at the same price?
Wouldn't that be better for PC users/gamers? Heck yes.
But it's bad for Intel. Because they wouldn't be able to get insane profits like selling 5960X 8c/16t for $999.![]()
Before IGPs, CPUs cost about the same per unit, IGP is awesome, because no need for bad motherboard IGPs or discrete cards where they are not needed. IGP comes at no cost initially and during using it adds probably just few watts of draw to rest of the CPU. While my main rig has no IGP, I do use it in all other rigs, it's the best invention ever and if you are not gaming hardcore, it's nobrainer to have one. I'm glad that every CPU by now has it.
I agree, IGPs are great. But Intel's IGP would be just as good if it didnt try to be more useful than it is. It could and should be only 2-4 EU instead of ballooning to 12,16,24, etc. It only needs to be strong enough to run 2D games.
I love the igp as a spare or second video card. It's good for testing purposes, or to get you by if your dgp dies on you.
Now with DX12, we may get some more use out of it.
I agree, IGPs are great. But Intel's IGP would be just as good if it didnt try to be more useful than it is. It could and should be only 2-4 EU instead of ballooning to 12,16,24, etc. It only needs to be strong enough to run 2D games.
Except that even 2D games - hell, even UI effects in your OS - need more GPU horsepower than that, and those requirements are always going up.
Not likely to get anymore use out of it. DX12 when used properly makes use of the CPU more efficient and the IGP is nowhere near powerful enough to make your CPU the bottleneck when gaming.
I do agree with your first point though. I rather like having it. I have my secondary monitor running off the IGP which frees up a bit of vram from being used on my card, it also gives me QuickSync which I use often.
McMullen showcased the benefits for a hybrid configuration using the Unreal Engine 4 Elemental demo. Splitting the workload between unnamed Nvidia discrete and Intel integrated GPUs raised the frame rate from 35.9 FPS to 39.7 FPS versus only targeting the Nvidia chip. In that example, the integrated GPU was relegated to handling some of the post-processing effects.
It was already demonstrated to help out in mutiadapter mode. And that was presumably an earlier, less capable intel igp than what we have now.
http://techreport.com/news/28196/di...-shares-work-between-discrete-integrated-gpus
Again, not likely to happen, certainly not to an extent that will matter. A lot of things were demonstrated with DX12, and we are starting to see just how difficult it is to get that extra performance out of it, and that's before we get into the complication that is multi-adapter mode. It's the difference between what CAN be done and what WILL be done.
I agree, IGPs are great. But Intel's IGP would be just as good if it didnt try to be more useful than it is. It could and should be only 2-4 EU instead of ballooning to 12,16,24, etc. It only needs to be strong enough to run 2D games.
I agree, IGPs are great. But Intel's IGP would be just as good if it didnt try to be more useful than it is. It could and should be only 2-4 EU instead of ballooning to 12,16,24, etc. It only needs to be strong enough to run 2D games.
Except that even 2D games - hell, even UI effects in your OS - need more GPU horsepower than that, and those requirements are always going up.
I haven't encountered a need for anything more powerful then the Intel IGP's since Ivy Bridge even when running dual 1080p displays for your basic desktop needs. I'm not only speaking for myself but the hundreds of users I've supported over the years. A handful of which run 3 displays.
