Question Which ULV is better for general tasks -- quad-core with standard graphics or dual-core with Iris graphics?

zliqdedo

Member
Dec 10, 2010
59
10
81
Intel’s 15W 8th gen offerings have four cores, but only standard graphics, whereas their 7th gen options only have two cores, but come with better Iris graphics and 64 MB of eDRAM, which also acts as a L4 cache for non-video tasks, essentially giving you 6266 MHz RAM for the first 64 MB.

I know quad-core CPUs are beneficial for certain tasks, but which of the aforementioned options would be better at general tasks such as multitasking about 5 GB worth of apps like: browser tabs, email client, Word, Excel, chat, light photo editing software, and streaming music and video?

I can’t help but feel the dual-core option with better graphics and an L4 cache would perform better for this use-case, but I could be very wrong.
 

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,695
136
Intel’s 15W 8th gen offerings have four cores, but only standard graphics, whereas their 7th gen options only have two cores, but come with better Iris graphics and 64 MB of eDRAM, which also acts as a L4 cache for non-video tasks, essentially giving you 6266 MHz RAM for the first 64 MB.

I can’t help but feel the dual-core option with better graphics and an L4 cache would perform better for this use-case, but I could be very wrong.

I think you're confusing Skylake with Broadwell. Intel changed the layout on SKL/KBL/CFL so the on-chip EDRAM only benefits graphics.

Broadwell was a bit of a one-off in that regard.

Unless you're running demanding triple A titles on your laptop, you won't benefit at all from Iris Graphics. The "basic" UHD6xx series has all the hardware video acceleration you'll need.

I know quad-core CPUs are beneficial for certain tasks, but which of the aforementioned options would be better at general tasks such as multitasking about 5 GB worth of apps like: browser tabs, email client, Word, Excel, chat, light photo editing software, and streaming music and video?

I wouldn't go below a quadcore today. Too many things are now multithreaded, and even the 15W quads have decent turbos. Get as much RAM and as large an SSD as you can within your budget.
 

coercitiv

Diamond Member
Jan 24, 2014
7,223
16,977
136
I know quad-core CPUs are beneficial for certain tasks, but which of the aforementioned options would be better at general tasks such as multitasking about 5 GB worth of apps like: browser tabs, email client, Word, Excel, chat, light photo editing software, and streaming music and video?

I can’t help but feel the dual-core option with better graphics and an L4 cache would perform better for this use-case, but I could be very wrong.
You would be very wrong indeed (but understandably so): the L4 cache helps to keep the cores fed especially on laptops with slow RAM, but that advantage simply pales in comparison with the raw throughput a quad-core brings.

As long as gaming performance is not on the list, the quad-core will win hands down in all workloads.
 

zliqdedo

Member
Dec 10, 2010
59
10
81
I think you're confusing Skylake with Broadwell. Intel changed the layout on SKL/KBL/CFL so the on-chip EDRAM only benefits graphics.

Broadwell was a bit of a one-off in that regard.

For the eDRAM implementation, Intel is still using their second generation eDRAM implementation whereby the eDRAM acts as a L4 buffer for supplying the L3 from DRAM through the System Agent – this is compared to the first generation where the eDRAM was a victim cache. This methodology allows the eDRAM to speed up more use cases than just graphics, and the 50 GBps bidirectional bandwidth is certainly a big leap over main DRAM bandwidth (that some OEMs run in single channel mode anyway).
- Ian Cutress

You would be very wrong indeed (but understandably so): the L4 cache helps to keep the cores fed especially on laptops with slow RAM, but that advantage simply pales in comparison with the raw throughput a quad-core brings.

But do I really need said raw throughput for the basic tasks I intend to do? I'm afraid the slower GPU would struggle to render OS animations in the high-res laptops I'm looking at, that it would struggle to render and scroll through web pages smoothly, etc.
 

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,695
136
It seems I misread something about the SKL L4 cache setup. Broadwells L4 was indeed a victim cache.

https://www.anandtech.com/show/9582/intel-skylake-mobile-desktop-launch-architecture-analysis/5

Broadwell:
27.jpg


Skylake:

28.jpg


Rather than acting as a pseudo-L4 cache, the eDRAM becomes a DRAM buffer and automatically transparent to any software (CPU or IGP) that requires DRAM access. As a result, other hardware that communicates through the system agent (such as PCIe devices or data from the chipset) and requires information in DRAM does not need to navigate through the L3 cache on the processor. Technically graphics workloads still need to circle around the system agent, perhaps drawing a little more power, but GPU drivers need not worry about the size of the eDRAM when it becomes buffer-esque and is accessed before the memory controller is adjusted into a higher power read request. The underlying message is that the eDRAM is now observed by all DRAM accesses, allowing it to be fully coherent and no need for it to be flushed to maintain that coherence. Also, for display engine tasks, it can bypass the L3 when required in a standard DRAM access scenario. While the purpose of the eDRAM is to be as seamless as possible, Intel is allowing some level on control at the driver level allowing textures larger than the L3 to reside only in eDRAM in order to prevent overwriting the data contained in the L3 and having to recache it for other workloads.

(emphasis mine)

It seems SKL can indeed use the L4 cache for all memory access. My mistake. It's easy to get confused when trying to keep up with everything. :(

--------------------------------------

That said, I still don't think the L4 cache worthwhile. Dual channel 2400MHz DDR4 already provides 38.4GB/s of bandwidth, across the entire memory pool. Not just the first 64/128MB of it.
 
  • Like
Reactions: wilds and NTMBK

zliqdedo

Member
Dec 10, 2010
59
10
81
It seems I misread something about the SKL L4 cache setup. Broadwells L4 was indeed a victim cache.

https://www.anandtech.com/show/9582/intel-skylake-mobile-desktop-launch-architecture-analysis/5

Broadwell:
27.jpg


Skylake:

28.jpg




(emphasis mine)

It seems SKL can indeed use the L4 cache for all memory access. My mistake. It's easy to get confused when trying to keep up with everything. :(

--------------------------------------

That said, I still don't think the L4 cache worthwhile. Dual channel 2400MHz DDR4 already provides 38.4GB/s of bandwidth, across the entire memory pool. Not just the first 64/128MB of it.

So you'd still say the quad-core with slower graphics would perform better for the basic tasks I've listed, and that there wouldn't be any GUI slow-downs, patchy web page scrolling, etc.? I won't be doing any gaming, but I do plan on docking the laptop to a 4K monitor in the future. You're positive the 24 EUs in the 620 with a meager 34GB/s (actually 29.9GB/s in the laptop I'm looking at) would be capable of accelerating 8 million worth of pixels smoothly?
 

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,695
136
So you'd still say the quad-core with slower graphics would perform better for the basic tasks I've listed, and that there wouldn't be any GUI slow-downs, patchy web page scrolling, etc.? I won't be doing any gaming, but I do plan on docking the laptop to a 4K monitor in the future. You're positive the 24 EUs in the 620 with a meager 34GB/s (actually 29.9GB/s in the laptop I'm looking at) would be capable of accelerating 8 million worth of pixels smoothly?

For desktop work, certainly. It only becomes very iffy if you intend any kind of 3D workload. We have some laptops with the similar (24EU) but slightly faster (U)HD630 and I haven't heard anyone complain when docked at a 4K monitor. (yet... ;))

Even the very basic Vega3 in my HTPC can do 4K desktop work, and push 4K60 VP9 video. The (U)HD620 shouldn't be different in that regard. As mentioned in the Athlon 200GE thread, even these very basic IGPs are as fast as mid-range graphics cards from 10 years ago.
 

coercitiv

Diamond Member
Jan 24, 2014
7,223
16,977
136
But do I really need said raw throughput for the basic tasks I intend to do?
Yes, you do. For example all modern browsers are well multi-threaded right now and will easily use the more efficient quad-core configuration. This alone is reason enough.

I'm afraid the slower GPU would struggle to render OS animations in the high-res laptops I'm looking at, that it would struggle to render and scroll through web pages smoothly, etc.
This is not a case of decisions based on feelings - you either have the data at hand or experience with modern devices or you don't. Simply postulating that OS animations or GPU accelerated scrolling would prove difficult for HD620 does not turn fear into fact.

Go into a store with modern Intel laptops. Look for a demo model with iGPU only, 4k screen and a SSD. Use the browser, run some apps. See for yourself.
 
  • Like
Reactions: Charlie22911

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
Intel IGP has supported 4K60 for the last several generations.

A Haswell chip with HD4000 supports it.

What you really need to look at is to see if the laptop you are buying will support 4K to an external display.
 

zliqdedo

Member
Dec 10, 2010
59
10
81
For desktop work, certainly. It only becomes very iffy if you intend any kind of 3D workload. We have some laptops with the similar (24EU) but slightly faster (U)HD630 and I haven't heard anyone complain when docked at a 4K monitor. (yet... ;))

Even the very basic Vega3 in my HTPC can do 4K desktop work, and push 4K60 VP9 video. The (U)HD620 shouldn't be different in that regard. As mentioned in the Athlon 200GE thread, even these very basic IGPs are as fast as mid-range graphics cards from 10 years ago.

That sounds great; hopefully, that's the case. I might sound paranoid here, but experience tells me most people can and do put up with bad performance, sometimes without even realizing it.

Yes, you do. For example all modern browsers are well multi-threaded right now and will easily use the more efficient quad-core configuration. This alone is reason enough.

Thanks, that's useful information, but I still wonder whether it would matter for me, since the 15W 7th gen chip seems to render pages instantly.

This is not a case of decisions based on feelings - you either have the data at hand or experience with modern devices or you don't. Simply postulating that OS animations or GPU accelerated scrolling would prove difficult for HD620 does not turn fear into fact.

I agree, that's why I'm asking; I have bad IGP-related experience. I did go to a store, and it seems to run fine, but I can't really know until I have my full workload going, and I simply can't do that at a store.

Intel IGP has supported 4K60 for the last several generations.

A Haswell chip with HD4000 supports it.

What you really need to look at is to see if the laptop you are buying will support 4K to an external display.

Support doesn't necessarily translate to smooth performance. I don't know about the HD 4000, but the HD 3000 struggles with 1080p in Windows 10; sure, it doesn't have official support for Windows 10, but still. And since the HD 3000 has 12 EUs, and struggles with 1080p, I fear the UHD 620, which has 24 EUs, might struggle with 2K, let alone 4K. Sure, it's a more modern architecture with official driver support, but I can't know for sure.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
EUs are not the same between generations. Each EU is faster in the newer generation.
16EUs of HD620 are way faster than 16EUs of HD3000.

Besides, all of this info is readily available on the web. You don't even need to ask here.
Plus, it's the specifications of the actual machine that matter.

HD620 has no trouble with 4K60 at all.

It is highly capable:

https://en.wikichip.org/wiki/intel/hd_graphics_620
 

jpiniero

Lifer
Oct 1, 2010
16,490
6,983
136
If it's Whiskey based you would also have the benefit of higher turbo clocks compared to the 28W Kaby, although it does depend on the model if you can utilize it or not. Even with the Windows 10 spyware you likely won't ever really use the extra cores but the extra cores is another reason to go with the quad.
 

coercitiv

Diamond Member
Jan 24, 2014
7,223
16,977
136
Thanks, that's useful information, but I still wonder whether it would matter for me, since the 15W 7th gen chip seems to render pages instantly.
Your original question was which product was better for general tasks, not which one is enough today.

The quad-core is faster and will age better. If you like the eDRAM product more... then by all means go with that - it's your money and it's important you feel good about the purchase, but you won't get validation for this choice here.
 

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,695
136
That sounds great; hopefully, that's the case. I might sound paranoid here, but experience tells me most people can and do put up with bad performance, sometimes without even realizing it.

Thankfully, we're not at the Intel "Extreme" Graphics (2) or GraphicsMediaDecelerator level bad anymore. Intel's IGP have massively improved since.

I agree, that's why I'm asking; I have bad IGP-related experience. I did go to a store, and it seems to run fine, but I can't really know until I have my full workload going, and I simply can't do that at a store.

Support doesn't necessarily translate to smooth performance. I don't know about the HD 4000, but the HD 3000 struggles with 1080p in Windows 10; sure, it doesn't have official support for Windows 10, but still. And since the HD 3000 has 12 EUs, and struggles with 1080p, I fear the UHD 620, which has 24 EUs, might struggle with 2K, let alone 4K. Sure, it's a more modern architecture with official driver support, but I can't know for sure.

The real reason for Sandy Bridges (HD2000/3000) IGP poor performance under Win10 isn't the HW itself, it's the very poor driver support. The bundled driver nukes OpenGL support completely.

You more-or-less have to force install the 7 driver to get acceptable performance.
 
  • Like
Reactions: wilds and coercitiv

wilds

Platinum Member
Oct 26, 2012
2,059
674
136
I love the concept of the L4 cache and I still use my 5775C for my main CPU.

I would forget Iris and go with normal iGP graphics with a standard quad core vs the dual core. Iris Pro is really impressive compared to other Intel graphics, but it will not be faster in multitasking.

Having the L4 as extra cache is why I bought this CPU, but I would instantly go with a CPU with more cores.