Question CPU for Surface Laptop 3.

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

LikeLinus

Lifer
Jul 25, 2001
11,518
670
126
So I went ahead and ordered a Surface Laptop 3 with the Ryzen cpu, 16gb of ram and 512mb SSD. But I'm seeing conflicting reports about the Intel CPU and Ryzen. Microsoft is saying it's the "fastest mobile CPU/GPU on the marketing". But then some articles talk about the speed advantage of the i7 and it appears to be the better all around cpu, but maybe not as great as the Ryzen GPU. Battery life on both is said to be the same. So I'm not taking in to account the 25w vs 15w.

Plus, from what I've seen, the business version is strictly Intel. Now I'm on the fence and just wondered what everyone's opinion is.

FYI: This will not be for gaming. It's simply for light (travel editing, review and backing up 4K material and photos) video production and graphics. I currently have a MacBook Pro (2017) and a Surface Pro 4. Love them both and they are great, but I want to try the Surface Laptop 3 for fun. Love new toys, guilty as charged :D

The other advantage of the business machine is they go to 32GB and 1TB SSD. But my primary concern is the CPU and if I should order an Intel one just to be safe. I can always cancel one of them.
 
  • Like
Reactions: lightmanek

TheGiant

Senior member
Jun 12, 2017
748
353
106
This! I constantly see the erroneous claim that the IPC of Zen2 and Skylake are equal.

I think this is mostly because Intel is faster in gaming, but that is almost exclusively due-to much better memory latency, not clock speed .9900K is often a tiny bit faster than Zen2 even when both are fixed @ 4Ghz.

This in turn makes people assume that the IPC is "about equal". However In the majority of production workloads (where latency is less of an issue) this is not the case and Zen2 is measurably faster per-clock.

Just look at photoshop (image below). 3800x is equal to 9900K, despite the latter having a 4.7Ghz all core boost and 5.0Ghz dual-core boost. 3800X max boost is 4.5 and the all-core boost of 3800X is something between 4.2-4.3 GHz (Based on the Techpowerup Clock analysis of 3900X), therefore it has 9-11% more IPC in all all the aggregated Photoshop tests, Pudget ran:



The same is true for a lot of other production workloads. Especially rendering and encoding where even 3700x is often neck-to-neck with 9900K, despite having all core boost somewhere between 4.0-4.2 Ghz (@ 88W power limit, mind you).

As for gaming, the saving grace for Intel is the fact thatZen2 L3 caches are not shared being 16MB x 2 instead of unified 32MB.

This doesn't matter for workloads where all cores work on a separate dataset (e.g Cinebench), in those the entire L3 cache is used pretty-much optimally already. Now, for games this is almost never the case. It's really hard to make games truly multi-threaded, and even when they are, these threads are still mostly processing draw-calls (of the same input data). Therefore for games, most of the data in those 16MB CCX L3 caches ends up being duplicated.

What it means: despite having 32MB of L3, Ryzen 3xxx still has almost the same amount of effective L3 as 9900K for games (but not for most production workloads). Zen 3 will fix it, and twice the cache will make cases like CS:GO much more common.

TL;DR
Zen2 and Skylake IPC is not equal. Zen2 is often 10-15% faster per clock in production workloads. The "Gaming IPC" (god. I hate that term) is mostly worse due to memory latency, par some anomalies like CS:GO.
zen 2 has much better SMT coefficient than coffee lake (like 28 vs 37%)
otherwise they are pretty much equal
 

zinfamous

No Lifer
Jul 12, 2006
111,695
31,043
146
Disappointing ? It sure looked like a good laptop to me. I wouldn't say a 10, but not bad at all.

The modem looks to be rather terrible (would be adequate for me, but comparing to industry standard it seems like a very poor decision), and the battery life is terrible in comparison to the Intel. I've had a Surface Pro for 4 years now, and while those aren't issues that I normally care about, they are probably the two most important issues for how I use my Surface. I was about ~95% sure I would grab one of these Picasso Surface laptops, but after that review I am going to try and hold out for a 7nm upgrade...if it happens. I assume it will? I guess I'll also wait to see what the Surface Pros and Books offer; though I wanted to go back to a proper laptop to replace my Surface Pro.

I think one of the major issues here is lack of LPDDR support with Picasso. I think that really hurts this Zen+ mobile chip at the typical issues that are just amplified in ultraportables: power use, heat draw, and of course performance.

Still no news for when Renoir will be available, right...let alone when it will start popping up in OEM devices?
 
Last edited:
  • Like
Reactions: Thunder 57

zinfamous

No Lifer
Jul 12, 2006
111,695
31,043
146
There is more assumptions and speculation in there than a flat earther convention (sorry, couldn't think of anything funny :confused: ). Renoir shouldn't take too much longer to get out, and if AMD has a product that can hit better margins in mobile, they could probably speed up Zen 3 mobile to come out around the same time as Tiger Lake. AMD hasn't focused on mobile at all because they haven't had many design wins.That is changing, so maybe they will put more effort into it.

Microsoft wouldn't make this move without reason. Maybe the expect AMD to offer superior parts. Maybe they are worried about the renewed rumors of Intel shortages and wanted a second source. Either way MS thought AMD was worth putting into their premium products.

I'm mostly curious if Renoir is going to be using Navi CUs and not Vega...which has probably already been confirmed, I just forgot? That would be a huge improvement in efficiency, I think, especially on top of how good 1st gen 7nm Zen2 already is. I expect it to leapfrog these Icelake chips, and TigerLake will be indistinguishable in comparison, maybe even obsolete, as well, when it releases.
 

Gideon

Platinum Member
Nov 27, 2007
2,013
4,992
136
I'm mostly curious if Renoir is going to be using Navi CUs and not Vega...which has probably already been confirmed, I just forgot? That would be a huge improvement in efficiency, I think, especially on top of how good 1st gen 7nm Zen2 already is. I expect it to leapfrog these Icelake chips, and TigerLake will be indistinguishable in comparison, maybe even obsolete, as well, when it releases.

Based on Linux drivers Renoir should still be VEGA but with 20CUs and LPDRR4x support. That is surely leaving some performance on the table, but it's still 7mm Vega with nearly 2x the memory bandwidth (which is by far the most limiting factor for current chips),

So it will surely beat Icelake noticeably. I'd say at least 50% (nearly doubling the CUs and bandwidth)., If it were Navi you could add another >=20% on top, but sadly no :(
 

zinfamous

No Lifer
Jul 12, 2006
111,695
31,043
146
Based on Linux drivers Renoir should still be VEGA but with LPDRR4x support. That is surely leaving some performance on the table, but it's still 7mm Vega with probably nearly 2s the memory bandwidth (which is by far the most limiting factor for current chips), so it will surely beat Icelake noticeably. I'd say 50%., If it were Navi you could add another 25% on top, but sadly no :(

well, that blows, even though 7nm Vega is still quite a bit better than 12nm. ....but damn, AMD: why you so dumb? :\
 

Gideon

Platinum Member
Nov 27, 2007
2,013
4,992
136
well, that blows, even though 7nm Vega is still quite a bit better than 12nm. ....but damn, AMD: why you so dumb? :\

Yeah, I also didn't get it. Especially considering that the Trinity APU was released with a new architecture months before it hit discreet graphics cards. My theory is, that this is because Renoir was planned to be initially be released on GloFo 7nm process and be ready before Navi or about the same time.

Releasing an APU with older gen GPU, despite the same architecture being available on discreet graphics (and on the very same same process) for at least 6 months, really does seem dumb (especially if you compare it to the Trinity example)

EDIT:
It seems my memory failed me. In May 2012 Trinity released the 2nd gen Bulldozer CPU core first (on mobile CPU), not the Northen Island VLIW4 GPUs (which were already available since 2010). But still, half a year seems like a big enough gap
 
Last edited:

Abwx

Lifer
Apr 2, 2011
11,835
4,789
136
zen 2 has much better SMT coefficient than coffee lake (like 28 vs 37%)
otherwise they are pretty much equal

That would amount to 7% better MT IPC if ST IPC were equal, but Zen 2 has 5-6% better perf/clock in this register, hence the 13% better MT IPC at the end of the day...
 

naukkis

Golden Member
Jun 5, 2002
1,004
849
136
Yeah, I also didn't get it. Especially considering that the Trinity APU was released with a new architecture months before it hit discreet graphics cards. My theory is, that this is because Renoir was planned to be initially be released on GloFo 7nm process and be ready before Navi or about the same time.

Trinity APU was released about one and half year later than that arch in discrete card(HD6970)..... Obviously it takes much more time to finish APU than pure graphics card.
 
  • Like
Reactions: Gideon

LightningZ71

Platinum Member
Mar 10, 2017
2,307
2,897
136
Based on Linux drivers Renoir should still be VEGA but with 20CUs and LPDRR4x support. That is surely leaving some performance on the table, but it's still 7mm Vega with nearly 2x the memory bandwidth (which is by far the most limiting factor for current chips),

So it will surely beat Icelake noticeably. I'd say at least 50% (nearly doubling the CUs and bandwidth)., If it were Navi you could add another >=20% on top, but sadly no :(

This could be interesting. There were some minor improvements internally to the Vega 20 CUs. They also made big improvements in the FP64 capabilities and some of the Int capabilities. I think that, with the improved memory bandwidth of LPDDR4X and by potentially doing what they did in Bristol Ridge and not crippling some of the higher end capabilities of the CUs, Renoir could be a VERY interesting for pro versions of these notebooks.
 

TheGiant

Senior member
Jun 12, 2017
748
353
106
That would amount to 7% better MT IPC if ST IPC were equal, but Zen 2 has 5-6% better perf/clock in this register, hence the 13% better MT IPC at the end of the day...

it is pretty much equal, once better, once worse

the MT scaling is much better with ryzen

just dont make it better than it is

ofc if you are a cinepeenbench runner it is better
 

TheGiant

Senior member
Jun 12, 2017
748
353
106
did someone measure actual wall power from the ryzen surfaces vs the core surfaces?

idle and load?

that battery life cant be just because of idle power...doesnt make sense
 

LikeLinus

Lifer
Jul 25, 2001
11,518
670
126
Sure, I can help with that. I meant to do some testing when I received them yesterday, but my son had a game and I didn't have time to open them till about an hour ago. All I've done is install all updates to both the AMD and Intel version. I have a Kill-A-Watt that I can hook up to both? Though I've never tried it, I will certain try.

Sorry for the quick image. This is actually my audio rig and I have a rack right next to this desk, so it's not the cleanest look, lol.

Right to left.

AMD Surface Laptop 3 15", Intel Surface Laptop 3 15" (i7), MacBook Pro 2017 15" (i7). JBL 3mkii in the background and 27" LG 4k monitor. Great for Cubase 10 Pro and my Axe-FX 3!!20191023_134207.jpg
 
  • Like
Reactions: lightmanek

DrMrLordX

Lifer
Apr 27, 2000
22,696
12,650
136
I think you mean MS :p . I think this was done to work out any kinks that may arise to ensure a smooth Renoir launch.

Duh yeah I meant MS. Sorry, brain malfunction.

zen 2 has much better SMT coefficient than coffee lake (like 28 vs 37%)
otherwise they are pretty much equal

CoffeeLake's SMT coefficient is worse, yes. But ST performance is all over the map. For example, CoffeeLake kills Matisse in stuff like SuperPi, but doesn't in CineBench R15 ST.
 

LikeLinus

Lifer
Jul 25, 2001
11,518
670
126
This was a very interesting test and I'm glad you asked me to do it. This test is ALL about how it is charging the battery and power levels.

When I first did the test, the AMD was at 40.3watts (as shown in the photo). I had the "Watts up pro" set in between the laptops and used the same charger for both test. The next test was the Intel version. It hit around 54 watts, which isn't a lot, but It is what it is. So I went to post the photos and realized the screen was off on the Intel SL3. I was like, wait, maybe that wasn't a good test. Turns out, I was right. I did the test again and had the screen on for the Intel laptop this time (it had gone to sleep in the first text). Oddly enough, the Intel was around the same watts as the AMD. I think this is due to the fact that the laptop/charger is pumping more watts to recharge quickly. They claim an 80% charge in 1hr, which makes sense. With both of them on, the are within 1watt of each other.

AMD SL3.jpg20191023_141712.jpg
 
  • Like
Reactions: lightmanek

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
27,094
16,014
136
This was a very interesting test and I'm glad you asked me to do it. This test is ALL about how it is charging the battery and power levels.

When I first did the test, the AMD was at 40.3watts (as shown in the photo). I had the "Watts up pro" set in between the laptops and used the same charger for both test. The next test was the Intel version. It hit around 54 watts, which isn't a lot, but It is what it is. So I went to post the photos and realized the screen was off on the Intel SL3. I was like, wait, maybe that wasn't a good test. Turns out, I was right. I did the test again and had the screen on for the Intel laptop this time (it had gone to sleep in the first text). Oddly enough, the Intel was around the same watts as the AMD. I think this is due to the fact that the laptop/charger is pumping more watts to recharge quickly. They claim an 80% charge in 1hr, which makes sense. With both of them on, the are within 1watt of each other.

View attachment 12323View attachment 12332
OK, so you are saying that both use the same amount of power at idle. and they both have the same size battery, but the Intel has a significantly longer battery life ? This still makes no sense. Something is wrong with the testing, or the batteries. Toms testing is what I question, as you have both right in front of you, and we can see the results ourselves. Based on your testing, the battery life should be within margin of error.

However, try a full load test on both using the same software (like prime 95 or something) and see what the loaded power is on both ?
 

LikeLinus

Lifer
Jul 25, 2001
11,518
670
126
You have been show multiple reasons why AMD may be having issues with battery life. Additionally, I NEVER said Intel has significantly longer battery life. The mods need to address these false accusations. Period.
 
Last edited:

dahorns

Senior member
Sep 13, 2013
550
83
91
OK, so you are saying that both use the same amount of power at idle. and they both have the same size battery, but the Intel has a significantly longer battery life ? This still makes no sense. Something is wrong with the testing, or the batteries. Toms testing is what I question, as you have both right in front of you, and we can see the results ourselves. Based on your testing, the battery life should be within margin of error.

However, try a full load test on both using the same software (like prime 95 or something) and see what the loaded power is on both ?

It doesn't really tell us anything if both batteries aren't at 100% prior to the test, right? I'm assuming you can't remove the battery to run a pure test, but I suppose I could be wrong. It sounds like we could just be getting numbers for the chargers as they charge the batteries. In fact, from the first image, we can see the battery is charging and not at 100%.
 
Last edited:

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
27,094
16,014
136
You have been show multiple reasons why AMD may be having issues with battery life. Additionally, I NEVER said Intel has significantly longer battery life. The mods need to address these false accusations. Period.
I didn't say YOU said they have longer battery life, Tomshardware (I think thats where I read it) said that.
 
Last edited:

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
27,094
16,014
136
It doesn't really tell us anything if both batteries aren't at 100% prior to the test, right? I mean, I'm assuming you can't remove the battery to run a pure test, but I suppose I could be wrong. It sounds like we could just be getting numbers for the chargers as they charge the batteries.
Good point. I am interested in all manner of tests on this subject.
 

LikeLinus

Lifer
Jul 25, 2001
11,518
670
126
It doesn't really tell us anything if both batteries aren't at 100% prior to the test, right? I mean, I'm assuming you can't remove the battery to run a pure test, but I suppose I could be wrong. It sounds like we could just be getting numbers for the chargers as they charge the batteries.

This is EXACTLY what I mentioned. I noticed a difference due to screens being on/off with the charging rate. i NEVER claimed that intel had a significant battery life. Then only thing I pointed out is that the charge rate is supposed to be 80% at 1hr, which would change the statistical difference because it all depends on the machine difference/charge rate and battery life.

I thought it was a good suggestion, but I noticed the difference because I wasn't either computer with a 100% charged battery and screen that was the same. I admit this was my fault and I pointed it out.
 

Thunder 57

Diamond Member
Aug 19, 2007
3,805
6,413
136
I was interested in getting the Ryzen version, but the battery figures sort of put me off. I'm actually more and more leaning towards waiting for Tiger Lake and Reoinr. It seems likely that there will be a quite disproportionate performance increase from both the Intel and AMD camps within 12 months. This Broadwell laptop I'm writing on will do for another short while.

Side note: The prices for higher RAM and storage tiers on the Surface Laptop really are pretty awful. And inconsistent. Examples (using the business models):
  • Going from 512GB to 1TB on the i7 model costs ~450 USD. Comparing price per GB to a Samsung PM981 drive available at retail, the price is 6x higher. As a consumer, I can buy three 1TB Samsung PM981 drives for the same price as the 512GB->1TB upgrade of the Surface Laptop.
  • Regarding consistency: Upgrading the i5 model from 128GB to 1TB costs ~290 USD. Upgrading the i7 model from 256GB to 1TB, i.e. a smaller upgrade, costs 850 USD (!!!).
These are prices for the business models, which are available in some configurations not available to consumers. Just looking at Microsoft's consumer page and playing with the configurations shows some weird stuff as well:
  • 128GB -> 256GB: 300 USD
  • 256GB -> 512GB: 400 USD
  • 512GB -> 1TB: 400 USD
I know pricing like this is pretty common for devices that are not easily upgradeable by the end user. It gets pretty absurd when you compare to devices that are upgradeable, though. Getting a good 1TB M.2 drive and 32GB of DDR4 memory is less than 300 USD in total. Makes buying one of the higher tier Surface Laptop models a really hard pill to swallow.

EDIT: Playing with the configuration tool, upgrading from 16GB to 32GB RAM seems to cost 300 USD.

That might be a prudent decision. Considering Renoir will have LPDDR4X that is much faster than what Picasso has, it should save more at idle and gain a huge boost in certain workloads.

Based on Linux drivers Renoir should still be VEGA but with 20CUs and LPDRR4x support. That is surely leaving some performance on the table, but it's still 7mm Vega with nearly 2x the memory bandwidth (which is by far the most limiting factor for current chips),

So it will surely beat Icelake noticeably. I'd say at least 50% (nearly doubling the CUs and bandwidth)., If it were Navi you could add another >=20% on top, but sadly no :(

Beat me too it. Id say Renoir will have a sizable CPU uplift, and a huge GPU boost based on the memory alone.
 

dahorns

Senior member
Sep 13, 2013
550
83
91
You have been show multiple reasons why AMD may be having issues with battery life. Additionally, I NEVER said Intel has significantly longer battery life. The mods need to address these false accusations. Period.

I don't think he meant it like that. At any rate, see my post above, you have to make sure the batteries are fully charged before testing. Otherwise, you're just telling us what the chargers can do (which would explain why your numbers were exactly the same).
 
  • Like
Reactions: Markfw

LikeLinus

Lifer
Jul 25, 2001
11,518
670
126
I did make mention of that. I tested it based on the requested and noticed the error after the fact. You are completely right. Once I tested them both with the same screen usage and charging, they seemed to show within a 1-2watt difference. That's all I can say.
 

dahorns

Senior member
Sep 13, 2013
550
83
91
I did make mention of that. I tested it based on the requested and noticed the error after the fact. You are completely right. Once I tested them both with the same screen usage and charging, they seemed to show within a 1-2watt difference. That's all I can say.

Sorry, your original post wasn't entirely clear to me. But thank you for doing this. It would be great if you could re-run the test once both laptops are at 100%. My suggestions/requests to start:

* idle, screens on (same brightness)
* idle, screens off
* during a short, but heavy load (e.g., Geekbench 4 run), ideally generally charting the variations as the run progresses + scores.
* While watching a movie from the hard-drive;
* While streaming a movie from netflix/amazon;
 
  • Like
Reactions: Markfw

LikeLinus

Lifer
Jul 25, 2001
11,518
670
126
Completely agree. I can do such ASAP. My apologies for not already having run these test. Between work and family since yesterday, it's been hectic! I only ran the test quickly as a favor to the curiosity of the other memory.

I have the laptops on my desk and they are probably fully charged by now (better be!). I can install Geekbench 4 and run those while they are plugged in.