Discussion Intel current and future Lakes & Rapids thread

Page 875 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Khato

Golden Member
Jul 15, 2001
1,279
361
136
Apple has in excess of 20% of the laptop market (~25% last time I looked it up). The 12% share is compared to all PCs.l Whew, way behind in this thread.
Depends on the market - 20%+ might be true in the US? IDC global numbers for q1 2023 have laptops at 40M and desktops 16M - https://www.idc.com/promo/pcdforecast They report Apple at 4.1M units shipped in the 'traditional PCs' market, so even assuming all were laptops that's only a 10% market share - https://www.idc.com/getdoc.jsp?containerId=prUS50565723 Meanwhile according to the Mercury Research figures quoted in this article - https://www.tomshardware.com/news/a...et-share-report-recovery-looms-on-the-horizon - Intel has 83.8% of the notebook x86 unit share. So Intel would end up at around 75% market share on notebooks apparently? Plenty of room for growth.
 

A///

Diamond Member
Feb 24, 2017
4,351
3,160
136
Intel doesn't make laptops. Intel powered Apple laptops exist. I think there's some overlap there.
 

Mopetar

Diamond Member
Jan 31, 2011
8,489
7,735
136
The amount of butthurt and petty intellectual indigence in this thread over the last 10 pages is sad. Games have for any number of reasons run better on one CPU or GPU over another since time immemorial.

Is it that surprising that Intel might have better performance in a game, and do the other titles where this already happens not count for some reason?

Everyone acting like this is some massive world breaking deal one way or the other needs to go outside and touch grass. You all look like a troupe of baboons fighting over a tire swing.
 

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
27,250
16,108
136
The amount of butthurt and petty intellectual indigence in this thread over the last 10 pages is sad. Games have for any number of reasons run better on one CPU or GPU over another since time immemorial.

Is it that surprising that Intel might have better performance in a game, and do the other titles where this already happens not count for some reason?

Everyone acting like this is some massive world breaking deal one way or the other needs to go outside and touch grass. You all look like a troupe of baboons fighting over a tire swing.
I was not going to comment. But after this, let me just say that ONE game seeming to run faster by ONE reviewer vs all the other multiple review sites saying otherwise seems a little bit WRONG. And multiple pages arguing this ? Leave me out of it.
 
  • Haha
Reactions: reb0rn

H433x0n

Golden Member
Mar 15, 2023
1,224
1,606
106
The amount of butthurt and petty intellectual indigence in this thread over the last 10 pages is sad. Games have for any number of reasons run better on one CPU or GPU over another since time immemorial.

Is it that surprising that Intel might have better performance in a game, and do the other titles where this already happens not count for some reason?

Everyone acting like this is some massive world breaking deal one way or the other needs to go outside and touch grass. You all look like a troupe of baboons fighting over a tire swing.
Yes. It’s been overblown on both sides. There’s also this notion that the 7800X3D had some massive performance advantage in gaming. It has a big / noticeable advantage in perf/watt but it never had a large lead in raw performance with the aggregate of reviewers showing a performance advantage of <=5%. Just this year alone there’s been numerous titles where it either loses or ties in raw performance (TLOU, Jedi Survivor RT, Ratchet & Clank, Starfield, Callisto Protocol). It has a smaller lead over the 13900K than the 13900K had over the 7700X for the 6 months prior to the X3D releases.

It’s all basically negligible and imperceptible to the average user.

I was not going to comment. But after this, let me just say that ONE game seeming to run faster by ONE reviewer vs all the other multiple review sites saying otherwise seems a little bit WRONG. And multiple pages arguing this ? Leave me out of it.
It’s not one reviewer. It’s ComputerBase, GamersNexus & Hardware Unboxed all showing the same results.
 
Last edited:

itsmydamnation

Diamond Member
Feb 6, 2011
3,074
3,901
136
Yes. It’s been overblown on both sides. There’s also this notion that the 7800X3D had some massive performance advantage in gaming. It has a big / noticeable advantage in perf/watt but it never had a large lead in raw performance with the aggregate of reviewers showing a performance advantage of <=5%. Just this year alone there’s been numerous titles where it either loses or ties in raw performance (TLOU, Jedi Survivor RT, Ratchet & Clank, Starfield, Callisto Protocol). It has a smaller lead over the the 13900K than the 13900K had over the 7700X for the 6 months prior to the X3D releases.

It’s all basically negligible and imperceptible to the average user.


It’s not one reviewer. It’s ComputerBase, GamersNexus & Hardware Unboxed all showing the same results.
unless you play late stage simulation games. but not many people benchmark that .

But that behaviour has existed since crystalwell.

This is well documented in individual communities like r/Stellaris
 
  • Like
Reactions: lightmanek

Ajay

Lifer
Jan 8, 2001
16,094
8,114
136
Depends on the market - 20%+ might be true in the US? IDC global numbers for q1 2023 have laptops at 40M and desktops 16M - https://www.idc.com/promo/pcdforecast They report Apple at 4.1M units shipped in the 'traditional PCs' market, so even assuming all were laptops that's only a 10% market share - https://www.idc.com/getdoc.jsp?containerId=prUS50565723 Meanwhile according to the Mercury Research figures quoted in this article - https://www.tomshardware.com/news/a...et-share-report-recovery-looms-on-the-horizon - Intel has 83.8% of the notebook x86 unit share. So Intel would end up at around 75% market share on notebooks apparently? Plenty of room for growth.
Seems odd. The reporting I saw (can't remember where) was from sometime in 2022. Eh, if Apple is fine with their share, good for them. If not, well, they have work to do. Would kind of like to see the net profits on 'PCs' from Apple compared to Wintel stuff.
 
  • Like
Reactions: poke01

A///

Diamond Member
Feb 24, 2017
4,351
3,160
136
GR and SRF packages were shown off this past week. Nice and shiny. and big.
 

AMDK11

Senior member
Jul 15, 2019
473
407
136
This is not GraniteRapids-AP on LGA7529!This is GraniteRapids-SP on the new LGA4710 with the physical size of LGA4677.Compare the cutouts in the PCB.

If GraniteRapids-SP on LGA4710 has 3 compute tiles, how many will GraniteRapids-AP on LGA7529 have?4-5?
 

SiliconFly

Golden Member
Mar 10, 2023
1,924
1,284
106
At this point, we have some info about ADM L4 cache in MTL (but not much). Two months back, there was a leak which showed that at least one MTL cpu has 128MB ADM L4 cache. Beyond that, no one seems to know!

(1) Is it present only in specific skus or the entire MTL range?
(2) Is ADM exclusive to tGPU?
(3) If not, does it have any impact on tCPU single-threaded performance?
(4) If exclusive to tGPU, what tGPU performance boost are we looking at?

Only 10 more days to go for the reveal, and only bit and pieces of info about ADM is available as of now. Sucks! :(
 

jpiniero

Lifer
Oct 1, 2010
16,816
7,258
136
Yes, we know that 192EU GPU tile was cancelled.
However, Intel could be planning MTL products for...

GT1 => 32EU ( Does MTL-S exist? )
GT2 => 64EU
GT3 => 128EU

Is it?

IIRC Meteor S and Arrow S share the same PCH which the IGP is included in the PCH (and not it's own tile). The PCH is also fabbed on an Intel 10 nm node.
 
Last edited:

Ajay

Lifer
Jan 8, 2001
16,094
8,114
136
I'll just be glad to see real benchmarks/power efficiency, etc. and put the speculation to bed on this one.
 
  • Like
Reactions: Geddagod

A///

Diamond Member
Feb 24, 2017
4,351
3,160
136
if MTL is only available on laptops, what is recommended to be used on a new desktop build?
14900K or 7950X.

Personally I'd wait until the holidays period to get a deal or Zen 5. Intel will have more socket maturity.
 

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
27,250
16,108
136
14900K or 7950X.

Personally I'd wait until the holidays period to get a deal or Zen 5. Intel will have more socket maturity.
If you care about efficiency the 7950x. At ~140 watt, its awesome.


This is NOT a Zen4 thread.
Post like this need a valid reason why you brought it up, or Take it to a Zen4 thread.
And this post is very misleading, as with a full load on all cores, its way more then 140W even before boosting.
The TDP on a 7950X is 170W even.

Consider this your last freebie

Moderator Aigo.
 
Last edited by a moderator:
  • Like
Reactions: A///