Question Alder Lake - Official Thread

Page 112 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

noway069111

Member
Apr 3, 2022
32
11
36
Nope.

What yo do with the CPU is up to the end user. The OS is the responsibility of the OEM. If you need GPU support then you need the proper OS / drivers + probably openCL.
Agreed, whole heartedly. But you can't complain that it doesn't perform when used in a manner that it was not intended to.
 

igor_kavinski

Platinum Member
Jul 27, 2020
2,536
1,292
96
Ryzens are multicore workload monsters. ADL doesn't stand a chance in embarrassingly parallel workloads anyway. At least, not with max 8 P-cores. Intel has to try a lot harder to pack as many cores into their dies as AMD before this can be a fair fight.
 

noway069111

Member
Apr 3, 2022
32
11
36
Ryzens are multicore workload monsters. ADL doesn't stand a chance in embarrassingly parallel workloads anyway. At least, not with max 8 P-cores. Intel has to try a lot harder to pack as many cores into their dies as AMD before this can be a fair fight.
Well, that was completely off-topic. Who said anything about a competition between AMD and Intel? This is an ADL thread, as far as I can tell.

Curious, how many AMD CPUs do you own?
 

noway069111

Member
Apr 3, 2022
32
11
36
There's no intentions implied by Intel regarding which HW / SW / GPU you will se with their chips.
Same as they don't sell sports cars and tell you not to go off-roading with it. I mean, most of these CPUS are sold in OEM laptops and desktops in large volumes. They don't need to specify such a specifically small niche usage market.

You also have to remember, the CPU is still fully functional and works in Linux with normal usage. But these comments are specific to DC and such. Seems a bit silly to expect everything to be the same across the board. No CPU is.
 

VirtualLarry

No Lifer
Aug 25, 2001
53,267
7,701
126
Maybe because the CPU wasn't optimized or built to run on Linux for DC purposes? I mean, you're basically taking a sedan and saying "oh, it doesn't do off-roading! and got stuck in the mud!" The CPU wasn't intended for such and it's odd that your expectation is that it should excel in something it was never built for.

I mean, you're basically saying the reviews are not commenting on it "intelligently", but you are testing outside of the normal perimeters (specific and minimally usage) and saying you are speaking on it intelligently? Was ADL designed for that??
It's called a "general purpose micro-processor" for a REASON. Because the specified workload is NOT pre-defined.

What @Markfw is using it for, is NOT out-out-spec. Just a bit on the bleeding edge from the Linux software guys.
 

noway069111

Member
Apr 3, 2022
32
11
36
It's called a "general purpose micro-processor" for a REASON. Because the specified workload is NOT pre-defined.

What @Markfw is using it for, is NOT out-out-spec. Just a bit on the bleeding edge from the Linux software guys.
Thanks for making my point. GENERAL PURPOSE, does not mean it will perform better at a single usage point. It's a bit funny that Intel people have been targeting that they are the best gaming CPU for years. But, oddly enough, AMD wants to claim "fastest gaming CPU". Is that general purpose, or are they going after a specific function? You can't have it both ways. If it's all "general purpose", why are there differences between laptop, desktop, and server CPUS. They are NOT general purpose. Not all CPUs support ECC. Not all CPUS are built the same.

It's like saying, all cars are general purpose because they have 4 wheels and a motor, lol. Any intelligent person knows that a lot of cars are built specific to a certain function. Regardless if they use the same overall design.

Mark's CPU isn't "out of spec", it is able to perform the operation. But he is going on about how his EPYP CPU is able to do it faster. If we are completely honest here, what does that have to do with the thread? This is about ADL and not any other CPU. The rules are a bit confusing as a new member, but it seems like you're not supposed to thread crap, but then you can thread crap? Heck, whi knows.

Anyways, ADL was not built for DC and I have no clue why people are comparing it to other CPUS or even talking about it. The CPU is awesome for normal usage and gaming! Let's just talk about ADL in this thread.
 
Last edited:

noway069111

Member
Apr 3, 2022
32
11
36
I bought it so I could comment on it itelligently. If you don't own one, then its all feedback from web pages you have read. Nobody ever wrote about ADL on linux or DC performance, so how would I know ? At least I never saw it. Maybe I will find a better use for it then a F@H house for a 3070TI.
I guess my issue is that you want to comment on it "intelligently". How exactly are you commenting on it intelligently when you are it limiting to a specific "test" and results that you are already looking for expectations to be good/bad? You are not an independent testing bench and how are we know what you are really using? You've posted no actual benches and AT hasn't posted anything you tested. Are we all supposed to just go with what you say? I mean, it's nice when end-users do some research, but you just seem to reply and not provide any info or screenshots???

It's all specific non-standard usage of the CPU. But this is the most telling comment...

" lets just say I found out what I needed to about Alder lake by buying the 12700F. "

What did you find out? Because that CPU is fantastic at everyday task, games and general usage. Would you agree that it's a great everyday usage CPU???
 
Last edited:

Tech Junky

Senior member
Jan 27, 2022
604
170
76
lets just say I found out what I needed to about Alder lake by buying the 12700F. "

What did you find out? Because that CPU is fantastic at everyday task, games and general usage. Would you agree that it's a great everyday usage CPU???
Found out the F isn't helpful when your GPU drivers spaz on a kernel upgrade resulting in no display.
 

Hulk

Diamond Member
Oct 9, 1999
3,362
859
136
I bought it so I could comment on it itelligently. If you don't own one, then its all feedback from web pages you have read. Nobody ever wrote about ADL on linux or DC performance, so how would I know ? At least I never saw it. Maybe I will find a better use for it then a F@H house for a 3070TI.
It's a great chip for day-to-day usage with some audio/video/photo editing here and there.
 
  • Like
Reactions: noway069111

noway069111

Member
Apr 3, 2022
32
11
36
Found out the F isn't helpful when your GPU drivers spaz on a kernel upgrade resulting in no display.
Which has nothing to do with ADL. Sorry, but it seems some people are trying to blame ADL performance on issues that are not it's fault or anything to do with it (or what it is designed for). It was designed for Windows, which makes up 98%+ of the general usage.
 

noway069111

Member
Apr 3, 2022
32
11
36
It's a great chip for day-to-day usage with some audio/video/photo editing here and there.
Agreed. It's funny cause of the similarities to Cars. I know people have the analogy, but it's so true! It's like saying, my car does 0-60 in 3.2 and yours only does it in 3.4 seconds! Honestly, no normal person would EVER notice the difference. Yet, for some weird reason, people want to make a huge deal about it and act like certain chips are just trash. It makes no sense, at all.
 

Tech Junky

Senior member
Jan 27, 2022
604
170
76
Which has nothing to do with ADL. Sorry, but it seems some people are trying to blame ADL performance on issues that are not it's fault or anything to do with it (or what it is designed for). It was designed for Windows, which makes up 98%+ of the general usage.
You didn't get the F issue.

When upgrading the kernel there's an exception to the NVIDIA drivers and you have to run a special command to install them with the specific kernel w/ an option at the end of the install command. They don't auto compile with new releases beyond what's in their manifest.

F doesn't have an iGPU and no NVIDIA drivers = no display to troubleshoot with to fix the issue.
 
  • Like
Reactions: MadRat

Markfw

CPU Moderator, VC&G Moderator, Elite Member
Super Moderator
May 16, 2002
22,801
11,213
136
You didn't get the F issue.

When upgrading the kernel there's an exception to the NVIDIA drivers and you have to run a special command to install them with the specific kernel w/ an option at the end of the install command. They don't auto compile with new releases beyond what's in their manifest.

F doesn't have an iGPU and no NVIDIA drivers = no display to troubleshoot with to fix the issue.
Actually, I did have a screen, it just said "check your video drivers" and I did what it said, and it did not fix the issue.

UPDATE: This is to all that helped me, and I thank you. All of this comparing to other CPUs was to fix an issue : Alder lake should have been stronger and faster than any other cores, but it was 4 times slower. Well, I turned off e-cores, and that should have eliminated the scheduling issue and fixed the problem. It didn't. I tried the new kernel version, that caused more problems than it fixed, but was a good idea, and will be needed in the future. So thats where I left it.

Fast forward to today. I was talking to my DC buddies and they suggested deleting all processing units, running the BOINC benchmarks, update the project in BOINC, and then get new units. Well, guess what, exactly as I expected, Adler lake (well the P-cores) are faster per core than any other current cores, just less of them !!! So the problem was NOT Alder lake (as I thought anyway), its was NOT the OS (which I thought it was), but it was the application was not being utilized to optimize the cores ! Now it does take more electricity to run per core than Ryzen, but IT IS FASTER PER CORE.

I will ignore any replies that are not constructive. This whole process was a learning experience, and proved what I have said before, that the P-cores are strong, but use more power. Yes, for general desktop use, not heavily threaded, they are great. Gaming they are great, but now have a new contender (5800X3D). For really heavy productivity, they are not that great. I hope you all have learned something as well as me.
 

Tech Junky

Senior member
Jan 27, 2022
604
170
76
That's a gimmick though. The only major difference from the X version is the bump in cache to 96MB and the cores are slower than and you can't OC it if you wanted to do so.

I still think you can wrangle some more HP out of the 12700 with the updated kernel but, we won't know for sure until you try again with the additional GPU driver steps we discussed. Just make an image of your current running system beforehand to make recovery quicker w/o having to setup everything from scratch again. An image only takes a few minutes to make and recover from.
 

Markfw

CPU Moderator, VC&G Moderator, Elite Member
Super Moderator
May 16, 2002
22,801
11,213
136
That's a gimmick though. The only major difference from the X version is the bump in cache to 96MB and the cores are slower than and you can't OC it if you wanted to do so.

I still think you can wrangle some more HP out of the 12700 with the updated kernel but, we won't know for sure until you try again with the additional GPU driver steps we discussed. Just make an image of your current running system beforehand to make recovery quicker w/o having to setup everything from scratch again. An image only takes a few minutes to make and recover from.
Well, with the cores working well right now, I will be just leaving it alone for the time being, the 8 P-cores are screaming now.. And wait for Mint to support a newer kernel.

Thanks for all the suggestions, it has helped ! Even though the fix was something else.
 

Markfw

CPU Moderator, VC&G Moderator, Elite Member
Super Moderator
May 16, 2002
22,801
11,213
136
i don't use the F or a GPU since I went with the K and it's a headless system until I need to recover it if there's something broken. That K upgrade is worth the extra $20 for insurance purposes.
I paid $312 for my F. The K was $394 at the time. Since I run a big GPU for F@H, to me it was not worth the extra. My 12700F has a 3070TI.
 
Last edited:
  • Like
Reactions: Drazick

ASK THE COMMUNITY