Question Snapdragon 8 Gen 2 announced

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

hemedans

Member
Jan 31, 2015
69
22
81
The L2 configurations are not mentioned there, right?



yeah, sad :(
they mention it, private l2 cache for each cluster


  1. The single big core is a Cortex-X2, running at 3.0 GHz with 1 MiB of private L2 cache.
  2. The middle cores are Cortex-A710, running at 2.5 GHz with 512 KiB of private L2 cache.
  3. The four efficiency cores are Cortex-A510, running at 1.8 GHz and an unknown amount of L2 cache. These four cores are arranged in pairs, with L2 cache being private to a pair.
  4. On the top of these cores is an additional 6 MiB of shared L3 cache and 4 MiB of system level cache at the memory controller, which is a 64-bit LPDDR5-3200 interface for 51.2 GB/s theoretical peak bandwidth.
 
  • Like
Reactions: Tlh97 and scineram

lopri

Elite Member
Jul 27, 2002
13,180
564
126
I currently have an S21 Ultra with a Samsung-fabbed Snapdraqon 888 which has not been as bad as I thought it would be; granted, I'm not doing anything CPU-intensive with it right now, so perhaps that is part of what's making it livable. But during casual use it doesn't have the overheating problems that I remembered seeing in Anandtech's review of the Snapdragon 888 and Exynos 2100. Tbh my old ROG Phone 2 overheated more often on the same workloads. It exhibited some odd behavior, though.
S22 heats up with casual use. Such as downloading/installing apps from Play store, or just rebooting the system makes the phone's metal frame scorching hot. Network hands-off (from Wi-Fi to 4G) also doesn't work sometimes, and when that happens I have to reboot the phone. Shooting video or even watching YouTube HDR does the same thing. (heat). Battery life naturally suffers as well. 2.5~3 hrs of SoT makes no sense for a 2022 phone.

I mean, Qualcomm switched the fab to build the same exact chip. I definitely understand where they're coming from.
 

DrMrLordX

Lifer
Apr 27, 2000
20,518
9,605
136
S22 heats up with casual use. Such as downloading/installing apps from Play store, or just rebooting the system makes the phone's metal frame scorching hot. Network hands-off (from Wi-Fi to 4G) also doesn't work sometimes, and when that happens I have to reboot the phone. Shooting video or even watching YouTube HDR does the same thing. (heat). Battery life naturally suffers as well. 2.5~3 hrs of SoT makes no sense for a 2022 phone.

I mean, Qualcomm switched the fab to build the same exact chip. I definitely understand where they're coming from.
Interesting perspective. Is that behavior widespread?
 

Jimminy

Senior member
May 19, 2020
220
83
71
At first glance I was tempted to disagree, in the sense that smartphones still have a long way to go in terms of performance. However, If I had to choose between "mostly performance improvements" and "mostly efficiency improvements" then I fully agree with you, I would go for efficiency at this point.
I don't understand why a phone has to be as thin as a razor blade. After all, the other dimensions are approaching that of a small dinner plate. It could be made a few mm thicker, with a larger battery, giving longer run time.

Phones have 2 uses: (1) to make a phone call (or text), and (2) to function as a toy (or pacifier) at other times. They currently cannot replace a desktop computer for anything else. Even an El-cheapo laptop works a lot better.

Actually, there is a third function: They give folks the opportunity to squabble endlessly over which phone is best. :)
 

Doug S

Golden Member
Feb 8, 2020
1,506
2,197
106
I don't understand why a phone has to be as thin as a razor blade. After all, the other dimensions are approaching that of a small dinner plate. It could be made a few mm thicker, with a larger battery, giving longer run time.

Phones have 2 uses: (1) to make a phone call (or text), and (2) to function as a toy (or pacifier) at other times. They currently cannot replace a desktop computer for anything else. Even an El-cheapo laptop works a lot better.

Actually, there is a third function: They give folks the opportunity to squabble endlessly over which phone is best. :)

Your understanding of what a 'phone' is is around 15 years out of date, or you are being willfully ignorant.

No one buys a "phone" these days, they buy a jack of all trades pocket computer that makes phone calls as one (lesser and lesser used) of its many functions. A phone can replace a desktop computer for almost anything, except stuff that needs a big display or a full keyboard (though I've seen teenagers who can type accurately on an iPhone as fast as almost anyone can work a keyboard) or the niche (sorry but it is true, "power users" / hardcore gamers are a clear minority of PC owners) applications that need more power than a phone SoC can bring to bear - which, keep in mind, is more in the latest phones than even a high end desktop PC could muster in 2010.

The "typical PC user" tasks that require a desktop/laptop PC are stuff like writing term papers, doing spreadsheets, creating powerpoints - i.e. MS Office type tasks that would be needlessly painful on a 6" display. Web browsing, social media, email, etc. can all be done exactly as well with a phone as a PC, and like it or not that pretty much covers the "computer" needs for the majority of people. That's one reason fewer PCs are sold every year (which ignoring the pandemic bubble is already reverting back to the decade+ trend of lower sales every year) and people who never owned a PC tend to mostly not acquire one even when they can easily afford it. Most people in China use a smartphone as their only computing device, and that's true over most of the world except the US and Europe.

The reason they don't make them thicker with a larger battery is because once the battery lasts all day having it last longer is pointless - because humans require sleep. For the niche crowd that wants a bigger battery they can buy a case with a built in battery to boost capacity. Then only those who want that capacity have to accept the excess weight.
 

FlameTail

Senior member
Dec 15, 2021
210
54
61
Most people in China use a smartphone as their only computing device, and that's true over most of the world except the US and Europe.
I was actually surprised to learn recently that mobile gaming is a huge thing in China. No wonder I see a massive crowd of Chinese players in almost every MMO I have ever played :)
 

FlameTail

Senior member
Dec 15, 2021
210
54
61
A16 Bionic vs SD 8 gen2 Geekbench.

Seems like Qualcomm has 'caught up' with Apple in terms of multicore performance. But the Snapdragon is definitely pulling more power, with such a beefed up CPU configuration. I wonder how the thermals are.
 

Attachments

hemedans

Member
Jan 31, 2015
69
22
81
A16 Bionic vs SD 8 gen2 Geekbench.

Seems like Qualcomm has 'caught up' with Apple in terms of multicore performance. But the Snapdragon is definitely pulling more power, with such a beefed up CPU configuration. I wonder how the thermals are.
Check geekerwan Chinese chanell, Cpu wise its not good and it use lot of power.

Its Gpu where sd 8 gen 2 shine, More perfomance than A16 while using 8W, it can sustain 75% of its perfomance, clockspeed just 680mhz.

If Snapdragon get needed perfomance core from Nuvia i wont be suprised by 2024 to surpass Bionic in both Cpu and Gpu.
 

FlameTail

Senior member
Dec 15, 2021
210
54
61
Its Gpu where sd 8 gen 2 shine, More perfomance than A16 while using 8W, it can sustain 75% of its perfomance, clockspeed just 680mhz.
Excellent. Excellent.

Looks like we about to get superb laptop SoCs from Qualcomm. This is the most exciting thing for me. Just like Apple does for their M-series chips for Macs, Qualcomm can scale up their Adreno GPUs and put them in laptop SoCs.

Exciting times ahead.
 
  • Like
Reactions: Tlh97 and hemedans

gdansk

Golden Member
Feb 8, 2011
1,304
1,282
136
A16 Bionic vs SD 8 gen2 Geekbench.

Seems like Qualcomm has 'caught up' with Apple in terms of multicore performance. But the Snapdragon is definitely pulling more power, with such a beefed up CPU configuration. I wonder how the thermals are.
Close as it has been in generations
 

StinkyPinky

Diamond Member
Jul 6, 2002
6,731
723
126
Honestly Samsung offer such amazing trade-in deals I'll probably update my 22 Ultra to the 23 Ultra. Won't cost much and I get a new warranty and from what I can see, definitely a better SoC.
 
  • Like
Reactions: Tlh97 and Lodix

Tup3x

Senior member
Dec 31, 2016
751
656
136
Honestly Samsung offer such amazing trade-in deals I'll probably update my 22 Ultra to the 23 Ultra. Won't cost much and I get a new warranty and from what I can see, definitely a better SoC.
They sure do. That being said, I've been rather disappointed in camera performance (S22 Exynos version here). Upgrade to Snapdragon would be huge improvement but if image processing doesn't make large improvements, I'm not sure what to do...
 

FlameTail

Senior member
Dec 15, 2021
210
54
61
Apparently part of the reason why the GPU in the 8 gen2 is so powerful/efficient is that Qualcomm increased the number of ALUs to 1536 ( up from 1024 in 8g1), and reduced the clock speed by ~30% compared to 8g1. Source: Geekerwan video.

I commenter on the said video stated that GPUs love to run wide and slow, and that is the reason for the 8g2 GPU's excellent perf/watt.

This gave me some food for thought. I assume GPU's designed to run at high clocks use the HP (high performance) library of the process node? In that case, if you are going to add more cores but reduce the clock speeds, it makes sense to use an HD (High Density) library to fabricate the GPU ? HD libraries have limited max clock speeds but are more efficient, and also save die area.

I am no expert, but can someone confirm the validity of my speculation? I am very curious to know 🤔😃
 
Last edited:
Feb 17, 2020
89
213
76
Apparently part of the reason why the GPU in the 8 gen2 is so powerful/efficient is that Qualcomm increased the number of ALUs to 1536 ( up from 1024 in 8g1), and reduced the clock speed by ~30% compared to 8g1. Source: Geekerwan video.

I commenter on the said video stated that GPUs love to run wide and slow, and that is the reason for the 8g2 GPU's excellent perf/watt.

This gave me some food for thought. I assume GPU's designed to run at high clocks use the HP (high performance) library of the process node? In that case, if you are going to add more cores but reduce the clock speeds, it makes sense to use an HD (High Density) library to fabricate the GPU ? HD libraries have limited max clock speeds but are more efficient, and also save die area.
Expert here (chip designer).

That's the V/F curve. Real basic <redacted> . Power ~ switching cap * frequency * voltage^2, but frequency scales with voltage, so really it's Power ~ switching cap * frequency^3. So if you cut frequency in half but make the design twice as wide, assuming you get linear scaling, power gets cut by 75% ( 2 * (1/2)^3).

As for HP/HD, literally nobody (other than Intel, because they're stupid) actually uses HP libraries. It's all HD.

Edit: late night brain dumb, 1/4 not 1/2

Profanity is not allowed in the tech areas
Markfw
Anandtech Moderator
 
Last edited by a moderator:

FlameTail

Senior member
Dec 15, 2021
210
54
61
As for HP/HD, literally nobody (other than Intel, because they're stupid) actually uses HP libraries. It's all HD.
This I am not sure of. For one, I have heard that the prime core in chips like the SD 865 use 'relaxed HP libraries'. The SD 865 has 4× A77 cores. But one A77 core uses the said HP library to enable higher peak frequencies and thus act as a 'prime core'.

In addition I am also aware that some nodes have such a thing as a UHD library (Ultra High Density). I will try to post the source if I can find it.
 

Thala

Golden Member
Nov 12, 2014
1,355
652
136
That's the V/F curve. Real basic <redacted>. Power ~ switching cap * frequency * voltage^2, but frequency scales with voltage, so really it's Power ~ switching cap * frequency^3. So if you cut frequency in half but make the design twice as wide, assuming you get linear scaling, power gets cut in half ( 2 * (1/2)^3).
Few comments. The cubic scaling only holds locally, because the F/V curve is not linear - it is rather a hyperbolic function, which has a zero/pole roughly at Vth. This means at half the frequency you might need more or less than half the voltage - depending on what your reference point is.
Finally, even if we assume, that your calculation is correct, ( 2 * (1/2)^3) = 1/4 and not just half.
 
Last edited by a moderator:
Feb 17, 2020
89
213
76
Few comments. The cubic scaling only holds locally, because the F/V curve is not linear - it is rather a hyperbolic function, which has a zero/pole roughly at Vth. This means at half the frequency you might need more or less than half the voltage - depending on what your reference point is.
Finally, even if we assume, that your calculation is correct, ( 2 * (1/2)^3) = 1/4 and not just half.
Late night brain dumb on the 1/4 vs 1/2 part. As for treating it as linear, it's a simplification that's close enough for teaching a casual person. It's just not realistic to measure all the components that go into a V/F curve, so you need a simplified model. Vth on its own can vary >100mV chip-to-chip, and the curve also depends heavily on the design and process. Once you get to a certain point, the curve inflects due to paths becoming wire-dominated and becomes way worse than linear. For example, the A12 V/F Curve shows a 7x increase in power going from 1.2ghz (or just a bit less) to 2.4ghz (or just a bit less), which is practically cubic. Then once you get above 0.9v it explodes. To me, this indicates the core team designed primarily around the 0.6v and 0.9v corners and didn't do much if any work on anything above that, so the design's routes just don't let frequency scale higher.

This I am not sure of. For one, I have heard that the prime core in chips like the SD 865 use 'relaxed HP libraries'. The SD 865 has 4× A77 cores. But one A77 core uses the said HP library to enable higher peak frequencies and thus act as a 'prime core'.

In addition I am also aware that some nodes have such a thing as a UHD library (Ultra High Density). I will try to post the source if I can find it.
I HIGHLY doubt the SD865 has any 3-fin cells in it. If it does, Qualcomm's dumb. They might have synthesized at a tighter frequency and used 8T SRAM instead of 6T, but that's an entirely different thing altogether.

As for UHD, that's a new thing for N3E. N3E has UHD (1+2 fin), HD (2-fin), and HP (3+2 fin) arrangements. UHD and HP are hybrid row, meaning they alternate between 2-fin and 1-fin (UHD) or 3-fin (HP). Hybrid row introduces a lot of design challenges, mainly focused around legalization, multi-row cells, and achieving high utilization (tool may pack certain rows but leave others empty). As a result, you get dramatically higher runtime on the UHD or HP libaries compared to regular HD. Anyone chasing maximum power/area scaling will still want to use UHD, but for performance-focused parts there will still be plenty of demand for HD. Anyone using HP is an idiot because it doesn't give that much performance, and with HD you can spin your design more times to get better convergence on timing.
 

Shivansps

Diamond Member
Sep 11, 2013
3,657
1,331
136
Excellent. Excellent.

Looks like we about to get superb laptop SoCs from Qualcomm. This is the most exciting thing for me. Just like Apple does for their M-series chips for Macs, Qualcomm can scale up their Adreno GPUs and put them in laptop SoCs.

Exciting times ahead.
I trought Qualcomm had the 8CX Gen 3 for laptops, that one should have 4x Cortex-X1 and 4x Cortex-A78, whiout any in-order small core acting as a handbrake for Windows. And i think this is the way to go, if ARM wants to enter the laptop market they cant do it with phone SoCs, and unfortunally that whats the Snapdragon 8 Gen 2 is with those 3 A510. Laptops and phone have diferent requirements. But it remains to be seem how much more energy efficient ARM is running only big cores with all the I/O that you do need for a laptop.
 

FlameTail

Senior member
Dec 15, 2021
210
54
61

Excellent video.

One interesting point i noticed is that the Cortex A715 is barely better than the A710 in power-efficiency, which in itself was barely an improvement over A78.

It seems like we really are hitting a wall of diminishing returns with Austin's core design.
 

ASK THE COMMUNITY