• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

News [CNBC] AMD, Samsung partner on mobile graphics tech

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
15W is still a lot more than 6W. Can they still have top class performance rivalling the established players in the mobile space?

And? 15W is including power hungry (in mobile terms) Zen2/3 CPUs. Steam Deck is already claiming 4-15W APU power usage and 1.0-1.6GHz 4WGP RDNA2 to go with that.
 
And? 15W is including power hungry (in mobile terms) Zen2/3 CPUs. Steam Deck is already claiming 4-15W APU power usage and 1.0-1.6GHz 4WGP RDNA2 to go with that.
Steam Deck likely has a bigger battery than the typical smartphone. It's supposed to be 40Whr which a Google search says is impossible to convert into mAh (possibly due to unknown voltage of the Steam Deck's battery).
 
Steam Deck likely has a bigger battery than the typical smartphone. It's supposed to be 40Whr which a Google search says is impossible to convert into mAh (possibly due to unknown voltage of the Steam Deck's battery).

You seem to not being able to understand basic concepts.
First of all, mAh is a useless measurement when you already have Wh. Why do you want to convert back to a useless unit of measure? 40Wh tells you ALL the information you need, it can power a 4W APU+2W screen etc for around 6 hour 40 minutes. What does 8000mAh tell you? NOTHING. - Well maybe not nothing, it tells you charging at 8 amps (1C) is considered safe, and >24 amps (3C) is possible.
Also what's battery size got anything to do with power or performance? Deck's APU is already firmly within smartphone territory, that's the whole point.
You don't expect smartphones to be able to play games for as long as a portable game console. Larger battery = longer play time. Deck perhaps has a little better performance given it's actively cooled. That has nothing to do with battery size. They could also have a build-in 19Wh battery - flaghship smartphone battery capacity - and it'll just have a short battery life (and lighter). Nothing else would change.
 
Last edited:
They could have a build-in 19Wh battery - exactly flaghship smartphone battery capacity - and it'll just have a short battery life (and lighter). Nothing else would change.
So you are fine with a smartphone that dies after one hour of gaming on the mobile RDNA2 GPU?
 
Initial YT reviews suggest that playing a AAA title may drain Steam Deck's battery in just 90 minutes! So serious gamers will need to carry a hefty power bank with them.

The future isn't quite here yet for RDNA2 in terms of good battery life. Maybe RDNA3...
 
It's not like battery life is long while gaming on mobile phones as is, and AAA PC games still have significantly more complex graphics compared to what's done even in modern mobile phone games. Furthermore I don't think efficiency of RDNA2 in Van Gogh is directly comparable to that in Exynos 2200 (different manufacturer, different node and likely different optimizations).
 
Initial YT reviews suggest that playing a AAA title may drain Steam Deck's battery in just 90 minutes! So serious gamers will need to carry a hefty power bank with them.

The future isn't quite here yet for RDNA2 in terms of good battery life. Maybe RDNA3...
This is a thread about Exynos 2200 and not about Steam deck. I wouldn't make conclusions about Exynos 2200 based on Van Gogh.

You also didn't provide any link or screenshot to that 90 min battery life.
I just checked out a review done by Linus
Screenshot_1.png
That's ~12W, and It has a longer battery life than a lot costlier AYANEO 2021 Pro or OneXPlayer Mini.

Perhaps you watched the review made by Gamers Nexus.
Screenshot_2.png
Here It managed only 87 minutes in DMC5. That's ~27W power consumption for the whole device with a 40Wh battery.
Is this really a bad result? Against which competition?
 
Last edited:
True, it has no competition. However, it's not quite the serious gaming on the go device it was hyped up to be. Maybe a die-shrink will fix that, if it sells enough units to justify that.
 
True, it has no competition. However, it's not quite the serious gaming on the go device it was hyped up to be. Maybe a die-shrink will fix that, if it sells enough units to justify that.
It will sell well, It's a PC in a tiny chassis, and the price is pretty good.
Battery is not very big, so even a die shrink won't help that much. Even If the whole device consumed only 10W, with 40Wh battery It allows only 4 hours of nonstop gaming, that is also not so much. It needs a bigger battery, but there is no space and In my opinion this device is already quite large.

Back to topic. I thought we will see some performance results for Exynos 2200 today, but It looks like I was mistaken. I am very interested what they managed with Exynos 2200.
 
Last edited:
True, it has no competition. However, it's not quite the serious gaming on the go device it was hyped up to be. Maybe a die-shrink will fix that, if it sells enough units to justify that.

How many hours do you need in a day or before you can charge again? If you've got a 60 minute commute by bus or train and you recharge at work then this has you covered with time to spare.

Sure it doesn't work for a 3 hour flight, but if there is a game you can play that will last that long then you can always play that instead. If you have to have something that can do 4 hours straight in the absolute worst case it's probably overbuilt for everything else.
 
I'm guessing battery life can be improved through Steam Deck specific optimizations. The screen is small so not that much need to have HQ textures and maybe the LOD bias can be tweaked too. Also, no need for anisotropic filtering more than 4X?
 
For mobile use of the Desk Valve suggests limiting FPS (and screen) to 30hz. DMC5 at higher with no vsync may well be what constitutes a burner benchmark. Would be interesting at what FPS that ran (DMC5 at high with vsync was 60hz).
 
Isn't the S22 out yet?
What was the result from this partnership? It failed or not?

Anyway... I can't help but think it's a bit of a failure anyway. RDNA2 CUs are not small. Meanwhile Qualcomm keeps delivering good and comparable GPU performance with much smaller GPUs.
 

Yup, looks like bringing high performance parts over to Samsung's process is not an easy task. RDNA2 might be absurdly power efficient, but it looks like something is holding it back at the power envelope it needs to operate in as a phone GPU.

Likely isn't clocking as high as it "should" given process difficulties, if I were to venture a guess.
 

Yup, looks like bringing high performance parts over to Samsung's process is not an easy task. RDNA2 might be absurdly power efficient, but it looks like something is holding it back at the power envelope it needs to operate in as a phone GPU.

Likely isn't clocking as high as it "should" given process difficulties, if I were to venture a guess.
Unsure about what Samsung's requirements were, but for discrete GPUs RNDA2 and consoles AMD's approach seems to have been a small die and clocks, clocks, clocks. On mobile, more area might make more sense. Apple's chips are huge but do not require running beyond a process' perf/watt sweet spot.
 
Unsure about what Samsung's requirements were, but for discrete GPUs RNDA2 and consoles AMD's approach seems to have been a small die and clocks, clocks, clocks. On mobile, more area might make more sense. Apple's chips are huge but do not require running beyond a process' perf/watt sweet spot.
Yeah, if Samsung wanted higher performance with that config it should have clocked higher. Otherwise the config is just too small. I guess there was some truth to the rumor that higher clocks were planed originally but turned out to be too inefficient on the node used.
 
So it was a bad combination from the start?
A GPU that was made to clock as high as possible to perform well being fabbed by a foundry with recent problems to achieve target clocks?

Again, I'm glad for AMD, but Samsung... maybe this experience may be shared with AMD to help them design even more low power GPUs?

edit: but I did see a review testing Exynos SOC with actual games and witnessing it throttle much less offering a much more stable framerate.
 
Last edited:
Not absolutely terrible performance-wise but the exynos buyers are getting screwed as they are basically getting a refresh of last years phone compared to the QC buyers that are getting a substantial performance upgrade.
 
Yeah, if Samsung wanted higher performance with that config it should have clocked higher. Otherwise the config is just too small. I guess there was some truth to the rumor that higher clocks were planed originally but turned out to be too inefficient on the node used.

You should not blame the node, if the Adreno on the same node performs much better in the same power envelope. I mean, does this surprise anyone even? AMDs RDNA2 might be power efficient compared to NVidia - but better do not compare it to any ultra mobile GPU architecture. Even ARM Mali would have been a better choice.
 
You should not blame the node
I'm blaming Samsung. They should have known all parameters: RDNA2's performance, their own node's performance, and how many CUs respectively how high frequency they need for the real life performance they desired.

Considering this Exynos looks to be not much of an upgrade over the previous one (so far anyway, some people still hope for significant changes through driver updates, and power usage wise it already does seem to be a big upgrade) something somewhere in the planning obviously is amiss.
 
I'm blaming Samsung. They should have known all parameters: RDNA2's performance, their own node's performance, and how many CUs respectively how high frequency they need for the real life performance they desired.

This is correct in theory. Practically when licensing 3rd party IP, in particular for IP, which has to be modified - you getting the numbers relatively late. So you start with an area budget, because that is most trivial to estimate and continue with a target clock frequency. Thats what Samsung did, however they had to reduce the frequency several times which led to the final performance of the product.
 
This is correct in theory. Practically when licensing 3rd party IP, in particular for IP, which has to be modified - you getting the numbers relatively late. So you start with an area budget, because that is most trivial to estimate and continue with a target clock frequency. Thats what Samsung did, however they had to reduce the frequency several times which led to the final performance of the product.
And that's down to the node.
 
I also blame Samsung LSI Exynos team primarily because they keep managing to make their CPU worse than Qualcomm despite manufacturing both and having the same architectures (nominally). Seems like something is off there.
 
Currently there are driver issues too so some benchmarks are even a bit glitchy. Vulkan seems to work the best at the moment.

To be honest, I'm not sure why Samsung did this. Latest Mali would have been at least as good.
 
Back
Top