• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Question x86 and ARM architectures comparison thread.

Page 27 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Partly because every single keyboard, mouse and proper (speedy) usb keys are still all stupidly USB A.

Also because keyboard/mice are low speed peripherals. Unless you want to designate a few USB-C ports as low speed you're going to waste I/O links providing a bunch of full speed USB-C ports only to have them used for slow HID devices. If you took the capacity for one USB-C port you could split it into three USB-A ports for storage drive, keyboard and mouse and because the keyboard/mouse are so slow your storage drive will still be running at full speed.
 
Also because keyboard/mice are low speed peripherals. Unless you want to designate a few USB-C ports as low speed you're going to waste I/O links providing a bunch of full speed USB-C ports only to have them used for slow HID devices. If you took the capacity for one USB-C port you could split it into three USB-A ports for storage drive, keyboard and mouse and because the keyboard/mouse are so slow your storage drive will still be running at full speed.

Though a full speed usb-c has 24 wires (18 used) a usb-a 3.0 has 9-12 wires (7-8 used) If you split all the wires and counted up the bandwidth and devices it wouldn't matter because you probably bought a charging cable.
 
USB-C is still a stellar way to connect to a docking station. Full spec USB 3.1 and up gives you most everything you need in 95% of use cases. However, when I'm on the road, I want at least one USB-A for a mouse or file transfer.
 
Don't remember if this has been talked about yet, but placing an X925 in a mobile/DT solution (GB10) nets it >7% higher IPC than the same core in a smartphone (Xring O1) thanks to uncore changes.
1764552894875.png
The result of this is that Zen 5 DT is only 5% faster than the X925, while also having a ~13% lead in perf over Zen 5 high power mobile (strix halo).
The X925 is last gen too.
 
Don't remember if this has been talked about yet, but placing an X925 in a mobile/DT solution (GB10) nets it >7% higher IPC than the same core in a smartphone (Xring O1) thanks to uncore changes.
View attachment 134611
The result of this is that Zen 5 DT is only 5% faster than the X925, while also having a ~13% lead in perf over Zen 5 high power mobile (strix halo).
The X925 is last gen too.
Bweeeeee it's n3 opinion discarded
 
Don't remember if this has been talked about yet, but placing an X925 in a mobile/DT solution (GB10) nets it >7% higher IPC than the same core in a smartphone (Xring O1) thanks to uncore changes.
View attachment 134611
The result of this is that Zen 5 DT is only 5% faster than the X925, while also having a ~13% lead in perf over Zen 5 high power mobile (strix halo).
The X925 is last gen too.
Also how is Apple still making do with Everest and sawtooth rehashes??

They have to move on from that base soon
 
Don't remember if this has been talked about yet, but placing an X925 in a mobile/DT solution (GB10) nets it >7% higher IPC than the same core in a smartphone (Xring O1) thanks to uncore changes.
Different cache configs is likely the only significant change.

Also GB10 is still only as good as the platform.

To play any decent games you are going to need emulation, and even with all the work that's gone into FEX emu you are still going to get a significant emulation tax in both performance and power relative to a native hw solution.

Especially if it lacks the x86 memory mimicking thing that Apple Mx's CPU has that mitigates the tax of x86 -> ARM64 translation.
 
To play any decent games you are going to need emulation, and even with all the work that's gone into FEX emu you are still going to get a significant emulation tax in both performance and power relative to a native hw solution.
It seems some games can be made to run at an acceptable speed: https://forums.anandtech.com/thread...nning-of-transformation.2617658/post-41535822

I agree it's a single example and it might not be representative of other games.

As an aside, back when I was playing on PC 10 years ago, I had to switch back to Windows from Linux as Wine was not always working as expected (in particular with MT games). Beyond the processor the OS was an issue. If I ever decided to return to gaming on PC, I'd stick to Windows + x86.
 
It seems some games can be made to run at an acceptable speed: https://forums.anandtech.com/thread...nning-of-transformation.2617658/post-41535822

I agree it's a single example and it might not be representative of other games.

As an aside, back when I was playing on PC 10 years ago, I had to switch back to Windows from Linux as Wine was not always working as expected (in particular with MT games). Beyond the processor the OS was an issue. If I ever decided to return to gaming on PC, I'd stick to Windows + x86.
And even in PC and Windows, you should use mostly Intel since most software were fully optimized to that brand... and even to certain generations like Sandy or Skylake.
 
And even in PC and Windows, you should use mostly Intel since most software were fully optimized to that brand... and even to certain generations like Sandy or Skylake.
I hope that a game that was optimized for Sandy Bridge or Skylake works well on any recent CPU, be it from AMD or Intel 🙂
 
Also how is Apple still making do with Everest and sawtooth rehashes??

They have to move on from that base soon
This is no longer Everest or Sawtooth. It's funny that some people in the industry continue to refer to Apple's new P and E cores by the same names from the A16 era simply because Apple has stopped giving them code names. The cores from the A15 and A16 have different code names, but the changes in architecture are minimal.
 
this is news to me

laptops are fun
What I find really funny is what they write:
This does not apply to Windows, though.
Remove the capital and it fully applies to windows (vs fullscreen) 🙂
 
this is news to me

laptops are fun
Would this apply to current hardware? I figured this was a issue of the GPU not having to be concerned about what's behind the window and keeping it updated. But, the comments imply this is a related to how work is divided between the iGPU and dGPU, if I'm understanding correctly.
 
Would this apply to current hardware? I figured this was a issue of the GPU not having to be concerned about what's behind the window and keeping it updated. But, the comments imply this is a related to how work is divided between the iGPU and dGPU, if I'm understanding correctly.
That comment looks stupid: there's no dGPU on recent Mac.

And then I realized the article is 8 years old... @poke01 I hate you 😀

EDIT: the comment that looks stupid is the one on the site, just to clarify.
 
Would this apply to current hardware? I figured this was a issue of the GPU not having to be concerned about what's behind the window and keeping it updated. But, the comments imply this is a related to how work is divided between the iGPU and dGPU, if I'm understanding correctly.
That comment looks stupid: there's no dGPU on recent Mac.

And then I realized the article is 8 years old... @poke01 I hate you 😀

EDIT: the comment that looks stupid is the one on the site, just to clarify.
🤣Sorry here’s something more recent on iGPUs.

Video playback 1080p on Chinese streaming site. Motherboard power consumption:

IMG_2996.png
 
Would this apply to current hardware? I figured this was a issue of the GPU not having to be concerned about what's behind the window and keeping it updated. But, the comments imply this is a related to how work is divided between the iGPU and dGPU, if I'm understanding correctly.

It is probably bypassing all the GUI software layers and writing directly in the framebuffer.
 
Partly because every single keyboard, mouse and proper (speedy) usb keys are still all stupidly USB A.
All of my keyboards and mice came with USB-C, and my fastest USB drives are USB-C.

I think it is mostly due to the dedicated lanes needed for higher speeds. ADDING A BUNCH OF 20-40gbps ports, regardless of type, is complex and expensive.
 
Apple ditched USB-A on laptops NINE years ago. You'd think if this was a problem there'd be some backlash on it.
laptops are than fine to come with no USB-A as they come with a built in keyboard and trackpad and mostly likely will be docked via USB-C but desktops are a big no.

Apple still ships USB-A on its professional desktops
 
Back
Top