Discussion NV Re-Enter ARM PC market in Q1-2026?

Page 15 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

marees

Platinum Member
Apr 28, 2024
2,092
2,704
96
It apparently taped out last December

But Charlie's leak published only this April

What were nv, mediatek, & Microsoft doing in those 90 days ?
 
  • Like
Reactions: igor_kavinski

poke01

Diamond Member
Mar 8, 2022
4,627
5,940
106
It apparently taped out last December

But Charlie's leak published only this April

What were nv, mediatek, & Microsoft doing in those 90 days ?
I mean in 3 years replace Mediatek with Intel too.

This is why you need to own your hardware stack like AMD and Apple do. Intel owns it now too.

Intel replacing its iGPU IP with Nvidia will for sure see delayed products.
 
  • Like
Reactions: Tlh97 and marees

Tup3x

Golden Member
Dec 31, 2016
1,291
1,423
136
What's the problem with the display block? Does it suck compared to MediaTek IP?
If I'd have to guess, they have issues with integrating that NVIDIA ip and they have bugs that make the thing unstable.

Or everything is just a guesswork and there are no hardware issues at all. I wouldn't be surprised if the real issue is Microsoft or that the drivers just aren't ready yet.
 

regen1

Senior member
Aug 28, 2025
268
330
96
Starting Wednesday, Oct. 15, DGX Spark can be ordered on NVIDIA.com. Partner systems will be available from Acer, ASUS, Dell Technologies, GIGABYTE, HP, Lenovo, MSI as well as Micro Center stores in the U.S., and from NVIDIA channel partners worldwide.
 

MS_AT

Senior member
Jul 15, 2024
913
1,833
96
So the controlled benchmarks from selected reviewers bits was true. I guess everybody got the same reviewer guide, unless I have missed it nobody ever talks about the CPU part of this thing? [how it does compare in cpu only inference, how fast it can compile new libs, what is the power draw, does it work outside the two linux distros they officially mention].
 

MS_AT

Senior member
Jul 15, 2024
913
1,833
96
First power measurements I have seen and also some interesting bits https://www.servethehome.com/nvidia-dgx-spark-review-the-gb10-machine-is-so-freaking-cool/

There are a few clear challenges working with the GB10. Somewhat surprisingly, video output is one of those areas that you think any NVIDIA product would nail. The Spark has been challenging to say the least. The LG OLEDs we have that are 1440p, display a garbled mess out of the HDMI port if set to 1440p output in the OS. Likewise, ultra widescreen monitors were a no-go.

At idle, when we did our power measurements last week, this system was idling in the 40-45W range. Just loading the CPU, we could get 120-130W. Adding the GPU, and other components we could get to just under 200W, but we did not get to 240W. Something to keep in mind is that QSFP56 optics can use a decent amount of power.


Also, in many of the AI inference workloads with LLMs we were using 60-90W and the system was very quiet. There is a fan running, but if you are 1-1.5m away it is very difficult to hear and it never hit 40dba when we were not stress testing the system.
 

The Hardcard

Senior member
Oct 19, 2021
344
434
136
Really Why ?
It won’t be competitive. Both AMD and Nvidia missed the target with 256 bit memory. For LLMs, bandwidth is as important as compute and the Mac Studios have 512 bits up to 128 GB RAM capacity and 1024 bits up to 512 GB of RAM.

The huge knock on Macs is the compute is way too low, but the Apple10 GPU architecture resolves that with the neural accelerators (tensor cores). Comparable compute with double to quadruple the bandwidth.

Apple’s MLX framework now has a functioning CUDA backend, model algorithms run as is on Nvidia hardware. Most people, if not everyone wanting DGX Spark for locally prototyping and fine tuning a planned datacenter deployment can do better with the upcoming Mac Studios as well.