Nintendo Switch is powered by NVIDIA

Page 37 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
Smells like early hardware issues more than optimization issues to me

Yeah it doesn't really look like a general lack of performance is causing these issues, although it should be noted that there has been some speculation that the Switch might be bottlenecked by bandwidth in the docked mode. This would also explain why Zelda sees such a relatively small jump in resolution (from 720P to 900P), even though the GPU clock is increased by 150% (from 306.8 to 768 MHz) .
 
Last edited:
  • Like
Reactions: Headfoot

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,330
126
Teardown of the Switch : https://imgur.com/a/IVQTT#Cuq9loV

die shot

3W38cvj.jpg
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
It's also worth noting that there are two revisions of the X1, with the old Shield TV using the older A1 revision and the new Shield TV as well as the Switch using the newer A2 revision. I doubt there's any significant difference between the two though.
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
http://www.shacknews.com/article/99018/rumor-nintendo-switch-graphics-and-hardware-specs-leaked

Looks like it's basically full speed, except devs can only use 3 out of the 4 CPU cores.

Nah, not really. The leak that they talk about doesn't actually list the final clock, only the max clocks that the chip is capable of. In fact the leak specifically mentions that the final clocks are TBD.

Also it's worth noting that the leak is apparently about an older devkit, not the final consumer version.

http://dystify.com/SwitchLeak
 

Krteq

Golden Member
May 22, 2015
1,009
729
136
What exactly is "custom" on this chip?

nintendo_switch_maxwell_tegra_soc.png


Specs are almost the same as X1 except 60FPS@FHD/30FPS@4K for video output (limit of HDMI 1.4 and screen used on Nintendo Switch, not SoC itself).
 
Last edited:

poofyhairguy

Lifer
Nov 20, 2005
14,612
318
126
What exactly is "custom" on this chip?

The Shield X1 has 4x ARM Cortex A53s for BIG.little, but the Switch SoC doesn't seem to have them.

Another key difference between the Switch at the devices that have had X1's previously (like the Shield TV or the Pixel C) is 4GB of RAM instead of 3GB.
 

Krteq

Golden Member
May 22, 2015
1,009
729
136
Hmm, even nV says only 4x A57 for Tegra X1, so 4x A53 could still be included in SoC for switch, but they are not listed.

Anyway, Jetson was using 4GB too
Jetson_TX1_Press_Deck_Final-page-006.jpg

I still can't find any differences between Tegra X1 and this SoC used in Switch.
 

Face2Face

Diamond Member
Jun 6, 2001
4,100
215
106
The X1 as it stands is pretty capable SoC. It will be nice to see what were are looking at spec wise once it's released and someone does a deep dive.

For anyone who hasn't seen it. Here is an Tegra X1 playing Crysis 3. I don't know any specifics, as they're not giving, but it's still impressive even it's at 720p.

 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
What exactly is "custom" on this chip?

Probably nothing (other than a bit of binning). The lack of the A53 cores is probably just because they aren't enabled, but they are probably still physically present on the die. The slight amount of die space that would be saved by excluding the A53 wouldn't come close to making up the expenses connected with Nvidia designing and validating a new design of the Tegra X1.

To be fair though it does seem kind of silly that Nintendo doesn't make use of the A53 cores, since they are apparently dedicating one of the A57 cores to the OS, something that I would imagine would be suitable for the A53 cores, thus leaving all 4 A57 cores for gaming, but maybe power usage was a concern here.
 

SPBHM

Diamond Member
Sep 12, 2012
5,068
423
126
The X1 as it stands is pretty capable SoC. It will be nice to see what were are looking at spec wise once it's released and someone does a deep dive.

For anyone who hasn't seen it. Here is an Tegra X1 playing Crysis 3. I don't know any specifics, as they're not giving, but it's still impressive even it's at 720p.


it kind of is, but at the same time I think it's limited to a demo map and also lets not forget that even PS3/360 have the full game, the Tegra X1 should beat those old consoles easily...

now the problem with the Switch is that the rumored clocks are really low compared to the Shield TV.

the photos makes it look really like a regular Tegra X1, perhaps there is something custom to it but I doubt they changed it much... also in the end the memory bandwidth is a good indicative of performance since I don't think they would build something to unbalanced, if I'm not mistaken at best we are looking at 64bit DDR4 3200? that's super low even compared to a 750 ti/950 and so on, and it's shared with the CPU, so I think more than 256SPs wouldn't really make a lot of sense, and they really are power constrained (a lot more than the Shield TV)....

Nintendo said their SoC is custom, would having different clocks, disabled 'little' cores be enough to call it a custom Tegra X1?
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
now the problem with the Switch is that the rumored clocks are really low compared to the Shield TV.

That depends on whether we're looking at handheld mode or docked mode. In handheld mode the clocks are significantly lower, but in docked mode they should be roughly comparable. It's important to remember that while the Shield TV can go up to 1 GHz on the GPU in actual usage it can potentially throttle down as low as 614 MHz (The Switch is allegedly 768 MHz in docked mode). Of course this throttling was demonstrated with a Unity demo and 3DMark, so we don't know what amount of throttling (if any) happens with Crysis specifically.

the photos makes it look really like a regular Tegra X1, perhaps there is something custom to it but I doubt they changed it much... also in the end the memory bandwidth is a good indicative of performance since I don't think they would build something to unbalanced, if I'm not mistaken at best we are looking at 64bit DDR4 3200? that's super low even compared to a 750 ti/950 and so on, and it's shared with the CPU, so I think more than 256SPs wouldn't really make a lot of sense, and they really are power constrained (a lot more than the Shield TV)....

Yes it appears to use two 32 bit DDR4 modules at 3200, more specifically these ones: K4F6E304HB-MGCH. This would result in the Switch having 25.6 GB/s bandwidth.

That is indeed significantly lower than something like a Geforce 950, but the Switch is also significantly slower than the 950 (it's roughly 78% slower in docked mode with 76% lower bandwidth).
 

SPBHM

Diamond Member
Sep 12, 2012
5,068
423
126
That depends on whether we're looking at handheld mode or docked mode. In handheld mode the clocks are significantly lower, but in docked mode they should be roughly comparable. It's important to remember that while the Shield TV can go up to 1 GHz on the GPU in actual usage it can potentially throttle down as low as 614 MHz (The Switch is allegedly 768 MHz in docked mode). Of course this throttling was demonstrated with a Unity demo and 3DMark, so we don't know what amount of throttling (if any) happens with Crysis specifically.



Yes it appears to use two 32 bit DDR4 modules at 3200, more specifically these ones: K4F6E304HB-MGCH. This would result in the Switch having 25.6 GB/s bandwidth.

That is indeed significantly lower than something like a Geforce 950, but the Switch is also significantly slower than the 950 (it's roughly 78% slower in docked mode with 76% lower bandwidth).

I guess you are right sustained clock on the Shield TV might not be that great, but the Android TV runs CPU cores up to 2GHz while the Switch at 1GHz even when docked, so that might be very helpful in keeping the GPU locked at 768 and not bellow, but at the same time a 1GHz A57 is not going to be that great.

also the Switch is based on a newer revision of the Tegra X1, the same used on the Shield TV 2017 and not on the previous ones it seems, so the Shield TV 2017 might also not throttle the GPU as badly as the original ones? in any case, the Shield TV seems to have access to more cooling and power, it should be superior (well it's not because it runs Android, but the hardware/clocks) that's what I was trying to say.
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
I guess you are right sustained clock on the Shield TV might not be that great, but the Android TV runs CPU cores up to 2GHz while the Switch at 1GHz even when docked, so that might be very helpful in keeping the GPU locked at 768 and not bellow, but at the same time a 1GHz A57 is not going to be that great.

Keeping the CPU at 1GHz definitely helps sustain a higher clock on the GPU. The guy who did the tests on the Shield TV actually tried locking the CPU to 1GHz on the Shield TV, and saw the GPU sustaining a solid 1GHz, so it's probably safe to say that the GPU clock on the Switch is helped by having a lower CPU clock, and if it was running the CPU at 2 GHz like the Shield TV it would undoubtedly have the GPU running significantly slower.

And yeah, 3 A57 cores at 1GHz is going to be a limitation, and it's going to be interesting to see how developers handle it.

also the Switch is based on a newer revision of the Tegra X1, the same used on the Shield TV 2017 and not on the previous ones it seems, so the Shield TV 2017 might also not throttle the GPU as badly as the original ones? in any case, the Shield TV seems to have access to more cooling and power, it should be superior (well it's not because it runs Android, but the hardware/clocks) that's what I was trying to say.

There has been some benches done of the new Shield TV 2017 and it does indeed look like it performs better than the old version, which could very well be down to less throttling.

Of course this isn't particularly relevant as far as the Crysis demo is concerned, since that was run on the old Shield TV.
 

Krteq

Golden Member
May 22, 2015
1,009
729
136
So, TechInsights has a definite proof that Nintendo is using a standard Tegra X1, not a "custom Tegra processor" like nV said.

TechInsights said:
After subsequent processing of the GPU from the Nintendo Switch, we have determined that the processor is the Nvidia Tegra T210. The T210 CPU features 4 Cortex A57 and 4 Cortex A53 processor cores and the GPU is a GM20B Maxell core. Download the high resolution image here.





TechInsights - Nintendo Switch Teardown
 
  • Like
Reactions: Bacon1

HOOfan 1

Platinum Member
Sep 2, 2007
2,337
15
81
So, they bring out their new hardware with 2 year old technology and it is running at clock speeds well below the Nvidia Shield TV?
 

Face2Face

Diamond Member
Jun 6, 2001
4,100
215
106
Keep it mind guys the switch is running BSD, not Android. Also NVIDIA developed the NVN API for the Switch so it should have very low level access to the hardware, like the current gen consoles do. Looks like Vulkan is supported as well.

I'm also trying to think who else Nintendo would've used when it comes to lower power SoCs? Qualcomm makes decent performing SoCs, but Nintendo wouldn't get half the support as they do from NVIDIA. AMD would be great, but nothing in their product stack could perform as well @ 8w. An Apple A10 with a PowerVR GPU would be great, but that isn't happening. Curious to hear what you guys think? What other choice did they have for their "Console" vision.

I'm personally looking forward to a Switch like product that can run all of my Steam Library. I don't see it happening until AMD steps up, or INTEL does with their graphics division.
 

Qwertilot

Golden Member
Nov 28, 2013
1,604
257
126
The obvious potential option was a tegra running pascal. More money/chip and some technological risk involved of course so you can see why not.
 
  • Like
Reactions: linkgoron