[Videocardz] AMD announces Multiple Embedded Form Factors in various Power Options

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
The new offerings cover a broad range of needs, from 192 GFLOPS to 3 TFLOPS of single precision performance, and from 20 to less than 95 watts of thermal design power. The products are offered as a Multi-Chip Module (MCM), Mobile PCI Express® Module (MXM) and PCIe® options, with AMD offering the only MCM solutions. All of these products offer extended support and longevity. The new discrete graphics cards offer the right balance of performance, power and graphics memory size, to meet the needs of most customers.

E8950MXM

5-1080.3352732458.png


Type B Mobile PCI-Express Module (MXM)
32 Compute Units1; 3 TFLOPS single precision (Peak)
8GB GDDR5 Memory; 256-bit wide
< 95W Thermal Design Power
Support for 4K hardware-accelerated decode and encode
AMD Eyefinity technology for up to 6 display outputs2
Support for DirectX® 12, OpenGL 4.5, and OpenCL&#8482; 2.0

AMD-E8950.jpg


E8870MXM and E8870PCIe

4-1080.65996497.png


12 Compute Units; 1.5 TFLOPS single precision (Peak)
4GB GDDR5 Memory; 128-bit wide
< 75W Thermal Design Power
Dual HD decode of H.264, VC-1, MPEG-4 and MPEG-2
AMD Eyefinity technology for up to 6 display outputs
Support for DirectX 12, OpenGL 4.5, and OpenCL 2.0

AMD-E8870.jpg


E6465MCM, E6465MXM and E6465PCIe

6-1080.1975662628.png


The power-efficient AMD Embedded Radeon E6465MCM GPU, E6465MXM Module and E6465PCIe Board all provide excellent processing performance at low power in a small form factor, making them well-suited for mobile signage, retail and kiosks, factory human-machine interface systems, heads-up conventional military/aerospace displays, and thin client computers.

2 Compute Units; 192 GFLOPS single precision (Peak)
2GB GDDR5 Memory; 64-bit wide
< 20W Thermal Design Power
Dual HD decode of H.264, VC-1, MPEG-4 and MPEG-2
AMD Eyefinity technology for up to 4 display outputs
Support for Direct® X11.1, OpenGL 4.5, and OpenCL 1.2

All of the new high-performance and power efficient AMD Embedded Radeon graphics options offer an industry-leading five-year longevity supply commitment. Each supports Microsoft® Windows® 7, Windows® 8.1, Windows® 10, and Linux®.

So essentially close to double the performance of HD7850 (roughly PS4's GPU inside a <95W TDP).

3-1080.1079742006.png


7-1080.1225309044.png


8-1080.461150514.png


9-1080.2725566988.png


10-1080.4107034509.png


Videocardz

Computerbase

P.S. 3 Tflops 8950 could be a 2048 SP Tonga @ ~725 GPU clocks or a 1792 SP Tonga @ 850mhz.

Given that 8870 with a 75W TDP already appears to be faster than XB1's GPU, I sure hope Nintendo's NX can incorporate an AMD APU at least somewhat close to XB1's graphics capabilities!
 
Last edited:

Techhog

Platinum Member
Sep 11, 2013
2,834
2
26
lol. If the console version of NX is any of these, it would be E6465, especially since it needs to be close in performance to the handheld for easy porting.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
lol. If the console version of NX is any of these, it would be E6465, especially since it needs to be close in performance to the handheld for easy porting.

Come on! You are killing it! :D

That would imply worse GPU performance than 2005's Xbox One for God's sake.

Xbox_One_vs_PS4_spec_01.jpg


Also, how would an x86 E6465 help with what most likely will be an ARM-based handheld for easy porting?

On the Wii U:

"The Latte graphics chip contains both a "GX2" GPGPU, which runs Wii U applications, and a "GX" GPU, which enables backwards-compatibility with Wii games. The GX2, designed by AMD, is based on the Radeon R600/R700 architecture and is clocked at approximately 550 MHz.[80][82][note 1] It is manufactured at a 40 nm node and contains 32 MB of eDRAM cache memory, which can also act as L3 cache for the CPU.[note 1] The GX, originally designed by ATI Technologies, contains a 1 MB and a 2 MB banks of eSRAM cache memory.[80] The Latte chip also includes a secondary custom ARM9 processor with 96 KB of SRAM memory that handles system tasks in the background during gameplay or while the system is in sleep mode, and dedicated hardware audio DSP module."

https://en.wikipedia.org/wiki/Wii_U

Why wouldn't they make make an X86 AMD APU + custom ARM SoC for the NX handheld?

Your GPU suggestion would mean the NX would not even be able to run Xbox 1/PS4/PC ports easily in 2016-2020. That would make the NX home console an automatically failure imo.

I mean surely if by 2016 Nintendo cannot release a console with 0.75 Tflops of SP GPU performance, they should quit the home console business. By mid-2016, the APUs inside PS4/XB1 can probably be purchased from AMD directly for $70 considering by that point they will be 3 years old and back in Nov 2013 their cost was estimates between $100-110 as per IHS.

409628-ihs-xbox-one-teardown.jpg


Since the NX will find itself in the mid-cycle between XB1/PS4 and XB2/PS5, it would be suicidal to use the E6465 with only 0.19 Tflops of GCN performance. ^_^
 

Techhog

Platinum Member
Sep 11, 2013
2,834
2
26
Come on! You are killing it! :D

That would imply worse GPU performance than 2005's Xbox One for God's sake.

Xbox_One_vs_PS4_spec_01.jpg


Also, how would an x86 E6465 help with what most likely will be an ARM-based handheld for easy porting?

On the Wii U:

"The Latte graphics chip contains both a "GX2" GPGPU, which runs Wii U applications, and a "GX" GPU, which enables backwards-compatibility with Wii games. The GX2, designed by AMD, is based on the Radeon R600/R700 architecture and is clocked at approximately 550 MHz.[80][82][note 1] It is manufactured at a 40 nm node and contains 32 MB of eDRAM cache memory, which can also act as L3 cache for the CPU.[note 1] The GX, originally designed by ATI Technologies, contains a 1 MB and a 2 MB banks of eSRAM cache memory.[80] The Latte chip also includes a secondary custom ARM9 processor with 96 KB of SRAM memory that handles system tasks in the background during gameplay or while the system is in sleep mode, and dedicated hardware audio DSP module."

https://en.wikipedia.org/wiki/Wii_U

Why wouldn't they make make an X86 AMD APU + custom ARM SoC for the NX handheld?

Your GPU suggestion would mean the NX would not even be able to run Xbox 1/PS4/PC ports easily in 2016-2020. That would make the NX home console an automatically failure imo.

I mean surely if by 2016 Nintendo cannot release a console with 0.75 Tflops of SP GPU performance, they should quit the home console business. By mid-2016, the APUs inside PS4/XB1 can probably be purchased from AMD directly for $70 considering by that point they will be 3 years old and back in Nov 2013 their cost was estimates between $100-110 as per IHS.

409628-ihs-xbox-one-teardown.jpg


Since the NX will find itself in the mid-cycle between XB1/PS4 and XB2/PS5, it would be suicidal to use the E6465 with only 0.19 Tflops of GCN performance. ^_^

Nintendo doesn't care about ports. They know by now that third-party support is gone for good (or rather, they're too stupid to understand why thy lost their third party support and thus can't fix it). That's why NX is a platform encompassing both a console and a handheld. They're going to make a cheap enough console that people will buy it even though they'll only buy 2-3 games per year tops, and will basically just play games made for the handheld at a higher resolution. You are right about one thing, though: it will be a custom chip, and it probably won't even be based on GCN!

Back on topic, I wonder if this means that we might finally see AMD release the M390X soon.
 

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
Do 100% of monitors use DisplayPort now-a-days? No HDMI ports on those.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Do 100% of monitors use DisplayPort now-a-days? No HDMI ports on those.

I don't think that's a big deal because if you can support DP1.2 from the chip, you can make a product with HDMI 2.0 output. It's going to be up to the OEM who decides to integrate the embedded solution to determine the output connection. Besides, DP to HDMI 1.4 adapters are fairly cheap. None of the solutions outlined above have the horsepower to natively run 1440-4K games anyway with 1920x1200 and below a far better sweet spot for them.

3TFlops isn't even as fast as the 2011 925mhz HD7970 so I doubt HDMI 2.0 4K support matters.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Why AMD didn't sold those cards as fanless low wattage cards???

What? Because they just announced them, meaning they probably couldn't achieve this level of perf/watt previously. It's also doubtful where such a product would make sense on the laptop/desktop space since 192Gflops is very low level performance. This is clearly intended for specialized uses where the customer isn't willing to pay large premiums for faster NV GPUs/Intel APUs. For the existing laptop/desktop market, such a product wouldn't work.
 

Techhog

Platinum Member
Sep 11, 2013
2,834
2
26
What? Because they just announced them, meaning they probably couldn't achieve this level of perf/watt previously. It's also doubtful where such a product would make sense on the laptop/desktop space since 192Gflops is very low level performance. This is clearly intended for specialized uses where the customer isn't willing to pay large premiums for faster NV GPUs/Intel APUs. For the existing laptop/desktop market, such a product wouldn't work.

If only they could get that performance in a mobile (or Nintendo handheld) power budget.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
If only they could get that performance in a mobile (or Nintendo handheld) power budget.

This is what they have right now.

R9 M295X (125W TDP) = ~3.5 Tflops

R9 M290X (100W TDP) = 2.3 Tflops

R9 M280X (80W TDP) = 1.79 Tflops

R9 M275 (60W TDP) = 1.18 Tflops (This is already nearly as powerful as the GPU in Xbox One)

R9 M265X (40W TDP) = 0.8 Tflops (notice this is 4X faster than the E6465 but only double the power usage)

With 16nm HBM2, they promise to increase perf/watt 2X over all of these parts -- of course we shouldn't take their marketing slides at face value though. :biggrin:

BTW, AT confirms that E8950MXM is a full 2048 shader 256-bit Tonga in a 95W TDP.

"Starting from the top, the first new Embedded Radeon is the E8950MXM. As alluded to in the name, this product is only available as an MXM card, and is based on AMD&#8217;s Tonga GPU. This is a full shader enabled Tonga part with all 2048 SPs enabled and paired with 8GB of GDDR5 (using 8Gb chips), making it roughly analogous to the Radeon R9 M295X. In terms of power consumption the E8950MXM is rated for a 95W TDP"


If only Nintendo was ballsy enough to put the 95W E8950MXM inside the NX :)
 
Last edited:

dark zero

Platinum Member
Jun 2, 2015
2,655
140
106
This is what they have right now.

R9 M295X (125W TDP) = ~3.5 Tflops

R9 M290X (100W TDP) = 2.3 Tflops

R9 M280X (80W TDP) = 1.79 Tflops

R9 M275 (60W TDP) = 1.18 Tflops (This is already nearly as powerful as the GPU in Xbox One)

R9 M265X (40W TDP) = 0.8 Tflops (notice this is 4X faster than the E6465 but only double the power usage)

With 16nm HBM2, they promise to increase perf/watt 2X over all of these parts -- of course we shouldn't take their marketing slides at face value though. :biggrin:

BTW, AT confirms that E8950MXM is a full 2048 shader 256-bit Tonga in a 95W TDP.

"Starting from the top, the first new Embedded Radeon is the E8950MXM. As alluded to in the name, this product is only available as an MXM card, and is based on AMD’s Tonga GPU. This is a full shader enabled Tonga part with all 2048 SPs enabled and paired with 8GB of GDDR5 (using 8Gb chips), making it roughly analogous to the Radeon R9 M295X. In terms of power consumption the E8950MXM is rated for a 95W TDP"


If only Nintendo was ballsy enough to put the 95W E8950MXM inside the NX :)
AMD could simply use the E8950MXM as the cheap alternative against nVIDIA's offerings...
 

Techhog

Platinum Member
Sep 11, 2013
2,834
2
26
This is what they have right now.

R9 M295X (125W TDP) = ~3.5 Tflops

R9 M290X (100W TDP) = 2.3 Tflops

R9 M280X (80W TDP) = 1.79 Tflops

R9 M275 (60W TDP) = 1.18 Tflops (This is already nearly as powerful as the GPU in Xbox One)

R9 M265X (40W TDP) = 0.8 Tflops (notice this is 4X faster than the E6465 but only double the power usage)

With 16nm HBM2, they promise to increase perf/watt 2X over all of these parts -- of course we shouldn't take their marketing slides at face value though. :biggrin:

BTW, AT confirms that E8950MXM is a full 2048 shader 256-bit Tonga in a 95W TDP.

"Starting from the top, the first new Embedded Radeon is the E8950MXM. As alluded to in the name, this product is only available as an MXM card, and is based on AMD&#8217;s Tonga GPU. This is a full shader enabled Tonga part with all 2048 SPs enabled and paired with 8GB of GDDR5 (using 8Gb chips), making it roughly analogous to the Radeon R9 M295X. In terms of power consumption the E8950MXM is rated for a 95W TDP"


If only Nintendo was ballsy enough to put the 95W E8950MXM inside the NX :)

I was talking about the handheld. If you've been paying attention, NX is (almost certainly) the codename for the platform encompassing both Nintendo's next generation console and handheld (as separate pieces of hardware running the same OS and compatible but different specifications). Also, it's Nintendo. I'll be shocked if the console targets more than 50W for the entire console. Wii U uses less than 30W under full load. 95W is completely out of the question. They also won't use HBM due to cost and the fact that it's not not proven technology.

Also, this post just reminded me: Where the heck is the R9 M390X?
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I was talking about the handheld. If you've been paying attention, NX is (almost certainly) the codename for the platform encompassing both Nintendo's next generation console and handheld (as separate pieces of hardware running the same OS and compatible but different specifications). Also, it's Nintendo.

Is it two separate devices though under the NX platform? That was my understanding. I still don't understand why everyone is ignoring the possibility of an X86 AMD APU (gives NX the ability to do PS4/XB1/PC 3rd party cross-platform ports) + ARM SoC?

Everywhere I read it's always one OR the other.

But look at this about the Wii U:

"The Latte graphics chip contains both a "GX2" GPGPU, which runs Wii U applications, and a "GX" GPU, which enables backwards-compatibility with Wii games. The GX2, designed by AMD, is based on the Radeon R600/R700 architecture and is clocked at approximately 550 MHz.[80][82][note 1] It is manufactured at a 40 nm node and contains 32 MB of eDRAM cache memory, which can also act as L3 cache for the CPU.[note 1] The GX, originally designed by ATI Technologies, contains a 1 MB and a 2 MB banks of eSRAM cache memory.[80] The Latte chip also includes a secondary custom ARM9 processor with 96 KB of SRAM memory that handles system tasks in the background during gameplay or while the system is in sleep mode, and dedicated hardware audio DSP module."
https://en.wikipedia.org/wiki/Wii_U

Nintendo was able to incorporate the ENTIRE Wii GPU + an ARM9 processor in the Wii U. Why wouldn't they use an X86 AMD APU + ARM SoC for the NX home console?

I'll be shocked if the console targets more than 50W for the entire console. Wii U uses less than 30W under full load.

So let's see the Wii U failed miserably because it was underpowered and lacked 3rd party software, with Nintendo proving that they cannot get 1st party games out on time and in large enough quantities to support the console. Therefore, how in the world would a 50W NX manage to reverse the horrendous strategy of the Wii U?

95W is completely out of the question.

If they want to compete in North America and Europe, it better be not out of the question. Not only is launching in 2016 already a failure of a strategy because you are competing against well-established XB1/PS4 userbase of customers (friends/relatives/colleagues who haven't bought an XB1/PS4 are likely to buy the Slim versions because their friends/relatives/colleagues own XB1/PS4 -- so it's better to play online games with them). It also means the NX will be overlapping with PS5/XB2 consoles that are abound to launch in 2019-2020. There is no way around this point.

For that reason if the NX doesn't at least match Xbox One in CPU+GPU performance, it's an automatic fail, even if they price it at $199. By the time the NX even launches, Xbox One and PS4 will have dropped in price another $50.

They also won't use HBM due to cost and the fact that it's not not proven technology.

Who said anything about HBM?

R9 M275X is a 60W part with 1.18Tflops performance
E8950MXM is a 95W part with 3Tflops performance

The 8950MXM actually has the best performance/watt out of all the parts out right now in this thread but I realize Nintendo is too stupid and cheap to go with the obvious best part. At the very least if they don't match Xbox One's GPU horsepower, the console is another failure.

If it wasn't for Nintendo's perfect timing of catching a wave of casuals during the Wii, both the Wii and the Wii U would have been their worst selling consoles of all time. In fact, Nintendo always made powerful consoles. NES, SNES, N64 were the most powerful consoles of their generation. The GameCube's overall horsepower was also superior to PS2's. Again, if Nintendo doesn't even match Xbox One's CPU+GPU horsepower, they have no chance of even dreaming of 3rd party support. If they give up on 3rd party support completely, their next console is not going to succeed in key markets. You need 3rd party support -- which is why Xbox One, PS4 and PC (Steam/Origin) are so popular.

Also, this post just reminded me: Where the heck is the R9 M390X?

390X doesn't sound exciting at all. Take a look:

"The AMD Radeon R9 M390X is a mobile high-end graphics card for laptops. It is most likely based on the same chip as the AMD Radeon R9 M295X in the iMac 5K Late 2014. That means it should be based on the third generation of GCN (desktop Tonga chip). The amount of shader cores is identical but the core clock is reduced by 15% (from 850 MHz to 723 MHz) and the memory is also clocked slower (5448 MHz vs 5000 MHz)."
http://www.notebookcheck.net/AMD-Radeon-R9-M390X.144432.0.html
 
Last edited: