• Guest, The rules for the P & N subforum have been updated to prohibit "ad hominem" or personal attacks against other posters. See the full details in the post "Politics and News Rules & Guidelines."

Discussion Speculation: The Rise of RISC-V

guidryp

Senior member
Apr 3, 2006
995
1,069
136
RISC-V gets mentioned more these days, as a contingency if ARM gets unruly, with NVidia in the process of buying them.

But what we seldom see are interesting RISC-V implementations. But there are some extremely impressive perf/watt claims about a new Risc-V design from Micro Magic:





So are we at the beginning of the rise of RISC-V?
 

Hitman928

Diamond Member
Apr 15, 2012
3,339
3,461
136
Maybe, but this test doesn't tell us much. The benchmark they use is tiny, designed to scale down to very basic microcontrollers (like 8 bit controllers) and we have no info on what the Micro Magic processor specs are. If AMD/Intel/ARM wanted to make a tiny CPU where the only metric they cared about was efficiency (giving up actual performance and functionality to get it), this chart would look very different and the Micro Magic processor wouldn't look nearly as good. But the CPUs they are using as comparison are a completely different class of processors.
 

amrnuke

Golden Member
Apr 24, 2019
1,149
1,694
96
Andrei already mentioned that their Apple estimations are orders of magnitude off (from another article, but both speak of that Micro Magic chip):

I'm not sure how Andy Huang got the M1 to only score 10,000 CoreMarks, when it actually gives 30,000 in multiple tests, and I'm not sure why they'd estimate 15W instead of actually measuring the M1's power consumption. Seems like a very stupid off-the-cuff remark by Huang that is remarkably simple to disprove.

The Arstechnica chart in the first post seems accurate, as we'd expect, with the M1 getting 10974 CoreMarks/Watt - which is similar to Andrei's findings of ~9558.
 

Hitman928

Diamond Member
Apr 15, 2012
3,339
3,461
136
Andrei already mentioned that their Apple estimations are orders of magnitude off (from another article, but both speak of that Micro Magic chip):

Just to be clear, Andre is referring to Micro Magic's Apple estimations, not what is shown in the Ars article, what is shown in the Ars article is fairly accurate for the M1, at least when the big cores are engaged.
 

Hitman928

Diamond Member
Apr 15, 2012
3,339
3,461
136
I'm not sure how Andy Huang got the M1 to only score 10,000 CoreMarks, when it actually gives 30,000 in multiple tests, and I'm not sure why they'd estimate 15W instead of actually measuring the M1's power consumption. Seems like a very stupid off-the-cuff remark by Huang that is remarkably simple to disprove.

The Arstechnica chart in the first post seems accurate, as we'd expect, with the M1 getting 10974 CoreMarks/Watt - which is similar to Andrei's findings of ~9558.
The whole thing is a publicity stunt with very little real info or relevance.
 

gdansk

Senior member
Feb 8, 2011
569
238
116
It seems to me that if a easy to decode ISA actually matters then the designers should prefer simple, extensible RISC-V over ARMv8 for ease of design and cost reasons. On the other hand, how good are the RISC-V backends on mainstream compilers? There are so many extensions they must be targeting a specific set of extensions like RV64GC.
 

Doug S

Senior member
Feb 8, 2020
469
677
96
I'm not sure how Andy Huang got the M1 to only score 10,000 CoreMarks, when it actually gives 30,000 in multiple tests, and I'm not sure why they'd estimate 15W instead of actually measuring the M1's power consumption. Seems like a very stupid off-the-cuff remark by Huang that is remarkably simple to disprove.

The Arstechnica chart in the first post seems accurate, as we'd expect, with the M1 getting 10974 CoreMarks/Watt - which is similar to Andrei's findings of ~9558.
Personally when I see such wildly unfavorable claims I assume the person making them is selling snake oil. If the product he was selling (if it ever even sees the light of day) was good, he'd be able to make honest comparisons.

This isn't something slightly shady like "well let's use the Macbook Air and run it in a hot room and run the test a bunch of times in a row before taking our measurement so we get some throttling and hope we knock down the M1's numbers by 30-40%". This is outright lying by a couple orders of magnitude. So I will assume everything else he says is a lie equally as big, and will predict this product never sees the light of day. If I had to guess, they're hoping someone will be fooled and buy/bail them out.

Even if this was honest, the fact he's measuring a product that claims to scale to 5 GHz at 3 GHz is disingenuous. Every CPU has much better perf/watt running at a 40% lower clock rate. So measure the M1 at 2 GHz and see how it does then...
 

Hitman928

Diamond Member
Apr 15, 2012
3,339
3,461
136
Personally when I see such wildly unfavorable claims I assume the person making them is selling snake oil. If the product he was selling (if it ever even sees the light of day) was good, he'd be able to make honest comparisons.

This isn't something slightly shady like "well let's use the Macbook Air and run it in a hot room and run the test a bunch of times in a row before taking our measurement so we get some throttling and hope we knock down the M1's numbers by 30-40%". This is outright lying by a couple orders of magnitude. So I will assume everything else he says is a lie equally as big, and will predict this product never sees the light of day. If I had to guess, they're hoping someone will be fooled and buy/bail them out.

Even if this was honest, the fact he's measuring a product that claims to scale to 5 GHz at 3 GHz is disingenuous. Every CPU has much better perf/watt running at a 40% lower clock rate. So measure the M1 at 2 GHz and see how it does then...
They're not even trying to sell this product from what I can tell, they're trying to license their IP, this was just all for show to try and get people to believe how good their IP is. I can't fault them too much I guess, it worked. I am very disappointed that the tech media just pretty much regurgitated it instead of criticizing the lack of real info, the bad data, the irrelevant comparisons. . . pretty much everything about it.
 

Doug S

Senior member
Feb 8, 2020
469
677
96
They're not even trying to sell this product from what I can tell, they're trying to license their IP, this was just all for show to try and get people to believe how good their IP is. I can't fault them too much I guess, it worked. I am very disappointed that the tech media just pretty much regurgitated it instead of criticizing the lack of real info, the bad data, the irrelevant comparisons. . . pretty much everything about it.
Well you may be disappointed, but I hope you're not surprised...
 

guidryp

Senior member
Apr 3, 2006
995
1,069
136
Personally when I see such wildly unfavorable claims I assume the person making them is selling snake oil. If the product he was selling (if it ever even sees the light of day) was good, he'd be able to make honest comparisons.
Yes, I ignored the one sided claims from Micro Magic themselves.

The Arstechnica Story reports their own scores for Apple and AMD cores.

Though they are still relying on numbers supplied by Micro Magic for that side of the comparison.

There are a great many question marks, but this still could be something worth keeping an eye on.
 

name99

Senior member
Sep 11, 2010
400
296
136
RISC-V gets mentioned more these days, as a contingency if ARM gets unruly, with NVidia in the process of buying them.

But what we seldom see are interesting RISC-V implementations. But there are some extremely impressive perf/watt claims about a new Risc-V design from Micro Magic:





So are we at the beginning of the rise of RISC-V?
No.
If you rely on a BS benchmark, which you then cannot even calculate close to correctly you have UTTERLY squandered your credibility. (As have EE Times, not that they had much to begin with.)

Next question.
 

soresu

Golden Member
Dec 19, 2014
1,572
786
136
RISC-V gets mentioned more these days, as a contingency if ARM gets unruly, with NVidia in the process of buying them.

But what we seldom see are interesting RISC-V implementations.
I'm still waiting on the more powerful Madras Shakti cores to show some sign of life.
 

SarahKerrigan

Senior member
Oct 12, 2014
219
232
116
EEtimes said:
“Using the EEMBC benchmark, we get 55,000 CoreMarks per Watt. The M1 chip is roughly the equivalent of 10,000 CoreMarks in EEMBC terms; divide this by eight cores and 15W total, and that is less than 100 CoreMarks per Watt.” Going on to make a comparison with Arm, he added, “The fastest Arm processor under EEMBC benchmarks is the Cortex-A9 (quad-core), with a figure of 22,343 CoreMarks. Divide this by four cores and 5W per core, and you get 1,112 CoreMarks per Watt.”
We're supposed to believe that a quad-core Cortex-A9 is over twice as fast as M1? And, for that matter, that a quad A9 is a 20W chip? Also, how do they get 100 CM/W from 10000 CM @ 15W?

What am I missing? I work in semi and I'm naturally really suspicious of this kind of claim...
 
  • Like
Reactions: scannall

DrMrLordX

Lifer
Apr 27, 2000
16,787
5,771
136
On the other hand, how good are the RISC-V backends on mainstream compilers? There are so many extensions they must be targeting a specific set of extensions like RV64GC.
Gotta wonder, eh? I am still waiting for someone to actually start selling a RISC-V CPU/SoC that supports most of/all the known extensions (with the exception of I think -P, which has been at least partially deprecated in favor of -V?).
 

soresu

Golden Member
Dec 19, 2014
1,572
786
136
Gotta wonder, eh? I am still waiting for someone to actually start selling a RISC-V CPU/SoC that supports most of/all the known extensions (with the exception of I think -P, which has been at least partially deprecated in favor of -V?).
Aren't the major extensions still in pre v1 state?

This is something that makes me highly dubious of RISC V at the moment.

It's all well and good if they say that future increments of the extension standards will be backwards compatible, but that sounds like they are setting themselves up for a world of hurt and bloat.
 

DrMrLordX

Lifer
Apr 27, 2000
16,787
5,771
136
Aren't the major extensions still in pre v1 state?

This is something that makes me highly dubious of RISC V at the moment.

It's all well and good if they say that future increments of the extension standards will be backwards compatible, but that sounds like they are setting themselves up for a world of hurt and bloat.
I suspect the same. There are several extensions that are not "frozen" yet, and it is possible that the extensions will actually change over time, which means older hardware won't support them and newer hardware may not be backwards-compatible.

From a laptop/desktop/workstation/server point-of-view, it would make more sense if they finalized groups of extensions and then required people to support all of them (or as many as possible; 128-bit extensions are pretty niche) in order to have the RISC-V badge. Then when new extensions are needed, they can hold them together and update the RISC-V standard in groups of extensions. For microcontrollers that approach makes less sense, where it's nice to be able to trim off unneeded extensions to save on implementation costs.
 
  • Like
Reactions: soresu

ASK THE COMMUNITY