Info Graphcore's new 7nm AI chip

soresu

Platinum Member
Dec 19, 2014
2,657
1,858
136
Graphcore just announced their 250 TeraFLOP FP16 monster AI chip, nVidia beware.....

Link here to more details on it.

Edit: the announcement is very slippery about exactly what a single chip does.

I don't think it's 250 TFLOP FP32 per chip, I think it's per rack of 4 doing 62.5 TFLOPS each.

FP16 figures are projected 4x higher than that, so effectively 250 TFLOPS FP16 per chip.

Graphcore-vs-nvidia.jpg
 
Last edited:

DisEnchantment

Golden Member
Mar 3, 2017
1,601
5,780
136
AI chip, nVidia beware.....
But strangely no customer mentions so far.
Making Silicon is just one part of the process.Having a SW stack is another. Then supporting customers on top of that. (AMD learnt that the hard way. Fortunately for them, they have the backing of the US DoE now for ROCm)
And having the tenacity to continuously execute despite all these drawbacks so that customers can take you seriously.
And when you get enough exposure, to have enough money to fight lawsuits from NV, there is bound to be some tech they are stepping on which NV would have patented many times over. Without a big patent pool to fight back there is a possible risk for potential customers.
 

soresu

Platinum Member
Dec 19, 2014
2,657
1,858
136
And when you get enough exposure, to have enough money to fight lawsuits from NV, there is bound to be some tech they are stepping on which NV would have patented many times over.
Unlikely, nVidia may have gotten the drop on many - but compared to the GPU market AI/NPU/IPU/tensor acceleration is still the wild west at the moment*.

If anything it's more likely they would get sued over something software related, since as you say that tends to be the lynchpin in a strategy.

Though what really seamed glaring by its absence was power consumption figures.

They made a big show of the cost of a Graphcore system vs a nVidia system, but I saw nothing about TDP's at all.

*What makes it easier to patent GPU's is the necessity to adhere to standards like DX and OGL - the whole unified shader methodology of DX10 is a major thing for nVidia patents I think, I'm pretty sure that they sued someone over it.

(or was that AMD that sued someone for once?)
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
But strangely no customer mentions so far.
You need to be a big reliable company to sell such high end kit - no one is going to invest a lot into a company they couldn't rely on. So I guess this hw is more a sales pitch to sell the company then it is to sell the hardware. Expect some big company trying to get into the AI game to snap them up quick if they are really any good.
 

soresu

Platinum Member
Dec 19, 2014
2,657
1,858
136
Not sure why this is in the Graphics forum, GraphCore don't make graphics processors.
To be fair it's the closest forum for it as nVidia went the whole nine yards to make AI/ML compute a GPU thing since Volta.

By all means if someone wants to make a separate AI/ML forum then I'll move the next such thing there.

Either way it matters because nVidia who make GPU's are in this market and this is a not insignificant contender against their new Ampere chips for AI purposes at least.

Also I'm pretty sure that a Graphcore guy has been interviewed saying something along the lines of their architecture possibly being suitable for general compute and graphics - though don't quote me on that.

The particularly interesting part of the UE5 demo from a purely technical perspective was that the Epic guys actually said there was a fair amount of software rasterization happening in it.

Obviously this doesn't mean that everybody is suddenly going to start doing the same, but it is a start - so fixed function hardware for that purpose may not necessarily always be the thing in the future, and more competitors could be found willing to throw their hat in the ring with uArch's closer to pure general compute.
 
Last edited:

GodisanAtheist

Diamond Member
Nov 16, 2006
6,783
7,117
136
You need to be a big reliable company to sell such high end kit - no one is going to invest a lot into a company they couldn't rely on. So I guess this hw is more a sales pitch to sell the company then it is to sell the hardware. Expect some big company trying to get into the AI game to snap them up quick if they are really any good.

- My only regret in life is that I have but one like to give to this post.

For all we know this is Graphcore's pitch to NV to buy them...
 
  • Like
Reactions: NTMBK

NTMBK

Lifer
Nov 14, 2011
10,232
5,013
136
You need to be a big reliable company to sell such high end kit - no one is going to invest a lot into a company they couldn't rely on. So I guess this hw is more a sales pitch to sell the company then it is to sell the hardware. Expect some big company trying to get into the AI game to snap them up quick if they are really any good.

So true. If you invest multiple years building software for Graphcore's proprietary software stack, and then Graphcore goes bust... Well, there goes years of work, and possibly your job. It's a huge gamble.
 

soresu

Platinum Member
Dec 19, 2014
2,657
1,858
136
For all we know this is Graphcore's pitch to NV to buy them...
Or Intel, AMD, and any number of other interested parties.

nVidia probably isn't going to buy out another compute hardware company like GC at this point - I think PhysX was the last thing even remotely comparable and that was clearly for the software stack which is becoming less and less use now as the likes of Epic develop their own native Chaos solutions for Unreal Engine, and I would be surprised if Unity are not doing the same under wraps.

Intel definitely could, if it wouldn't look utterly ridiculous for them to do so after at least 2 other ML focused acquisitions so far with little obvious results to show for it.

AMD probably would be sailing close to the wind to do such a thing at this point and we still know very little about the ML focused HW in CDNA+.

On the other hand - the oft overlooked thing is the players that don't necessarily want the tech for competing against compute giants like nVidia so much as for its own sake.

Like Tesla, or some other auto manufacturer looking for uber power efficent and performant hardware for ADAS and autonomous driving systems to complement their own wares.

Or you could take the more long view of it - Boeing or Lockheed might want to buy it for similar reasons to Tesla and more besides.
 
Last edited:

NTMBK

Lifer
Nov 14, 2011
10,232
5,013
136
nVidia probably isn't going to buy out another compute hardware company like GC at this point - I think PhysX was the last thing even remotely comparable and that was clearly for the software stack which is becoming less and less use now as the likes of Epic develop their own native Chaos solutions for Unreal Engine, and I would be surprised if Unity are not doing the same under wraps.

It's not all going one way. The Lumberyard engine is ditching its proprietary physics engine and moving over to PhysX.

Intel definitely could, if it wouldn't look utterly ridiculous for them to do so after at least 2 other ML focused acquisitions so far with little obvious results to show for it.

What's the best way to fix their lineup of 5 different incompatible AI solutions? Add a 6th one! But don't worry, OneAPI will make all the problems go away *eyeroll*
 

soresu

Platinum Member
Dec 19, 2014
2,657
1,858
136
It's not all going one way. The Lumberyard engine is ditching its proprietary physics engine and moving over to PhysX.
On the scale of things Lumberyard is not much to go by, at least compared to Unreal and Unity users/customers/partners.

Sure it has a handful of big name partners like Star Citizen, but drop on in the bucket territory compared to what Epic covers alone.

With UE5's extremely impressive demo and Epic's promise of forward compatibility for non custom projects, I would only expect their reach to increase over the next console generation.

The impressive graphics capabilities of CryEngine seemed amazing when Star Citizen got rolling, but UE4 and Unity have really stepped up since then, to say nothing of the monumental mess at CryEngine which has clearly impeded their progress.
What's the best way to fix their lineup of 5 different incompatible AI solutions? Add a 6th one! But don't worry, OneAPI will make all the problems go away *eyeroll*
I think you missed the part where I was actually making a dig at Intel about their throwing money at the ML market problem through acquisitions.

As for OneAPI, I don't know if it will ever come to something, or stay open if it does - but I certainly don't expect anything blindingly positive this side of 2025, at least unless they have a pretty huge team working on it 24/7.