Intel Announces FPGA on package + Xeon Product

kimmel

Senior member
Mar 28, 2013
248
0
41
[FONT=&quot]http://www.theregister.co.uk/2014/06/18/intel_fpga_custom_chip/[/FONT]
The chip company announced on Wednesday at GigaOm Structure in San Francisco that it has started selling a Xeon E5-FPGA hybrid chip to some of its largest customers.
http://new.livestream.com/gigaom/structure2014/videos/54241911

Would be interesting to play with. Probably will be hard to get a hold of even after it does come out though. The article says present tense but the interview doesn't give a release date.

Cache coherent over QPI as well.
 

sefsefsefsef

Senior member
Jun 21, 2007
218
1
71
This is excellent news for at least 2 reasons. 1) Intel is moving toward accelerators, which I believe is the future for energy-efficient computing. 2) Intel is selling something that has a programmable non-x86 component. Next up, they need to make an ARM core, and use it to take over the world.
 

Ayah

Platinum Member
Jan 1, 2006
2,512
1
81
major question will be cost. if the fpga portion has to have an Intel-sized markup, it may not be price-competitive against just tacking the fpga onto pcie.
 

TuxDave

Lifer
Oct 8, 2002
10,571
3
71
This is excellent news for at least 2 reasons. 1) Intel is moving toward accelerators, which I believe is the future for energy-efficient computing. 2) Intel is selling something that has a programmable non-x86 component. Next up, they need to make an ARM core, and use it to take over the world.

I'm excited. I know companies are all trying get a competitive advantage by differentiating itself from another. Software is one way and system configurations is another. By putting FPGAs in, it allows Microsoft or Google to go program in very specific functions that differentiate themselves even more. So that's good for them... and it's good for Intel.

This does two things. It sort of jumps ahead of the trend by providing a solution for the customers. That way their eyes don't wander around and start buying other parts. On top of that, it prevents ISA bloat by not needing to put in a lot of hardware for very weird instructions that only 1 or 2 customers really want.

The FPGA implementation probably won't beat out common arithmetic operations on the CPU (or GPU) due to its decrease in density and frequency but it'll allow customers to start putting in weird bit manipulation instructions that no one except them wants. :p
 
Last edited:

sefsefsefsef

Senior member
Jun 21, 2007
218
1
71
The FPGA implementation probably won't beat out common arithmetic operations on the CPU (or GPU) due to its decrease in density and frequency but it'll allow customers to start putting in weird bit manipulation instructions that no one except them wants. :p

Some of my recent research has been into hardware acceleration of sorting functions for big data workloads, and there is a huge amount of performance to be gained even when the accelerator (or FPGA) is running at only a few hundred HMz.

EDIT: here are a couple papers people should look into if they want to know more about what I'm talking about --
http://arcade.cs.columbia.edu/harp-isca13.pdf
http://arcade.cs.columbia.edu/q100-asplos14.pdf
 
Last edited:

jpiniero

Lifer
Oct 1, 2010
16,832
7,281
136
Yeah, this is a big deal. Keeping Google/Yahoo/Bing/Amazon on x86 (as opposed to doing similar stuff on ARM) is extremely important since that is a high growth area.
 

Ken g6

Programming Moderator, Elite Member
Moderator
Dec 11, 1999
16,697
4,658
75
I actually expected FPGAs to be added to CPUs as accelerators as long as ten years ago. Of course, that was before GPGPU acceleration.

I also expected them to be less expensive than they are. I wonder what keeps the prices up? Patents?
 

TuxDave

Lifer
Oct 8, 2002
10,571
3
71
Some of my recent research has been into hardware acceleration of sorting functions for big data workloads, and there is a huge amount of performance to be gained even when the accelerator (or FPGA) is running at only a few hundred HMz.

EDIT: here are a couple papers people should look into if they want to know more about what I'm talking about --
http://arcade.cs.columbia.edu/harp-isca13.pdf
http://arcade.cs.columbia.edu/q100-asplos14.pdf

For sure. There's definitely applications which allow significant performance gains even at a density and frequency disadvantage. No doubt about it. Just don't have any wishful thinking about using an FPGA to outperform GPU DGEMM performance.
 

jpiniero

Lifer
Oct 1, 2010
16,832
7,281
136
I actually expected FPGAs to be added to CPUs as accelerators as long as ten years ago. Of course, that was before GPGPU acceleration.

Intel's rather actively worked against accelerators of any kind; wanting to keep the lock-in to x86 and make the focus of the computer the CPU. They're only doing this now because they fear Google et all will leave them for ARM.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
Intel's rather actively worked against accelerators of any kind; wanting to keep the lock-in to x86 and make the focus of the computer the CPU. They're only doing this now because they fear Google et all will leave them for ARM.

Could you provide edvidence of this? Intel even helped FPGA makers to produce FPGAs on the lastest node for years.

So I would really like to see any documentation of this.
 

jpiniero

Lifer
Oct 1, 2010
16,832
7,281
136
Could you provide edvidence of this? Intel even helped FPGA makers to produce FPGAs on the lastest node for years.

Well, if you are talking about recent history, PCI Express is such a bottleneck that a lot of programs that could use FPGA/GPUs/other accelerators are not able to or the usefulness is limited. Given Intel's history, that's by design. Intel's slowly opening up things only because they have to.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
Well, if you are talking about recent history, PCI Express is such a bottleneck that a lot of programs that could use FPGA/GPUs/other accelerators are not able to or the usefulness is limited. Given Intel's history, that's by design. Intel's slowly opening up things only because they have to.

Thats your argument for "Intel working actively against"?

Intel have worked with FPGA designers for quite some years your know.

So lets drop the random FUD and nonsense.
 

zephyrprime

Diamond Member
Feb 18, 2001
7,512
2
81
Do you think FPGA are cheap?

There is nothing inherently expensive about fpga's. The only reason they are expensive is because they are a niche product. For an equal number of transistors, there is not reason they would be any more expensive to manufacture than a processor. Overall, they would be much cheaper to sell because their design costs are so low because they have such a homogeneous design.
 

waffleironhead

Diamond Member
Aug 10, 2005
7,061
570
136
Amd's torrenza never really took off. That was an fpga in an opteron socket though. Will be curious to see how well these take off.
 

crashtech

Lifer
Jan 4, 2013
10,695
2,294
146
as expected semiaccurate has a negative spin on it.

http://semiaccurate.com/2014/06/20/intels-fpga-announcement-simply-palpable-desperation/

i love how the dude puts up anything that might be remotely interesting behind a paywall. but all his anti nvidia and intel propaganda comes free!

like the brother of daniel nenni
I can't tell he doesn't like Intel, not at all, that is totally unbiased coverage.

Wake me up when machines start re-programming FPGAs without human intervention.