Explain to me when to use each type of compute technology

Roland00Address

Platinum Member
Dec 17, 2008
2,196
260
126
So I know the basics of each technology but can someone give me some explanations of the various compute technologies such as CPU, GPU, FPGA, ASIC, etc and when it makes sense to use each one for each type of problem you will see in the non-consumer market. What is the compute technologies pros and con's and strengths and weaknesses.

For example you use X technology for database servers, this one for web servers, this email one for microservers, this one for scientific calculations such as mapping oil deposits, this one for high frequency trading, etc

It would also be awesome if you can give a relative size of the market like this is a small niche thing but it is expected to grow rapidly, or this tech is dying out.


---

This post inspired by the buyout of Altera who makes FPGAs via Intel. Intel has been partnering with FPGA for 2 years for it was announced in Oct 2013 that Alters was going to use Intel fans for some future products and that in Jun 2014 Intel and Alteration would partner for a hybrid x86+FPGA.
 

sm625

Diamond Member
May 6, 2011
8,172
137
106
Keep in mind that a CPU can have ASIC style logic on it. Take Intel's quicksync for example. You want to use an ASIC whenever you have a very specific set of operations that you want to perform on a constant stream of data, usually network or video data. You switch to an FPGA if you expect the algorithm(s) to change frequently.
 

Roland00Address

Platinum Member
Dec 17, 2008
2,196
260
126
Keep in mind that a CPU can have ASIC style logic on it. Take Intel's quicksync for example. You want to use an ASIC whenever you have a very specific set of operations that you want to perform on a constant stream of data, usually network or video data. You switch to an FPGA if you expect the algorithm(s) to change frequently.

I understand. ASIC is the most efficient but you can't change it it is the program written into the hardware, fpga is kinda of a hybrid between an ASIC and a CPU with which it can be very close to the performance of the other extremes or not even close depending on how well designed it is.

I am just trying to get a sense of how this old technology, but now "kinda new" is going to change the compute market on the non commercial side.

I am asking with this post to get a sense of the forest, and via understanding the forest I hope to better understanding of the individual trees. And by understanding both the big and small picture I hope to get a general idea of the ratios of individual trees inside the forest.

Thank you for responding though.
 

videogames101

Diamond Member
Aug 24, 2005
6,783
27
91
ASIC: Cheapest per unit for massive production (millions of units). Expensive initial cost.

FGPA: Low initial cost, high cost per unit. Often used for prototyping ASICs, or for applications which need more custom high speed hardware but don't need millions of units. (The defense industry/military loves FPGAs for their high speed and low volume applications)

GPUs: Very specific scientific calculations can be accelerated with GPUs. Apart from that, mostly graphics processing :)

Everyone else uses microcontrollers (or their larger cousins, x86 CPUs) because they're cheap as milk, often fast enough, and programmers are ubiquitous. Turns out the hardware design engineers you need for FPGAs and ASICs are more difficult to come by.

For consumer markets, FPGAs don't make much sense because their cost is so high per unit compared to ASIC designs. The idea of selling FPGA's alongside an x86 cpu also doesn't make sense because no infrastructure exists for consumers or even most companies to write or distribute the HDL code needed to make use of those FPGAs. (Much less make efficient use of the x86/FPGA interface!)

FPGAs are decently sized market. The people who use FPGAs use them for a good reason, and won't be switching to anything else. That being said, I don't see any reason that market would be growing at a rapid pace.

To answer your question specifically...
"...database servers, this one for web servers, this email one for microservers, this one for scientific calculations such as mapping oil deposits, this one for high frequency trading, etc"

Most everyone uses x86 CPUs (excepting those crazy guys using IBM Z mainframe CPUs) to do all of the above. Although I know that the high frequency trading market has been looking at FPGAs. No idea about the penetration there though.
 
Last edited:

CP5670

Diamond Member
Jun 24, 2004
5,697
797
126
GPUs have recently become popular for machine learning, specifically for training deep convolutional networks. I'm using them at work for doing face recognition. This usually only needs single precision and can be done on consumer cards, as opposed to double precision for scientific simulations. Not sure about the market size, but it was the main talking point at Nvidia's technology conference a few weeks ago.

FPGAs are used for one-off prototypes that are not mass-manufactured, especially for real-time processing. They are common in the defense world for radar/comms systems and to a lesser extent in the HFT space.
 

TechyGeek

Member
Feb 23, 2015
108
9
81
Does facebook count for face recognition?
How about law enforcement (FBI CIA etc).

How big is that market size?