- Feb 12, 2013
- 3,818
- 1
- 0
Why don't they just write a program to design a cpu?
Have you ever written Verilog or VHDL? Have you ever had to take Verilog from someone who thinks #1ps is a synthesizable construct or enjoys writing verilog that looks like if(foo==1'bx)? Have you ever used EDA tools from Synopsis and Cadence? Have you ever had someone look baffled when you tell them they just blew through the entire area budget for their design when all they did was add one line of Verilog? (which created a giant array of cells)
All of these are why the design takes awhile. We're not even including actual manufacture, the subsequent debug time, and the fab turn around times.
People complain about C and C++ don't realize how good they have it that a relatively sane language is somewhat standard.
Why does cpu design take so long?
Why don't they just write a program to design a cpu?
The answer is the project management triangle.
![]()
It takes a long time (in our perception of time) because the companies making the cpus don't want to spend an insane amount of money developing the chip so as to bring it to market faster without sacrificing quality and/or scope (features such as performance, power consumption, instruction sets, iGPU, etc.)
If 90% of the market was willing to pay 3x more for the same product with the same features provided it arrived to the market in just 2 yrs instead of 3 yrs then you can bet companies would be throwing even more money at developing the chips on a faster timeline.
Going back to the project management triangle, you can do what you propose but not without sacrificing scope (features), quality (bugs, validating computer developed cpu is difficult), or development budget.
But in the end, since these products are born in a semi-capitalistic/semi-free market environment, the answer to pretty much any question you can come up with regarding the scope or timeline of the product is that the economics don't work out unless it is done the way they currently do it. (i.e. its the money)
Why don't they just write a program to design a cpu?
Why don't they just write a program to design a cpu?
because even for Intel's top scientists, reverse-engineering a derelict alien spaceship on the far side of the moon takes a lot of time.
but you didn't hear it from me.![]()
ALL the damn stages are complexWhy don't they just write a program to design a cpu?
What are you guys all talking about with AMD using a program to automatically design a CPU? The synthesis that can be automated is very far from all of CPU design.
The question isn't really that much different from asking why Microsoft doesn't just write a program to design the next version of Windows.
they reverse engineer AMD chips. that's how they write compilers that make AMD worse
Looked at GAs a little when writing a placer based on shape-curve + slicing-trees a few years back. Perhaps I did not delve deeply enough, but it seemed a little difficult to implement some of its idealogies, eg. what exactly does one swap between two placement solutions, and does it necessarily "evolve" to a global (rather than local) minimum.A more interesting option is to do a overall optimization using GA with microarchitectural simulations including estimations for power, delays, etc. This is an old idea of mine, but I also found some recent papers describing such possibilities.
I'm astonished at the speed of design. It's amazing to me.
+1From a simple enthusiast point of view I see processors as some of the most complex and amazing machines mankind is able to make, requiring collaboration from a wide range of disciplines (chemistry, engineering in many of its variants, physics, computing obviously, etc), and each year that passes the complexity gets even more mind boggling. I mean, from simple calculators with a few transistors to a 486 not that long ago with a million transistors doing their thing, we're now into the billions! Breathtaking stuff.
I can easily see how millions of man hours go into these little wonders we buy and use, it's amazing to think all the knowledge that's been put into practice to make these a reality. I love forums for this very reason, the possibility to see people in their respective fields give their take on the matter.
Room-temperature quantum computing.I can't even begin to think what would happen when a technology that does to the transistor what it did to the vacuum tube gets invented. Not to mention if it has the ability to scale and progress along the years just as much as the transistor did and does...
ALL the damn stages are complex
1) IOS
2) Design (RTL)
3) Functional Verification (UVM, etc.)
4) Frontend (synthesis, STA)
5) Backend (physical design, DFT)
6) Chip finishing
Hell, I work on just synthesis, STA and physical design, and I already see plenty of issues to keep me busy (is your UPF/CPF right? why by Odin's nutsack did you add 500ps of uncertainty AND 10% of timing derates? why do you have dummy 1'b0/1'b1, etc.)
While it's true that we do have CAD programs to help out (Virtuoso, PrimeTime/ETS, EDI/ICC/AtopTech, DC/RC, etc.), it's a FAAAAAR cry from just typing in "make silicon" and being done with it.
If it were, you can be damned sure management wouldn't pay a bunch of neckbeards $100K+
One more thing - P ?= NP, which means that all CAD programs have to resort to heuristics. Something as "simple" as placement (especially when you have 100s of macros like in a complex CPU) is already a PITA, let alone CTS, routing, timing closure, power/leakage optimization, etc.
Add the fact that design is generally an over-constrained optimization problem, and it just makes thing much more complicated.
PHB : This block must fit in 1um^2, consume only 1nW, run at 10GHz, have 100% test coverage, OCC and implement all customer functionality.
Me : No problem. You want fries with that? :biggrin:
Have you ever written Verilog or VHDL? Have you ever had to take Verilog from someone who thinks #1ps is a synthesizable construct or enjoys writing verilog that looks like if(foo==1'bx)? Have you ever used EDA tools from Synopsis and Cadence? Have you ever had someone look baffled when you tell them they just blew through the entire area budget for their design when all they did was add one line of Verilog? (which created a giant array of cells)
All of these are why the design takes awhile. We're not even including actual manufacture, the subsequent debug time, and the fab turn around times.
People complain about C and C++ don't realize how good they have it that a relatively sane language is somewhat standard.