Are there people out there who 100% understand how a computer works?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Revolution 11

Senior member
Jun 2, 2011
952
79
91
I am rather envious of you engineers in this regard, my brother says the same thing about how he understands science and math far better now. Almost makes me want to switch careers out of sheer envy. :)
 

discoeels

Junior Member
Apr 28, 2013
9
0
0
One book many people who ask about this find very beneficial is simple called CODE ( http://www.amazon.co.uk/Code-Langua...1319/ref=sr_1_2?ie=UTF8&qid=1363360821&sr=8-2). It goes over the general encoding used within processors and builds up transistors from the raw physical element and explanation right the way up to putting them together to form adders and buses and such. It is a good book on getting the theory moved into the basic practical without too much starting knowledge.

But to build a machine of todays complexity and all its software takes many millions of human hours to achieve, it is far more complex than of us realise.

Edit: Sometimes I wonder how its possible to explain anything at all

Sadly,just left the pastry/bakery business. I felt sufficient in what I knew and the experience I had but the process was never ending,particularly with bread.
There might be a systematic process to explain what goes into mixing a particular type of dough that when baked will yield a certain type of bread, or at least will be recognized as a certain type. But it would be impossible to give an exhaustive explaination.
So...computers=bread :biggrin:
 
Last edited:

fyb3r

Member
Feb 12, 2013
32
0
0
www.anarchyst-it.com
Really it isnt as complicated as it might seem. granted ive been doing this my entire life and have had my head burried in a physics or computer related book every spare minute i can for the past 15 years (24 currently), but it isnt that hard to fathom.

There are, however, specialists in every field you will go into. While I know how they work all the way down to the binary level with the transitors and how the transisitors function, im also a specialist in network security, enterprise level administration, and programming. So rather then becoming an engineer and designing chips and circuit boards i prefer to focus on the software side of things. I stay on the up and up when it comes to new hardware, but i dont look into schematics really.
 

Cogman

Lifer
Sep 19, 2000
10,284
138
106
Really it isnt as complicated as it might seem. granted ive been doing this my entire life and have had my head burried in a physics or computer related book every spare minute i can for the past 15 years (24 currently), but it isnt that hard to fathom.

There are, however, specialists in every field you will go into. While I know how they work all the way down to the binary level with the transitors and how the transisitors function, im also a specialist in network security, enterprise level administration, and programming. So rather then becoming an engineer and designing chips and circuit boards i prefer to focus on the software side of things. I stay on the up and up when it comes to new hardware, but i dont look into schematics really.

There is a difference between knowing how things generally work and knowing how things actually work. I could tell you all about pipelining, Out of order processing, SIMD, and a whole host of other topics about how a CPU works, yet I couldn't begin to explain to you what exactly happens in an Intel CPU. I couldn't tell you what pipelining stages they use, how their microcode translators work, or exactly how their instruction ordering is done. I can just tell you that it does happen.
 

007ELmO

Platinum Member
Dec 29, 2005
2,046
36
101
Bump for people to re-read Max's post and the first reply sentence
 

smakme7757

Golden Member
Nov 20, 2010
1,487
1
81
There would be very few who could say with honesty that they understand every working aspect of a computer system: Hardware, software and the elements that make it all work - electricity, electrons ect....

Just learning how caching works in a CPU and why it's important is a massive subject because it involves the "Why". Everything has been created for a reason and everything works together right down to the electrons flowing through the transistors.

BUT

I must admit computing wasn't as magical after i started getting down and learning the theory behind it :p
 

Sheep221

Golden Member
Oct 28, 2012
1,843
27
81
The block diagram of computer is very simple, detailed one, not so much:biggrin:.
The problem is these things are very challenging to be designed by one person, that's why we have companies who employ thousands of people just to develop and make these things and there is not much of them really, the most difficult components the CPUs and GPUs, only 3 companies worldwide (AMD, Intel, Nvidia) mass produce and develop them for desktop market.
 

Dude111

Golden Member
Jan 19, 2010
1,495
5
81
Maximilian said:
Are there people out there who 100% understand how a computer works?
I dont understand very much of it but i am quite fascinated by it!
 

Oyeve

Lifer
Oct 18, 1999
22,043
875
126
No, you would go mad if you did. Hmm, perhaps the most craziest people know.
 

sushiwarrior

Senior member
Mar 17, 2010
738
0
71
So is OP asking if there exists a person who could design, manufacture, assemble, program, and operate a computer of entirely their design?... I think that's about as specific you could get. I think it's easily doable by somebody much smarter than I. Design a basic processor, have a knowledge of lithography, make a PCB to put it on, write an OS and BIOS and assembly code to use with it, and design a peripheral to interact with it. Then, using your basic computer, design the next, more complicated one! :D
 

sandorski

No Lifer
Oct 10, 1999
70,635
6,197
126
Give me a shovel, a bucket, and $30,000. In 30 days I'll have a PC shipped to you.

:sneaky:
 

SecurityTheatre

Senior member
Aug 14, 2011
672
0
0
What an interesting topic.

I have Computer Science, Computer Engineering and Electrical Engineering degrees.

In those three disciplines, I studied most of what you're talking about, but I would NEVER claim to know all of what is in a modern computer.

But for reference, in Electrical engineering, you study electronics and electromagnetics. You learn transistor theory and study transistor response times. The basic chemistry of how a transistor is built and how it is laid onto an IC is something they go over. We did some study in one of the EE/CprE crossover courses on transistor response times and how voltage vs gate size affects performance. I could go dig up the equations, they were pretty dense. We studied quantum tunneling (which has a non-trivial impact on today's tiny gates).

We did several labs on lithography and how it is done and we did some basic experiments on building ICs (they were absolutely huge compared to today's modern process).

In Computer Engineering classes, we focused on taking transistors (both "ideal" and "real") in simulations and building logic circuits. The simple layouts for the basic logic operations, OR, AND, NOT, XOR, etc and how that combines to do anything from arithmetic to processor opcode decision trees. How to combine these units into adders of various length and speed. However, in a modern CPU they go to some extreme measures to make this stuff more efficient and I wouldn't claim to be well versed on this.

We designed SDRAM and SRAM chips and ran simulations on them. In the final lab, we built a MIPS-compatible 16-bit microprocessor with 16-bit SDRAM memory using a 4-stage pipeline. In theory, these chips could run some very basic embedded operating systems and included 4 hardware interrupt lines. This was all done in simulation using Verilog VHDL.

In other courses, we studied modern interrupts and how they work, we studied processor performance optimizations and how to implement caches (information about set associativity, transaction lookaside buffers, etc) and we implemented several layered cache architectures in simulations.

Optimizations like out of order operations and lengthening the pipeline. Hardware implementations of speculative execution and SIMD instructions, microcode and some other things were all concepts we put together practical implementations of.

Other courses in Computer Engineering involve coding (using MIPS and 68k assembly) various microprocessor to perform basic functions, ranging from simple math, to realtime control systems.

In Computer Science, we studied everything from formal logic to algorithms. Things like algorithm time complexity and set theory are a standard part of that curriculum and are important for designing systems, but don't fundamentally help you understand how a computer works. On the other hand, courses heavy in operating system design delved into the modern Linux kernel implementation and practical implementation of a monolithic kernel and a microkernel design. We did some heavy code work in the Linux kernel, implementing an alternative task switcher, implementing some tweaks to how drivers are loaded, etc. However, this area is still where I feel the weakest of the whole stack. The space between firmware (which I understand pretty well). Another course involved delving into compiler design and compiler functionality.

Of course, I also did the higher level programming work in Java, C++ and did some personal projects in a few other languages. I implemented a medical records system for some researchers on camps in a really slick (for the time) Windows GUI that I think is still in use for managing human trials and drug testing results for that department.

In my aborted attempt to get a Masters in Computer Engineering, I focused on information security, did courses on Radio Frequency communication and dissected 802.11 from software down to firmware down to hardware and then back up to protocol design and security. I did some research on network stack performance and how building multiple TCP streams in parallel and then alternating streams for transmission actually makes modern implementations of TCP window sizing and congestion backoff much more efficient, resilient to data loss, more "fair" from a sharing standpoint, and faster.

But anyway, all I'm saying is that I regard my background as pretty extensive - and I hardly would claim to be able to build a modern PC from scratch. In fact, very far from it, even given nearly unlimited time.

HOWEVER, I think I could build an 80386 processor from scratch. I could probably build a really simple Linux clone. I could build the RAM and a peripheral bus and have it light up some lights. I'm sure there are a few gaps that I would have trouble with, even trying to build a simple 8088-based IBM XT system, though I suspect i could get close.

For example, building a video framebuffer that can switch modes. Exactly how the sync signals on the wire interact with the electron tubes in a CRT monitor? What about graphics modes that use a framebuffer? I *think* I know how it works, but I'm certain there are some areas I don't understand.

Also, a modem... I know the theory behind the standard QAM (a phase+amplitude RF modulation) encoding and I've done work analyzing the signals from various levels, but I'm still fairly sure I don't know how to implement a complex integrated circuit to actually generate those QAM signals. The PLLs involved in picking up an analog carrier signal and locking onto it are something I've designed (a long time ago), but how to use that to pick up phase variances on the wire... and implement that in silicon... Hmmmm, not quite sure. I know I could build an RS-232 serial port with a FIFO buffer, those are pretty simple, but it would be slow as hell at first. The old 300 bps ports aren't too complicated, but jacking the speed up to a modern 225k speed serial port is pretty difficult (even if it is OLD technology by modern standards)

Also, disk drives. I didn't do enough applied physics focused on magnetic storage medium. We talked about giant magnetoresistive effects and how they *might* be used to store data on a platter, but the chemistry that goes into the drive platter coatings... the highly sensitive stepper motors that control the heads of the drives, the aplification technology to translate the subtle signals picked up by the drive head. This is all complicated stuff.

I haven't even got into just the process of laying down a 7-15 layer PCB that is used in modern motherboards and all the impedance matching that goes into the trace layouts on that kind of PCB.

And in the voltage regulation and transformers. I've certainly built an AC/DC transformer with multiple voltage outputs, but it was really inefficient. The modern switching power supplies are pretty cool engineering themselves. Beyond that, the voltage regulation on motherboards are also something you build in a decent Electrical Engineering lab, but the precision of the digital-voltage switching on a modern motherboard is pretty damn cool, and I'm not quite sure exactly how they work.

Wow... I think I went on for awhile. I'll leave it at that.
 

icanhascpu2

Senior member
Jun 18, 2009
228
0
0
Knowing how it works isnt the hard part.
Knowing how to build one is the hard part.

No one alive has all that information in their head. You would need to go very far back to the basics of things like smelting metals and have a deep understanding of chemistry far FAR before you even get to the actually making the first proto-transistor, let alone anything resembling a CPU made in the last 50 years.

Many people contain the knowledge of how to work certain machines to make parts for a computer. But how far back will you take this? This question is way too vague. Does that person know how to build the machines in fabrication plant? Do they even know how to build the tools to build those machines?? The answer is no. They need outside knowledge at some point.

HOWEVER, I think I could build an 80386 processor from scratch.


Define "from scratch".

This is like a cook saying they can bake a cake from 'scratch' when in reality what they mean is they are going to use the oven they have 0 idea how to build, the piping/electricity they have 0 idea how to get from source to stove, the milk they have no idea how to get from cow to fridge (the store doesn't count), the eggs from chicken to bowl. Lets not even get into the cake mix itself and the agriculture involved there.

So no. You couldn't build one 'from scratch'. You can put the ready-made pieces together possibly though, and that's no small feat.
 
Last edited:

icanhascpu2

Senior member
Jun 18, 2009
228
0
0
My point is depending on how far back you go, one person would need to know far far more than EE to build a chip. If they are simply using ready-made tools to piece it together then it becomes orders or magnitude easier (but still exceptionally skilled) to build even a simple imitation.

Question is too open ended

By the way that very video is one of the virtual CPU vids that got me into minecraft. Still play to this day :)
 

HeXen

Diamond Member
Dec 13, 2009
7,832
37
91
IF I knew 100% of computers I would still have 99.9% useless knowledge regardless of what specific type of job in the industry I took it to.
 

Ayah

Platinum Member
Jan 1, 2006
2,512
1
81
I'd imagine "from scratch" as I drop you on an empty uninhabited planet somewhere with just your clothes and tell you to go forth and build a computer. (you'd need to know how to find and extract the materials yourself, no surveys/charts/maps/equipment, etc.)
 

SecurityTheatre

Senior member
Aug 14, 2011
672
0
0
I'd imagine "from scratch" as I drop you on an empty uninhabited planet somewhere with just your clothes and tell you to go forth and build a computer. (you'd need to know how to find and extract the materials yourself, no surveys/charts/maps/equipment, etc.)

I guess I was taking for granted "using other technology that exists today".

I've done enough metalwork to know I couldn't make a semi-modern saw or an axe. Even a simple knife would be difficult and all of that assumes I had a supply of refined metal to start with.

Technology is way amazing.
 
Last edited:

Nec_V20

Senior member
May 7, 2013
404
0
0
The very simple answer is no.

We techies go where out inclinations take us and we know each other.

The whole area of computing is so vast that even if you are a big fish in an big pond (not just a long time poster on a forum, i.e. big fish little pond) then if you were to spread your arms out as wide as you can and then separate your thumb and your forefinger about a quarter of a centimetre apart then that distance between your thumb and forefinger would be the maximum you could ever envisage to know with regard to the totality of your arms spread akimbo within the realms of computers.
 

Maximilian

Lifer
Feb 8, 2004
12,604
15
81
Fantastic post SecurityTheatre, wish I was mathematically inclined and knowledgeable enough to be able to piece together a 386, perhaps in time (although likely not):cool:
 

nightspydk

Senior member
Sep 7, 2012
339
19
81
You are forgetting my friends like OP said we are taking mechanics not the hardware stuff unfortunately. That is simple for even a noob programmer.
The hardware is just that intricate part of a system that fuction depending of every part of a system dependent. You see that evry day with every kind of stuff you would otherwise notice and where is that? :p

..and don't mind me..
 
Last edited: