Has there really been such a great development the last 20 years?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

lukatmyshu

Senior member
Aug 22, 2001
483
1
0
of course compiler optimizations are still practiced to this day. BUT if I am multiplying a number by 2, I'll multiply it by 2. It's a lot clearer for the next guy to understand, and me to read the next time I'm looking saying "what does this do again?" .... of course the compiler should be doing bit-shifting if necessary (in some architectures it's not that much faster) Still, things like instruction scheduling require programmers to have some sort of knowledge about what they'er doing. I agree with Shal whose statement that programs that were 'optimized' by using type-insecure languages, and programmers were forced to forego bounds checks in order to save execution time -- that's what's biting us in the a$$ today. Code Red, Nimda ... 50% of all those virii could be eliminated with bounds checking.
 

Peter

Elite Member
Oct 15, 1999
9,640
1
0
Actually if you want to multiply by two the absolute quickest method to do that is to do that with an addition. X+X fastest; X shl 1 not-so-good; X * 2 awful. This is particularly true on Pentium-4 since it has very fast ADD logic and abysmally slow multiplying.

But anyhow, code optimization comes in three stages:
(1) Think of a better algorithm.
(2) Use compiler optimization.
(3) Hand-optimize.

Obviously (1) is where the human brain is needed. (2) is what everybody does, albeit often with a lack of understanding what optimizations help what kind of code. (3) is still pretty yielding because optimizing compilers don't see the whole picture of what the algorithm is for.

Most of the time, at least in creating desktop applications, (1) and (3) are completely neglected. Mediocre code is thrown at good compilers, and if it works, out the door it goes. Profiling for performance bottlenecks? Let alone do something about it? Rarely. Instead,

(4) Make customers buy faster hardware.

And only if that still doesn't make them shut up, then finally go back to (1).
 

trak0rr0kart

Member
May 1, 2003
70
0
0
Originally posted by: HarryAngel
I was discussing with a friend of mine about the the last 10-15 years of computing that we have been doing. The computing industry has moved forward, but has there really been such a great development the last 20 years? I mean compare the last 10 years with today...stuff that you did with your pc then with software like coral draw and stuff you can do today, compare the speed. I don't see a great improvement in computing..sure evrything is in gigahertz, GigaByte but the software programs are also much bigger. Does your production work get done faster today then say 10 years ago? maybe. But it's not all that fantastic that ones thinks. I say that the groundbreaking development occurred 10-20 years ago.



IF you are just talking about the way you interface with the computer.. its purpose etc.. then there hasn't been much changes except for the advent of the net. That changed everything on how we use the computer.. that was a huge development.

If you talk about the refinements we have made to the computer.. then the list is endless to what we have done to improve the machines.

Just an example for say graphic design and rendering etc... I would say hell ya there has been improvements.. and anything else that is speed related. Like calculating ballistic trajectories. Could you imagine doing that with those old .. very old computers sooo many years ago? no.. so things of that nature is where I would see the improvements.

Your statement is very open. You can't say that you have read, seen or even discussed every single improvement out there. If you think a world with 7 billion people in it.. for the last 10 years or so, completely focused on computing and utilizing them in every way that could benefit us is not an improvement.. then you need to open up a little bit and read a lot more.

Bill gates.. consider an idiot by a lot of people.. and a computer revolutionist by a lot others.. even said this: We will never need more then 640K. So this man.. Computer rev or whatever for his time when other dudes where out doing non-computer stuff.. can say that and be soooooo wrong.. then there would have to be soo much that was improved to make it so.

Just think about it a little more before you make a statement like that. You definitely got attention (I'm not saying thats why you started this thread).
 

tdowning

Member
May 29, 2003
69
0
0
Originally posted by: Peter
Just go back to what you'd get for the same money in 1993, or 1983. Like, the $500 that buy you an extremely okay PC today wouldn't even have bought you a decent 486 in 1993, and in 1983 you'd have been hard pressed to even find a C64 or Atari 800XL w/ floppy drive for that kind of money.
Then go use the software of the respective age.
Just for comparisons, 20 MByte (!) HDDs were $1000 in 1986 and 300 KB/s "fast". Computers had 4 to 8 MHz, and 1 MByte RAM was plenty.

What absolutely didn't make any progress is the form factor of the standard PC. We still get clunky big tin boxes - with essentially nothing anymore inside, but still the same extremely unelegant brick.

Hey have you seen some of the Small form factor PC's from Compaq? I think SFF never really took off for consumers because most of them (or at least the most vocal ones) all want more upgradability that you can get from a SFF, (the ones I've seen 3 PCI slots or 2 PCI and a half height AGP) and only have room for three drives, (Floppy, HDD, CD-whatever)

I really think that there really have been some revolutionary paradigm shifts in the world of computers.

Take Plug and Play. admittedly it did first appear in a useable form in the IBM PS/2 Model 50z (The oldest one I've seen with Micro-channel archetecture... Nevermind, He's saying 20 years)) IBM dropped the ball trying to make that a proprietary standard, and the Idea really took off with PCI. And because PCI is so easy to work with (From a design, and end-user standpoint,) Why reinvent the wheel? PCI has been made faster, (faster clock, and wider Bus) Hot swappable, (though only in servers) and it's still largely backward compatible.

there is also Firewire, USB 1.1 and 2.0 not to mention that so many people were so sure that moore's law would run smack into the laws of physics, ant so many different points along the line, usually shortly berfore the next major breakthrough.

"THe Computer IS the network" is also a major paradigm shift. many people opt to spend less on the hardware, and more on a broadband connection, because the Internet is the killer app for home users, (The combination of WWW and e-mail mostly, although voice chat and stuff like that are starting to get up there too) What's on your desk, (Or lap as the case might be) is not nearly as important as what it allows you to do.

Somebody help me out. what did the average laptop weigh 20 years ago? Did they even exist? I'm really not sure.

DVD's are more revolutionary than evolutionary compared to CD's. New processes were needed to create LED's (Laser emitting diodes, in this case) with a shorter wavelength. The move from red to (can't remember off the top of my head,) was a major breakthrough, because shorter wavelength can "See" smaller features, and the partially reflective layer that allows for dual layer DVD's. Now i've headr rumblings that another reduction in laser wavelength will allow for putting HDTV quality video onto a CD/DVD sized media.

and let me close with this. I personally carry more computing horsepower in my pocket (Now a sony Clie) than it took to automate the US's Air defence warning system, (A series of computers that monitored Radar data looking for missles and other such stuff. Read the book "Project Whirlwind" for more on that)
 

damonpip

Senior member
Mar 11, 2003
635
0
0
My Opinion:

1) The WWW has been the biggest development in the last 20 years and one of the biggest in the last century.

2) Computers have not physically changed a whole lot, except for the speed, but more speed does lead to better software. The internet certainly wouldn't be where it is today without fast computers. I think the some of the largest advances have been made in graphics hardware and software. Who would of guessed that we could make a totally realistic scenes for a movies on a computer (Matrix Reloaded.)

3) Computers are going to drastically change fairly soon, as silicon is reaching it's limit. I'd guess that artificial intelligence and quantum computing may become a reality within the next 20 years.
 

jodhas

Senior member
Aug 5, 2001
834
0
0
Originally posted by: pm
Look folks, HarryAngel is talking about the physics involved in building fundamental parts of computers.
A process shift from, say, 0.18um to 0.13um isn't as simple as telling every person working out in the fab, "Ok, guys, this week we are going to make everything smaller than we did last week." A process shift is a hideously complex affair involving literally hundreds of companies and thousands of engineers and technicians. At any given point, most of the technologies required for manufacturing chips two generations ahead haven't been invented yet.

I'm a cleanroom engineer. I will not specifically name our clients, but two of our biggest clients boast state of the art FABS in Austin, Texas and Eugene, Oregon.

Jumping from .18um to .13um is not easy. Nor is jumping from 200mm wafer to 300mm wafer. However in both cases, it is not a "hideously complex affair." It does not require hundreds of companies nor thousands of engineers.

I can confidently say this because of the following reasons.
Case 1
In both cases (.18um to .13um and 200mm to 300mm), many FABS around the world are merely "retro fitting" their current fabs to accomodate the new technology. Also, as we jump from .18um to .13um, human presence inside a cleanroom must be minimized. On a typical Class 1 Cleanroom in a FAB, we have a raised floor system with the + and the - airflow design to keep the "large" particles out. Now for the "small" particles, the ceilings are covered with Fan Filter Units equipped with ULPA 99.999999999 Efficiency and up 365 days a year. Now, this is your average Class 1 Cleanroom (very simplified). However, as we jump from .18um to .13um, we need to minimize human presence. Even with the strict cleanroom protocols for gowning and with State of the Art Airshowers, humans are the single biggest source of contamination inside a cleanroom. In order to solve this problem, we minimize human presence by installing what is called the AMHS (Automated Materials Handling System). There are only 2 reliable companies that have the capabilities to these robotic systems. They are Muratec and Daifuku, I think Daifuku won the "Intel's preferred Supplier List." And even this process is not too difficult, for I've seen it first hand in New York's IBM FAB as Muratec installed these systems as they were retro fitting to 300mm technology. Anyhow, so, for "Retro Fitting" we are "upgrading." It's not as impossible and difficult as it seems.

Case 2
Our major client in Austin Texas, will be "retro fitting" as well, to accomodate the .13um technology this Fall. It's a $200Million Project. Respectable cleanroom companies (as well as mine) will bid on this project. I assure you, it does not take thousands of engineers nor hundreds of companies. :). We already commited couple of engineers, salesman to do some cost estimation for this project. $200 cleanroom/FAB is a good sized project, but it is not something hundreds of companies can say "let me have a piece of that pie!" Any respectable cleanroom contractors should be able to finish something like this without too much problem.

Case 3
If not retro-fitting is not an option, and a company wants to build a FAB from the scratch, It still does not take "thousands of engineers" and "hundreds of companies". I know this because MW Zander (company based in Germany) built our clients FAB based in Eugene, Oregon for 1.4 Billion dollars in 1997 and MW zander completed this job with their in-staff engineers and several subcontractors and supplers.

Case 4
And actual development of the .13um technology, it's sort of the same procedure as the .18um and to the contrary of the above quotation, yes the big idea behind .13um technology is simply making it "smaller". I've seen first hand, the tools used (photo, etching, cmp, etc...) for .13um process look very very similar to the ones used in .18um. Tool installation and tool operations were close to being identical as the tools used in the .18um process. I don't know the perfect details of these tools but if the installation, operation, matintenance and physical attributes are almost indentical, would the science behind it be too different? Probably not. I think it's safe for us to draw an analogy of upgrading your duron to an athlon, doubling your ram to 512mb and adding a faster harddrive (provided your mobo can take the athlon), to describe the procedure behind the trastion of .18um to .13um
My point is that, .18um to .13um can be done rather eaily, and most definately, it is not a "hideously complex affair involving thousands of engineers nor hundreds of companies." It's more of an upgrade, based on the same technology.

My point is that anyone who says that there haven't been "great developments" in silicon manufacturing and design in the last 20 years, is standing too far away from the action to see how difficult the challenges are and how impressive the solutions have been. And it's not like the reason that the basis for most semiconductor electronics is silicon is due to the fact that there have been no innovations or developments in finding other materials. The fact is that silicon is the "best" material we have available for manufacturing when economics is taken into consideration.

I haven't been in this area for 20 years but only about 10, so I am humbled as I say what I am about to say. Fundamentally the technology, the idea, the theory, is practically the same as it was 10 years ago. There have been tremendous improvements in the process of chip-making and the environment(FAB) in which they are produced but it is only an improvement not a fundamental change, so I would have to agree with Harry Angel.

I hope I didn't come across as arrogant, I've had trouble in the past for sounding arrogant. If I've offended anybody (especially the person I quoted from, I apologizein advance if I did offend you). But anyhow, as a person who is a little closer to the "field" of chipmaking than an average person, I just wanted to agree with Harry Angel and put in my 2 cents. :)
 

HarryAngel

Senior member
Mar 4, 2003
511
0
0
Originally posted by: jodhas
Originally posted by: pm
Look folks, HarryAngel is talking about the physics involved in building fundamental parts of computers.
A process shift from, say, 0.18um to 0.13um isn't as simple as telling every person working out in the fab, "Ok, guys, this week we are going to make everything smaller than we did last week." A process shift is a hideously complex affair involving literally hundreds of companies and thousands of engineers and technicians. At any given point, most of the technologies required for manufacturing chips two generations ahead haven't been invented yet.

I'm a cleanroom engineer. I will not specifically name our clients, but two of our biggest clients boast state of the art FABS in Austin, Texas and Eugene, Oregon.

Jumping from .18um to .13um is not easy. Nor is jumping from 200mm wafer to 300mm wafer. However in both cases, it is not a "hideously complex affair." It does not require hundreds of companies nor thousands of engineers.

I can confidently say this because of the following reasons.
Case 1
In both cases (.18um to .13um and 200mm to 300mm), many FABS around the world are merely "retro fitting" their current fabs to accomodate the new technology. Also, as we jump from .18um to .13um, human presence inside a cleanroom must be minimized. On a typical Class 1 Cleanroom in a FAB, we have a raised floor system with the + and the - airflow design to keep the "large" particles out. Now for the "small" particles, the ceilings are covered with Fan Filter Units equipped with ULPA 99.999999999 Efficiency and up 365 days a year. Now, this is your average Class 1 Cleanroom (very simplified). However, as we jump from .18um to .13um, we need to minimize human presence. Even with the strict cleanroom protocols for gowning and with State of the Art Airshowers, humans are the single biggest source of contamination inside a cleanroom. In order to solve this problem, we minimize human presence by installing what is called the AMHS (Automated Materials Handling System). There are only 2 reliable companies that have the capabilities to these robotic systems. They are Muratec and Daifuku, I think Daifuku won the "Intel's preferred Supplier List." And even this process is not too difficult, for I've seen it first hand in New York's IBM FAB as Muratec installed these systems as they were retro fitting to 300mm technology. Anyhow, so, for "Retro Fitting" we are "upgrading." It's not as impossible and difficult as it seems.

Case 2
Our major client in Austin Texas, will be "retro fitting" as well, to accomodate the .13um technology this Fall. It's a $200Million Project. Respectable cleanroom companies (as well as mine) will bid on this project. I assure you, it does not take thousands of engineers nor hundreds of companies. :). We already commited couple of engineers, salesman to do some cost estimation for this project. $200 cleanroom/FAB is a good sized project, but it is not something hundreds of companies can say "let me have a piece of that pie!" Any respectable cleanroom contractors should be able to finish something like this without too much problem.

Case 3
If not retro-fitting is not an option, and a company wants to build a FAB from the scratch, It still does not take "thousands of engineers" and "hundreds of companies". I know this because MW Zander (company based in Germany) built our clients FAB based in Eugene, Oregon for 1.4 Billion dollars in 1997 and MW zander completed this job with their in-staff engineers and several subcontractors and supplers.

Case 4
And actual development of the .13um technology, it's sort of the same procedure as the .18um and to the contrary of the above quotation, yes the big idea behind .13um technology is simply making it "smaller". I've seen first hand, the tools used (photo, etching, cmp, etc...) for .13um process look very very similar to the ones used in .18um. Tool installation and tool operations were close to being identical as the tools used in the .18um process. I don't know the perfect details of these tools but if the installation, operation, matintenance and physical attributes are almost indentical, would the science behind it be too different? Probably not. I think it's safe for us to draw an analogy of upgrading your duron to an athlon, doubling your ram to 512mb and adding a faster harddrive (provided your mobo can take the athlon), to describe the procedure behind the trastion of .18um to .13um
My point is that, .18um to .13um can be done rather eaily, and most definately, it is not a "hideously complex affair involving thousands of engineers nor hundreds of companies." It's more of an upgrade, based on the same technology.

My point is that anyone who says that there haven't been "great developments" in silicon manufacturing and design in the last 20 years, is standing too far away from the action to see how difficult the challenges are and how impressive the solutions have been. And it's not like the reason that the basis for most semiconductor electronics is silicon is due to the fact that there have been no innovations or developments in finding other materials. The fact is that silicon is the "best" material we have available for manufacturing when economics is taken into consideration.

I haven't been in this area for 20 years but only about 10, so I am humbled as I say what I am about to say. Fundamentally the technology, the idea, the theory, is practically the same as it was 10 years ago. There have been tremendous improvements in the process of chip-making and the environment(FAB) in which they are produced but it is only an improvement not a fundamental change, so I would have to agree with Harry Angel.

I hope I didn't come across as arrogant, I've had trouble in the past for sounding arrogant. If I've offended anybody (especially the person I quoted from, I apologizein advance if I did offend you). But anyhow, as a person who is a little closer to the "field" of chipmaking than an average person, I just wanted to agree with Harry Angel and put in my 2 cents. :)
Great post! Right on the money! :)