Has there really been such a great development the last 20 years?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

gururu

Platinum Member
Jul 16, 2002
2,402
0
0
By some standards, we won't see a 'great' development until the use of electrons is vanquished.
 

borealiss

Senior member
Jun 23, 2000
913
0
0
by your definition, it seems that even if something like optical interconnects were developed, that still wouldn't be an advancement... there hasn't been a "major" breakthrough in component manufacturing because tried and true methods work, and that's what material scientists have to work with. if you want to talk about new processes, you should take a look at how the manufacturing of the substrate on hard disks has changed over the last decade. densities for these things have skyrocketed. every year they manage to fit more bits with a square inch than the previous year. but the fundamental manufacturing process isn't going to have a major revolution if the original one works to begin with. if it's not broken, why fix it? even if you're talking about leaps and bounds in manufacturing processes, i still think you're understating the ingenuity that goes into the semiconductor manufacturing process. if a gallium arsenic mixture works, why change it? if a 9 metal layer process works, why fix it? what's next on the list? xray lithography i believe for beyond 90 nm? you're asking for a major revolution in manufacturing when one isn't really needed. necessity is the mother of invention, and when the time comes that it will be necessary to move to something beyond semiconductors, we'll make that transition. as of now, it's not necessary, but this doesn't mean r&d efforts are being put forth for the next generation of substrate materials for constructing ICs.

you're opinion of what constitutes a "great development" is somewhat boggling as well. certainly the improvements in computers are not limited to the manufacturing processes. certain archictectural enhancements are extremely noteworthy, certainly as much as a manufacturing breakthrough. where would modern computers be today if superscalar architecture wasn't developed. how about improvements in out of order execution? the number of functional units that have grown on cpus? the advanced schedulers required by cpus in the face of ever-lengthening pipelines? simple improvements in algorithms for branch prediction? what about the fact that most cpus are just using the x86 isa as a wrapper? imho, the fact that x86 is just a common front-end is an extremely important achievement as decoders for modern day cpus get more modular for the IA-32 isa. if transmeta's technology had taken off a bit more, i think we would've seen a paradigm shift from cpus designed for a particular instruction set to those that are adapted to existing instruction sets as a wrapper. but my digression aside, all these add up. i think you're limiting your scope to a very narrow field and not doing your original question justice by just focusing on manufacturing.
 

Shalmanese

Platinum Member
Sep 29, 2000
2,157
0
0
Thats like saying there has been no major improvements in Industry since 5000BC since we are still primarily utilising chemical energy.
 

HarryAngel

Senior member
Mar 4, 2003
511
0
0
Originally posted by: Shalmanese
Thats like saying there has been no major improvements in Industry since 5000BC since we are still primarily utilising chemical energy.
No it's not, that would be your way of looking at it ;)
 

HarryAngel

Senior member
Mar 4, 2003
511
0
0
Originally posted by: borealiss
by your definition, it seems that even if something like optical interconnects were developed, that still wouldn't be an advancement... there hasn't been a "major" breakthrough in component manufacturing because tried and true methods work, and that's what material scientists have to work with. if you want to talk about new processes, you should take a look at how the manufacturing of the substrate on hard disks has changed over the last decade. densities for these things have skyrocketed. every year they manage to fit more bits with a square inch than the previous year. but the fundamental manufacturing process isn't going to have a major revolution if the original one works to begin with. if it's not broken, why fix it? even if you're talking about leaps and bounds in manufacturing processes, i still think you're understating the ingenuity that goes into the semiconductor manufacturing process. if a gallium arsenic mixture works, why change it? if a 9 metal layer process works, why fix it? what's next on the list? xray lithography i believe for beyond 90 nm? you're asking for a major revolution in manufacturing when one isn't really needed. necessity is the mother of invention, and when the time comes that it will be necessary to move to something beyond semiconductors, we'll make that transition. as of now, it's not necessary, but this doesn't mean r&d efforts are being put forth for the next generation of substrate materials for constructing ICs.

you're opinion of what constitutes a "great development" is somewhat boggling as well. certainly the improvements in computers are not limited to the manufacturing processes. certain archictectural enhancements are extremely noteworthy, certainly as much as a manufacturing breakthrough. where would modern computers be today if superscalar architecture wasn't developed. how about improvements in out of order execution? the number of functional units that have grown on cpus? the advanced schedulers required by cpus in the face of ever-lengthening pipelines? simple improvements in algorithms for branch prediction? what about the fact that most cpus are just using the x86 isa as a wrapper? imho, the fact that x86 is just a common front-end is an extremely important achievement as decoders for modern day cpus get more modular for the IA-32 isa. if transmeta's technology had taken off a bit more, i think we would've seen a paradigm shift from cpus designed for a particular instruction set to those that are adapted to existing instruction sets as a wrapper. but my digression aside, all these add up. i think you're limiting your scope to a very narrow field and not doing your original question justice by just focusing on manufacturing.
Changes and need for changes dosen't always follow each other. One can argue then by your reasoning if any major change was brought bye need. I mean like you say "if it's not broken, why fix it?" ;)

Great breakthroghs inovations haven't always come through beacuse of urgent necessity. If you look back at scientific improvemnets for exampel then you can see that many great innovations/breakthroughs were not always received as 'usefull' at the time. More then often the discussions are about how to implement the new findings, where can they be usefull etc.


 

HarryAngel

Senior member
Mar 4, 2003
511
0
0
Originally posted by: borealiss
by your definition, it seems that even if something like optical interconnects were developed, that still wouldn't be an advancement... there hasn't been a "major" breakthrough in component manufacturing because tried and true methods work, and that's what material scientists have to work with.

That is my point. Technology hasn?t really changed the last 20. Seems like we can agree on that. Now whether there is a need for change or not, thats arguable.
 

borealiss

Senior member
Jun 23, 2000
913
0
0
see, that's where i disagree with you. something like optical interconnects would be a complete revolution for semiconductors, not to mention they would be fabricated completely differently from something with copper or aluminum interconnects. the fact that you wouldn't consider them a revolution just proves my point about the limited scope of your question. try asking your question to a materials scientist involved with the fabrication process. i'm sure you'd get a much different answer. fabrication processes haven't changed much in the traditional sense because they haven't had to. if standard cmos works, use it. why move to pmos or nmos only when there's a tried a true method. limit your scope of "revolutionary" to just the superficial level of what you see and that's the type of answer you're going to get. anyways...
 

HarryAngel

Senior member
Mar 4, 2003
511
0
0
I agree that if optical interconnects were developed for semiconductors, THAT would be a major breakthrough!

BUt it hasen't. So I think that we can agree on that.
 

Coldfusion

Golden Member
Dec 22, 1999
1,014
0
76
I think many of the technological advances have been masked by poor programming.

By far the biggest advances have been made on the materials side. While making something smaller, faster, cheaper may not seem like a huge breakthrough, it truly is. CPU's today are 100x faster than those 10 years ago, while being smaller, and consuming less power. Hard Drives are also smaller and faster.

Programs have become bloated. With all the extra horsepower, time has not been spent optimizing code, time has only been spent cranking out more and more of it. As the technological barriers for ramping up clock speed and disk capacity are hit, this will happen. Clock speeds and disk capacity will remain the same, yet performance will increase nonetheless.
 

Epimetreus

Member
Apr 20, 2003
72
0
0
I agree that advancements have been made on the fringe. IE, optical cables, CDs instead of floppies, etc... like I said before. It's the core things that are so very similar to their predecessors in terms of underlying concept; and thus the peripherals must emulate them in order to be compatible.
As for TransMeta, they took the fundamental means in which we utilize the power these processors have and changed its application to a more universal, simpler, more rapid means. This is some of the greatest innovation I've seen in a long time.
And one final thing: for any of you wondering why you should innovate when things "work as they are now", I have only this to say:
What would Tesla say if he heard you say that?
 

Epimetreus

Member
Apr 20, 2003
72
0
0
Originally posted by: Coldfusion
I think many of the technological advances have been masked by poor programming.

By far the biggest advances have been made on the materials side. While making something smaller, faster, cheaper may not seem like a huge breakthrough, it truly is. CPU's today are 100x faster than those 10 years ago, while being smaller, and consuming less power. Hard Drives are also smaller and faster.

Programs have become bloated. With all the extra horsepower, time has not been spent optimizing code, time has only been spent cranking out more and more of it. As the technological barriers for ramping up clock speed and disk capacity are hit, this will happen. Clock speeds and disk capacity will remain the same, yet performance will increase nonetheless.

My GOD, this is SO true!!
I remember back a few years ago even when things were a great deal more complex; I was still under my technophobic parents' control and thus could not participate, but I would discuss PC-related things with friends who could. It was amazing how ingenious programmers were when optimizing code out of necessity. I rather wish hardware had not had the giant explosion of development we've seen lately since I am stolidly of the belief that we would have better performing and more stable applications if software had become as integrated and widely used as it is now, without the overpowered hardware to back it up since programmers would still be forced to optimize code in a far more stringent manner to make it work even half acceptably with any machine at all.
 

Shalmanese

Platinum Member
Sep 29, 2000
2,157
0
0
Bleh, optimising code is a waste of good talent. There is "good" processor wastage and "bad" processor wastage. Things like garbage collection and dynamic array allocation are a godsend. Sure they cost processor time but its worth it. Half the security exploits are from buffer overruns simply because people are still using a language from a time when dynamic bounds checking was too expensive.

Far from being more stable and integrated, they would be bug ridden and half finished because programmers were forced to do everything by hand.
 

Epimetreus

Member
Apr 20, 2003
72
0
0
Shal, you obviously either do not know what you are talking about or dealt with incompetent programmers.
I don't know which, but if I were a programmer myself I'd be liable to skin you alive for such an insulting comment.
 

Coldfusion

Golden Member
Dec 22, 1999
1,014
0
76
Originally posted by: Shalmanese
Bleh, optimising code is a waste of good talent. There is "good" processor wastage and "bad" processor wastage. Things like garbage collection and dynamic array allocation are a godsend. Sure they cost processor time but its worth it. Half the security exploits are from buffer overruns simply because people are still using a language from a time when dynamic bounds checking was too expensive.


Well, part of the problem is programmers have NO IDEA what is going on at the machine level when they call such and such a command. I firmly believe assembly should be part of every college curriculum, as it gives a greater understanding of what is going on underneath.

Until then, you'll continue to see nested for loops, storing all database entries in one giant table, and unmodularized code. I can't believe the amount of "professional" programmers that i've met that have no idea how to program.

Garbage collection is not wastage, its actually doing something useful. Wasteage is calling cmul instead of sll when multiplying by two, when sll is 32x more efficient doing the same job. Anybody could be a programmer given unlimited cpu cycles and system resources.

Far from being more stable and integrated, they would be bug ridden and half finished because programmers were forced to do everything by hand.

And this is different from now, how? Programs are bug-ridden and half finished today. The only difference is you have people that have NO idea what they're doing writing programs in VB and other drag and drop programming environments. Such an advance -- programs that write the bad code for people that don't even know enough to write bad code.
 

HarryAngel

Senior member
Mar 4, 2003
511
0
0
Originally posted by: Shalmanese
Bleh, optimising code is a waste of good talent. There is "good" processor wastage and "bad" processor wastage. Things like garbage collection and dynamic array allocation are a godsend. Sure they cost processor time but its worth it. Half the security exploits are from buffer overruns simply because people are still using a language from a time when dynamic bounds checking was too expensive.

Far from being more stable and integrated, they would be bug ridden and half finished because programmers were forced to do everything by hand.
rolleye.gif
 

HarryAngel

Senior member
Mar 4, 2003
511
0
0
Originally posted by: Coldfusion
Originally posted by: Shalmanese
Bleh, optimising code is a waste of good talent. There is "good" processor wastage and "bad" processor wastage. Things like garbage collection and dynamic array allocation are a godsend. Sure they cost processor time but its worth it. Half the security exploits are from buffer overruns simply because people are still using a language from a time when dynamic bounds checking was too expensive.


Well, part of the problem is programmers have NO IDEA what is going on at the machine level when they call such and such a command. I firmly believe assembly should be part of every college curriculum, as it gives a greater understanding of what is going on underneath.

Until then, you'll continue to see nested for loops, storing all database entries in one giant table, and unmodularized code. I can't believe the amount of "professional" programmers that i've met that have no idea how to program.

Garbage collection is not wastage, its actually doing something useful. Wasteage is calling cmul instead of sll when multiplying by two, when sll is 32x more efficient doing the same job. Anybody could be a programmer given unlimited cpu cycles and system resources.

Far from being more stable and integrated, they would be bug ridden and half finished because programmers were forced to do everything by hand.

And this is different from now, how? Programs are bug-ridden and half finished today. The only difference is you have people that have NO idea what they're doing writing programs in VB and other drag and drop programming environments. Such an advance -- programs that write the bad code for people that don't even know enough to write bad code.
I was just discussing that with a programmer friend of my and I made a post about this not so long ago in the OS forum....you are so right!
 

Shalmanese

Platinum Member
Sep 29, 2000
2,157
0
0
Hmm... maybe we have a misunderstanding here. Im not saying that stuff like using quicksort instead of bubble sort is "bad". Of coruse that is good, ALGORITHMS should be optimised. What I dont like is that when you have a task X and you only have Y CPU cycles to spend on it so you need to spend endless hours tweaking the code so that it just fits within the constraints. When you cant AFFORD grabage collection or dynamic array allocation because you dont have the CPU cycles. That is BAD.

As for sll vs cmul, that is a compiler optimisation, not a code one. Ideally, coders shouldnt even need to KNOW about sll vs cmul. Things like storing all database entries in a single table isnt "bad" because it wastes processor cycles. Its bad because its a bad design philosophy. Similarly with modularity.

Yes, ANYBODY could be a programmer with unlimited CPU cycles. Isn't this a good thing? Or are you afraid of losing your job? Te whole point of all these extra CPU cycles is to make the PROGRAMMER's job easier. The fact that it bloats the code is irrelevant as long as the code can run at an acceptable speed.

As I said before? How many security exploits are from buffer overuns? Dynamic Array allocation and bounds checking makes buffer overruns impossible at a rather trivial performance drop. I think I would rather prefer a program which is guarenteed not to overflow rather than one which has just been checked by the programmer not to overflow.
 

anomaly

Senior member
Nov 14, 2002
401
0
0
Yes there have been revolutionary steps in both computer and car worlds.
For computers:
The WWW
Liquid Crystal Displays
.13um processing (yes this is revolutionary rather than evolutionary, it took 4 years of inventing other technology to make this possible)
Chip arrangement (PGA, FPGA, getting rid of riser cards)
etc
Cars:
Use of computers
Fuel injection
ABS
Airbags
Stability Control Systems (as a racer at the regional level in both Formula Vee and Vintage SCCA AND driving a Street Modified E46 M3 in Auto-X) SCS is great. Though it is much less intrusive while driving the M3 than other cars (Audi A8 for example.) But for the general public who wants to be able to steer and brake in an emergency situation without worrying about sliding it has saved alot of lives.
By your mentality we SHOULD be sitting in the 1900's. Most people could make do with what was avaliable at the time.
 

diehlr

Member
Dec 29, 2000
186
0
0
The main thing that has advanced in computing in the past 10-20 years is accessibility and that's about it. Most of the horsepower used in CPUs aside from games is to make computers more user friendly and appealing to the eyes. If this weren't a factor, most office users would still be happy using a 386DX running DOS or maybe Windows 3.1.
 

TJN23

Golden Member
May 4, 2002
1,670
0
0
Originally posted by: Epimetreus
but when it comes to processors, RAM, hard drives, and other deep internals, there's been little fundamental advancement in a very long time.


the von neumann architecture has been around for a long time and wont seem to change...which i believe states that u have memory, secondary storage, processor, IC...
 

vegetation

Diamond Member
Feb 21, 2001
4,270
2
0
Originally posted by: Peter
>"what do you need a computer for?" Typical answers were
>for checkbooks, word processing, maybe games (sort of),
>but wow, you don't hear people asking that kind of
>question anymore.

Right ... nowadays when you as a salesman try to figure out the needs of people who came in for a computer, you ask "what will you use the computer for" ... and receive a blank stare more often than an answer.

Today, people go buy computers because everybody has one, not because they need one.


Not really true. A lot of old people want computers to send email, get real-time pics of their grandchildren, check for travel advice and low price reservations online. Tell me a single 70 year old who would have wanted to do that in the 80s or early 90s.
 

Peter

Elite Member
Oct 15, 1999
9,640
1
0
I didn't say everyone was like that. And who changed my wording in the quote?

Besides, on optimization. It's a rewarding art, both in algorithms AND the actual underlying code - yet it's an art that is going under a bit. Today's approach at resolving a "this is too slow" problem is to use a faster machine. Even in markets where this is not possible (like industrial or embedded computing where you just can't simply put a faster machine in), most programmers don't have the faintest clue about how to speed up their programs, or even identify what's making them slow.
I get that kind of inquiry from our customers quite often - and most of the time, better algorithms and better coding yield speed increases in the 10x range while at the same time making the program simpler and smaller.

Part of the art is to identify the critical portions of the program, to optimize those and leave everything else alone. Hand coding an entire application in assembly language is just as stupid as not optimizing anything at all.
 

HarryAngel

Senior member
Mar 4, 2003
511
0
0
Originally posted by: TJN23
Originally posted by: Epimetreus
but when it comes to processors, RAM, hard drives, and other deep internals, there's been little fundamental advancement in a very long time.


the von neumann architecture has been around for a long time and wont seem to change...which i believe states that u have memory, secondary storage, processor, IC...