Couldn't agree more!Originally posted by: gururu
By some standards, we won't see a 'great' development until the use of electrons is vanquished.
No it's not, that would be your way of looking at itOriginally posted by: Shalmanese
Thats like saying there has been no major improvements in Industry since 5000BC since we are still primarily utilising chemical energy.
Changes and need for changes dosen't always follow each other. One can argue then by your reasoning if any major change was brought bye need. I mean like you say "if it's not broken, why fix it?"Originally posted by: borealiss
by your definition, it seems that even if something like optical interconnects were developed, that still wouldn't be an advancement... there hasn't been a "major" breakthrough in component manufacturing because tried and true methods work, and that's what material scientists have to work with. if you want to talk about new processes, you should take a look at how the manufacturing of the substrate on hard disks has changed over the last decade. densities for these things have skyrocketed. every year they manage to fit more bits with a square inch than the previous year. but the fundamental manufacturing process isn't going to have a major revolution if the original one works to begin with. if it's not broken, why fix it? even if you're talking about leaps and bounds in manufacturing processes, i still think you're understating the ingenuity that goes into the semiconductor manufacturing process. if a gallium arsenic mixture works, why change it? if a 9 metal layer process works, why fix it? what's next on the list? xray lithography i believe for beyond 90 nm? you're asking for a major revolution in manufacturing when one isn't really needed. necessity is the mother of invention, and when the time comes that it will be necessary to move to something beyond semiconductors, we'll make that transition. as of now, it's not necessary, but this doesn't mean r&d efforts are being put forth for the next generation of substrate materials for constructing ICs.
you're opinion of what constitutes a "great development" is somewhat boggling as well. certainly the improvements in computers are not limited to the manufacturing processes. certain archictectural enhancements are extremely noteworthy, certainly as much as a manufacturing breakthrough. where would modern computers be today if superscalar architecture wasn't developed. how about improvements in out of order execution? the number of functional units that have grown on cpus? the advanced schedulers required by cpus in the face of ever-lengthening pipelines? simple improvements in algorithms for branch prediction? what about the fact that most cpus are just using the x86 isa as a wrapper? imho, the fact that x86 is just a common front-end is an extremely important achievement as decoders for modern day cpus get more modular for the IA-32 isa. if transmeta's technology had taken off a bit more, i think we would've seen a paradigm shift from cpus designed for a particular instruction set to those that are adapted to existing instruction sets as a wrapper. but my digression aside, all these add up. i think you're limiting your scope to a very narrow field and not doing your original question justice by just focusing on manufacturing.
Originally posted by: borealiss
by your definition, it seems that even if something like optical interconnects were developed, that still wouldn't be an advancement... there hasn't been a "major" breakthrough in component manufacturing because tried and true methods work, and that's what material scientists have to work with.
Originally posted by: Coldfusion
I think many of the technological advances have been masked by poor programming.
By far the biggest advances have been made on the materials side. While making something smaller, faster, cheaper may not seem like a huge breakthrough, it truly is. CPU's today are 100x faster than those 10 years ago, while being smaller, and consuming less power. Hard Drives are also smaller and faster.
Programs have become bloated. With all the extra horsepower, time has not been spent optimizing code, time has only been spent cranking out more and more of it. As the technological barriers for ramping up clock speed and disk capacity are hit, this will happen. Clock speeds and disk capacity will remain the same, yet performance will increase nonetheless.
Originally posted by: Shalmanese
Bleh, optimising code is a waste of good talent. There is "good" processor wastage and "bad" processor wastage. Things like garbage collection and dynamic array allocation are a godsend. Sure they cost processor time but its worth it. Half the security exploits are from buffer overruns simply because people are still using a language from a time when dynamic bounds checking was too expensive.
Far from being more stable and integrated, they would be bug ridden and half finished because programmers were forced to do everything by hand.
Originally posted by: Shalmanese
Bleh, optimising code is a waste of good talent. There is "good" processor wastage and "bad" processor wastage. Things like garbage collection and dynamic array allocation are a godsend. Sure they cost processor time but its worth it. Half the security exploits are from buffer overruns simply because people are still using a language from a time when dynamic bounds checking was too expensive.
Far from being more stable and integrated, they would be bug ridden and half finished because programmers were forced to do everything by hand.
I was just discussing that with a programmer friend of my and I made a post about this not so long ago in the OS forum....you are so right!Originally posted by: Coldfusion
Originally posted by: Shalmanese
Bleh, optimising code is a waste of good talent. There is "good" processor wastage and "bad" processor wastage. Things like garbage collection and dynamic array allocation are a godsend. Sure they cost processor time but its worth it. Half the security exploits are from buffer overruns simply because people are still using a language from a time when dynamic bounds checking was too expensive.
Well, part of the problem is programmers have NO IDEA what is going on at the machine level when they call such and such a command. I firmly believe assembly should be part of every college curriculum, as it gives a greater understanding of what is going on underneath.
Until then, you'll continue to see nested for loops, storing all database entries in one giant table, and unmodularized code. I can't believe the amount of "professional" programmers that i've met that have no idea how to program.
Garbage collection is not wastage, its actually doing something useful. Wasteage is calling cmul instead of sll when multiplying by two, when sll is 32x more efficient doing the same job. Anybody could be a programmer given unlimited cpu cycles and system resources.
Far from being more stable and integrated, they would be bug ridden and half finished because programmers were forced to do everything by hand.
And this is different from now, how? Programs are bug-ridden and half finished today. The only difference is you have people that have NO idea what they're doing writing programs in VB and other drag and drop programming environments. Such an advance -- programs that write the bad code for people that don't even know enough to write bad code.
Originally posted by: Epimetreus
but when it comes to processors, RAM, hard drives, and other deep internals, there's been little fundamental advancement in a very long time.
Originally posted by: Peter
>"what do you need a computer for?" Typical answers were
>for checkbooks, word processing, maybe games (sort of),
>but wow, you don't hear people asking that kind of
>question anymore.
Right ... nowadays when you as a salesman try to figure out the needs of people who came in for a computer, you ask "what will you use the computer for" ... and receive a blank stare more often than an answer.
Today, people go buy computers because everybody has one, not because they need one.
Originally posted by: TJN23
Originally posted by: Epimetreus
but when it comes to processors, RAM, hard drives, and other deep internals, there's been little fundamental advancement in a very long time.
the von neumann architecture has been around for a long time and wont seem to change...which i believe states that u have memory, secondary storage, processor, IC...