Has there really been such a great development the last 20 years?

HarryAngel

Senior member
Mar 4, 2003
511
0
0
I was discussing with a friend of mine about the the last 10-15 years of computing that we have been doing. The computing industry has moved forward, but has there really been such a great development the last 20 years? I mean compare the last 10 years with today...stuff that you did with your pc then with software like coral draw and stuff you can do today, compare the speed. I don't see a great improvement in computing..sure evrything is in gigahertz, GigaByte but the software programs are also much bigger. Does your production work get done faster today then say 10 years ago? maybe. But it's not all that fantastic that ones thinks. I say that the groundbreaking development occurred 10-20 years ago.
 

neo4s

Member
Dec 21, 2002
83
0
0
I agree, the technology hasn?t really changed, just gotten faster, bigger, cheaper. But there hasn?t been a need for any fundamental change. As apps get bigger, computers just get faster. However, as we approach the limits of current technology, there will be a need for fundamental change. I guess that?s when we?ll see quantum computing, bio organic logic and other sic-fi concept become reality. Then again will that truly be a ?breakthrough? or just an extension of current technology? What can we really consider to be the next breakthrough? Will we have to develop technology beyond 1?s and 0?s, truly artificial intelligence?
 

Oaf357

Senior member
Sep 2, 2001
956
0
0
The big thing these days (IMO) is that there is so much technology out there that the innovations/developments are only going to be integration of those technologies to better suit people.
 

pm

Elite Member Mobile Devices
Jan 25, 2000
7,419
22
81
To me, the widespread acceptance in society of the Internet and the WWW is a huge development in the last 20 years of computing. Today you can listen to music, make reservations, check status, read or watch the news, download Ebooks and programs and access a huge information (and disinformation) repository. This is huge to me, and in fact, seems to me to have an even larger impact than the development of the PC itself. Without the IBM PC, we'd probably have Apples or Sinclairs or Amigas. But without the Internet and the WWW... we'd just have a bunch of computers connected in small area networks.

And while we are at it, even 10 years ago we barely had Windows. Windows 3.1 was a pain to use, it was kludgey, it crashed frequently, it wasn't that easy to use. Let alone 20 years ago when it was 1983, we had $3000 computers that had 256K of RAM, were extremely hard to use, had limited graphics (if any) on a monochrome screen, didn't have any sound beyond a really annoying speaker, they required text input and a big manual and/or training to use, and printed out something that may or may not be what you had on the screen on impact printers or the new-fangled dot-matrix printers. Now we have a graphical operating system that a 4 year old child can use, coupled with printers can spit out a 4x6 photo that is virtually indistiguishable from a lab photo, WYSIWYG, with reasonably high-fidelity sound, better-than-TV quality graphics, all on a computer that costs about $1000 or less.
 

borealiss

Senior member
Jun 23, 2000
913
0
0
i think portable devices have come a long way, same with wireless. cel phones that are practically pda's are a reality. you can check email with your phone, take a picture, compose that picture in a new email, and send it to a friend across the country while driving a car (not the safest thing tho). this was a long ways away 10-20 years ago. the integration of technology in our daily existence compared to what it was 10-20 years ago is evidence enough that there have been great developments. the WWW, a huge one imho, is a good one like PM pointed out. the fact that we haven't noted what these great developments are in our daily lives shows the level of integration they occupy, which is a good thing i believe. the level of computing has also been a great plus to the advancement of better and safer software. real preemptive multitasking vs time sharing systems that aren't thousands of dollars are now a reality. the birth of the "low cost" computer has made owning a computer and getting connected a lot easier. when was the last time you paid around 500 bucks for a decent system 10-20 years ago? i believe that wasn't a reality. i recall my parents paying $3k for my Dell 486DX2 66mhz system, with a fancy "math coprocessor". that's what they were called back then... although not a direct factor in our lives, more computing power has meant better r&d efforts at universities and other companies, of which we see the effects of in our daily lives such as better structural engineering, simulated crash tests, etc... leading to safer and better technologies. the car you drive was probably simulated on hardware using structural analysis software. communication is another major development that computers facilitate. video conferencing with people half a world away can be done as better codecs require more computing power. the list goes on and on...
 

Epimetreus

Member
Apr 20, 2003
72
0
0
I think the question was directed more at the physics of computing than the integration, sociological repercussions, or compactness.
Beneath all those strata of coding, beneath the object oriented layers, OS controlled functions, assembler optimizations, BIOS interfaces, etc, there is a point at which the "computer" as a coded entity ends and the laws of physics begin. It's much easier these days to forget that; to the point some people don't even realize it consciously, even people who are actually pretty technically apt.
I think the question, at its most fundamental, was directed to mean: in the last twenty years, where has the innovation gone in terms of finding new ways to take advantage of the basic laws of physics? Our computer components are smaller, use different formulae, are further optimized and more complex; but they're still silicon, gold, copper, circuit board, etc. There's not been another quantum shift, such as the change from, say, mechanical PCs to electronic, vacuum tubes to transistors, tape to discs, discs to CDROMS, etc. There have been in peripherals -- CRT monitors are going fast and CDROMS/DVDs are completely murdering VHS and cassettes, etc, etc, etc, etc... but when it comes to processors, RAM, hard drives, and other deep internals, there's been little fundamental advancement in a very long time.
 

AdvancedRobotics

Senior member
Jul 30, 2002
324
0
0
Computing has advanced quite significantly in 20 years. We have gone from no graphics cards... to 2d cards... to 3d cards... and now with GPUs that do both 2d and 3d.

As well, if you look at the development of the operating system and software. It has progressed greatly. There are many programs that can do just about anything. In the past, the computer was just a tool for the rich that could do only a few things. Now it is a tool for everyone and has access to a lot of information.

Technology is always changing and R&D is constantly being done. Currently we are dawning on the age of molecular and quantum computing. Computers and chips that will use carbon nanotubes and even DNA.

I think you're point that the major development was ~10-20 years ago is quite valid. But I think now that computers and nanoscale research is becoming more and more practical. The research that has been going on to constantly shrink computers and make them faster has opened new doors in biomedical, robotics, and engineering fields.

Computing has become more chemical/biological; with the research of carbon nanotubes and DNA. By shrinking circuits and coming up with more efficient ways to store data, one will soon be able to store thousands of files on a chip that will fit into ones pocket. As well as make more efficient, lighter weight robots and space shuttles and produce more of them.
 

Shalmanese

Platinum Member
Sep 29, 2000
2,157
0
0
Originally posted by: HarryAngel
I was discussing with a friend of mine about the the last 10-15 years of computing that we have been doing. The computing industry has moved forward, but has there really been such a great development the last 20 years? I mean compare the last 10 years with today...stuff that you did with your pc then with software like coral draw and stuff you can do today, compare the speed. I don't see a great improvement in computing..sure evrything is in gigahertz, GigaByte but the software programs are also much bigger. Does your production work get done faster today then say 10 years ago? maybe. But it's not all that fantastic that ones thinks. I say that the groundbreaking development occurred 10-20 years ago.

spoken like a true 15 year old :)
 

Peter

Elite Member
Oct 15, 1999
9,640
1
0
Just go back to what you'd get for the same money in 1993, or 1983. Like, the $500 that buy you an extremely okay PC today wouldn't even have bought you a decent 486 in 1993, and in 1983 you'd have been hard pressed to even find a C64 or Atari 800XL w/ floppy drive for that kind of money.
Then go use the software of the respective age.
Just for comparisons, 20 MByte (!) HDDs were $1000 in 1986 and 300 KB/s "fast". Computers had 4 to 8 MHz, and 1 MByte RAM was plenty.

What absolutely didn't make any progress is the form factor of the standard PC. We still get clunky big tin boxes - with essentially nothing anymore inside, but still the same extremely unelegant brick.
 

f95toli

Golden Member
Nov 21, 2002
1,547
0
0
WWW

Where I live almost everyone uses the web (we even have our own way of spelling it "webb") once in a while, tax return, buying tickets, books, CDs etc.
And that has happened in 10 year!
 

PrinceXizor

Platinum Member
Oct 4, 2002
2,188
99
91
Well the computer as a concept has been around for awhile (the famous "Yeah, but what is it good for" quote comes to mind).

Big advancements

1. The personal computer (definitely 20 years ago)
2. The cost of computers (roughly 15-20 years ago)

I remember my grandfather bought a brand spanking new PS/2 that had a screaming math co-process for $7000!!

3. The WWW

Not really a direct computer advancement, but a benefit of them none-the-less.

Think of it this way, has there really been a "revolutionary" television invention since color. The basic technology is still the same as it was 50 years ago.

It's logical to conclude that computers in general will slowly go the same way as a TV, so common and easy that we'll be using them to find info on the "NEXT" really cool technological idea.

P-X
 

HarryAngel

Senior member
Mar 4, 2003
511
0
0
Originally posted by: Shalmanese
Originally posted by: HarryAngel
I was discussing with a friend of mine about the the last 10-15 years of computing that we have been doing. The computing industry has moved forward, but has there really been such a great development the last 20 years? I mean compare the last 10 years with today...stuff that you did with your pc then with software like coral draw and stuff you can do today, compare the speed. I don't see a great improvement in computing..sure evrything is in gigahertz, GigaByte but the software programs are also much bigger. Does your production work get done faster today then say 10 years ago? maybe. But it's not all that fantastic that ones thinks. I say that the groundbreaking development occurred 10-20 years ago.

spoken like a true 15 year old :)
Did I crap in your home? No i didnt, so please don't crap in my thread.
Sorry mate, but I have to say that if you're going to come in and cr@p in someone's thread, you'd better bring something a little stronger than "spoken like a true 15 year old :)"

 

HarryAngel

Senior member
Mar 4, 2003
511
0
0
Originally posted by: Epimetreus
I think the question was directed more at the physics of computing than the integration, sociological repercussions, or compactness.
Beneath all those strata of coding, beneath the object oriented layers, OS controlled functions, assembler optimizations, BIOS interfaces, etc, there is a point at which the "computer" as a coded entity ends and the laws of physics begin. It's much easier these days to forget that; to the point some people don't even realize it consciously, even people who are actually pretty technically apt.
I think the question, at its most fundamental, was directed to mean: in the last twenty years, where has the innovation gone in terms of finding new ways to take advantage of the basic laws of physics? Our computer components are smaller, use different formulae, are further optimized and more complex; but they're still silicon, gold, copper, circuit board, etc. There's not been another quantum shift, such as the change from, say, mechanical PCs to electronic, vacuum tubes to transistors, tape to discs, discs to CDROMS, etc. There have been in peripherals -- CRT monitors are going fast and CDROMS/DVDs are completely murdering VHS and cassettes, etc, etc, etc, etc... but when it comes to processors, RAM, hard drives, and other deep internals, there's been little fundamental advancement in a very long time.
Excactly! Thats what i meant. Fundamentaly there hasen't been any great development the last 20 years in the physics of computing.
 

Shalmanese

Platinum Member
Sep 29, 2000
2,157
0
0
Originally posted by: HarryAngel
Did I crap in your home? No i didnt, so please don't crap in my thread.
Sorry mate, but I have to say that if you're going to come in and cr@p in someone's thread, you'd better bring something a little stronger than "spoken like a true 15 year old :)"

Sorry if that sounded a bit brusque, its just that what I wanted to say had been said better by pm anyway. If a person was auto-magically transported from 1983 to today, he would be absolutely mind boggled.

Anyway, Ill give it a shot anyway:

email: Can you live without email? I know I certainly couldnt live without email as my life is currently structured.

mobile phones: you can find anybody, anytime. Everyone is always on demand.

design & production: Can you imagine the freedom of building virtual anythings and then being able to tweak them to your hearts content without ever being locked in.

internet & multimedia: you can find out anything about anything within minutes. You can make friends with people around the globe.
 

vegetation

Diamond Member
Feb 21, 2001
4,270
2
0
Technology has become more adapted to our culture. Our culture has become more centered around technology and will continue to do so. This is definitely the result of lower cost computers, mobile phones, internet access, etc. 10 years ago if you told an average joe you bought airline tickets yourself "on an online computer network", they would look at you like some kind of freak. Not so anymore.

The real question is could we have achieved our cultural dependence on technology if computer systems today weren't as powerful as now? Let's say everyone was still using primarily text based systems because the CPU was too weak for graphics (think Apple //). I would probably argue "no" because it would be too dull. I remember long ago the argument going around was, "what do you need a computer for?" Typical answers were for checkbooks, word processing, maybe games (sort of), but wow, you don't hear people asking that kind of question anymore. Technology has just become so much more accepted by the general audience, rather than just the geek crowd, that the money is there to make it expand even further.
 

gururu

Platinum Member
Jul 16, 2002
2,402
0
0
The last 20 years have BEEN the great development. Computers now are closer to their originally conceived purpose than ever and development will occur until they can no longer surpass their potential. Development will take as long as it takes. No idea manifests itself into a useful entity in a short period of time.
 

Peter

Elite Member
Oct 15, 1999
9,640
1
0
>"what do you need a computer for?" Typical answers were
>for checkbooks, word processing, maybe games (sort of),
>but wow, you don't hear people asking that kind of
>question anymore.

Right ... nowadays when you as a salesman try to figure out the needs of people who came in for a computer, you ask "what will you use the computer for" ... and receive a blank stare more often than an answer.

Today, people go buy computers because everybody has one, not because they need one.
 

pm

Elite Member Mobile Devices
Jan 25, 2000
7,419
22
81
Excactly! Thats what i meant. Fundamentaly there hasen't been any great development the last 20 years in the physics of computing.
I guess I still don't see what you mean. There's been huge changes in the physics and chemistry of making chips. We have shifted the materials and methods used in lithography, we have switched the materials that make up the gate oxide, the wiring, the interlayer dielectric, and the passivation layer. The silicon of the transistors is shifting towards other materials (SiGe, SOI). There have been huge changes in the physics of designing chips. We have wire widths that are getting so thin that soon the effects of electron scattering on the few atoms that we have making up the wire will have profound effects on resistivity.

Yes, most chips are still made on silicon and CMOS, but cars are still made (primarily) out of metal and run on gasoline as they did 90 years ago and no one could say that there has been no great progress in designing and manufacturing cars.

Silicon is cheap and does the job very well. Until there is a reason to switch to something else, no one will. This isn't due to lack of progress or lack of creativity - it boggles my mind how much clever engineering effort goes into each process generation shift that the average consumer takes completely for granted - it's due to practicality and economics.
 

q2261

Senior member
May 20, 2001
304
0
0
Look folks, HarryAngel is talking about the physics involved in building fundamental parts of computers, NOT about new things we build with the same components, and keeping this in mind, I think the basis of computing hasn't really changed all that much in the past 20 years. And no, I am NOT a 15-year old :)
 

LuckyTaxi

Diamond Member
Dec 24, 2000
6,044
23
81
Originally posted by: pm
To me, the widespread acceptance in society of the Internet and the WWW is a huge development in the last 20 years of computing.

AMEN!

 

pm

Elite Member Mobile Devices
Jan 25, 2000
7,419
22
81
Look folks, HarryAngel is talking about the physics involved in building fundamental parts of computers.
And I'm saying that much of that has changed as well and most people simply don't know it. Quantum mechanics has caught up with most aspects of chip design and in the current generation things like electron tunneling, scattering and other esoteric non-Newtonian effects will introduce some interesting non-linear aspects to chip design. So design is fundamentally different nowadays from it was 20 years ago back when transistors behaved like on-off switches, the gate oxide behaved like an ideal capacitor, and wires behaved like resistance was linear and inductance didn't exist.

And practically nothing aside from the silicon itself and the fundamentals of lithography and etch is the same in manufacturing a chip. The bulk substrate has shifted from bulk silicon to epitaxially grown silicon or SOI or SiGe. The gate oxide has shifted from silicon oxide to SiN or something even more exotic. The polysilicon gate has shifted from polysilicon to some salicide. The wires have shifted from aluminum to tungsten and copper. The dielectric has shifted from SiO2 to all sorts of various recipes. The passivation has shifted from SiO2 to polyimide or some other flavor. The lasers, the optics and the photoresist have all changed several times over.

A process shift from, say, 0.18um to 0.13um isn't as simple as telling every person working out in the fab, "Ok, guys, this week we are going to make everything smaller than we did last week." A process shift is a hideously complex affair involving literally hundreds of companies and thousands of engineers and technicians. At any given point, most of the technologies required for manufacturing chips two generations ahead haven't been invented yet. Looking at the ITRS (International TEchnology Roadmap for Semiconductors) roadmap for lithography for 2005 many of the entires are red which means "solutions are not known". See pages 8-14 of this .PDF document from ITRS

My point is that anyone who says that there haven't been "great developments" in silicon manufacturing and design in the last 20 years, is standing too far away from the action to see how difficult the challenges are and how impressive the solutions have been. And it's not like the reason that the basis for most semiconductor electronics is silicon is due to the fact that there have been no innovations or developments in finding other materials. The fact is that silicon is the "best" material we have available for manufacturing when economics is taken into consideration.

Patrick Mahoney
Microprocessor Design Engineer
Enterprise Processor Division
Intel Corporation
 

lobe

Senior member
May 20, 2000
202
0
0
You say in the last 10-15 years, or the last 20 years, like that is close.

5 years difference when it comes to computing is a LONG time.

If your point is that the developments in computing have been more evolutionary than revolutionary, I can accept that. However, what an evolutional development it has been!

If anything, I have noticed the RATE of evolution has been slowing for the past year.

Hey, I'm ready for something revolutionary. Sounds like you are, too.
 

Epimetreus

Member
Apr 20, 2003
72
0
0
Yes, most chips are still made on silicon and CMOS, but cars are still made (primarily) out of metal and run on gasoline as they did 90 years ago and no one could say that there has been no great progress in designing and manufacturing cars.

I don't think cars have made a lot of progress either. They have made as many steps back as forward, in reality. Example: we get more miles per gallon because we switched from high quality steel engine blocks to aluminum based materials, alloys, etc... but we lost engine longevity. That's why cars that were made in 1950 and *well* maintained still are in good working condition, but well maintained cars from 1980 are probably in similar or worse working condition. We switched from heavy steel "bumpers" to the plastic affairs we use now(also to improve fuel efficiency), but an older car will crumple a newer like the tin can it is.
Computers have come farther and faster, and I do not wish to pretend that the integration, precision, and sociological impacts have not been.... drastic, to say the least, but we are still in the same basic paradigm as we were previously.
We do it better, faster, more cleverly, and in new ways... but we still do "it". Now I know great inroads are being made into such things as nanotubes and the like, but they're not "out" yet. Thus they don't affect "we", the consumer.
I am not arguing that "progress" has not been made, I am saying that this progress we have seen is still within, I hate to use the term, paradigm of development.