how can computer technologies keep constantly and predictibly advancing?

dpopiz

Diamond Member
Jan 28, 2001
4,454
0
0
CPU's get faster
hard drives get bigger
wireless networks get faster
etc

and people just expect them to keep doing so at about the same rate


how? it seems to me that if you want to make a faster/bigger/whatever thing, you would need to have a real technological breakthrough. I don't understand how breakthroughs can just keep predictibly happening
 

Lynx516

Senior member
Apr 20, 2003
272
0
0
The technology's dont then to have large breakthrough's but evolve and have lots of smaller breakthroughs. The progress is not always predictable. Take the problems Intel have had with 90nm. IN the last year CPU speeds have hardly increased. Wireless networks are a new technology and so they will evolve pretty quickly. Hard drives for along time didnt get bigger very much I remember being around the 8 GB range max HDD size for ages before drive size expoded.
 

HVAC

Member
May 27, 2001
100
0
0
That's a big question. It will probably take more than a simple reply to satisfy your curiousity. I will contribute my bit.

Research. Economic pressures of having competition result in research of new features, designs, materials, and processes. The rate of change of the semiconductor industry was explained by "Moore's Law". This is a target that these types of companies have accepted and strive to maintain to remain economically viable as a company.

The invention and continual progress of computers aids computer research because computers reduce the time necessary to carry out design, material research, process design, etc...... which results in faster computers.....ad nauseum.

I am pretty sure this is only one facet of the explanation, but let someone else continue it.....
 

Pudgygiant

Senior member
May 13, 2003
784
0
0
Lynx that isn't true though. There have always been holdups in hardware development. In fact, that may be a key factor in keeping the advancement rate constant. So yes, it actually is fairly predictable. AMD called it 2 years ago that they'd release clawhammer 2H03, and they did (I wish the games industry was like that... HL2 release date has slipped way too much.)
 

ReiAyanami

Diamond Member
Sep 24, 2002
4,466
0
0
Technology goes forward, not backward. and nor is it "constant". technology advancement is dependant on the discovery of things we do not know, and luckily humans do not know very much, hence we are assured that we will consistently advance as long as humanking keeps learning NEW things

will the next 100 years be as revolutionary as the previous hundred years?

suppose the next hundred is indeed just as revolutionary (nanomachines, biotech, quantum computing) and we discover nearly everything there is for a human to discover. then at that point technology might halt advancement or slow to near zero. but we won't need to advance our technology anymore if we can generate matter and traverse across the universe instantly...
 

Lynx516

Senior member
Apr 20, 2003
272
0
0
PudgyGiant Your hammer example doesnt hold. It was initially due for launch a year ago.

But as Rei said it isnt that they predict very well it is that because they find so many problems the tech evolves to counter.

Also when you have billions spent on R&D some of that has to deliver a reulst
 

sao123

Lifer
May 27, 2002
12,653
205
106
One thing to understand is this:

Computer companies spread out the generations of technology so that they have time to research the next generation.
IE...
When intel creates the Pentium 6...When they research it, it will have a speed of 10Ghz, a 3MB cache, have 128bit extensions, and some other features...
But the first Pentium 6 they sell you will be a : 5Ghz 64bit CPU with 2MB cache enabled...etc. Then they just up the speed, and turn on features as time goes by... Finally they give you the full montey deal about 3 years later when the next generation chip has been researched and developed and is nearly ready to manufacture.
 

masul0100

Member
Jun 19, 2001
48
0
0
I think it isn't an issue of technology but economics. "where there is a will, there is a way" when will = an a.ss load of money to be made.

Just a thought
Masul
 

drag

Elite Member
Jul 4, 2002
8,708
0
0
Yep. Were there is a will there is a way.

Our scientists are always working on new technologies and are improving/developing new technics to control energy.


Engineers are constantly developing new manufacturing technics and are improving the machines.

One example I like is about a metal bender. A simple device on a metal pedistal with a big metal nob in the middle. It has a handle that goes on it with a nob on that. You take a metal bar, put it on the machine, pull on the bar hard and it bends the metal all smooth like.

However if you need a metal bender device you need to find someone that already has one to build another machine for you. You need a metal bender to build a metal bender. However using some engineering technics and laws of leverage and stuff like that, you can bend bars a bit heftier then the bars that the machine that's making it is built out of.

So each machine that gets built is a bit hardier, can bend slightly larger and tougher metal bars. Then the machine that gets built from that machine can be made more powerfull and bigger.

Then they begin adding motors, developing other metal shaping devices that continously built machines bigger, more powerfull, and more complex then themselves.

etc etc.

Now we have gigantic machines that we use to build super oil tankers, skyscrapers, monsterous bridges and stuff like that.

The only thing people aren't sure about is were the first metal bender came from, and how the guy built it without having a metal bender in the first place. :p

And that's how I figure that enginneering science is like. Each new developement of technology provides the means to press the boundries a little farther back. They provide the basis for testing new ideas and developing new technics to handle problems that are created by pressing current technology far beyond it's current capabilities. You can build bigger, purer silicon wafers buy correcting mistakes from the past, thus making it cheaper. Machines to make the traces for the cpu's are designed on ever improving computers that can provide the means to make tighter and tighter tolerances. etc etc.

And the occasional breakthru doesn't help either. For instance from a slashdot story from a while back I learned of the race to create a pure very large diamond. Diamonds are just carbon crystals of a particular type, it's just hard to make big pure ones because anything you use to make the diamonds gets trapped in the cystal structure.

Now you got one guy using a russian technic developed years ago for making large industrial diamonds. It never took off because the amount of persicion needed for a very complicated proceedure ment that almost every time you try to make a diamond something went wrong. Now using modern robotic technology and computers that level of repeated exactness is fairly easy. So he makes diamonds that are big, and completely undetectable from a natural diamond in a chemical/visual way. The only real flaw is that they are to perfect.

The second guy is more interesting. He creates diamonds by growing them in a chemical enviroment. Buy slowly growing them up from a diamond "seed" the chemicals bond to the existing diamond and grow more crystals on that, making it bigger. One of the things he is growing are large wafers. So he grows them out, slices them up into thin slices, puts them in the chemical enviroment and grows them thicker and wider.

Eventually he want's to create diamonds that are very wide, deep, and flat. The bigger the seed he creates for the next diamond the bigger and faster he can grow the next diamond.

You see you can dope diamonds, just like you can with silicon. You add impurities to modify the electrical behavior of the crystal and create transistors. Since you can add the impurities at will when you grow diamonds like that you will eventually be able to build diamond-based electronics and especially CPU's.

The neat thing is that diamonds can wistand intense heat without fear of breaking down like silicon. So you could make a CPU core that you can overclock the snot out of. You can have it put out hundreds of degrees worth of temperature. The only practical limations would be the ability of the surrounding electronics to withstand the heat of the cpu.
 

rimshaker

Senior member
Dec 7, 2001
722
0
0
It won't... it can't. Electrons are only so small and can only do so many things. Computer technology (electron based) will probabaly plateau within the next 20yrs or so. Personally i think the next generation of technology will be more photon based. Or something close along those lines.
 

Pudgygiant

Senior member
May 13, 2003
784
0
0
Ah, that again rimshaker. He didn't say "how WILL", he said "how CAN", insinuating past and present-tense examples. And right now we're pretty far from hitting the limit of the electron. And the "next generation" will most definitely NOT be photon-based (although one could argue we have the bases set for them with fibre), as Intel has apparently already released roadmaps up to p6...
 

wacki

Senior member
Oct 30, 2001
881
0
76
Originally posted by: rimshaker
It won't... it can't. Electrons are only so small and can only do so many things. Computer technology (electron based) will probabaly plateau within the next 20yrs or so. Personally i think the next generation of technology will be more photon based. Or something close along those lines.

Nicely phrased as far as saying "Electons" and not just computers.

Electron based cpu's will plateau in the next 20 years or so, but an Isreali company just built a cpu that's photon based. It has the power of 1,000 P4's. Since photon based computation doesn't isn't effected by capacitance like electron based computation, Moores law will probably continue to beyond any of our lifetimes. Then there might be quantum computers and stuff will get really crazy. Who knows tho. But one thing is certain, thanks to photons, Moores law will continue on for us.

Linky to Isreal

Also storage cube that look like they belong on Star Trek are currently being used by IBM and the U.S. government that can hold a terabyte and maybe even a petabyte in a cubic centemeter of glass. The technology is called high density holographic storage. It has a current transfer rate of a gigabit per second.

Sorry no links available for that one. Try google tho.
 

Pudgygiant

Senior member
May 13, 2003
784
0
0
But like I said a couple posts up, just because someone has the technology doesn't mean they'll put it out. For now going with the vanilla electron CPU is a bunch more profitable to Intel and AMD than researching a completely new architecture would be. Hell, especially since they just blew all that money R&D'ing 64-bit desktop cpu's...
 

tinyabs

Member
Mar 8, 2003
158
0
0
How about putting $10 in fixed deposit today and see how much it will grow until the day Moore Law is invalid.
 

glugglug

Diamond Member
Jun 9, 2002
5,340
1
81
The new technologies all come from alien spacecraft which Intel bought and reverse engineered. Those technologies are released gradually on a set schedule to milk the market for all it's worth.

jk :)
 

RelaxTheMind

Platinum Member
Oct 15, 2002
2,245
0
76
If you ever catch glimpses of rare timelines from intel or AMD (deep searching through their developer sections you can clearly see exactly what they plan on releasing and almost to the exact dates. I would hold my breath for anything new to come out the the general public anytime soon (your 20 year mark) Advancements in technology mean nothing when it comes to marketing strategies. Releasing a 10ghz processor which they already have, wont pay for Johnny Intel's college fund.

I think the standard cooling method of processors will soon reach a plateu. The current design of computers is quite flawed and a newer standard is what will likely see a change in the next couple years.

The Pentium-M Centrino is a good example of it all being marketing. Come on a 1.4ghz mobile processor faster than a desktop 3ghz P4? Yet they market it as a "very low power" processor.

Moore's Law = Common Sense Law

photon based cpus...that article almost made me wet my pants....

Quantum processors... only way i will ever see anything close to that is if i freeze my body and hope they find a gentle way to thaw me out. By then there will probably be more stuff so advanced I probably wouldnt even worry about processors. (IE nanotechnology)

RTM
 

monzi

Senior member
Dec 10, 2003
671
0
0
WWW.CBUSVIP.COM
My Theory is that Intel and AMD alike both have chips capeable of forsay 5mhz, but
they do not want to release them untill the P4 ERA stops selling.

P4, HT/EE and AMD Barton are still thriving today.


If Intel then releases the P5 in 5 months, people who bought a p4 5 months ago, will go and buy
a p5 to be ahead of the game.


Its all marketing and the way they are able to predict buying patterns

my -1 cent.