Where does computer development end?

AreaCode707

Lifer
Sep 21, 2001
18,447
133
106
Trying to figure out a good way to phrase these questions. What happens when computer development reaches a plateau? When processors max out their speed, when Moore's Law no longer applies? Where will computers be then? Where to next? Do you think they'll reach some kind of optimal level, and stay there, like many past technologies? What will gamers be looking for if graphics are indistinguishable from real and there's no longer any waiting for the faster machine to come out? Nevermind how many years we are away from computers reaching their maximum potential - when that happens, what next? There's been lots of speculation about launching into biotechnology - computers integrated with ourselves. Anyone see that as plausible?
 

Booster

Diamond Member
May 4, 2002
4,380
0
0
I think there'll be more computers in the future... Higher speeds, software more complicated to learn, more time to sit on the bottom to figure these new computers out. I don't expect anything good or positive for myself, frankly.
 

Yossarian

Lifer
Dec 26, 2000
18,010
1
81
Considering how far we've come in only 30 years or so, I can't even imagine what the limits are. We'll probably get some kind of massively parallel nanocomputer that uses individual electrons or photons in place of transistors. Biotech is just a matter of time. If it can be done, sooner or later someone will do it.
 

AreaCode707

Lifer
Sep 21, 2001
18,447
133
106
Originally posted by: Booster
I think there'll be more computers in the future... Higher speeds, software more complicated to learn, more time to sit on the bottom to figure these new computers out. I don't expect anything good or positive for myself, frankly.

Obviously there'll be higher speeds, but there has to be a physical limit at some point. And at some point, the speed of computers will so far outpace the speed of humans that there'll be little use in speeding them up more, right? What new software - what happens when the software we have does everything we need it to and more, and further development becomes superfluous?
 

gopunk

Lifer
Jul 7, 2001
29,239
2
0
i thought i heard something about a limit, but rest assured, we weren't anywhere close to it the last time i checked :)

and even aside from brute force, there is a TON of work that still needs to be done in artificial intelligence, algorithms, and whatnot.
 

m2kewl

Diamond Member
Oct 7, 2001
8,263
0
0
it will end when humans start nuking each other.

you know computer don't just create themselves ;)
 

Hayabusa Rider

Admin Emeritus & Elite Member
Jan 26, 2000
50,879
4,268
126
It doesnt end. The genie is out of the bottle. There is a built in evolutionary pressure, if you will on computing. As long as people are programming (and eventually computers doing programming as well) they will write software that challenges hardware. Hardware designers will then respond and get a temporary advantage. Then the cycle repeats. I see no limiting factors that physically prevent computers from getting better. If silicon fails, something else will come along. DNA, quantum computing are out on the horizon. What will people hundreds of years in the future conceive of? We can't guess, but as I said, there is no physical process or limitation known.
 

kuk

Platinum Member
Jul 20, 2000
2,925
0
0
When computers learn to "develop" themselves. This will lead to the end of human computer development; there wouldn't need to be any human intervention.
 

AreaCode707

Lifer
Sep 21, 2001
18,447
133
106
At some point though, computer development has to exceed our uses for it, wouldn't you think?
 

Spyro

Diamond Member
Dec 4, 2001
3,366
0
0
Originally posted by: kuk
When computers learn to "develop" themselves. This will lead to the end of human computer development; there wouldn't need to be any human intervention.

And then they'll decide that they don't need us anymore...............
 

Booster

Diamond Member
May 4, 2002
4,380
0
0
Obviously there'll be higher speeds, but there has to be a physical limit at some point. And at some point, the speed of computers will so far outpace the speed of humans that there'll be little use in speeding them up more, right? What new software - what happens when the software we have does everything we need it to and more, and further development becomes superfluous?

i wouldn't put too much emphasis on computers... Yes, they change smth in life... But the real changes are in production, industry... Computers can connect you to the 'net, but they won't change your life itself, IMO. Consumer computers are still mostly toys... So when they reach a peak... I don't think Intel will let that happen... They need to sell more CPUs to drive progress, and there are absolutely no limits in foreseeable future I think.
 

Hayabusa Rider

Admin Emeritus & Elite Member
Jan 26, 2000
50,879
4,268
126
Originally posted by: HotChic
At some point though, computer development has to exceed our uses for it, wouldn't you think?

Yes, when there is nothing left to learn or know.
 

m2kewl

Diamond Member
Oct 7, 2001
8,263
0
0
Originally posted by: Spyro
Originally posted by: kuk
When computers learn to "develop" themselves. This will lead to the end of human computer development; there wouldn't need to be any human intervention.
And then they'll decide that they don't need us anymore...............

watching Terminator again huh? ;)
 

AreaCode707

Lifer
Sep 21, 2001
18,447
133
106
Originally posted by: Hayabusarider
Originally posted by: HotChic
At some point though, computer development has to exceed our uses for it, wouldn't you think?

Yes, when there is nothing left to learn or know.

I'm not talking about abandoning computers, I'm talking about computers reaching such a peak that their interaction with humans can't practically be improved. Then you have as much access to learning and knowing as you are able to have with a computer's facilitation. The computer, in this case, would be basically developed to its ideal point, for human purposes.
 

Imaginer

Diamond Member
Oct 15, 1999
8,076
1
0
I think computer development will end when we reach that stopping point in applying quantum physics to engineering devices. I mean we reach that point where we cannot absolutely go further whatsoever because of the physics involved in producing tiny devices.

But if you are talking about programming intelligence and AI, we have barely scratched the surface on that one.
 

beer

Lifer
Jun 27, 2000
11,169
1
0
Originally posted by: HotChic
Originally posted by: Hayabusarider
Originally posted by: HotChic
At some point though, computer development has to exceed our uses for it, wouldn't you think?

Yes, when there is nothing left to learn or know.

I'm not talking about abandoning computers, I'm talking about computers reaching such a peak that their interaction with humans can't practically be improved. Then you have as much access to learning and knowing as you are able to have with a computer's facilitation. The computer, in this case, would be basically developed to its ideal point, for human purposes.

There are always going to be needs.

Someone, somewere, is going to want the power of the most powerful supercomputer today in their desktop. Who knows when you need to simulate earth?

 

Barnaby W. Füi

Elite Member
Aug 14, 2001
12,343
0
0
It's hard to say, and I think it will be a very long and difficult process. Computers still behave in a way that is very foreign to someone who has not used a computer. Computers are still extremely stupid. Over the course of the entire lifetime of computers, people have tried to make software that is more and more intelligent, and anticipates your needs more. I honestly don't know whether they truly do good job of catering to laymen or not, but to me they seem to fall flat on their face. User interface design and artificial intelligence seem to be the big two. These are what are being alluded to more and more in current software (well, *better* UI's, that is). One amazing example was the article about the newest longhorn alpha release. When something crashes, it asks you to debug. Instead of having buttons with verbs on them, they have OK and cancel. When I see this dialog, I have to read the whole thing just to know what's going on. Information overload (to a layman). The buttons should say "Debug" and "Don't debug", or something similar. This is an ancient, well known piece of a good UI, and the brand spankingest OS falls on it's face. Personally I just say fvck it. I have plenty of time to learn, so I don't waste my time with software that treats me like an idiot. I am in full control of my computer and it works how I want, but that's a totally different situation from, say, my grandparents, who are learning the intricacies of aol buddy lists and are still pretty confused about it.

As far as games go, I think graphics will always be an important thing. Even when they are able to totally mimic reality, they will simply make games have graphics that are beyond reality. Take the matrix, or many other films for example. Stories and gameplay are always important too, as well as good AI for single player, and immersive multiplayer modes. One thing that is somewhat worrying is MMORPGs, the depth of their gameplay can cause people to play for tens upon tens of hours to become just an "ok" player, and hundreds of hours to be powerful. The games are goal oriented, with the goal being ridiculously hard to reach, but little tiny goals that you can complete quickly along the way that keep things fun. The social aspect and gee-whizness of a real world, with people, houses, shops, forests, etc, outside of our normal world, is very fascinating, and that's what humans love, gee-whizness. One amusing example I can remember reading was from the book "In the Beginning Was the Command Line" by Neil Stephenson: a man was at disneyworld, on the set of a mock-up of a wild west town. Everything was fake, obviously. The man was videotaping it, and while doing so, watching everything through the lcd on his camera. So you have a guy watching through a screen, a video of a mock-up wild west scene, he's sort of two or three steps out of reality. I know that didn't sound all that great the way I remembered and typed it, but it was much better in the book. :) Another example is games. People love to ooh-ahh at scenes of beautifully rendered forests and villages in games, when they could just go to a forest in real life. I dunno, it's all confusing :p
 

AreaCode707

Lifer
Sep 21, 2001
18,447
133
106
Originally posted by: Elemental007

Someone, somewere, is going to want the power of the most powerful supercomputer today in their desktop. Who knows when you need to simulate earth?

:) That's a great line.
I'm just playing devil's advocate here, because many professionals are talking about Moore's Law reaching it's maximum and computers moving into circuits of atomic distances sometime in the next one to five decades. Reading an article by Peter Drucker called "Beyond the Information Revolution" (referring to Toffler's three waves of revolution) and it just got me thinking. Well written piece.
 

Barnaby W. Füi

Elite Member
Aug 14, 2001
12,343
0
0
Basically, no matter how powerful the computer is, you need software that is useful and logical. Creating software that is powerful AND logical is hard, because when you get down to it, it's created by a bunch of monkies at computers typing out things like

if(foo != bar) {
fun(x, y, 300);
} else {

...and so on.
 

Spyro

Diamond Member
Dec 4, 2001
3,366
0
0
Originally posted by: HotChic
Originally posted by: Elemental007

Someone, somewere, is going to want the power of the most powerful supercomputer today in their desktop. Who knows when you need to simulate earth?

:) That's a great line.
I'm just playing devil's advocate here, because many professionals are talking about Moore's Law reaching it's maximum and computers moving into circuits of atomic distances sometime in the next one to five decades. Reading an article by Peter Drucker called "Beyond the Information Revolution" (referring to Toffler's three waves of revolution) and it just got me thinking. Well written piece.

Hmmm, I've seriously got to read that book :) However, I don't think that computers (i.e. technology) will ever reach any particular type of rooftop. If you're talking about processors, than yes, I think that its fairly obvious that eventually silicon will no longer be able to go any further, speed-wise. But by that time we will have probably discovered an alternate material for creating cpu's.

The pc as we know it has always been limited, it sits on your desk, its too big to carry around, and other annoying factors, and despite their smaller size, notebooks and laptops aren't exactly the wave of the far future either. Tablet PCs (or another form of a easily portable computer) will probably be the "next big thing", afterr that (maybe wwaaaayyyy after that) we will probably enter the area of embedded computers (think like borg). An era were it is difficult to tell when man begins and when robot begins. Of course, thats just one of the possible future directions of computer technology, but it is one that has a higher probability of becoming a reality than most people think.
 

Spyro

Diamond Member
Dec 4, 2001
3,366
0
0
Originally posted by: BingBongWongFooey
Basically, no matter how powerful the computer is, you need software that is useful and logical. Creating software that is powerful AND logical is hard, because when you get down to it, it's created by a bunch of monkies at computers typing out things like

if(foo != bar) {
fun(x, y, 300);
} else {

...and so on.

Theoretically, its possible that eventually, computers will be able to self program themselves, just as humans learn from their mistakes and adapt.
 

Barnaby W. Füi

Elite Member
Aug 14, 2001
12,343
0
0
Originally posted by: Spyro
Originally posted by: BingBongWongFooey
Basically, no matter how powerful the computer is, you need software that is useful and logical. Creating software that is powerful AND logical is hard, because when you get down to it, it's created by a bunch of monkies at computers typing out things like

if(foo != bar) {
fun(x, y, 300);
} else {

...and so on.

Theoretically, its possible that eventually, computers will be able to self program themselves, just as humans learn from their mistakes and adapt.

But you still have to program them to be able to do that (not a simple thing) :)
 

Spyro

Diamond Member
Dec 4, 2001
3,366
0
0
Originally posted by: BingBongWongFooey
Originally posted by: Spyro
Originally posted by: BingBongWongFooey
Basically, no matter how powerful the computer is, you need software that is useful and logical. Creating software that is powerful AND logical is hard, because when you get down to it, it's created by a bunch of monkies at computers typing out things like

if(foo != bar) {
fun(x, y, 300);
} else {

...and so on.

Theoretically, its possible that eventually, computers will be able to self program themselves, just as humans learn from their mistakes and adapt.

But you still have to program them to be able to do that (not a simple thing) :)

Indeed, and with that programming, theoretically, the first batch of self-programming creations, would be able to "duplicate" themselves and pass there programming on.......
 

Hayabusa Rider

Admin Emeritus & Elite Member
Jan 26, 2000
50,879
4,268
126
Originally posted by: HotChic
Originally posted by: Hayabusarider
Originally posted by: HotChic
At some point though, computer development has to exceed our uses for it, wouldn't you think?

Yes, when there is nothing left to learn or know.

I'm not talking about abandoning computers, I'm talking about computers reaching such a peak that their interaction with humans can't practically be improved. Then you have as much access to learning and knowing as you are able to have with a computer's facilitation. The computer, in this case, would be basically developed to its ideal point, for human purposes.

Let's assume a perfect interface, which I think you are talking about. Given this to be the case, the computer is a physical device and therefore limited to what it can achieve. For example, let's take the weather. Predicting the weather is an intractable problem. Determining the forecast to any arbitrary date in the future is a mathematically impossible. Literally. It cannot be done. It is not a matter of computing power. It cannot be done. You can get more accurate forecasts though say 2 weeks. Build super duper computers. Now what about 3 weeks, or a month? There are whole classes of problems that cannot be calculated very far out. Now this is interface independent. The theoretical perfect interface could give you the answer only so far out. Then you would have to improve the computational power of the machine, but the interface, being optimized would not have to be changed.

The answer I think that as long as there are problems that people want solved, there will never be a machine that can answer them, but we could get closer and closer. That requires continuous improvement beyond human interaction with machine, and as long as physics does not constrain, then better machines we will have.
 

her209

No Lifer
Oct 11, 2000
56,336
11
0
Originally posted by: Spyro
Originally posted by: BingBongWongFooey
Originally posted by: Spyro
Originally posted by: BingBongWongFooey
Basically, no matter how powerful the computer is, you need software that is useful and logical. Creating software that is powerful AND logical is hard, because when you get down to it, it's created by a bunch of monkies at computers typing out things like

if(foo != bar) {
fun(x, y, 300);
} else {

...and so on.

Theoretically, its possible that eventually, computers will be able to self program themselves, just as humans learn from their mistakes and adapt.

But you still have to program them to be able to do that (not a simple thing) :)

Indeed, and with that programming, theoretically, the first batch of self-programming creations, would be able to "duplicate" themselves and pass there programming on.......

And it was then that Skynet was born.