Are we ever going to reach an equilibrium of sorts where CPU speed becomes irrelevent?

Hajpoj

Senior member
Dec 9, 2006
288
0
0
Do you get what I'm saying, will we ever reach a day where CPUs no longer become a factor? Where we can get a cpu to do anything and everything we want a cpu to do instantly?


Likewise about Vid Cards, will we reach the point where graphics are so life-like realistic that there'll be no where else to go?
 

Roguestar

Diamond Member
Aug 29, 2006
6,045
0
0
I don't see how that could be physically possible. Elementarily speaking if you want a calculation to be done, something must change states. This state change will take time, no matter how infinitessimal. So in the future it'd be possible for a "processor" to do anything we ask with almost no waiting time whatsoever, no lag or noticeable computation time but that'd just be a factor of future speed and efficiency.
 

Hajpoj

Senior member
Dec 9, 2006
288
0
0
So this state change cannot exceed the speed of light correct? So once we do hit that, theoretically there is no where else to go.

Some executions do occur at near the speed of light today, do they not? Then theoretically we will reach a day where all executions will be handled with the same ease.

edit: Force and work done per cycle would be different per execution of course, but that's not the point. The question is how much work can be done on that cycle, and the theory is that the work "carried per cycle" can be near infinite which would make all work done at the same time.
 

Roguestar

Diamond Member
Aug 29, 2006
6,045
0
0
Regardless of how much work a processor can do per cycle, one cannot compute all calculations at once. For example, if I were to multiply X by Y, add Z and divide by A, I can't divide by A until I've done the preceeding steps. There will always be sequential processes with our mathematics.
 

Hajpoj

Senior member
Dec 9, 2006
288
0
0
But multiple calculations can be completed on the same cycle, which can make the time in between inconsequential.
 

Vegitto

Diamond Member
May 3, 2005
5,234
1
0
Originally posted by: Roguestar
Regardless of how much work a processor can do per cycle, one cannot compute all calculations at once. For example, if I were to multiply X by Y, add Z and divide by A, I can't divide by A until I've done the preceeding steps. There will always be sequential processes with our mathematics.

Yeah. I'd add a response, but you've already said everything that could be said. Good job.
 

Roguestar

Diamond Member
Aug 29, 2006
6,045
0
0
Originally posted by: Hajpoj
But multiple calculations can be completed on the same cycle, which can make the time in between inconsequential.

I think perhaps I didn't make myself clear; what I mean is that on that all-powerful cycle, if you're calculating a sum you're still going to have to wait until the end of that cycle before you can use the result. You can't calculate the result and at the same time use it in another sum, they have to be done in order. I can't do the sum B + C = ? until I've finished working out A + 2 = B.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Originally posted by: Hajpoj
But multiple calculations can be completed on the same cycle, which can make the time in between inconsequential.

Really? How does the former (which is sometimes true, on occasion) make the latter (time inconsequential?) become an unequivocal truth?
 

Hajpoj

Senior member
Dec 9, 2006
288
0
0
then it's the cycles themselve that which can be accelerated? So the theoretical endpoint is the speed of light, the point at which no more advancement can be made technologically speaking here?
 

Hajpoj

Senior member
Dec 9, 2006
288
0
0
Originally posted by: Idontcare
Originally posted by: Hajpoj
But multiple calculations can be completed on the same cycle, which can make the time in between inconsequential.

Really? How does the former (which is sometimes true, on occasion) make the latter (time inconsequential?) become an unequivocal truth?

If all calculations were completed on the same cycle(assuming one cycle is the speed of light), there would only be need for the single cycle which renders time meaningless based on our perceptions.

Because of this, one can conclude that everything cannot be completed in one cycle. But given that a cycle = c(c being the speed of light) multiple cycles needed to complete a task at this would still be unnoticeable to our perceptions.


But I do suppose that even bigger calculations can always be queried such as pi to 10 septillionth digit which will cause computation times because of that whole concept of infinite we have in our mathematics.
 

Roguestar

Diamond Member
Aug 29, 2006
6,045
0
0
Originally posted by: Hajpoj
Originally posted by: Idontcare
Originally posted by: Hajpoj
But multiple calculations can be completed on the same cycle, which can make the time in between inconsequential.

Really? How does the former (which is sometimes true, on occasion) make the latter (time inconsequential?) become an unequivocal truth?

If all calculations were completed on the same cycle(assuming one cycle is the speed of light), there would only be need for the single cycle which renders time meaningless based on our perceptions.

Because of this, one can conclude that everything cannot be completed in one cycle. But given that a cycle = c(c being the speed of light) multiple cycles needed to complete a task at this would still be unnoticeable to our perceptions.


But I do suppose that even bigger calculations can always be queried such as pi to 10 septillionth digit which will cause computation times because of that whole concept of infinite we have in our mathematics.

Please clarify what you mean by these calculations being performed at the speed of light. The speed of light is a measure of distance travelled in a given time, what processor calculations are measured in is either cycles per second or the amount of time taken for one cycle (1 divided by cycles per second). The speed of light is a bit misleading here.
 

Cr0nJ0b

Golden Member
Apr 13, 2004
1,141
29
91
meettomy.site
I believe that this is a correct assertion.

I also believe that nothing new will ever be invented, so eventually there will be no new patents.

I also believe that we will soon have enough storage in our home PCs to hold everything digital that exists.

I also believe that we will soon evolve into beings of pure light and energy.

This is what I believe.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
What does the speed of light have to do with anything? And how do we know, with our current state of technology (by we I mean the human race) that nothing can exceed the speed of light? Who made that law?

Faster than light.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Originally posted by: Hajpoj
Originally posted by: Idontcare
Originally posted by: Hajpoj
But multiple calculations can be completed on the same cycle, which can make the time in between inconsequential.

Really? How does the former (which is sometimes true, on occasion) make the latter (time inconsequential?) become an unequivocal truth?

If all calculations were completed on the same cycle(assuming one cycle is the speed of light), there would only be need for the single cycle which renders time meaningless based on our perceptions.

Because of this, one can conclude that everything cannot be completed in one cycle. But given that a cycle = c(c being the speed of light) multiple cycles needed to complete a task at this would still be unnoticeable to our perceptions.


But I do suppose that even bigger calculations can always be queried such as pi to 10 septillionth digit which will cause computation times because of that whole concept of infinite we have in our mathematics.

Where are you getting this education from? IMO you are in the right to request a full refund.
 

Hulk

Diamond Member
Oct 9, 1999
5,138
3,726
136
Originally posted by: keysplayr2003
What does the speed of light have to do with anything? And how do we know, with our current state of technology (by we I mean the human race) that nothing can exceed the speed of light? Who made that law?

Faster than light.



Light travels at 3.0 x 10^8 m/sec in a vacuum. In various media light travels slower. That is the basis for refraction (the bending of light) and how lenses work.

There are examples of certain subatomic particles moving from place to place in less time than would take at the speed of light but these are quantum behaviors and not typical examples of how things behave at larger scales. When you get into the quantum world classical physics and even relativity begins to break down and a new set of very strange rules apply.

For all intents and purposes c (the speed of light) is the speed limit for the universe. Get used to it. E=mc^2, it's been tested it's correct. Time dilation, proven. Einstein's theory of general relativity has been proven to be correct time and time again

You can solve Maxwell's equations (four differential equations describing all electromagnetic phenomena) simutaneously and they show a constant, that constant is c and this theoretical values agrees exactly with experimental data.

Unless you have a PhD in physics please don't start changing the laws of the universe.


Even in Star Trek they sometimes have to wait on the computer!
 

Hajpoj

Senior member
Dec 9, 2006
288
0
0
data bursts cannot exceed the speed of light, they travel in waves correct me if I'm wrong. So the theoretical maximum speed could be no greater than the speed of light?
 

BitByBit

Senior member
Jan 2, 2005
474
2
81
The speed at which EM quanta (electrons, photons) propagate through circuits is not a bottleneck; it is the frequency at which the transmission medium operates that matters, since data is transmitted on the rising and falling edges of the clock cycle.

Presently, 4.0GHz seems to be the maximum clockspeed attainable with current hardware, which is well below that of light (~500THz). Even if light were ever to become a bottleneck, there is little reason to suggest that higher frequency waves such as X-ray and Gamma couldn't be used. One problem I've heard mentioned about such astronomical frequencies is the distance travelled by the signal each cycle (its wavelength). The wavelength of light is around 60 µm, which probably wouldn't even be a problem for todays processors, which are manufactured on 65/90 nm processes.
Of course, making transistors that could operate at such frequencies is but a dream for today's manufacturers.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
Originally posted by: Hulk
Originally posted by: keysplayr2003
What does the speed of light have to do with anything? And how do we know, with our current state of technology (by we I mean the human race) that nothing can exceed the speed of light? Who made that law?

Faster than light.



Light travels at 3.0 x 10^8 m/sec in a vacuum. In various media light travels slower. That is the basis for refraction (the bending of light) and how lenses work.

There are examples of certain subatomic particles moving from place to place in less time than would take at the speed of light but these are quantum behaviors and not typical examples of how things behave at larger scales. When you get into the quantum world classical physics and even relativity begins to break down and a new set of very strange rules apply.

For all intents and purposes c (the speed of light) is the speed limit for the universe. Get used to it. E=mc^2, it's been tested it's correct. Time dilation, proven. Einstein's theory of general relativity has been proven to be correct time and time again

You can solve Maxwell's equations (four differential equations describing all electromagnetic phenomena) simutaneously and they show a constant, that constant is c and this theoretical values agrees exactly with experimental data.

Unless you have a PhD in physics please don't start changing the laws of the universe.


Even in Star Trek they sometimes have to wait on the computer!

Never tried nor said I would change the laws of the universe. But anyone, PhD or other, claiming to actually know everything about the universe and it's ever dynamic nature, is a fool of fools.
 

VooDooAddict

Golden Member
Jun 4, 2004
1,057
0
0
Depending on your application, CPU speed could already be irrelevant. I know people perfectly happy using Pentium 2s and Win2k to play a few internet card games and do all their email, IM and Myspace type stuff.

 

Duvie

Elite Member
Feb 5, 2001
16,215
0
71
I agree that is some instances we are getting to a point that cpus are starting to get limited....IO limitations are starting to make cpu speed less relevant. I have many apps that with 4 cores I am limited even in a Raid setup....

If we move to 8 cores we are going to see major limitations unless other areas of technology keep pace...
 

Hulk

Diamond Member
Oct 9, 1999
5,138
3,726
136
Originally posted by: keysplayr2003
Originally posted by: Hulk
Originally posted by: keysplayr2003
What does the speed of light have to do with anything? And how do we know, with our current state of technology (by we I mean the human race) that nothing can exceed the speed of light? Who made that law?

Faster than light.



Light travels at 3.0 x 10^8 m/sec in a vacuum. In various media light travels slower. That is the basis for refraction (the bending of light) and how lenses work.

There are examples of certain subatomic particles moving from place to place in less time than would take at the speed of light but these are quantum behaviors and not typical examples of how things behave at larger scales. When you get into the quantum world classical physics and even relativity begins to break down and a new set of very strange rules apply.

For all intents and purposes c (the speed of light) is the speed limit for the universe. Get used to it. E=mc^2, it's been tested it's correct. Time dilation, proven. Einstein's theory of general relativity has been proven to be correct time and time again

You can solve Maxwell's equations (four differential equations describing all electromagnetic phenomena) simutaneously and they show a constant, that constant is c and this theoretical values agrees exactly with experimental data.

Unless you have a PhD in physics please don't start changing the laws of the universe.


Even in Star Trek they sometimes have to wait on the computer!

Never tried nor said I would change the laws of the universe. But anyone, PhD or other, claiming to actually know everything about the universe and it's ever dynamic nature, is a fool of fools.


On that we can agree.


Hmm. If there could be a Core 2 Duo operating using light at say a frequency 10,000 times faster than my 3.2 then I think that would satisfy me for a while. I don't know if even Microsoft could write enough bloatware to use up those cpu cycles. Although I wouldn't put it past them if they really put their minds to it!

 

zsdersw

Lifer
Oct 29, 2003
10,505
2
0
Originally posted by: Duvie
I agree that is some instances we are getting to a point that cpus are starting to get limited....IO limitations are starting to make cpu speed less relevant. I have many apps that with 4 cores I am limited even in a Raid setup....

If we move to 8 cores we are going to see major limitations unless other areas of technology keep pace...

Indeed. We need faster high-capacity storage. Hard drives, no matter how advanced, big, and fast, are yesterday's technology with an a$$-load of band-aids and patches.

Short of that, we would need an entirely different way of operating computers and servers; one that didn't rely on storage devices such as hard drives. Way back when, computers didn't have hard drives. Surely such a scenario can come about again, especially with all the alternative storage methods we have now.

 

Roguestar

Diamond Member
Aug 29, 2006
6,045
0
0
This thread is steadily making less and less sense. We should all go back to sleep and we'll get our light-processors and flying cars and cities in the sky.