So there's a limit to how fast a pc's can be?

Coldkilla

Diamond Member
Oct 7, 2004
3,944
0
71
I heard that there were many variable's that lead to a pc's ultimate speed. Once the CPU hit that speed, it would begin to emit "microwave" radiation. Will we ever pass this barrier? Is there any current technology that would help us pass this barrier that we know of?
 

Coldkilla

Diamond Member
Oct 7, 2004
3,944
0
71
Heard it from my professor at college. Saying that the charge inside a cpu can only go so fast before it starts to emit radiation. Unless we put CPU's in microwave radiation containers lol.
 

Coldkilla

Diamond Member
Oct 7, 2004
3,944
0
71
Unless he's a dousche bag because I asked him about some Unix/windows relation and their speeds with dual cores.. and he said he didn't know much about dual cores... get with the times I wanted to say.. but he's my professor.. i didn't pick em
 

myocardia

Diamond Member
Jun 21, 2003
9,291
30
91
Well, I've been reading about cpu's for 25 years now, and that's absolutely the first time I've ever heard that argument. He sounds like he might be missing at least a few screws.
 

Coldkilla

Diamond Member
Oct 7, 2004
3,944
0
71
I wouldn't doubt it. Seems all they do is increase RPMs. Eventually, if disks arent made of titainium, they'll shatter into bits and pieces.

Say, you had a 1 Terabyte CPU (not realistic but whatever), could this work? Say not Dual Core or a smaller processor core. Could he be talking about if you amp up the GHz while keeping the core inside the processor the same size?

What exactly, in terms of advances in technology allow us to continue forward towards faster and faster CPU's without any bordered limitations? I want to fire back at the teacher and make him look dumb in class lol, and im the one supposed to be learning from him
 

Leros

Lifer
Jul 11, 2004
21,867
7
81
Isn't there some kind of 10Ghz limit or something? Anyway, the future is in multiple cores. So if you have 32 cores that are each 2.0Ghz, thats pretty fast.
 

myocardia

Diamond Member
Jun 21, 2003
9,291
30
91
Originally posted by: Oxaqata
Frequency Spectrum
Is this what he means?
Once they start hitting those kind of frequencies the become a different type of energy (Infra-red, Ultraviolet)?
All of that only applies to radio waves. A cpu doesn't use radio waves. Never have, never will. BTW, what will eventually limit the frequencies that cpu's will be able to attain is the size of the "process". Intel is at a 65nm process now, but they're saying that they don't believe they'll ever be able to go much smaller than 45nm, because then it will start being too weak.
 

myocardia

Diamond Member
Jun 21, 2003
9,291
30
91
Originally posted by: holotech
i think as far as the ultimate speed of a PC isnt the hard drive the bottleneck?
It depends totally on what you're trying to do with your PC, and also how many, and how fast your hard drives are, along with their platter rotational speeds, obviously.
 

Oxaqata

Senior member
Jul 14, 2006
372
0
0
Originally posted by: myocardia
Originally posted by: Oxaqata
Frequency Spectrum
Is this what he means?
Once they start hitting those kind of frequencies the become a different type of energy (Infra-red, Ultraviolet)?
All of that only applies to radio waves. A cpu doesn't use radio waves. Never have, never will.

Exactly
 

btcomm1

Senior member
Sep 7, 2006
943
0
0
Yes there is technology they are working on right now that will allow for much faster computing. There are differnt materials that will allow computer to run at insane speeds I've read about them before. Also they are working on reverse hyperthreading right now. If they are able to do that, then the sky is the limit. They could have an 8 core CPU all cores running at 3.0 ghz and with reverse hyperthreading they would be like having a single 24 ghz chip.
 
Jun 14, 2002
505
0
0
Originally posted by: myocardia
Originally posted by: Oxaqata
Frequency Spectrum
Is this what he means?
Once they start hitting those kind of frequencies the become a different type of energy (Infra-red, Ultraviolet)?
All of that only applies to radio waves. A cpu doesn't use radio waves. Never have, never will. BTW, what will eventually limit the frequencies that cpu's will be able to attain is the size of the "process". Intel is at a 65nm process now, but they're saying that they don't believe they'll ever be able to go much smaller than 45nm, because then it will start being too weak.


wont that break one of moore's laws?

 

Bobthelost

Diamond Member
Dec 1, 2005
4,360
0
0
Anything electronic that turns on and off emmits electromagnetic noise. I can't remember the formula for i think it's a combination of speed and amplitude that determine it's frequency. I think he's talking rubbish, even if they do start to emmit microwaves you just need to put in some sheilding.

Moore's laws are not laws, they are observations.

Oh yeah, reverse hyperthreading is not all that great, in most roles it'd produce only an incremental improvement in performance.
 

tallman45

Golden Member
May 27, 2003
1,463
0
0
In a few short years from now all PC storage will be on SSD. I would love to see a NB chipset support a PCI Bus slot for a 150X 4Gb SD card. that would be a perfect location for a page data file (no heat, no cabling, low power draw, and lifetime warranty).

 

Pabster

Lifer
Apr 15, 2001
16,986
1
0
Originally posted by: myocardia
Well, I've been reading about cpu's for 25 years now, and that's absolutely the first time I've ever heard that argument. He sounds like he might be missing at least a few screws.

Most professors fit that category. :p
 

Goi

Diamond Member
Oct 10, 1999
6,771
7
91
He's probably talking about the actual clock speed of the CPU being limited, rather than performance. In any case, I haven't heard of any such problem too. The fastest "clock" is already in the hundreds of GHz IIRC.
 

socketman1

Junior Member
Sep 9, 2006
5
0
0
This seems more like a "Highly Technical" forum question, but here I go.

A few years ago Stanford University was hosting video files on all types of technical subjects. One of them was computers. Intel engineers gave a lecture somewhere about the barriers they face. Here is a highly butchered and marginally inaccurate summary of the video ( Its been a couple of years since ive seen it).

Quantum tunneling:
As you know electrons flow through a circuit and cross gates. These gates either allow electrons to pass or they dont. As microprocessor dies have shrank, so has the width of these gates. The die shrinks, and more transistors are added and gate width shrinking is just a natural step. Now. quantum physics says electrons can "tunnel" or randomly jump from one place to another over very short distances. Gate lengths are so short now that they do feel the effect of these short jumps. Voltage leaks and the circuit shorts itself out.
Engineers do have solutions for this, but the smaller we get ( and hence faster), the more its going to happen

Source of this info

Photo masking:

Forgive me here, but im paraphrasing a very complicated concept from memory. But the principle is the important thing. All circuits are etched onto the wafer by various means. Typically a very short wavelength laser. As the traces, ie. the width of the circuit the electrons flow down get smaller so does the laser width/wavelength. Once that wavelength hits a certain point, you cant garauntee the trace will work. Something about the heisenberg uncertainty principle and the traces needing to be smaller than the shortest wavelength laser we have.

So there are 2 obstacles that need addressing for current and future CPU's.

If any forum members remember those video lectures from 3+ years ago please ring in here, they were really interesting.
 

corkyg

Elite Member | Peripherals
Super Moderator
Mar 4, 2000
27,370
240
106
Print this out for your professor - and he can start thinking clockless processing. :)

Clockless

 

alocurto

Platinum Member
Nov 4, 1999
2,174
0
76
Originally posted by: Pabster
Originally posted by: myocardia
Well, I've been reading about cpu's for 25 years now, and that's absolutely the first time I've ever heard that argument. He sounds like he might be missing at least a few screws.

Most professors fit that category. :p

I would ask someone from Intel/AMD. They actually do something useful in the world.
 

Dethfrumbelo

Golden Member
Nov 16, 2004
1,499
0
0
Originally posted by: socketman1
This seems more like a "Highly Technical" forum question, but here I go.

A few years ago Stanford University was hosting video files on all types of technical subjects. One of them was computers. Intel engineers gave a lecture somewhere about the barriers they face. Here is a highly butchered and marginally inaccurate summary of the video ( Its been a couple of years since ive seen it).

Quantum tunneling:
As you know electrons flow through a circuit and cross gates. These gates either allow electrons to pass or they dont. As microprocessor dies have shrank, so has the width of these gates. The die shrinks, and more transistors are added and gate width shrinking is just a natural step. Now. quantum physics says electrons can "tunnel" or randomly jump from one place to another over very short distances. Gate lengths are so short now that they do feel the effect of these short jumps. Voltage leaks and the circuit shorts itself out.
Engineers do have solutions for this, but the smaller we get ( and hence faster), the more its going to happen

Source of this info

Photo masking:

Forgive me here, but im paraphrasing a very complicated concept from memory. But the principle is the important thing. All circuits are etched onto the wafer by various means. Typically a very short wavelength laser. As the traces, ie. the width of the circuit the electrons flow down get smaller so does the laser width/wavelength. Once that wavelength hits a certain point, you cant garauntee the trace will work. Something about the heisenberg uncertainty principle and the traces needing to be smaller than the shortest wavelength laser we have.

So there are 2 obstacles that need addressing for current and future CPU's.

If any forum members remember those video lectures from 3+ years ago please ring in here, they were really interesting.

Smaller gates do mean more leakage and I'd also imagine that there is a limitation to how fast the electrons can react to the switching field, and a limitation to how long it takes the switching field to get strong enough to pull the electrons. A photon computer would be able to achieve infinite speed/zero time, but then you're not dealing with anything physical anymore. The move to multiple cores does indicate that at least with this current MOSFET technology, the biggest percentage gains from gate shrinks are over.

 

MielkeHBP

Member
Nov 26, 2005
93
0
0
i was reading some computer magazine about a month ago, im thinking it was computer science but im not sure, i was sick and in the waiting room of the hospital, anyways i was reading that the cores of processors are going to be diamonds and other gems or some crystalize material and we will be able to reach extreme speeds, with low amounts of power, and very little heat. I am actually gonna go try to find that magazine now lol


ooh edit, link clicky

meh just search diamond CPU on google(redundent?)