• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

when do antennae stop working?

bwanaaa

Senior member
consider the following thought experiment:

you have an antenna receiving a radio signal. You gradually increase the frequency of the signal. At what frequency does the antenna fail (ie, no signal received) and how does this depend on its conductivity (or is it coercivity?)

Clearly, light (which is em radiation) cannot be 'received' by antennae but microwaves can. Somewhere in there, the EM stops interacting with the antenna.

why?
 
It has to do with the signal's wavelength... An antenna is suited for one frequency better than for the others, determined by the signal's wavelength and the antenna's length ; you can easily calculate a signal's wavelength with this formula : l x f = c, where l is lamba (wavelength), f is the frequency of the signal, and c is the speed of light. The wavelength for microwaves is pretty small, in the centimeters ; wavelength for light is in the hundreds of micrometers...

If the proportions of the antenna aren't right for the frequency of the signal you want to get, it'll bounce and get attenuated big time.
 
As Psychocow has already pointed out antennas need to be tailored for spefic frequenices, most antennas have a single centre frequency where they are very efficient (relatively speaking), at frequencies higher and lower than the centre frequency the efficiency quickly drops.
However, the efficiency is never zero so all antennas can couple all frequencies, albeit very badly. I have actuallu used the same antenna between 10 MHz and 50 GHz; is was extremely bad but the detector was very sensitive so we did not care about the efficiency.

Btw, light can definitly be recieved by antennas; at least if it is low-frequency light (otherwise the antenna dimensions become impractical); antennas are used in e.g. recievers for far infrared radiation used in astronomy (centre frequencies around 1 THz).


 
Originally posted by: PsYcHoCoW
It has to do with the signal's wavelength... An antenna is suited for one frequency better than for the others, determined by the signal's wavelength and the antenna's length ;

But different materials must be better for antennae than others--you cannot make an antenna of wood. I was wondering which material property correlates best with antenna reception property. It is hard for me to believe that metal would make the best receptor for light waves-no matter what the geometry. BTW, f95toli, what do you mean the dimensions become impractical? 🙂
 
antennaes need to be built of materials that either are conductive or reflective for the wavelengths they are designed for. You are probably thinking of a standard antenna like on a car or a dipole antenna - essentially a length of wire or wires. As others have pointed out these are most efficient at a size appropriate for the wavelength designed for and fall off as you increase/decrease the wavelength (this is roughly the antenna gain) A popular antenna design is the quarter wave stub - this antenna is 1/4 of the size of an antenna based for the full wavelength. It is nearly as efficient as the full sized version so that can give you a very rough idea of how wavelength changes the amount of the signal received.

dish style (parabolic) antennaes work by reflecting/focusing the signal onto a receiver or antenna

One of the above posters mentioned the size being impractical. I believe he was talking about trying to make an incredibily small antenna

Hope my ramblings helped you to understand a little more
 
Originally posted by: bwanaaa
Originally posted by: PsYcHoCoW
It has to do with the signal's wavelength... An antenna is suited for one frequency better than for the others, determined by the signal's wavelength and the antenna's length ;

But different materials must be better for antennae than others--you cannot make an antenna of wood. I was wondering which material property correlates best with antenna reception property. It is hard for me to believe that metal would make the best receptor for light waves-no matter what the geometry. BTW, f95toli, what do you mean the dimensions become impractical? 🙂


Antennas must be made of a conductive metal.

The way an antenna works is that as a wave, of whatever frequency, changes the electro-magnetic field around a conductor, this induces a related current within the conductor. This current is then passed to your receiver, radio, wireless card, whatever, and that current is translated into a useable signal.

If you tried to make an antenna out of wood, there would be no (useful) current induced within that piece of wood, thus it doesn't work well as an antenna.

 
Originally posted by: bwanaaa
Originally posted by: PsYcHoCoW
It has to do with the signal's wavelength... An antenna is suited for one frequency better than for the others, determined by the signal's wavelength and the antenna's length ;

BTW, f95toli, what do you mean the dimensions become impractical? 🙂

By "impractical" I meant that since the size needs to be of the order of the wavelength an antenna meant for optical wavlengths would need to be very small; furthermore it becomes very difficult to couple the signal from the antenna to the reciver/emitter.

Using standard lithography we can make very small thin-film antennas that worke quite well even at 1 THz and above, they are however still quite big compared to optical wavelengths (of the order of 50-100 micron across); it would be fairly easy to make a micron-sized (or even sub-micron) antenna but I am not sure how to test such a device and then there is the issue of coupling.




 
perhaps a carbon nanotube array dipped in copper sulfate to create a conductive copper coating would make a reasonable antenna. Would this then be able to receive light waves? It seems counterintuitive that light could induce a voltage in a wire.

Currently , lasers are sensed using the photoelectric effect or variable resistance (selenium). Are these methods fundamentally different?
 
You can sense far-infrared (FIR) lasers usng antennas, the wavelength of a FIR laser corresponds to about 1-2 THz so FIR lasers are frequently used to test planar antennas,

I don't really see the point in using carbon nanotubes in this case, you would still have the problem of radiation coupling; it might be possible using some form of dielectric waveguide but that is just a guess.

Light is just high-frequency electromagnetic radiation, there is no fundamental difference between light and e.g. microwave radiation, optical wavelengths corrensponds to high frequencies and effects like diffraction and interference become important but the equations stay the same.
 
Back
Top