Microwave Circuit Design vs. Traditional Analog

flyboy84

Golden Member
Jul 21, 2004
1,731
0
76
I am currently considering an offer to work for a large defense company in electronic warfare systems. The department I've been offered the position in has an analog group and a microwave group and my offer is for the microwave side. It sounds like the microwave guys mainly make MMICs while the analog group does more board level analog work (and a custom analog IC now and then). My experience is mainly in the analog world, though I have some coursework in Microwave/RF.

My question is the following. How do microwave and traditional analog stack up in terms of future growth and long term career development? If you had to pick between the two, which direction would you go? I really like analog, but I feel like microwave is more cutting edge. Is there more "prestige" working in one field vs. another? The company has demonstrated a great desire to train a junior engineer such as myself in the microwave arts and I think that is a great opportunity. My manager has said there is some overlap with the analog group as well and I'd have the ability to do some work there if I have the desire.

A little bit of background on my ed/experience: I graduated with my BSEE in 2006, taking mostly electronics classes to fill out my depth electives, and my senior project was to make an amateur radio RF repeater out of an old 1960s era UHF transceiver. I worked for a very large defense company for a little over a year doing systems integration work and then decided to go back for my MSEE full time at a different school. I did a concentration in microelectronic circuits (mainly analog) and I did take one course in Microwave Ckt Design. Most of my focus has been on analog CMOS IC design. I will be done with the MS program in about 2 weeks.
 

bobdole369

Diamond Member
Dec 15, 2004
4,504
2
0
If you had to pick between the two, which direction would you go?

Microwave.

While not a designer, nor an EE - I've had work in analog stuff as well as digital work (mostly as a jr engineer in field testing, troubleshooting, documentation, rework, etc). Analog is dead. I don't see any real growth in the analog field other than at the top of the chain. More and more analog stuff keeps getting pushed offshore. If your the designer - you are OK, but before long you need to learn Mandarin or Hindi to be effective in the analog world.
 

esun

Platinum Member
Nov 12, 2001
2,214
0
0
Analog will always be around, as will RF. They're both absolutely essential for any electronic system (well, RF only in high-frequency systems, obviously, but those are everywhere nowadays), and can't be automated nearly as easily as digital design. I think RF will inevitably grow more in the recent future than analog simply because of the ubiquity of wireless communications nowadays. I also think it's probably a good idea to learn it now while you have a company that wants to train you in that field, especially given that you already know analog design, since you'll be more versatile in the future.

Of course, this is coming from a MS student in EE, so I can't offer any insights you don't already know.
 

wwswimming

Banned
Jan 21, 2006
3,702
1
0
i started in Microwave in 1980, as a mechanical engineer. took Electromagnetics
classes over the years, etc.

the last R&D team i worked on had microwavers, analog guys, and digital guys,
and me (mechanical, EMI, dFx). seems like the microwavers & analog guys were
able to exchange jobs fairly easily, they were both good at each other's jobs.
not so much with the digital guys, though we shared the same lab, the digital
guys stayed in a group and worked on FPGA's, etc. (collaborating with the analog &
systems guys, the systems guys usually being experienced microwavers, project
leads, etc.)

it sounds like you have more experience with analog.

part of the answer is, what is the "delta" between your capabilities & what the
team you're assigned to will expect from you ? if there's a size-able delta that
they're expecting you to make up with "hustle", that can be stressful.

personally, i like microwave. Smith charts, vector network analyzers, etc.
 

f95toli

Golden Member
Nov 21, 2002
1,547
0
0
From what I've been told many companies are finding it difficult to find good analog designers nowadays; most of the EEs who graduate specialize in more "modern" topics such microwaves, DSP etc but -as esun has already pointed out- you also need good analog engineers.
I know from experience that the analog part (noise on the inputs etc) is the achilles heal of most modern measurement equipment, some of the analog electronics (pre-amplifiers etc) we use where I work is 20-30 years old (or was designed 20 years ago but is still being sold) but still performs better than any modern designs.
The manufacturers are well aware of this but many of them tell us that they simply can't find really good analog designers anymore (and some tell us that they don't care, most of their customers don't notice the problems).



 

flyboy84

Golden Member
Jul 21, 2004
1,731
0
76
Wow, thanks for your responses all. I've decided to accept the offer, mainly because the work environment really sounds like they take care of their junior engineers and they embrace a culture of education. I was also able to negotiate a hefty signing bonus, and $$ always helps ;)

The manager reiterated that I will be able to get some analog experience as well. I hope to gain expertise in both areas!
 

Born2bwire

Diamond Member
Oct 28, 2005
9,840
6
71
There is also a lot of talk about combining the two, digital and analog. The industry wants to progress to a single system on chip setup where the analog and digital components are all on the same chip as a way of reducing size and costs. In this manner, a designer would need to know a bit about both areas to be effective. I am not sure how far along they have progressed on doing this, it's been a few years since I heard an industry seminar on the topic.
 

flyboy84

Golden Member
Jul 21, 2004
1,731
0
76
Originally posted by: Born2bwire
There is also a lot of talk about combining the two, digital and analog. The industry wants to progress to a single system on chip setup where the analog and digital components are all on the same chip as a way of reducing size and costs. In this manner, a designer would need to know a bit about both areas to be effective. I am not sure how far along they have progressed on doing this, it's been a few years since I heard an industry seminar on the topic.

System on a chip is already quite pervasive today. Digital design drives newer process nodes (ie 65nm, 45nm and beyond) and skilled analog designers are needed to make circuits that perform well at these smaller feature sizes. The problem of course is that these processes are generally more geared to digital, so it is a real challenge to make high performance analog circuits with the same silicon.

With microwave on the other hand, you are talking about RF frequencies that are VERY high, and most silicon processes can't operate up there (we are talking 18-60GHZ). Typically you see special III-IV compound materials for these chips, like Gallium Arsenide or Indium Phosphide. They have been and will have to continue to be kept separate for some time until silicon can operate at microwave frequencies, or perhaps until digital moves to a post silicon medium.
 

wwswimming

Banned
Jan 21, 2006
3,702
1
0
isn't a CPU operating in the 2 to 4 GHz range technically a microwave circuit ?

i can't help but wonder if the Intel photomask layout engineers use guidelines similar
to stripline PC design in duroid (teflon-glass composite). for example, rounded or
beveled corners on a trace instead of a hard right angle.

so that as the digital circuits that interface with the analog & microwave circuitry
crank up in speed, the layout tasks are all RF or microwave.
 

flyboy84

Golden Member
Jul 21, 2004
1,731
0
76
Originally posted by: wwswimming
isn't a CPU operating in the 2 to 4 GHz range technically a microwave circuit ?

i can't help but wonder if the Intel photomask layout engineers use guidelines similar
to stripline PC design in duroid (teflon-glass composite). for example, rounded or
beveled corners on a trace instead of a hard right angle.

so that as the digital circuits that interface with the analog & microwave circuitry
crank up in speed, the layout tasks are all RF or microwave.

My understanding is that frequency alone does not dictate the definition of microwave. You have to consider the line length relative to the wavelength of the operating frequency. It all comes down to propagation time.

From wikipedia:

"Apparatus and techniques may be described qualitatively as "microwave" when the wavelengths of signals are roughly the same as the dimensions of the equipment, so that lumped-element circuit theory is inaccurate. "

We had two definitions that we used for sinusoidal or digital signals but I can't remember them off the top of my head.
 

Loki726

Senior member
Dec 27, 2003
228
0
0
Originally posted by: wwswimming
isn't a CPU operating in the 2 to 4 GHz range technically a microwave circuit ?

No, frequency used in the context of microwave means the frequency of the electromagnetic waves. Recall that for EM waves, frequency directly corresponds to wavelength because of the fixed speed of light, so the name refers to the wavelength -- seemingly micro as in micrometer wavelength(10^-6 m), but actually micro just means smaller than radio wavelength as the wavelength is (10^-3 to 10^0). Frequency in the context of CPU clocks refers to the time it takes to charge/discharge a capacitor until the potential changes by a certain amount.

People seem to get confused because frequency is used differently in so many contexts. Just remember that frequency conceptually means that something happens a certain number of times per unit of time. The something could be the number of times a pendulum swings back and forth, a wave cycles from a minimum to a maximum back to a minimum, or the earth revolves around the sun.
 

f95toli

Golden Member
Nov 21, 2002
1,547
0
0
Originally posted by: Loki726
Originally posted by: wwswimming
isn't a CPU operating in the 2 to 4 GHz range technically a microwave circuit ?

No, frequency used in the context of microwave means the frequency of the electromagnetic waves. Recall that for EM waves, frequency directly corresponds to wavelength because of the fixed speed of light, so the name refers to the wavelength -- seemingly micro as in micrometer wavelength(10^-6 m), but actually micro just means smaller than radio wavelength as the wavelength is (10^-3 to 10^0). Frequency in the context of CPU clocks refers to the time it takes to charge/discharge a capacitor until the potential changes by a certain amount.

That is sort of correct. But remember that a real digital signal is always transmissted as an analog waveform, if a circuit is changing 1e9 times/s this means that the signal it is sending out will have frequency components in the GHz range. If you want ot transmit a signal with a speed of 1 gigabit per second along a tranmission line a rule of thumb is that you need a bandwith of about 3GHz in order for the signal to look reasonbly "square" when it arrives at its destination.
Hence, whether or not somehing is a "microwave circuit" depends on the size of the circuit compared to the relevant wavelenghs as flyboy84 has already stated; if they are of the same order of magnitude you can't model the circuit using lumped elements anymore.

The interconnects in a CPU are so short that lumped elements approach presumably works well most of the time.