• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Electrical/Computer Engineering *warning: long*

So I'm studying for a circuits exam lately and, for example, I am figuring out the voltage drop across a certain capacitor or resistor in a RLC circuit that maybe has 2 voltage sources. Pretty basic, although some of them can be a PIA still. Of course, I am in basic circuit analysis (2xxx level).

Although I'm not that far in my degree, I will be a 3rd year/junior starting this fall, and it's like for all of the things I've learned thus far, they are almost meaningless. The electronics world seems SO vast and complicated. I could only dream that any electronic device you use is that simple to analyze. But in reality, any new, breakthrough technology is so far above anything I could imagine right now. For example, in digital logic we might look at a simple adder. Well if that's what an adder is, wtf does a CPU look like?? I could probably analyze a sequential circuit, or maybe even a basic async circuit, but that to me seems like 1% of what real electronics are. So I must ask, at what point does one learn enough to do real-world applications?

I have over half of the required credits for my B.S., and have a good idea on what I'll be taking the next couple of semesters. Two semesters from now I'll be 1.5 years from graduating and FINALLY taking my first c++ course!! Granted I got a bit of a late start on CS, but there's just so many damn prereqs that I'll be a senior before I get past the required, "basic" stuff. But yet, that is all the degree requires. Not that I should be able to expect to program games with a simple 4yr degree, but still, the programs that I can write now are nothing compared to what is really useful in the real world. I've been through 3 CS classes and have maybe 8 left. Again, for as much as I've learned in those 3, I don't see how I could do real-world things with what is left.

I was reading in HT, and read posts about x86 being a limiting factor. Ok.. .. now I have no idea what x86 is relative to PowerPC, MIPS, RISC, etc... I mean could mention more than the average joe about what it has to do with (it's an instruction set, whoop-dee-do, what is that? what makes it x86? etc ), but I couldn't really define any of those things. And so something else I wanna ask is that are things that one would learn in college going to be obsolete? Obviously if x86 is a limiting factor then they will switch to something different (eventually). Minor changes can probably be picked up easily, but a major switch like that seems like a totally new concept.

Ahhh... so you get my point? It's not that I dread what is ahead of me. In fact, the more I get into EE, the more I like it and the more that I see the "big picture". Little by little I see things come together that were just taken for granted in real life. But still it just doesn't seem like the next 2-2.5 years ahead of me that I would be ready to be in the workforce. Which reminds me, I need to get an internship as well.....

Anyway, anyone have similar thoughts or comments?
 
I graduated with a B.S. in computer engineering. If you want to do processor design, you will have to get a phd.
 
I know exactly what you mean OP. It's a big world out there and you can only learn so much in the time you have.
 
You have to have a serious amount of zealousness to excel in that sort of market. I'll be driving down a street or walking somewhere and spot some electronic gizmo of one type or another (be it only a stoplight even), and I'll idly start imagining the basic programming logic needed to make it work. I always figured that EE/CE people were the same way, but a bit more biased to hardware over software.
 
Originally posted by: Cattlegod
I graduated with a B.S. in computer engineering. If you want to do processor design, you will have to get a phd.

Only if you want to make big decisions on architecture and circuit topologies. You don't need a graduate degree to build adders/multipliers/bypass networks/whatevers. The company will already have a general topology that they discovered is best (thanks to the Ph'Ds) and you just gotta build it.

But you got something right after 3 years of studying. When you graduate if you want to be a great electrical engineer, you have to keep reading what's the newest thing out there.
 
patience... you'll learn more in your higher level courses. i'm a year ahead of you and in my last year i've covered x86 assembly and interfacing, and MIPS assembly and architecture... both of which gave me a big leap in seeing the big picture.
 
I feel kind of the same way, though I'm a senior in CS.

You can't learn everything... the stuff they teach you know may seem like it's useless but I would think it forms the basis of concepts you will learn later (job or further schooling). And I think yllus is right about having a serious amount of zealousness to excel.

I hope a willingness to learn, a good work ethic, and a good (not super expert) background in the field you study will be enough to get a semi-decent job. Or I'm fvcked.
 
Well, I'm a prof in EE, and I can understand your frustration. The simple answer is that a 4 year degree will not get you that far in "understanding electronics" on a device level.

The long answer is, that your 4 year degree is like building a foundation of a house. You're at the foundation, looking up at adjacent sky scrapers. Pretty hard to do. By your senior year, you should have a good understanding of what you want to specialize in, if that's your cup of tea. Many engineers go to work , or get an MBA and manage. You've really got to think MS, PhD for really in depth stuff, or get experience on the job after many years, in the discipline that you are interested in.

For instance, your communications courses will spend a few weeks on TV transmission methods, but it is highly unlikely to get into HDTV, high power transmitting methods, Satellite TV etc. etc. That can only come after your 4 year.

So keep your chin up, experiment on your own, and have fun. Make sure your heart is in what you do, and you will thrive.

🙂
 
There are shortcuts to figuring out voltage drops and amperage that I figured out when I was in high school. I could do that stuff in my head and be damn close. Being exact requires a calculator as my teachers required answers be correct to 4 decimal places. But no need to figure out a huge equasion that fills half page to get the amperage across a single resister.
 
Originally posted by: Evadman
There are shortcuts to figuring out voltage drops and amperage that I figured out when I was in high school. I could do that stuff in my head and be damn close. Being exact requires a calculator as my teachers required answers be correct to 4 decimal places. But no need to figure out a huge equasion that fills half page to get the amperage across a single resister.

it took you half a page to get the amperage of a single resistor?
 
Well, in order to do all thi sstuff, you have to know the basics. if you get to comples too fast, people will just start memorizing how to get answers instead of understanding them. After your BS, you will definately not have enough knowledge to become a senior engineer, but that is why we all see experience as teh most important thing.

I was reading in HT, and read posts about x86 being a limiting factor. Ok.. .. now I have no idea what x86 is relative to PowerPC, MIPS, RISC, etc... I mean could mention more than the average joe about what it has to do with (it's an instruction set, whoop-dee-do, what is that? what makes it x86? etc ), but I couldn't really define any of those things. And so something else I wanna ask is that are things that one would learn in college going to be obsolete? Obviously if x86 is a limiting factor then they will switch to something different (eventually). Minor changes can probably be picked up easily, but a major switch like that seems like a totally new concept.

Well, if you took Computer Architecture and you can't answer that question, you need to take the course again. This is the thing about all science degrees. You learn what needs to be prevelent today knowing that it could be obsolete by tomorrow. But do not fret, because the basics you learn will always be the same unless some new law comes out telling you otherwise.
 
Originally posted by: Gibson486
Originally posted by: Evadman
There are shortcuts to figuring out voltage drops and amperage that I figured out when I was in high school. I could do that stuff in my head and be damn close. Being exact requires a calculator as my teachers required answers be correct to 4 decimal places. But no need to figure out a huge equasion that fills half page to get the amperage across a single resister.

it took you half a page to get the amperage of a single resistor?
nah, I was exadurating. You know the type of question where you are given an input voltage and you need to get the amperage and voltage drops across each resister in a 10ish resister circuit. Those require a decent amount of work if you are doing it entirely by hand, or only using a basic calculator.
 
Originally posted by: Evadman
Originally posted by: Gibson486
Originally posted by: Evadman
There are shortcuts to figuring out voltage drops and amperage that I figured out when I was in high school. I could do that stuff in my head and be damn close. Being exact requires a calculator as my teachers required answers be correct to 4 decimal places. But no need to figure out a huge equasion that fills half page to get the amperage across a single resister.

it took you half a page to get the amperage of a single resistor?
nah, I was exadurating. You know the type of question where you are given an input voltage and you need to get the amperage and voltage drops across each resister in a 10ish resister circuit. Those require a decent amount of work if you are doing it entirely by hand, or only using a basic calculator.


Hmmmm...yeah, i guess, but you should be able to do it like nothing by the end of the course.....
 
Well I'm currently just entering the 2nd half of 2nd year EE this fall, and I know what you mean, but when I look back on first year, and more noticably the beginning of 2nd, we covered ALOT of ground about the world of EE. Although there's obviously a much larger world of everything encompasing EE but one thing I realized is that University teaches you to learn, and to learn fast. I'm learning the fundamentals, and without fundamentals you won't be able to learn the advanced concepts to come.

Like mentioned before you're building a foundation with your degree, you'll learn everything else in the working world and/or graduate degrees. My suggestion is to try and find something within EE that you truly have a passion for and just go all out. You obviously can't be a jack of all trades when it comes to EE or any engineering discipline. You need to focus on the core areas that interest you most and in the end you'll see that you know more than you initially thought and will continue to learn more.

Just hang in there!

--Mark
 
I have more fun with mechanical things, though I do like to play around with electronics. With electric stuff, though, I prefer the bizarre, though easy to understand with a grounding in physics phenomena, like tesla coils and rail guns and the like. The more powerful, the better. I have never been much interested in what all goes on inside a CPU.

I am pursuing a mechanical engineering major with a specialization in robotics, though, and I'd like to get higher degrees in the field (i'd like to get a Ph.D. but that is so far away I can't say for sure if I'm going to go for it).
 
I think few people understand every single detail of every part of every project that they work on...

this is why CPUs aren't designed by one guy in a holding cell...lol

but seriously im doing computational fluids this summer & I was in way over my head at the start...but you read, talk to people, and learn. Before you know it, you're making useful contributions to the project.

School will hopefully teach you some useful problem solving skills & the basic tools to understand what's going on (i.e. look at a circuit, have some idea of how it behaves...solving it out ain't so important w/all the computer tools available). But the really important part is to take whatever you learn in the classroom & look past the memorization & computation to the concepts & thinking processes involved...
 
Back
Top