So I'm studying for a circuits exam lately and, for example, I am figuring out the voltage drop across a certain capacitor or resistor in a RLC circuit that maybe has 2 voltage sources. Pretty basic, although some of them can be a PIA still. Of course, I am in basic circuit analysis (2xxx level).
Although I'm not that far in my degree, I will be a 3rd year/junior starting this fall, and it's like for all of the things I've learned thus far, they are almost meaningless. The electronics world seems SO vast and complicated. I could only dream that any electronic device you use is that simple to analyze. But in reality, any new, breakthrough technology is so far above anything I could imagine right now. For example, in digital logic we might look at a simple adder. Well if that's what an adder is, wtf does a CPU look like?? I could probably analyze a sequential circuit, or maybe even a basic async circuit, but that to me seems like 1% of what real electronics are. So I must ask, at what point does one learn enough to do real-world applications?
I have over half of the required credits for my B.S., and have a good idea on what I'll be taking the next couple of semesters. Two semesters from now I'll be 1.5 years from graduating and FINALLY taking my first c++ course!! Granted I got a bit of a late start on CS, but there's just so many damn prereqs that I'll be a senior before I get past the required, "basic" stuff. But yet, that is all the degree requires. Not that I should be able to expect to program games with a simple 4yr degree, but still, the programs that I can write now are nothing compared to what is really useful in the real world. I've been through 3 CS classes and have maybe 8 left. Again, for as much as I've learned in those 3, I don't see how I could do real-world things with what is left.
I was reading in HT, and read posts about x86 being a limiting factor. Ok.. .. now I have no idea what x86 is relative to PowerPC, MIPS, RISC, etc... I mean could mention more than the average joe about what it has to do with (it's an instruction set, whoop-dee-do, what is that? what makes it x86? etc ), but I couldn't really define any of those things. And so something else I wanna ask is that are things that one would learn in college going to be obsolete? Obviously if x86 is a limiting factor then they will switch to something different (eventually). Minor changes can probably be picked up easily, but a major switch like that seems like a totally new concept.
Ahhh... so you get my point? It's not that I dread what is ahead of me. In fact, the more I get into EE, the more I like it and the more that I see the "big picture". Little by little I see things come together that were just taken for granted in real life. But still it just doesn't seem like the next 2-2.5 years ahead of me that I would be ready to be in the workforce. Which reminds me, I need to get an internship as well.....
Anyway, anyone have similar thoughts or comments?
Although I'm not that far in my degree, I will be a 3rd year/junior starting this fall, and it's like for all of the things I've learned thus far, they are almost meaningless. The electronics world seems SO vast and complicated. I could only dream that any electronic device you use is that simple to analyze. But in reality, any new, breakthrough technology is so far above anything I could imagine right now. For example, in digital logic we might look at a simple adder. Well if that's what an adder is, wtf does a CPU look like?? I could probably analyze a sequential circuit, or maybe even a basic async circuit, but that to me seems like 1% of what real electronics are. So I must ask, at what point does one learn enough to do real-world applications?
I have over half of the required credits for my B.S., and have a good idea on what I'll be taking the next couple of semesters. Two semesters from now I'll be 1.5 years from graduating and FINALLY taking my first c++ course!! Granted I got a bit of a late start on CS, but there's just so many damn prereqs that I'll be a senior before I get past the required, "basic" stuff. But yet, that is all the degree requires. Not that I should be able to expect to program games with a simple 4yr degree, but still, the programs that I can write now are nothing compared to what is really useful in the real world. I've been through 3 CS classes and have maybe 8 left. Again, for as much as I've learned in those 3, I don't see how I could do real-world things with what is left.
I was reading in HT, and read posts about x86 being a limiting factor. Ok.. .. now I have no idea what x86 is relative to PowerPC, MIPS, RISC, etc... I mean could mention more than the average joe about what it has to do with (it's an instruction set, whoop-dee-do, what is that? what makes it x86? etc ), but I couldn't really define any of those things. And so something else I wanna ask is that are things that one would learn in college going to be obsolete? Obviously if x86 is a limiting factor then they will switch to something different (eventually). Minor changes can probably be picked up easily, but a major switch like that seems like a totally new concept.
Ahhh... so you get my point? It's not that I dread what is ahead of me. In fact, the more I get into EE, the more I like it and the more that I see the "big picture". Little by little I see things come together that were just taken for granted in real life. But still it just doesn't seem like the next 2-2.5 years ahead of me that I would be ready to be in the workforce. Which reminds me, I need to get an internship as well.....
Anyway, anyone have similar thoughts or comments?