EE major

oiprocs

Diamond Member
Jun 20, 2001
3,780
2
0
As an EE major, I have the option of taking 6 elective courses in any field I wish. I'd like to take some CS courses to get a good background in programming, and I was wondering if anybody has some advice for me, i.e. subjects that would be quite helpful to have taken/understand. I chose to specialize in communication systems within EE, if that helps.

Thanks!
 

Argo

Lifer
Apr 8, 2000
10,045
0
0
Algorithms is always good. Data Structures might be helpful as well. Other than that you can try taking some intro courses.
 

exdeath

Lifer
Jan 29, 2004
13,679
10
81
For EE you probably want to take courses on:

1) basic introduction to C programming, you won't use it at all in your field, but you'll know what programming is all about up top at the highest most abstract level.

2) machine architecture and intro to assembly language programming, know directly how software is interfacing with the hardware, particularly CPU and memory components. Do all the stuff you did in C over again, but in machine language directly on a specific machine with a specific CPU and architecture.

3) operating systems, know how drivers, interrupts, and all that work in software and understand the 'glue' between CompSci and ElecEng and how your high level CS software eventually interfaces with physical hardware registers on the EE side of things. Particularly useful to you would be something focused on embedded systems with small scale real time signal driven operating systems like VXWorks, rather than human-machine centric manual input driven systems like Unix/Windows.

4) compilers, the final mystery of how the line printf("Hello world!\n"); has anything to do with transistors and gates whatsoever once you know C, assembly, and operating systems

That will cover the general case of understanding things from both sides of the hardware vs. software isle.

For your specialization in coms, a course on algorithms and discrete mathematics, particularly with focus on DSP, would go a very long way.

Don't be intimidated if you lack knowledge of programming, thats what the classes are for, believe me they were very boring for me as a programmer. If you can handle EE you can handle CS, it's pretty much just a different perspective of the same exact thing, except instead of calling them flip-flops or latches we call them registers or IO ports.

The benefit of coming from EE first is that a programmer knows nothing about hardware design or FPGAs or VHDL or anything, but if you as a EE design a CPU, by nature of designing the instruction set you already know how to program it!
 

tidehigh

Senior member
Nov 13, 2006
567
0
0
another vote for real time embedded systems. this would be a good course to base a senior project on.
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
How about taking up a CS minor if your school offers it?

A data communications/networking class would probably be good for you to take at some point.
 

MBrown

Diamond Member
Jul 5, 2001
5,726
35
91
I am an EET major and we are required to take Intro to C++ and Embedded Microsystems which is a fancy name for assembly language. I already took C++ and it was kinda fun, but I am taking Embedded Microsystems now and it sucks. Its 10 times harder than C++.
 

Fallen Kell

Diamond Member
Oct 9, 1999
6,208
537
126
Originally posted by: MBrown
I am an EET major and we are required to take Intro to C++ and Embedded Microsystems which is a fancy name for assembly language. I already took C++ and it was kinda fun, but I am taking Embedded Microsystems now and it sucks. Its 10 times harder than C++.

Of cource it is 10 times harder then C++, it is the base language. This is why we created higher order languages like C++ in the first place, because we know how bad it sucks to write things in machine code. That is what compilers are for :D

However, you need to take that course to find out how bad it sucks to have to use machine code before you can appreciate all that has happened in compilers and higher order language theory. This is the core that your programming will always use, so you should have some understanding of it to be able to understand how everything really works.

And for you electrical engineers who are the guys who design the hardware and create these basic function sets that makes up the machine language, well, you need to understand how this affects things. Being able to have single functions that do common things like storing the value of a division between two registers into another register along with its remainder in another register could be useful to the guys who will be using your chip to do something...
 

exdeath

Lifer
Jan 29, 2004
13,679
10
81
I must be a dying breed. I actually enjoy assembly language. Far more straight forward and less obfuscated than higher level languages.

At least on stuff like PPC, ARM, and MIPS... and PS2 VU is really fun... I still remember scalar div Q, vf0w, vf1w is 7 cycles completion time and cannot be overlapped :)

Still waiting for PS3 to have a few must have exclusive titles so I can play with Cell... I can't justify spending that much for a console to home brew and mostly collect dust unless there are a few hit games for it.
 

Lord Banshee

Golden Member
Sep 8, 2004
1,495
0
0
Nope i loved my Assembly class, well it was called Microprocessor Applications. We used a 68HC12 and implemented all kinds of different hardware to the MCU and interfaced it will Assembly and i have to say i learned a lot and has to be one of the most fun EE class i have taken. We also had to use MIPS assembly in our Computer Arch class, but in that class we actually design the CPU and insturctions out of HDL, so i guess that includes everything :)

exdeath you have any links getting started with a PS2? That sounds pretty cool if it don't require me to buy too much.
 

Schadenfroh

Elite Member
Mar 8, 2003
38,416
4
0
Originally posted by: exdeath
I actually enjoy assembly language.

At least on stuff like PPC, ARM, and MIPS...

My university requires us (CS majors) to learn Assembly on an Intel 386. I do not enjoy it.
 

exdeath

Lifer
Jan 29, 2004
13,679
10
81
Originally posted by: Lord Banshee
Nope i loved my Assembly class, well it was called Microprocessor Applications. We used a 68HC12 and implemented all kinds of different hardware to the MCU and interfaced it will Assembly and i have to say i learned a lot and has to be one of the most fun EE class i have taken. We also had to use MIPS assembly in our Computer Arch class, but in that class we actually design the CPU and insturctions out of HDL, so i guess that includes everything :)

exdeath you have any links getting started with a PS2? That sounds pretty cool if it don't require me to buy too much.

http://ps2dev.org/

I haven't been keeping up to date with the tool chains so I don't know what is current or obselete, but my old/current setup for raw hardware coding (no OS or libraries) is:

1) Old style PS2 with brick style Sony network adapter (non USB) (or slim PS2)

2) mod chip. I prefer the DMS3/4 chips with the 'devolution' feature and boot from memory card option.

3) pukklink & ps2link, the IP boot loader server that boots the PS2 at power on and the PC client that allows you to send EE (Emotion Engine, R5900 main CPU) and IOP (I/O processor, the PS1 on a chip that also handles sound and peripheral I/O in PS2 mode) programs to the PS2. It also remaps the cdrom device path to a network share on your PC to emulate CD access so you don't need to waste time burning CDs with every build.

4) GNU PS2 SDK and compilers and C runtime libs.

5) The Linux kit. You don't need the whole kit, the hard drive, or even Linux, you just want the register level hardware documentation and user manuals that document instruction sets, bus operation, DMA, etc., which is like 6 separate large PDF files. I prefer not to use the Linux SDK itself because you are barred access from many things, including graphics. But those .pdf files are absolutely essential.

Basically you burn pukklink to a CD, follow the instructions for the DMS mod chip to go into devolution mode to copy the boot image off the CD onto the memory card, and from then on all I have to do when I want to dev is power on the PS2, hold triangle, and select pukklink from the applications menu and the PS2 boots and initializes the network adapter, and waits.

Then on the PC side, you set up your dev environment (i used VS6 rigged to use the PS2 GNU build tools including the seperate vector unit assembler). Then you use PS2Link IIRC to communicate with pukklink to reboot, send .elf and .irx modules and execute them, etc.

First things first, you have to learn the GS (graphics synthesizer) which is a 2D raster engine (accepts XYZ and STQ). Communication with the GS is done via 128 bit wide GIF (Graphics InterFace) packets stuffed through a single memory address I/O port. There are four ways of getting data there, via EE, DMA, VIF2, and VU2, but you can start simply by poking carefully assembled data packets through the GIF FIFO manually one qword (128bit) at a time.

First thing you'll need to do on power on is initialize the CRTC and set up a display mode for both the CRTC and the GS. After that you can just send simple commands to draw non perspective corrected non textured no depth buffered 2D solid quads and stuff like that just to see something on the TV and verify that you have a running program, etc. Then take it from there.

I believe STDOUT, and thus printf, is mapped to a console in ps2link so you could just start with the typical printf("Hello World!\n"); and upon executing the .elf on the PS2 you'd see "Hello World!" print to the ps2link console on the PC. Very useful debugging feature as the PS2 itself has no native text mode and you'd otherwise have to implement textured font sprites.

I believe the default behavior of crt0.s either goes into an infinite loop or halts the EE when your program ends (main returns) where you can then automatically reset it when you run another .elf.
 

exdeath

Lifer
Jan 29, 2004
13,679
10
81
Originally posted by: Schadenfroh
Originally posted by: exdeath
I actually enjoy assembly language.

At least on stuff like PPC, ARM, and MIPS...

My university requires us (CS majors) to learn Assembly on an Intel 386. I do not enjoy it.

There's your problem ;)

Problem for me is PC is what I primarily work on and assembly and debugging in x86 gets boring and repetitive.

Working with new platforms and learning new instructions and conventions, especially ones that are far more useful and intuitive, is when it gets enjoyable.

Like on the Gamecube, my first experience with PPC... stuff like 'rlwinm' 'rotate left word immediate then logical and with mask' and that was a simple one... so much for RISC ;) Clearly a chip that was built for DSP and bit stream processing.

And ARM7 on the Gameboy Advance... been a while, but I recall shift/rotate was built every instruction as an optional last immediate parameter, so to add two numbers and multiply by four you could do something like: add $r1, $r2, $r3, 2

Fun stuff.
 

Lord Banshee

Golden Member
Sep 8, 2004
1,495
0
0
Thanks exdeath,

Damn looks like i would have to spend money after all... :( My PS2 is just an old school model with no network or mod chip. And would need to buy linux kit. May not be worth it for me, I think I'll stick to learning this ARM7 dev board i have not messed with it yet.

Thanks again for the indepth explanation,
Chris
 

exdeath

Lifer
Jan 29, 2004
13,679
10
81
Originally posted by: Lord Banshee
Thanks exdeath,

Damn looks like i would have to spend money after all... :( My PS2 is just an old school model with no network or mod chip. And would need to buy linux kit. May not be worth it for me, I think I'll stick to learning this ARM7 dev board i have not messed with it yet.

Thanks again for the indepth explanation,
Chris

You can always get into GBA, it's a ARM7TDMI with ARM and Thumb support, has no cache (think there is like a 3 instruction fetch buffer or something, thats it). I/O registers and stuff to control graphics planes are well documented in public papers.

www.gbadev.org

All you'd need is a flash cartridge and a USB linker or cartridge flasher, should be fairly cheap these days for a reasonable size one. After all you just want to run home brew, so the 64-128 megabit carts that are not desirable by pirates are cheap, and are more than adequate for home brew and most demos.

There is also VisualBoyAdvance that is basically a perfect emulator with all kinds of debugging tools built in that allow you to view CPU state, all memory, I/O port state, and even special viewers that show you actual tiles and sprites at the appropriate locations rather than just raw hex.

But it's still feels special to run your code on real hardware... and all sorts of bugs come up that don't show up in the emulator. For example, a bug I ran into when first running on live hardware was doing successive DMA transfers back to back; for example initializing palette, sprite tables, and BG tables all one after the other using the same ROM->I/O DMA channel.

In the emulator it was fine, never an issue, and that is how I grew accustomed to doing things... until none of my demos ran on real hardware. But still something wasn't working quite right. I ended up tracking down the problem, it takes about 1 ARM cycle after the store that activates the DMA transfer for the DMAC to latch all your values in the I/O registers (source, dest, count, flags, etc) into the DMA channels internal state and begin the transfer... and my successive use of the same channel was basically clobbering the source and dest registers with the next transfer before that occurred. I just had to go back through everything and add a NOP or 2 between successive DMA channel uses... but that took a while to figure out why it worked fine on an emulator but left me with a blank screen on real hardware...

What made it hard is that I knew, that with no cache, the back to back DMA was perfectly ok without having to poll completion status before starting another, you see, because as soon as the write to the "DMA START" register happens, the CPU, without cache, would stall and wait automatically due "bus busy" assertion on the following instruction fetch once the DMAC takes control of the bus... basically ensuring that the next DMA setup didn't occur until the previous one finished... so I didn't even think to question putting a delay between starting a DMA then immediately writing the next source/dest the next instruction after... the one cycle latch thing was a completely different issue that I was never aware of...

I love this stuff, if you couldn't tell ;)