Viewing the Code behind Java's Predefined Elements

chrstrbrts

Senior member
Aug 12, 2014
522
3
81
Hello,

I'm new to computers and programming, and I've decided to start with Java.

I'm done with syntax (easy enough) and am now just learning about certain special packages like io, lang, etc.

However, I'm beginning to think that I made a mistake with starting with Java.

I don't really understand how to interact with the machine; all I do is use predefined methods to do my dirty work for me.

So, I'm wondering if I can see the code that the developers used to build the predefined elements of the language.

For example, for a simple method call like System.out.print("string goes here"); how did the developers tell the computer to print that string to the command window?

What does that code really look like?

Thanks.
 

purbeast0

No Lifer
Sep 13, 2001
53,639
6,522
126
and just fyi, NOT using libraries that already do what you want to do and choosing to re-write is a terrible idea. you WANT to use these java libraries that work and are maintained and have been written by the guys who make the language. writing your own "print" function would be 100% a waste of time.

in programming there is absolutely no reason to reinvent the wheel, unless for learning purposes.
 

Leros

Lifer
Jul 11, 2004
21,867
7
81
For example, for a simple method call like System.out.print("string goes here"); how did the developers tell the computer to print that string to the command window?

In really old computers, you would write ASCII characters directly to the monitor. You would do this by writing characters to a special memory address that was mapped to the I/O hardware.

Say this address was 0x0300, your code could look something like:

Code:
0x0300 = 'h';
0x0300 = 'e';
0x0300 = 'l';
0x0300 = 'l';
0x0300 = 'o';

Of course, you would write a function to take a string and loop through the characters to write each one to the monitor.

Terminals windows work roughly the same way. If you're writing native code, you can actually write code somewhat similar to what I have above. When you're using libraries, there are going to be several layers of abstraction that it make it difficult to follow the code from the print() call to the actual memory writing.
 
Last edited:

WaitingForNehalem

Platinum Member
Aug 24, 2008
2,497
0
71
In really old computers, you would write ASCII characters directly to the monitor. You would do this by writing characters to a special memory address that was mapped to the I/O hardware.

Say this address was 0x0300, your code could look something like:

Code:
0x0300 = 'h';
0x0300 = 'e';
0x0300 = 'l';
0x0300 = 'l';
0x0300 = 'o';
Of course, you would write a function to take a string and loop through the characters to write each one to the monitor.

Terminals windows work roughly the same way. If you're writing native code, you can actually write code somewhat similar to what I have above. When you're using libraries, there are going to be several layers of abstraction that it make it difficult to follow the code from the print() call to the actual memory writing.

Wouldn't you increment each address by one byte?
 

Leros

Lifer
Jul 11, 2004
21,867
7
81
Wouldn't you increment each address by one byte?

Nope. You can think of writing to that address like an API. Writing to that address means "append this character to the screen".

Every time you write to that address, the IO hardware kicks in, and sends the character to the monitor.
 

WaitingForNehalem

Platinum Member
Aug 24, 2008
2,497
0
71
Nope. You can think of writing to that address like an API. Writing to that address means "append this character to the screen".

Every time you write to that address, the IO hardware kicks in, and sends the character to the monitor.

Oh I see. Man that is primitive lol.
 

chrstrbrts

Senior member
Aug 12, 2014
522
3
81
In really old computers, you would write ASCII characters directly to the monitor. You would do this by writing characters to a special memory address that was mapped to the I/O hardware.

Say this address was 0x0300, your code could look something like:

Code:
0x0300 = 'h';
0x0300 = 'e';
0x0300 = 'l';
0x0300 = 'l';
0x0300 = 'o';

Of course, you would write a function to take a string and loop through the characters to write each one to the monitor.

Terminals windows work roughly the same way. If you're writing native code, you can actually write code somewhat similar to what I have above. When you're using libraries, there are going to be several layers of abstraction that it make it difficult to follow the code from the print() call to the actual memory writing.

OK. You're digging deeper here, but there's still a level of abstraction in your example.

Let's go all the way.

As we all know, the screen emits frequencies of light from light emitting elements called pixels.

When all the pixels are put together, we get a picture or a string, etc.

All the screen knows is what light to emit from each pixel.

So, if I wanted to print a string to the screen, at the bottom level, I would have to tell each pixel in some area of the screen to display a color that stands out from the color emitted by the pixels around it in such a a way that a pattern emerges that matches letters from our alphabet that form a sentence.

How would you code that?

I mean, you could encode the frequency in digital, store the bits in a memory location for each pixel, then send the whole shebang to the screen so that it can redraw itself.
 
Last edited:

Ken g6

Programming Moderator, Elite Member
Moderator
Dec 11, 1999
16,698
4,658
75
Some of what you're talking about is, at least sometimes, implemented in hardware. I can only speak from memory about Commodore computers, but there was a predefined section of RAM that mapped its bits directly to the screen. You could PEEK and POKE pixels to your heart's content, and the hardware would push those pixels onto the "monitor" (really a CRT television.) There were two modes, monochrome 320x240, and 4-color 160x240, though each 8x8 character could be a different monochrome color, and there were also hardware sprites.

If you want to know about font rasterization, though, these days that's software.
 

Cogman

Lifer
Sep 19, 2000
10,286
145
106
OK. You're digging deeper here, but there's still a level of abstraction in your example.

Let's go all the way.

As we all know, the screen emits frequencies of light from light emitting elements called pixels.

When all the pixels are put together, we get a picture or a string, etc.

All the screen knows is what light to emit from each pixel.

So, if I wanted to print a string to the screen, at the bottom level, I would have to tell each pixel in some area of the screen to display a color that stands out from the color emitted by the pixels around it in such a a way that a pattern emerges that matches letters from our alphabet that form a sentence.

How would you code that?

I mean, you could encode the frequency in digital, store the bits in a memory location for each pixel, then send the whole shebang to the screen so that it can redraw itself.

Heh, now you are talking about driver development. :)

The first thing to realize is that code doesn't do everything. You can only program what is made to be programmed (we will ignore HDLs for now). Most of the "handle the frequency of when the color information" stuff is built right into the chip rather than in any sort of programmable interface (well, at least it was in the VGA days, I don't know enough about modern video stuff to comment on the programmability of the DVI and HDMI chips).

Generally, there is a chunk of memory with just enough bytes to represent the number of pixels on the screen. The chip that communicates with the monitor goes through that array of memory and simply spits out whatever value is stored in it. When talking about VGA communication, it does this in the most bizarre way possible, by varying the resistance of the connection between the monitor and the video output. For modern digital standards it is literally just spitting out the contents of the memory and letting the monitor on the other end handle turning it into something that can be displayed. For LCDs, this is great and much more straight forward than making a VGA reader, for CRTs this is more unnatural (VGA was made the way it is because of how CRTs operate mechanically).

Now how do you write code which changes the memory to change the color of the pixel at a given point on the screen? Well, you have to write the driver for the video card to do that, each device is going to have a vastly different setup. The operating system defines a standard interface for all video card driver writers to conform to (In fact, there is even a standard Driver interface the firmware is supposed to deal with... but we will ignore that for now) and that is what programmers eventually program against.

In some/many operating systems, it isn't possible for a userland application to modify the colors at any point in the screen. Rather, the users application is only allowed to modify the colors of the screen real estate given to it by the operating system. It modifies on an x/y axis what it wants and the OS in turn puts that color where it needs to be on the screen in relation to the where the window exist.

Now, because operating systems are vastly different on the way they allow userland apps to interact with their windows, we have yet another layer (or two) that is usually added to help programmers write software that can run in multiple places. Those come in the forms of the SDL for managing windows, input devices, and the os event system, or OpenGL for standard 3d rendering interactions.

Now where does Java fit in? It is generally on top of all of that. Java has a standard interaction with the windowing system (swing, JavaFx) that it allows programmers to program against, it is up to the guys who implement the java platform for the various OSes to make sure those features reliably translate into window creation/pixel mutation, etc for the programmer.

In short, it is a big complex ball of wax that gets ever more complex as time goes on. Today, it is a non-trivial task to write a program which can change the color of any pixel on the screen (without having an OS, drivers, etc, which allow you to do just that).
 

Cogman

Lifer
Sep 19, 2000
10,286
145
106
Some of what you're talking about is, at least sometimes, implemented in hardware. I can only speak from memory about Commodore computers, but there was a predefined section of RAM that mapped its bits directly to the screen. You could PEEK and POKE pixels to your heart's content, and the hardware would push those pixels onto the "monitor" (really a CRT television.) There were two modes, monochrome 320x240, and 4-color 160x240, though each 8x8 character could be a different monochrome color, and there were also hardware sprites.

If you want to know about font rasterization, though, these days that's software.

Eh? I was under the impression that font rasterization was quickly moving into the hardware domain. In fact, that was supposed to be a major selling point for Chrome 64 and IE, that they do the rasterization in the GPU hardware rather than in software (Though, it might be some sort of complex shader which is still software, just running on hardware optimized for it).
 

Ken g6

Programming Moderator, Elite Member
Moderator
Dec 11, 1999
16,698
4,658
75
I might be behind the times. :$ I was discussing Commodore hardware, after all. ;)
 

disappoint

Lifer
Dec 7, 2009
10,132
382
126
Some of what you're talking about is, at least sometimes, implemented in hardware. I can only speak from memory about Commodore computers, but there was a predefined section of RAM that mapped its bits directly to the screen. You could PEEK and POKE pixels to your heart's content, and the hardware would push those pixels onto the "monitor" (really a CRT television.) There were two modes, monochrome 320x240, and 4-color 160x240, though each 8x8 character could be a different monochrome color, and there were also hardware sprites.

If you want to know about font rasterization, though, these days that's software.

Wow those resolutions are a lot lower than I thought they were.