Natural Language Programming

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Natural Language Programming: Good Idea?

  • Yes

  • Maybe

  • No


Results are only viewable after voting.
Status
Not open for further replies.

Gerry Rzeppa

Member
Dec 13, 2013
195
1
76
www.osmosian.com
I don't think it's a good idea just because of the word order. Commanding a child or animal or other person is not same as programming.

But we, at least, wish it was. We want to talk to our computers like Dave talks to the HAL in "2001 - A Space Odyssey". And we see our work here as a possible step in that direction.

If you want to program a door opening, for a compiler it is difference when you write open door, door open or open the door, same as, shut door, shut the door, or close door.

But why should it be different, if the intent of all those statements is the same?

It would be possible but then programmer would have to remember all sorts of exact commands which would be hard to sustain, given the way we use our natural language, so it's easier for both compiler and programmer to remember door_open=1 or door_open=0 and use the command accordingly.

On the contrary. If the compiler accepts "Open the door" and "Open door" and "Open the freaking door" as equivalent thoughts, the programmer doesn't have to remember which one to use -- any such expression will do.

Anyway, with natural language anyone virtually could be a programmer without hard effort and I don't agree with this, if I have to work hard to become doctor, lawyer or firefighter and more, why not a programmer?

It sounds like you're promoting hard work for the mere sake of hard work. Curious. I teach my kids, as a general rule, "If it's hard, you're doing it the wrong way." But that aside, natural programming languages won't magically make everyone a good or even a competent programmer. After all, everyone speaks a natural language, but relatively few become good novelists, poets, etc.
 

Markbnj

Elite Member <br>Moderator Emeritus
Moderator
Sep 16, 2005
15,682
13
81
www.markbetz.net
Forgive my ignorance, but I thought that was what I did. I provided a link to a zip file, on our site, that contains the instructions, the source code, and the executable.

You provided a direct link to a zip file. If you want to edit the OP to contain a link _to a web page_ on your own site, and have the file link hosted _on that page_ then I will allow it.
 

Markbnj

Elite Member <br>Moderator Emeritus
Moderator
Sep 16, 2005
15,682
13
81
www.markbetz.net
But we, at least, wish it was. We want to talk to our computers like Dave talks to the HAL in "2001 - A Space Odyssey". And we see our work here as a possible step in that direction.

The fundamental issue with this dream is that it is misguided. Programs are not complex and difficult to write because of syntax, or the inability to parse natural language. If you make it possible for computers to be programmed using natural language then you will simply have invented a new syntax, even if that syntax is the entire language. Programs are difficult because it is intellectually hard to: a) understand all the myriad states and rules for transitions between states in a real, non-trivial system; and b) decompose the necessary processes into discrete and unambiguous steps that can be followed by a machine. I don't care what language you use. If you want me to believe you've enabled natural language programming then post some actual non-trivial examples of working code.
 

Gerry Rzeppa

Member
Dec 13, 2013
195
1
76
www.osmosian.com
You provided a direct link to a zip file. If you want to edit the OP to contain a link _to a web page_ on your own site, and have the file link hosted _on that page_ then I will allow it.

I still don't see the difference, except that you're making work for me and forcing me to put a link on my site that (for reasons explained earlier) I don't want there. How about people just write me directly if they want the link: gerry.rzeppa@pobox.com . That okay with you?
 

Markbnj

Elite Member <br>Moderator Emeritus
Moderator
Sep 16, 2005
15,682
13
81
www.markbetz.net
I still don't see the difference, except that you're making work for me and forcing me to put a link on my site that (for reasons explained earlier) I don't want there. How about people just write me directly if they want the link: gerry.rzeppa@pobox.com . That okay with you?

Whether you see the difference or not is immaterial. It's not my function, nor the function of this forum, to make more or less work for you, or to assist you in promoting whatever it is you want to promote. What you do with your email address is your business.
 

Gerry Rzeppa

Member
Dec 13, 2013
195
1
76
www.osmosian.com
The fundamental issue with this dream is that it is misguided. Programs are not complex and difficult to write because of syntax, or the inability to parse natural language.

Syntax is not the whole problem, I agree. But obscure and unintuitive syntax can make learning to program more difficult than it needs to be. Why should a child, aspiring to one day become a professional programmer, have to learn a syntax like "someLibrary.ClrScr();" when, in accord with his other reading and writing classes, he could simply say, "Clear the screen."? And why should a full-grown professional, who converses and writes in plain English everyday, have to do otherwise?

If you make it possible for computers to be programmed using natural language then you will simply have invented a new syntax, even if that syntax is the entire language.

But that's significant. You can think of it as the last programming language because once computers can speak and understand as we do, all the others will quickly become obsolete.

Programs are difficult because it is intellectually hard to: a) understand all the myriad states and rules for transitions between states in a real, non-trivial system; and b) decompose the necessary processes into discrete and unambiguous steps that can be followed by a machine.

Boy, I'm glad I didn't run into you when I was learning to program! Nothing is hard if you (1) have the necessary prerequisites, and (2) learn the new stuff one tiny step at a time. I know this from experience, both my own, and in teaching literally thousands of others, young and old alike.

I don't care what language you use. If you want me to believe you've enabled natural language programming then post some actual non-trivial examples of working code.

Okay, here's my email address; Write me for a link to the whole shebang, including instructions and source code for the complete development system (desktop, file manager, editor, dumper, native-code-generating compiler/linker, and wysiwyg page layout facility):

gerry.rzeppa@pobox.com

I really can't imagine a better example of non-trivial working Plain English code.
 

MagnusTheBrewer

IN MEMORIAM
Jun 19, 2004
24,135
1,594
126
And keep in mind that every type you define and every routine you code increases the compiler's vocabulary and skill set. So the compiler's ability to understand what you say increases more and more as you "get to know one another".

.

This is inherently false. The user is merely writing another library. There is no getting to know one another. In the previous example I referred to a font name but, did not specify it was a font. A human would infer it but, no computer or software would. You're simply creating another language which is misleading because of a superficial resemblance to English.
 

DaveSimmons

Elite Member
Aug 12, 2001
40,730
670
126
Most disciplines use jargon because it works better for those that understand it. You can communicate concepts in single words instead of sentences, and the definition of those words is more precise.

This approach might possibly help those unwilling to learn the jargon for this discipline to get their point across, more or less, but I believe it will only make work harder for trained professionals.

x += 3 ;

vs.

take x and add 3 to itself

or

if ((x % 9) == 0)
{
function1 ( x ) ;
function 2 ( x + y, puffin, otter ) ;
}
x++ ;

vs.

if x modulo 9 is now zero then
begin
run function one with x
run function two with the sum of x and y, also puffin also otter
end
add 1 to x
 

Markbnj

Elite Member <br>Moderator Emeritus
Moderator
Sep 16, 2005
15,682
13
81
www.markbetz.net
Boy, I'm glad I didn't run into you when I was learning to program! Nothing is hard if you (1) have the necessary prerequisites, and (2) learn the new stuff one tiny step at a time. I know this from experience, both my own, and in teaching literally thousands of others, young and old alike.

For someone who claims an insight into natural language processing you seem to have a remarkably difficult time parsing it yourself. This whole thread reads like one long snake-oil pitch and I'm a hair away from just locking it and moving on.

You've posted nothing credible, and seem to be primarily interested in getting people to download your .zip file, by any means. I've said this at least twice now, in different ways, but I'll give it one more shot: if you want to be taken seriously then publish a white paper, release some source code, discuss your methods, do something to indicate that there is some substance behind your claims.

If you continue to pursue your current approach of trying to distribute the binaries without providing any detailed insight into what is in them then this thread is going to be locked.
 

Sheep221

Golden Member
Oct 28, 2012
1,843
27
81
But we, at least, wish it was. We want to talk to our computers like Dave talks to the HAL in "2001 - A Space Odyssey". And we see our work here as a possible step in that direction.

But why should it be different, if the intent of all those statements is the same?

On the contrary. If the compiler accepts "Open the door" and "Open door" and "Open the freaking door" as equivalent thoughts, the programmer doesn't have to remember which one to use -- any such expression will do.

It sounds like you're promoting hard work for the mere sake of hard work. Curious. I teach my kids, as a general rule, "If it's hard, you're doing it the wrong way." But that aside, natural programming languages won't magically make everyone a good or even a competent programmer. After all, everyone speaks a natural language, but relatively few become good novelists, poets, etc.
How do you know that compiler knows that all these are same intent? That's the problem, computers don't think.
It would be objectively much better to achieve what you want via voice or text recognition and said words translate to regular syntax and compile program. But still it's not really a good idea to promote this as natural language is oftenly used only in formal communication, programming any math, database or other complex structures would take so long to describe in natural language that it's not reasonable to program in it even if the machine could do it. It's quite difference when you doing something and are focused on task and talking about something. Thoughts that generate communication are different than the ones generating the code.
When working on something you don't think of it as speak but you do the task automatically, you are doing it you are not talking about it.

In retrospect, technical design has been since its earliest day subject to various shorting and customized languages, being it drawings, schematics or program codes, it has been always like this and it will remain for a while.

One more thing, don't teach your kids perfectionistic stuff if it's hard you are doing it wrong. Quite alot of things are hard to work on even if you understand them well. Such is life, your kids will start to soon live in illusion they can do everything with mere or no effort.
 

Gerry Rzeppa

Member
Dec 13, 2013
195
1
76
www.osmosian.com
This is inherently false. The user is merely writing another library. There is no getting to know one another.

Ah, but there is. Case in point. An earlier version of the compiler had no "composite" string writing capabilities; that is, to write a string on the screen you had to first set the font, then write the string, like so:

Select the Osmosian font. Draw the string.

My son wrote those routines. So when I tried to say,

Draw the string with the Osmosian font.

it didn't work. Now at this point I had two choices: I could either "get to know the compiler" by learning its way of saying what I wanted to say, or I could let the compiler "get to know me" by teaching it how I say things. I chose the latter, and coded up a routine starting:

To draw a string with a font:

And now the compiler -- knowing me a little better -- properly processes my requests. But it still works my son's way, as well. So we could say that the compiler has gotten to know my son (a little), and me (a little), and we've both gotten to know the compiler (a lot). And when my son, without forethought, attempts to draw a string my way, he'll be getting to know the compiler even more. And be pleasantly surprised at how accommodating it is has become!

Note the paradigm change here. In traditional languages, the programmer would be encouraged to code up one and only one way to perform a certain task; in our world, programmers are encouraged to teach the compiler anything and everything they can, so that whatever the next programmer tries to do, there's a greater probability that the compiler will understand his way of expressing himself as well.

Of course, some of this is automatic, and more can be made automatic (but we haven't had time to implement those changes yet). For example, the compiler should automatically accept a reversal of equivalent clauses without additional code being added: both of the statements below should call the same routine:

Draw the string with the Osmosian font.
With the Osmosian font, draw the string.


It's merely a prototype at this stage; but it shows great promise for the future.

In the previous example I referred to a font name but, did not specify it was a font. A human would infer it but, no computer or software would.

At the moment our compiler is a little too strongly "typed" to do what you're asking it to do. But to say "no computer or software would" is not true. It will be relatively easy for us to modifiy the thing to satisfy your request, and it's on our list to do so. And we already do it, with quite a bit of finesse, in other similar areas. For example, any routine that accepts numeric parameters will be passed values in the proper unit of measure, even if what the programmer specifies is different from the routine definition. For example, the routine:

To wait for some milliseconds:

Can be called in any of the following ways, and many more:

Wait for 300. [milliseconds assumed]
Wait for 1 second.
Wait for 3 hours.


And any variable can be referred to by its various "nicknames", for example "the left anchor point" can also be referred to as "the left anchor" or simply "the left".

You're simply creating another language which is misleading because of a superficial resemblance to English.

No, we've created a prototype in the interest of answering the three questions at the top of this thread; and we're suggesting that, based on our experience with the prototype, the subject deserves both further study and further development.

And it doesn't "superficially resemble" English; it is English. You can spell-check our code, and even run it through a grammar checker or a Flesch-Kincaid reading level evaluator. It's a subset of the full language, to be sure, but it's real English, interpreted by the compiler to mean what humans would naturally think it means.
 

Gerry Rzeppa

Member
Dec 13, 2013
195
1
76
www.osmosian.com
Most disciplines use jargon because it works better for those that understand it. You can communicate concepts in single words instead of sentences, and the definition of those words is more precise.

This approach might possibly help those unwilling to learn the jargon for this discipline to get their point across, more or less, but I believe it will only make work harder for trained professionals.
Allow me to quote from my original post:

At the lowest level, things look like this:

To add a number to another number:
Intel $8B85080000008B008B9D0C0000000103.

Note that in this case we have both the highest and lowest of languages -- English and machine code (in hexadecimal) -- in a single routine. The insight here is that (like a typical math book) a program should be written primarily in a natural language, with appropriate snippets in more convenient syntaxes as (and only as) required.

It's remarkable how little of most programs involve math, and yet most programming languages are based on mathematical syntaxes. (Less than 15% of our entire development system is arithmetic; the rest is "do this, do that" and "move this over there" and "write (or draw) this on the screen"). We've found that natural language is more appropriate for the bulk of most of the statements in most programs.

But let's compare four ways of saying even a mathematical something:

1. Add 1 to the counter.
2. counter=counter+1;
3. counter++;
4. Bump the counter.


The "C" method, #3, is obviously the most concise, even with the lengthy variable name, and I would be foolish to argue otherwise. But what is the thought in your mind just before you write that line of code? Is it not something like, "Bump the counter"? And when you're asked to explain this code to someone else, don't you say something like, "This is where we bump the counter"? And when someone else is reading your code, don't they mumble to themselves something like, "Okay, here's where he bumps the counter"? So the trade off, even in a mathematical situation like this is not just between number of keystrokes, but between #3 (two modes of thought, a translation, and the possible need for a comment), versus #4 (a single mode of thought, no translation, and no need for a comment).

Some, I know, will prefer the one to the other; but our compiler will eventually support them all fully, as it does in part already, as described above. The question is whether the rest of the world's compilers will ever support the rest of us -- the vast majority of humans who simply want to talk to their machines.
 
Last edited:

Gerry Rzeppa

Member
Dec 13, 2013
195
1
76
www.osmosian.com
For someone who claims an insight into natural language processing you seem to have a remarkably difficult time parsing it yourself. This whole thread reads like one long snake-oil pitch and I'm a hair away from just locking it and moving on.
It's your forum, do as you think will best serve your members.

You've posted nothing credible...
I have to take exception to that; the thing works.

...and seem to be primarily interested in getting people to download your .zip file, by any means.
Only because that's where the theory is proved.

I've said this at least twice now, in different ways, but I'll give it one more shot: if you want to be taken seriously then publish a white paper, release some source code, discuss your methods, do something to indicate that there is some substance behind your claims.
The kind of stuff I'd write in a white paper is included in the instructions that are packaged with the software, as is the source code, and quite a bit of discussion regarding our methods (a lot of which I'm repeating here in answer to other member's questions). The proof that there's substance to our claims is the program itself and the associated source code and documentation.

If you continue to pursue your current approach of trying to distribute the binaries without providing any detailed insight into what is in them then this thread is going to be locked.
Exactly what kind of "detailed insight" would you like? Let me know and I'll put it in my next reply.
 

Gerry Rzeppa

Member
Dec 13, 2013
195
1
76
www.osmosian.com
How do you know that compiler knows that all these are same intent? That's the problem, computers don't think.
It would be objectively much better to achieve what you want via voice or text recognition and said words translate to regular syntax and compile program.
Actually the compiler does pretty much what you're asking. If you hook up that "Dragon Naturally Speaking" product to your computer, you can speak Plain English into our editor and the compiler will convert that to a regular programming language syntax (specifically, Intel assembly language).

But still it's not really a good idea to promote this as natural language is oftenly used only in formal communication, programming any math, database or other complex structures would take so long to describe in natural language that it's not reasonable to program in it even if the machine could do it.
And yet we were able to conveniently and precisely describe an entire integrated development environment in English.

It's quite difference when you doing something and are focused on task and talking about something. Thoughts that generate communication are different than the ones generating the code.
Spoken like someone who doesn't have a lot of experience programming in a natural language.

When working on something you don't think of it as speak but you do the task automatically, you are doing it you are not talking about it.
Perhaps that's true for you; I can't get into your head and see what's happening. But I know what's going on in my head, and I do think in English about what I'm going to type, whatever the target language happens to be.

In retrospect, technical design has been since its earliest day subject to various shorting and customized languages, being it drawings, schematics or program codes, it has been always like this and it will remain for a while.
We're not arguing that English is the best way to express every possible thought. A picture, sometimes, is worth a thousand words; and an equation is sometimes worth a lot of words, as well. What we're proposing in a natural language framework that "drops down" to specific, specialized syntaxes as necessary. Like a typical math book that is full of plain English and interspersed formulas. Or like the routine in my initial post, English and machine code:

To add a number to another number:
Intel $8B85080000008B008B9D0C0000000103.


One more thing, don't teach your kids perfectionistic stuff if it's hard you are doing it wrong. Quite alot of things are hard to work on even if you understand them well. Such is life, your kids will start to soon live in illusion they can do everything with mere or no effort.
Thanks for the advice. I have three kids; a son, 36, who is a very successful programmer with his own company; a daughter, 34, who manages a warehouse for Amazon; and a son of our old age, 8, who is now learning what I taught the others (improved version). None of us have ever done anything hard except when we were doing it at the wrong time, or in the wrong way, or for the wrong reasons. As the Big Guy once said, "My yoke is easy, and my burden light..."
 

Gerry Rzeppa

Member
Dec 13, 2013
195
1
76
www.osmosian.com
He meant the source code INSTEAD of the executable. You did not post the source code of the executable. You posted a bunch of random text files that your executable presumably parses and then does something with.

By "random text files" do you mean the six text files that are in the zipped file? Specifically:

the desktop
the finder
the editor
the compiler
the writer
the noodle

Those are the source code for the project. The executable allows you to view, edit, and compile those files. In fact, the executable was created from those very files, compiled in the previous version of the compiler.
 
Last edited:

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
Specificially, we wanted to know:

1. Is it easier to program when you don’t have to translate your natural-language thoughts into an alternate syntax?
Whatever these, "natural language thoughts," are, they are not, to my knowledge, expressible in any form that may be typed in any characters of any known language which we might use to communicate to each other, so I find the question rather vague, which is fitting with the overall discussion. Frankly, I find converting my thoughts into English more difficult than converting those thoughts into the syntax of a decent programming language; I simply have much more practice at the former, and its vagueness allows for greater brevity.

2. Can natural languages be parsed in a relatively “sloppy” manner (as humans apparently parse them) and still provide a stable enough environment for productive programming?
Maybe. But, how do you prove that the program is performing the expected actions, without making it less sloppy, and rather COBOL-like? At that point, the exercise becomes rather silly, because the code gets harder to read due to its verbosity.

3. Can low-level programs (like compilers) be conveniently and efficiently written in high level languages (like English)?
Doubtful. What attempts to mimic natural language have not had problems due specifically to doing so? The sample code from the website reads very much like COBOL, or procedural SQL.

A machine code example for adding a number is a poor argument as well, as even assembly programmers don't use machine code, but assembly language. Most of the time, it's simply, "a + b," in any language.
 

Crusty

Lifer
Sep 30, 2001
12,684
2
81
Ok, I decided to give your stuff a shot. So I went to your homepage, and started to read the manifesto.

I closed it after the title page, for one reason. The font.

If you want to be taken seriously you need to present yourself in a professional manner, instead of your current approach.

More specifically, people want to see proof before they just blindly go and run your program.

Put some screenshots on your website or maybe a couple of screencasts demonstrating the product in action. How about some example source code that I can view directly on your webpage? What about an interactive demo? All of those are tangible evidence of your claims, a zip file with unknown binaries and text files is not.
 

Sheep221

Golden Member
Oct 28, 2012
1,843
27
81
On the internal speak, when you drive you talk to yourself hmm now I press clutch and switch 1st gear, press some gas, now 2nd gear, depress clutch etc, or you just hop in the car, thinking about your friends or wife or whatever and you just start it and drive?
 

Gerry Rzeppa

Member
Dec 13, 2013
195
1
76
www.osmosian.com
Whatever these, "natural language thoughts," are, they are not, to my knowledge, expressible in any form that may be typed in any characters of any known language which we might use to communicate to each other, so I find the question rather vague, which is fitting with the overall discussion.
Examples of "natural language thoughts" that I find in my head when I'm programming are things like "bump the counter" and "refresh the screen" and "track the mouse" and "show the menu bar". I can use such expressions as code with our compiler; with any other language, further translation is required.

Frankly, I find converting my thoughts into English more difficult than converting those thoughts into the syntax of a decent programming language; I simply have much more practice at the former, and its vagueness allows for greater brevity.
I'm not quite sure what you're saying there -- you say you "have much more practice at the former [presumably 'converting your thoughts into English'] and that its [English's] vagueness allows for greater brevity", and yet you claim that it is more difficult to convert your thoughts into English than a "decent programming syntax." I submit that you convert your thoughts into English, most of the time, with very little conscious effort at all. That you may have become multi-lingual enough so that you can also "think" in alternative syntaxes, is not surprising; any person who is fluent in more than one natural language can do the same. But they typically prefer one language over another. If you're content thinking in C++ or Ruby or Python or whatever, good for you. Those languages don't appeal to us.

how do you prove that the program is performing the expected actions, without making it less sloppy, and rather COBOL-like? At that point, the exercise becomes rather silly, because the code gets harder to read due to its verbosity.
I don't personally find it harder to read, but then I'm a good reader in general. I imagine there are people who are both mathematically inclined and not necessarily good readers who would naturally prefer a more mathematical syntax.

As I've mentioned elsewhere in this thread, we believe the ideal compiler will allow programmers to write programs like mathematicians write math books: snippets of specialized syntax (and even diagrams) imbeded in a natural language narrative. See, for example, Einstein's original (translated) paper on relativity:

http://www.gutenberg.org/files/30155/30155-pdf.pdf

Our position is that it's a trivial matter to extend our compiler to support such snippets, and even graphics of various kinds, in any number of syntaxes; but it's a very big job to extend any other compiler to support the natural language framework that our compiler already supports. In short, computer science's mathematical roots have flipped the whole thing upside down; if the first programmers had been linguists rather than mathematicians, we'd be farther along the road today.

What attempts to mimic natural language have not had problems due specifically to doing so? The sample code from the website reads very much like COBOL, or procedural SQL.
Yes it does read like COBOL, because COBOL was the first successful attempt in this direction. Which is why COBOL is one of the world's most stable and widely used languages to this day.

A machine code example for adding a number is a poor argument as well, as even assembly programmers don't use machine code, but assembly language. Most of the time, it's simply, "a + b," in any language.
The example was intended to show the limits at either end of the spectrum; after all, if you can handle those, handling the intermediates is obviously feasible.

Here are a couple of excellent articles regarding natural language programming:

http://blog.wolfram.com/2010/11/16/programming-with-natural-language-is-actually-going-to-work/

http://inform7.com/learn/documents/WhitePaper.pdf
 

Gerry Rzeppa

Member
Dec 13, 2013
195
1
76
www.osmosian.com
Ok, I decided to give your stuff a shot. So I went to your homepage, and started to read the manifesto. I closed it after the title page, for one reason. The font.
The font is actually a digitization of my son's printing. We wanted a non-copywrited font that we could embed in our program so that we'd be sure it was available on the user's machine. And since a great deal of our work is educating children, we thought it put a "friendly face" on the system. But surely you shouldn't let the mere look of a thing stop you so short! Does Einstein's haircut make you discard his physics?

If you want to be taken seriously you need to present yourself in a professional manner, instead of your current approach.
Oh, goodness! Did Steve Jobs look professional when he went to work barefoot? The photo you see under my name at the right is really me. I gave up wearing suits and ties (and shaving!) and attempting to impress people with my "professionalism" decades ago. See www.era-sql.com for my first company, founded in 1980. Made millions with that course, and licensed it to every major database vendor you can think of: Microsoft, Oracle, Teradata, etc. Those folks took me seriously. And I wasn't wearing a tie. And they still take me seriously, when they call to have me consult. And I still don't wear a tie.

More specifically, people want to see proof before they just blindly go and run your program.
No, actually most people simply download the thing, read the instructions, and -- undetered by the font! -- start coding. Hundreds of them. You're the exception in this case, not the rule.

Put some screenshots on your website or maybe a couple of screencasts demonstrating the product in action. How about some example source code that I can view directly on your webpage? What about an interactive demo? All of those are tangible evidence of your claims, a zip file with unknown binaries and text files is not.
I'm sorry, but I really don't see the problem here. Download the thing. Check it with your virus detector. Read the instructions (a PDF). Look at the source in notepad (they're plain text). Then ask yourself -- if a smart, rich guy like Gerry wanted to spread a virus, is this the way he would go about doing it?
 

Gerry Rzeppa

Member
Dec 13, 2013
195
1
76
www.osmosian.com
On the internal speak, when you drive you talk to yourself hmm now I press clutch and switch 1st gear, press some gas, now 2nd gear, depress clutch etc, or you just hop in the car, thinking about your friends or wife or whatever and you just start it and drive?
I don't talk to myself about things that have been "compiled" to the back of my brain (like chewing gum and walking). But when I'm describing a routine -- to myself, or to someone else, or to my computer -- you know, "Okay, first we have to do this, then we can do that..." -- I do talk to myself. In English.

That's why pseudo-code is so often recommended for both beginning programmers, and experienced programmers attempting to solve big problems. Because it's easier for most people to think in their native language.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
Examples of "natural language thoughts" that I find in my head when I'm programming are things like "bump the counter" and "refresh the screen" and "track the mouse" and "show the menu bar". I can use such expressions as code with our compiler; with any other language, further translation is required.
For my own part, I think of moving the mouse, or refreshing the screen. I imagine the complete action. Imagining seeing, hearing, doing, etc., does not require language. It is a direct matter.

Now, even here, you've come to a problem of language: how do you programmatically even describe bumping the counter? The first thing you'd need to do is define what the counter is. Then, define what action bumping is. Then, define what state begins a bump of it, and what state ends the bump of it. Then, what of the thing that is bumping it? There is still a great deal of context to be made before you can get there.

Meanwhile, if you have i = i + 1, i += 1, ++i, incr i, next i, etc., you don't have that kind of ambiguity. For one thing, a reader would know right off the bat that by counter, you meant the state of a loop, not something you might have in your garage or kitchen :). Thinking of a counter being incremented does not itself need to involve any known language, until it must be communicated.

I'm not quite sure what you're saying there -- you say you "have much more practice at the former [presumably 'converting your thoughts into English'] and that its [English's] vagueness allows for greater brevity", and yet you claim that it is more difficult to convert your thoughts into English than a "decent programming syntax." I submit that you convert your thoughts into English, most of the time, with very little conscious effort at all. That you may have become multi-lingual enough so that you can also "think" in alternative syntaxes, is not surprising; any person who is fluent in more than one natural language can do the same. But they typically prefer one language over another. If you're content thinking in C++ or Ruby or Python or whatever, good for you. Those languages don't appeal to us.
I am not fluent in other natural languages. With sufficient practice, it is easier to type out a task in a programming language than to describe it in English prose. It is not uncommon to spend many minutes, or an hour or more, of wall time (not staring at it the whole time doing nothing else), composing a forum post or email, due to the difficulty of saying what I am intending to, as I am intending to state it. Programming is very algebraic. It is based on repeating patterns, relations, and dimensional relationships. Not having multiple ways to interpret a thing, nor potentially-difficult-to-pin-down context, makes it easier to think about.

I don't personally find it harder to read, but then I'm a good reader in general. I imagine there are people who are both mathematically inclined and not necessarily good readers who would naturally prefer a more mathematical syntax.
I can read English just fine. Being a good reader, however, does not make it any easier to convert prose into math. The sample program on the website, for instance, is not remotely like any prose I have ever read (Inform at least can pull that off). A programming language must be precise at what it describes. If you do that in English, it will not be plain English, but a restricted subset of English, with a limited subset of English grammar, which ends up just like COBOL, or SQL, or Visual BASIC. They have the advantage of not scaring off people that were made afraid of maths in school, to begin learning, but are otherwise annoying and tedious.

As I've mentioned elsewhere in this thread, we believe the ideal compiler will allow programmers to write programs like mathematicians write math books: snippets of specialized syntax (and even diagrams) imbeded in a natural language narrative. See, for example, Einstein's original (translated) paper on relativity:
As long as there is a definition for calling code written in another language, that is done. In fact, it would be easy to argue that the ability to do just that is a strength of web apps: each part can be allowed to do what it does best, and programmer(s) involved are able to divide the work as best suits their needs.

Yes it does read like COBOL, because COBOL was the first successful attempt in this direction. Which is why COBOL is one of the world's most stable and widely used languages to this day.
Any evidence of that? Hard to believe that it's used more than C, C++, Java, C#, or Javascript. It was also so successful that hardly any programmers want to use it. COBOL, however, is not natural, merely verbose.

Inform is what came to mind when I saw the OP. For query-based systems, I think it may hold some promise. It is also far closer to prose than the Osmosian's website example file's contents, which are far from being prose-like. However, Mr. Wolfram still has yet to show any actual natural language (plenty of ego, though). All his examples are using standard declarative idioms. Separating terms by spaces doesn't make it natural language any more than LISP.
 
Last edited:

Gerry Rzeppa

Member
Dec 13, 2013
195
1
76
www.osmosian.com
For my own part, I think of moving the mouse, or refreshing the screen. I imagine the complete action. Imagining seeing, hearing, doing, etc., does not require language. It is a direct matter.
Right. Until you want to communicate your thoughts to someone else (or the computer). Then, at least in my mind, the thoughts are converted into English so rapidly and so automatically that there's no chance that they'd be converted to anything else first. I can imagine, however, people who speak, say, Javascript more than they speak English, working with it day and night, and those folks might experience something different.

Now, even here, you've come to a problem of language: how do you programmatically even describe bumping the counter? The first thing you'd need to do is define what the counter is. Then, define what action bumping is. Then, define what state begins a bump of it, and what state ends the bump of it. Then, what of the thing that is bumping it? There is still a great deal of context to be made before you can get there.
In Plain English you simply say:

Bump a counter.

The compiler creates a local variable of the appropriate type when it parses the indefinite article ("a") preceding the name ("counter"). Elsewhere in the routine you refer to the variable, naturally, as "the counter". If you want another counter, you simply say:

Bump another counter.

Which causes the allocation of another local variable of the appropriate type, and which can be referenced as "the other counter". Ditto for "a third counter", etc.

Meanwhile, if you have i = i + 1, i += 1, ++i, incr i, next i, etc., you don't have that kind of ambiguity. For one thing, a reader would know right off the bat that by counter, you meant the state of a loop, not something you might have in your garage or kitchen :). Thinking of a counter being incremented does not itself need to involve any known language, until it must be communicated.
Nice pun. We would define the latter as "a kitchen counter" to remove the ambiguity.

I am not fluent in other natural languages. With sufficient practice, it is easier to type out a task in a programming language than to describe it in English prose.
I think the key phrase there is "with sufficient practice."

It is not uncommon to spend many minutes, or an hour or more, of wall time (not staring at it the whole time doing nothing else), composing a forum post or email, due to the difficulty of saying what I am intending to, as I am intending to state it. Programming is very algebraic. It is based on repeating patterns, relations, and dimensional relationships. Not having multiple ways to interpret a thing, nor potentially-difficult-to-pin-down context, makes it easier to think about.
Perhaps natural language programming is simply not your cup of tea. Or perhaps you would like it once you've actually tried it for a while (allowing time for old habits to be overcome). Personally, I've programmed in both kinds of languages and prefer the natural by far for most tasks; and where I prefer some other kind of syntax, I put it on my list for one of our compiler's supported "sublanguages" -- thus, in the end, I'm hoping to have the best of both worlds at hand.

I can read English just fine. Being a good reader, however, does not make it any easier to convert prose into math. The sample program on the website, for instance, is not remotely like any prose I have ever read (Inform at least can pull that off). A programming language must be precise at what it describes. If you do that in English, it will not be plain English, but a restricted subset of English, with a limited subset of English grammar, which ends up just like COBOL, or SQL, or Visual BASIC. They have the advantage of not scaring off people that were made afraid of maths in school, to begin learning, but are otherwise annoying and tedious.
"Annoying and tedious", I agree, to some people, and in certain situations. Now, let's attempt a small meeting of the minds: If I agree that "x+2" is a more convenient expression than "add 2 to x", will you agree that "Clear the screen" is a better way of saying something like "graphics.ClrScr();"?

As long as there is a definition for calling code written in another language, that is done. In fact, it would be easy to argue that the ability to do just that is a strength of web apps: each part can be allowed to do what it does best, and programmer(s) involved are able to divide the work as best suits their needs.
You can call code in any language from our compiler, once you understand the calling conventions of both languages.

Any evidence of that? Hard to believe that it's used more than C, C++, Java, C#, or Javascript. It was also so successful that hardly any programmers want to use it. COBOL, however, is not natural, merely verbose.

Try here: http://blog.microfocus.com/news/2232/2232/ They claim that "that 200 times more transactions are processed daily by COBOL business applications than there are Google and You Tube searches made".

Inform is what came to mind when I saw the OP. For query-based systems, I think it may hold some promise. It is also far closer to prose than the Osmosian's website example file's contents, which are far from being prose-like. However, Mr. Wolfram still has yet to show any actual natural language (plenty of ego, though). All his examples are using standard declarative idioms. Separating terms by spaces doesn't make it natural language any more than LISP.
Well, at least you can see that if I'm a crank, there are other cranks out there as well, each attacking the problem from different angles. Which suggests that it's an avenue worth exploring.

There are lots of people who have coded interactive fiction using Inform and other (more "primitive") languages, and by and large they prefer Inform.

We have written a non-trivial program in English-language sentences and we personally think it a great improvement over the many other programming languages we've used. (The reason our code looks less like normal prose is because it's almost entirely imperative sentences. Normal prose consists of a variety of sentence types (like declarative, exclamatory, etc) that are simply not needed when commanding a machine to do things. You may as well object that a drill sergeant doesn't sound like Shakespeare.)

And even Wolfram -- a true "math head" if ever there was one -- has glimpsed the future; he's seen the limits of the mathematical approach, and is now taking a new tack.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
Now, not going through criticizing posted text, I think something like this would be excellent to teach young children through adolescents. How many people do you know that were taught to have a mental block against math, or stats; or a block against exploring any advanced piece of technology? Yet many of those same people can be given tools which have programming extensions, such as AutoCAD, FI, and become novice programmers with nothing but the need for dynamic macros. Or, they can learn to be quite adept at VBA, which is a horrendous language, IMO, to do what they need to get done in Excel, or Access. Or, they have Maple, Matlab, Mathematica, Sage, etc., as tools to use for R&D. But, oh, programming? No, can't do that stuff, it's too complex/hard/abstract/etc. (despite being able to write practical programs in an M-expression or S-expression language)!

If programming is taught in primary schools as an extension of math or science, people quite capable of grasping its main concepts will avoid it, or fail due to those parts of it; doubly-so if they try to use C or a clear derivative of C, and even more-so with a language using Simula-like OOP (pretty much anyone can grasp Smalltalk or Ruby's OOP implementations, FI, since they are primarily based on what the code is, does, or is for, rather than an abstract heirarchy). Djikstra may have had bad things to say about BASIC, but that was because he predated Java! :)
 

MagnusTheBrewer

IN MEMORIAM
Jun 19, 2004
24,135
1,594
126
You seem to like slapping a new label on programming languages and call it English but it is not. Innyour example you showed two different approaches your son and you took tondefine a term /action. Computers do not "learn." If your example was followed, thousands, if not millions of English users would have to create their usage. The definitions might be similar but, computers do not make assumptions. Every time a new user tried to use a term, the computer must run through all the definitions it has listed to try and find a match. If there is no match, the user has to redefine it yet again. You imply that if your software takes enough baby steps, it will be able to parse English by any user, hogwash. The computers and software we have now and in the foreseeable future are incapable of doing so. Youncan try to hide the logical fallacy behind calling it a "prototype " but, that won't make it true.
 
Status
Not open for further replies.