now read the behind the scenes interview with Stroustrup: http://developers.slashdot.org/comments.pl?sid=594771&cid=23934189
lol, I thought it was funny when I read it back in 2008. It couldn't be farther from the truth though.
This reeks of "Disgruntled c programmer who hates c++ and OOP".
seriously, there is a reason java, C#, PHP, and pretty much every language since C++ has been object oriented, it works. All this crap about "Durr, Oh, C++ uses 4 the more memory" and "Oh, C NEVER had a memory leak" is just plain retarded.
Back when I was teaching C++ in the early nineties there were always two or three guys in the back who were disgruntled at the whole world for challenging their favorite language and paradigm. "I could do all these things in C!" was their favorite line, and my favorite response was "Yes, and if you had I might not be here, and you might not be there, but you didn't and so I am, and you are."
Back when I was teaching C++ in the early nineties there were always two or three guys in the back who were disgruntled at the whole world for challenging their favorite language and paradigm. "I could do all these things in C!" was their favorite line, and my favorite response was "Yes, and if you had I might not be here, and you might not be there, but you didn't and so I am, and you are."
And anything with a GUI, it has to be LabView.
We aren't talking about hard and fast scientific rules here. We are talking about languages used to get the job done. There is a good reason that c++, java, and C# constantly top the list of popular programming languages. Point out what is wrong with OOP if you think it is so bad.I don't think it's acceptable to argue that something is good because it is widely used. If you gave me a homework that argued that 2+2=5 was true because it was a popular answer, I would fail you.
I've never seen a logically sound argument for ANY programming paradigm's superiority. We aren't talking about hard and fast scientific facts here, we are talking about preferences.OOP is widely used, and I have used OOP languages extensively, but I have yet to see a proper formal definition nor have I seen a convincing, logically sound, argument for its superiority.
It isn't a strawman. Go look up what a strawman argument actually is. The topic of discussion was a fake interview of Stroustrup talking about the inferiority of C++ and the superiority of C. Also, why would C have polymorphism? Are you just using random terms now? Seriously, a procedural language has no use for an entirely OOP concept.There are lots of flaws with lots of languages, and picking on C programmers who hate C++ is a strawman, since C has many well-known problems, such as lack of polymorphism.
The x86 is FAR from perfect, and as you pointed out, is bloated beyond belief. That being said, like most architecture it is stable. Unlike most architectures, it has become common place. For a guy to write a billing programming only in RPG and refuse to use a newer language, operating system, or architecture because it is "unstable" is retarded. Say what you will about x86 or C++, but they are NOT unstable in and of themselves.Also, are you seriously defending the x86 architecture? Have you ever written an operating system? I'm glad that such powerful hardware is widely and easily available, but the amount of time you spend fiddling with legacy code is quite astounding. Real mode, segmentation, hardware task management structures, the 8259A PIC, ACPI vs MP spec, the 8042 keyboard controller, BIOS, the 8254 PIT... the list goes on. These are all pieces of the overall Intel system architecture which you have to contend with at some point, in some way, and none of them belong in a modern computer.
Also, are you seriously defending the x86 architecture? Have you ever written an operating system? I'm glad that such powerful hardware is widely and easily available, but the amount of time you spend fiddling with legacy code is quite astounding. Real mode, segmentation, hardware task management structures, the 8259A PIC, ACPI vs MP spec, the 8042 keyboard controller, BIOS, the 8254 PIT... the list goes on. These are all pieces of the overall Intel system architecture which you have to contend with at some point, in some way, and none of them belong in a modern computer.
We aren't talking about hard and fast scientific rules here. We are talking about languages used to get the job done. There is a good reason that c++, java, and C# constantly top the list of popular programming languages. Point out what is wrong with OOP if you think it is so bad. I've never seen a logically sound argument for ANY programming paradigm's superiority. We aren't talking about hard and fast scientific facts here, we are talking about preferences.
As for a definition of OOP. Look it up. There are SEVERAL out there. Seriously, you must walk around with your eyes closed if you haven't seen a formal definition of OOP. Or are your feelings hurt because there is no standard definition for OOP? Because, honestly, there is no standard definition for ANY programming paradigm.
It isn't a strawman. Go look up what a strawman argument actually is.
Considering "Oh, C never had a memory leak" to be an argument is attacking a strawman.All this crap about "Durr, Oh, C++ uses 4 the more memory" and "Oh, C NEVER had a memory leak" is just plain retarded.
I'm afraid you've been misinformed about polymorphism. It's a more general concept than what OO programmers tend to refer to as "polymorphism". To make matters worse, what Java calls "Generics" is actually another form of polymorphism. Wikipedia seems to have an article discussing the matter so I'll leave it to them http://en.wikipedia.org/wiki/Type_polymorphism.Also, why would C have polymorphism? Are you just using random terms now? Seriously, a procedural language has no use for an entirely OOP concept.
A fair number of old time programmers, however, insist on trying to save face (and consequently lose it) by arguing how terrible the "new" stuff is which they can't effectively do because they really don't understand it. That was my ultimate point.
Of course, that's what I meant when I said "legacy."Markbnj said:Such systems aren't designed in their entirety by elite thinkers operating in abstract environments. They evolve in the real world.
I don't fault them for it. I don't expect people to be able to see the future. It's our fault for continuing to be burdened by the decisions of the past. Are there good sound economic reasons for doing this? Of course. Does that mean the x86 architecture is good? Hardly. Pragmatic choices are not necessarily ideal choices.Back when these components were spec'd and designed, and the interfaces set down, nobody had your post-evolution knowledge of what a "modern computer" would be.
I'm more interested these days in verification and predictability of systems. The layers of legacy, microcode, and trade secrets that are folded into the design of the x86 architecture these days makes these goals quite nearly impossible. You can argue reasonably that the x86 is simply not a suitable platform for this kind of work.
I don't think it's acceptable to argue that something is good because it is widely used. If you gave me a homework that argued that 2+2=5 was true because it was a popular answer, I would fail you.
I believe that is really the only definition that is needed (though, I know several language developers like to throw in their own "True OOP is this" sorts of things.)Maybe you are talking about preferences. My field is programming language theory, so I am interested in scientific and mathematical facts. Some languages are explicitly designed with goals in mind, such as ease of static analysis, or modularity. Some languages congeal together from various sources in a more "organic" fashion. Modern OOP languages tend to be of this sort. There's nothing innately wrong with either approach, but from a program analysis point of view, the latter are going to be much more difficult to reason about. The claim to fame of OOP languages is usually something along the lines of "they compartmentalize state" but this is more a feature of a module system, and not all OOP languages have that.
The only principle I've seen to be held in common amongst all the OOP languages is the notion of "object identity". Two objects can be generated in exactly the same way with exactly the same structure but have different identity. An object can be mutated in some way but retain its identity.
While this principle can be put to good use in modeling certain situations, it becomes somewhat awkward when dealing with derived relationships of multiple objects. An example might be the representation of the intersection of two sets.
Here's a link which has several articles discussing the pitfalls of OOP: http://okmij.org/ftp/Computation/Subtyping/
As there are for most paradigms.Where is it? I can point out Pierce's Types and Programming Languages chapter on OOP, but I don't think most OO programmers would like that language. I can point to the C++ or perhaps instead the Common Lisp standard and say "That's it" but then someone from the Smalltalk or maybe the Eiffel camp is going to disagree with me. The problem with OOP is that there are indeed so many definitions to choose from.
You sort of proved my point. You've not seen a standard definition because one really doesn't exist. You can cook one up, but that wouldn't really be a standard definition because some person is going to say "Nuh uhh, Procedural programming also involves feature X" where x is a feature of his favorite programming language.It's quite a leap to say there is no standard definition for any programming paradigm. While I haven't seen a formal definition for procedural programming personally, it is quite simple to cook one up. Functional programming languages are all extensions of the lambda-calculus which has a formal semantics. The extensions are typically defined using typing judgements and operational semantics. You can even go so far as what was done for Standard ML and formally describe an entire programming language, though most do not go to that extent, and stick to the core language.
I really wasn't considering it to be an argument. I was commenting on a fake interview in slashdot where essentially that exact proposition was put forward. I was not trying to argue against anyone, just pointing out that the author sounded like a disgruntled C programmer. Read the article posted, in it the author specifically saysNo the strawman part referred to
Considering "Oh, C never had a memory leak" to be an argument is attacking a strawman.
I don't see how I could have strawmanned as that is the exact position that the author of this comment took.Stroustrup: does it? Have you ever noticed the difference between a 'C' project plan, and a C++ project plan? The planning stage for a C++ project is three times as long. Precisely to make sure that everything which should be inherited is, and what shouldn't isn't. Then, they still get it wrong. Whoever heard of memory leaks in a 'C' program? Now finding them is a major industry. Most companies give up, and send the product out, knowing it leaks like a sieve, simply to avoid the expense of tracking them all down.
That I am. I guess I just always thought of it as strictly being "Objects inheriting attributes from other objects."I'm afraid you've been misinformed about polymorphism. It's a more general concept than what OO programmers tend to refer to as "polymorphism". To make matters worse, what Java calls "Generics" is actually another form of polymorphism. Wikipedia seems to have an article discussing the matter so I'll leave it to them http://en.wikipedia.org/wiki/Type_polymorphism.
Of course, that's what I meant when I said "legacy."
I don't fault them for it. I don't expect people to be able to see the future. It's our fault for continuing to be burdened by the decisions of the past. Are there good sound economic reasons for doing this? Of course. Does that mean the x86 architecture is good? Hardly. Pragmatic choices are not necessarily ideal choices.
I realized after I wrote that post that I focused way too much on the legacy devices which, as you rightfully point out, are typically emulated by the modern chipsets.
I'm more interested these days in verification and predictability of systems. The layers of legacy, microcode, and trade secrets that are folded into the design of the x86 architecture these days makes these goals quite nearly impossible. You can argue reasonably that the x86 is simply not a suitable platform for this kind of work.
But of course, who wants to pass up on the relatively cheap performance per dollar ratio? Pragmatically, it's what I've got to use. But it isn't the end-all of computer architecture, and I hope that someday there is better, or else we have really failed as a field.
Cogman said:You sort of proved my point. You've not seen a standard definition because one really doesn't exist. You can cook one up, but that wouldn't really be a standard definition because some person is going to say "Nuh uhh, Procedural programming also involves feature X" where x is a feature of his favorite programming language.dinkumthinkum said:It's quite a leap to say there is no standard definition for any programming paradigm. While I haven't seen a formal definition for procedural programming personally, it is quite simple to cook one up. Functional programming languages are all extensions of the lambda-calculus which has a formal semantics. The extensions are typically defined using typing judgements and operational semantics. You can even go so far as what was done for Standard ML and formally describe an entire programming language, though most do not go to that extent, and stick to the core language.
I think I wrote this paragraph in a confusing way. There are two points, the first being about procedural programming---there isn't a standard definition, but nobody really cares about it.
The second point was about functional programming, which is defined to be languages based on the lambda calculus---which itself is just a system for creating and applying functions (and nothing else). FP is rigorously defined by the semantics of the lambda calculus, and by the semantics given for each of the extensions that have been created. These exist because people care about doing proofs and verification of these programs. So standard definitions can exist, it's just about putting in the effort to do it. Of course you can have variations which relax certain rules, but when I say "pure FP" people know what I mean. That's not the case for OOP.
I usually hear it the other way around: that certain algorithms are harder to express in a functional programming language, but that large systems built like this are easier to understand because you don't have to worry about side effects and state.
I think I wrote this paragraph in a confusing way. There are two points, the first being about procedural programming---there isn't a standard definition, but nobody really cares about it.
The second point was about functional programming, which is defined to be languages based on the lambda calculus---which itself is just a system for creating and applying functions (and nothing else). FP is rigorously defined by the semantics of the lambda calculus, and by the semantics given for each of the extensions that have been created. These exist because people care about doing proofs and verification of these programs. So standard definitions can exist, it's just about putting in the effort to do it. Of course you can have variations which relax certain rules, but when I say "pure FP" people know what I mean. That's not the case for OOP.
Well then, technically I think functional programming is about the only paradigm that has a rigorous definition. OOP is very loose in its definition, but that doesn't necessarily make it a bad paradigm IMO. Even then, the definition of a language doesn't really it itself lead to a good logical argument for the superiority of the language. (It does help though, however, since so few paradigms are truly formally defined, it makes it hard argue one way or the other about the goodness of a paradigm).
I take more of the empirical approach. Since most programmers use OOP or an OOP like paradigm when programming, it supports the claim that it is one of the better languages. (Ignoring the whole correlation != causation thing 😛)
That being said, I really don't believe OOP to be the end all be all paradigm, I just view it as "The best we got so far for general purpose work". That isn't to say there aren't tasks that FP or Procedural programming are ideal for, just that most complex problems are simplified due to a OOP like paradigm. The argument that "OOP is terrible and costs so much etc." Is just as invalid as the argument that "OOP is superior to all other paradigms."
BTW, I really need to learn more FP 😀