• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Interview with creator of C++

Thank you for the read. The interview was 'odd' (didn't really focus on what I would have liked) but the guy seemed a bit reserved and modest. A true engineer. Not as much ego in him as I thought from the tone.
 
I met him at a conference years ago. Just a typical, quiet, very studious sort of guy. Prototypical geek engineer.
 
now read the behind the scenes interview with Stroustrup: http://developers.slashdot.org/comments.pl?sid=594771&cid=23934189

lol, I thought it was funny when I read it back in 2008. It couldn't be farther from the truth though.

This reeks of "Disgruntled c programmer who hates c++ and OOP".

seriously, there is a reason java, C#, PHP, and pretty much every language since C++ has been object oriented, it works. All this crap about "Durr, Oh, C++ uses 4 the more memory" and "Oh, C NEVER had a memory leak" is just plain retarded.
 
This reeks of "Disgruntled c programmer who hates c++ and OOP".

seriously, there is a reason java, C#, PHP, and pretty much every language since C++ has been object oriented, it works. All this crap about "Durr, Oh, C++ uses 4 the more memory" and "Oh, C NEVER had a memory leak" is just plain retarded.

Back when I was teaching C++ in the early nineties there were always two or three guys in the back who were disgruntled at the whole world for challenging their favorite language and paradigm. "I could do all these things in C!" was their favorite line, and my favorite response was "Yes, and if you had I might not be here, and you might not be there, but you didn't and so I am, and you are."
 
Back when I was teaching C++ in the early nineties there were always two or three guys in the back who were disgruntled at the whole world for challenging their favorite language and paradigm. "I could do all these things in C!" was their favorite line, and my favorite response was "Yes, and if you had I might not be here, and you might not be there, but you didn't and so I am, and you are."

🙂 They still exist. Them, along with functional programmers, can be some of the most annoying programmers to talk to (Not saying functional or procedural programming is bad, they have their place just like OOP). Its just sort of funny to see them twist and squeal about now defacto standards. Kind of like an old programmer I heard who tried to argue that "the x86 architecture is extremely buggy, and so is c++. RPG is the only way to go."

Though I guess I understand them. You like to program in what you are use to. I personally like c++ for most things and resisted learning C# for a while. But it sort of grows on you.
 
Back when I was teaching C++ in the early nineties there were always two or three guys in the back who were disgruntled at the whole world for challenging their favorite language and paradigm. "I could do all these things in C!" was their favorite line, and my favorite response was "Yes, and if you had I might not be here, and you might not be there, but you didn't and so I am, and you are."

Heh, I see a lot of this at work from the old timers. If it's not C or ADA, it's that new fangled bullshit that all the kids are using. 😀

And anything with a GUI, it has to be LabView.
 
I don't think it's acceptable to argue that something is good because it is widely used. If you gave me a homework that argued that 2+2=5 was true because it was a popular answer, I would fail you.

OOP is widely used, and I have used OOP languages extensively, but I have yet to see a proper formal definition nor have I seen a convincing, logically sound, argument for its superiority.

There are lots of flaws with lots of languages, and picking on C programmers who hate C++ is a strawman, since C has many well-known problems, such as lack of polymorphism.

Also, are you seriously defending the x86 architecture? Have you ever written an operating system? I'm glad that such powerful hardware is widely and easily available, but the amount of time you spend fiddling with legacy code is quite astounding. Real mode, segmentation, hardware task management structures, the 8259A PIC, ACPI vs MP spec, the 8042 keyboard controller, BIOS, the 8254 PIT... the list goes on. These are all pieces of the overall Intel system architecture which you have to contend with at some point, in some way, and none of them belong in a modern computer.
 
I sort of thought the fake C++ interview was funny.

I read a comment on slashdot that said there were other object oriented languages that came out around the same time as C++ that were arguably better but they either had undesirable licensing restrictions or expensive development tools so people learned C++ instead. That was before my time, but I could certainly see that happening if true.
 
I don't think it's acceptable to argue that something is good because it is widely used. If you gave me a homework that argued that 2+2=5 was true because it was a popular answer, I would fail you.
We aren't talking about hard and fast scientific rules here. We are talking about languages used to get the job done. There is a good reason that c++, java, and C# constantly top the list of popular programming languages. Point out what is wrong with OOP if you think it is so bad.

OOP is widely used, and I have used OOP languages extensively, but I have yet to see a proper formal definition nor have I seen a convincing, logically sound, argument for its superiority.
I've never seen a logically sound argument for ANY programming paradigm's superiority. We aren't talking about hard and fast scientific facts here, we are talking about preferences.

As for a definition of OOP. Look it up. There are SEVERAL out there. Seriously, you must walk around with your eyes closed if you haven't seen a formal definition of OOP. Or are your feelings hurt because there is no standard definition for OOP? Because, honestly, there is no standard definition for ANY programming paradigm.

There are lots of flaws with lots of languages, and picking on C programmers who hate C++ is a strawman, since C has many well-known problems, such as lack of polymorphism.
It isn't a strawman. Go look up what a strawman argument actually is. The topic of discussion was a fake interview of Stroustrup talking about the inferiority of C++ and the superiority of C. Also, why would C have polymorphism? Are you just using random terms now? Seriously, a procedural language has no use for an entirely OOP concept.

Also, are you seriously defending the x86 architecture? Have you ever written an operating system? I'm glad that such powerful hardware is widely and easily available, but the amount of time you spend fiddling with legacy code is quite astounding. Real mode, segmentation, hardware task management structures, the 8259A PIC, ACPI vs MP spec, the 8042 keyboard controller, BIOS, the 8254 PIT... the list goes on. These are all pieces of the overall Intel system architecture which you have to contend with at some point, in some way, and none of them belong in a modern computer.
The x86 is FAR from perfect, and as you pointed out, is bloated beyond belief. That being said, like most architecture it is stable. Unlike most architectures, it has become common place. For a guy to write a billing programming only in RPG and refuse to use a newer language, operating system, or architecture because it is "unstable" is retarded. Say what you will about x86 or C++, but they are NOT unstable in and of themselves.

I've never written an OS. Why would I have? It isn't exactly an everyday task that needs to be completed. I would NEVER recommend the x86 architecture for an embedded system (MAYBE a 8088 processor, but even then, a MIPS, ARM, or PIC controller would likely be 100x better). The point is, the old timer was arguing against not industry standards because he hadn't any experience with it. So instead of just saying to a customer "I don't know how to program in C++/C#/Pascal/etc." he took the other route which most old programmers commonly take and said "C++/C#/Pascal/etc is stupid and unstable."

Hell, he could have even said "It would be too hard to port all the code to a new language and architecture" and that would save face. A fair number of old time programmers, however, insist on trying to save face (and consequently lose it) by arguing how terrible the "new" stuff is which they can't effectively do because they really don't understand it. That was my ultimate point.
 
Also, are you seriously defending the x86 architecture? Have you ever written an operating system? I'm glad that such powerful hardware is widely and easily available, but the amount of time you spend fiddling with legacy code is quite astounding. Real mode, segmentation, hardware task management structures, the 8259A PIC, ACPI vs MP spec, the 8042 keyboard controller, BIOS, the 8254 PIT... the list goes on. These are all pieces of the overall Intel system architecture which you have to contend with at some point, in some way, and none of them belong in a modern computer.

It's amazing to me, after twenty years, how many people continue to get this just completely wrong. Such systems aren't designed in their entirety by elite thinkers operating in abstract environments. They evolve in the real world. Is QWERTY the absolute best layout for a keyboard? Is there really no better way to control a car than with 2-3 pedals, a wheel, and a transmission lever? Do elevator buttons have to be arrayed in a grid next to the door?

The PC architecture evolved over 30+ years of use by hundreds of millions of people. Over those three decades it has maintained truly impressive backward compatibility (I still have assembler demos running against the VGA hardware, written in the late 80's, and as recently as a couple years ago they still ran on XP), and has proven able to evolve to meet every new challenge that has come along. To say that the components (many of which, like the 8042 and 8253 I am intimately familiar with) should not be in a modern computer is really sort of a cheap shot. Back when these components were spec'd and designed, and the interfaces set down, nobody had your post-evolution knowledge of what a "modern computer" would be.

And of course, none of these components actually are in a modern computer anymore, at least in discrete form. They've become integrated functions in larger multi-function chipsets. So all you're really complaining about is that the components continue to adhere to the interfaces that were designed in the late 70's. Why is that a bad thing? It obviously works well enough.
 
Last edited:
We aren't talking about hard and fast scientific rules here. We are talking about languages used to get the job done. There is a good reason that c++, java, and C# constantly top the list of popular programming languages. Point out what is wrong with OOP if you think it is so bad. I've never seen a logically sound argument for ANY programming paradigm's superiority. We aren't talking about hard and fast scientific facts here, we are talking about preferences.

Maybe you are talking about preferences. My field is programming language theory, so I am interested in scientific and mathematical facts. Some languages are explicitly designed with goals in mind, such as ease of static analysis, or modularity. Some languages congeal together from various sources in a more "organic" fashion. Modern OOP languages tend to be of this sort. There's nothing innately wrong with either approach, but from a program analysis point of view, the latter are going to be much more difficult to reason about. The claim to fame of OOP languages is usually something along the lines of "they compartmentalize state" but this is more a feature of a module system, and not all OOP languages have that.

The only principle I've seen to be held in common amongst all the OOP languages is the notion of "object identity". Two objects can be generated in exactly the same way with exactly the same structure but have different identity. An object can be mutated in some way but retain its identity.

While this principle can be put to good use in modeling certain situations, it becomes somewhat awkward when dealing with derived relationships of multiple objects. An example might be the representation of the intersection of two sets.

Here's a link which has several articles discussing the pitfalls of OOP: http://okmij.org/ftp/Computation/Subtyping/

As for a definition of OOP. Look it up. There are SEVERAL out there. Seriously, you must walk around with your eyes closed if you haven't seen a formal definition of OOP. Or are your feelings hurt because there is no standard definition for OOP? Because, honestly, there is no standard definition for ANY programming paradigm.

Where is it? I can point out Pierce's Types and Programming Languages chapter on OOP, but I don't think most OO programmers would like that language. I can point to the C++ or perhaps instead the Common Lisp standard and say "That's it" but then someone from the Smalltalk or maybe the Eiffel camp is going to disagree with me. The problem with OOP is that there are indeed so many definitions to choose from.

It's quite a leap to say there is no standard definition for any programming paradigm. While I haven't seen a formal definition for procedural programming personally, it is quite simple to cook one up. Functional programming languages are all extensions of the lambda-calculus which has a formal semantics. The extensions are typically defined using typing judgements and operational semantics. You can even go so far as what was done for Standard ML and formally describe an entire programming language, though most do not go to that extent, and stick to the core language.

It isn't a strawman. Go look up what a strawman argument actually is.

No the strawman part referred to
All this crap about "Durr, Oh, C++ uses 4 the more memory" and "Oh, C NEVER had a memory leak" is just plain retarded.
Considering "Oh, C never had a memory leak" to be an argument is attacking a strawman.

Also, why would C have polymorphism? Are you just using random terms now? Seriously, a procedural language has no use for an entirely OOP concept.
I'm afraid you've been misinformed about polymorphism. It's a more general concept than what OO programmers tend to refer to as "polymorphism". To make matters worse, what Java calls "Generics" is actually another form of polymorphism. Wikipedia seems to have an article discussing the matter so I'll leave it to them http://en.wikipedia.org/wiki/Type_polymorphism.

C++ templates are a form of parametric polymorphism but with the ability to throw in ad-hoc bits (though doing this would break any Abstraction Theorem, if C++ had such a thing). C with templates would be polymorphic in this fashion.

A fair number of old time programmers, however, insist on trying to save face (and consequently lose it) by arguing how terrible the "new" stuff is which they can't effectively do because they really don't understand it. That was my ultimate point.

Fair enough.

Markbnj said:
Such systems aren't designed in their entirety by elite thinkers operating in abstract environments. They evolve in the real world.
Of course, that's what I meant when I said "legacy."
Back when these components were spec'd and designed, and the interfaces set down, nobody had your post-evolution knowledge of what a "modern computer" would be.
I don't fault them for it. I don't expect people to be able to see the future. It's our fault for continuing to be burdened by the decisions of the past. Are there good sound economic reasons for doing this? Of course. Does that mean the x86 architecture is good? Hardly. Pragmatic choices are not necessarily ideal choices.

I realized after I wrote that post that I focused way too much on the legacy devices which, as you rightfully point out, are typically emulated by the modern chipsets.

I'm more interested these days in verification and predictability of systems. The layers of legacy, microcode, and trade secrets that are folded into the design of the x86 architecture these days makes these goals quite nearly impossible. You can argue reasonably that the x86 is simply not a suitable platform for this kind of work.

But of course, who wants to pass up on the relatively cheap performance per dollar ratio? Pragmatically, it's what I've got to use. But it isn't the end-all of computer architecture, and I hope that someday there is better, or else we have really failed as a field.
 
I'm more interested these days in verification and predictability of systems. The layers of legacy, microcode, and trade secrets that are folded into the design of the x86 architecture these days makes these goals quite nearly impossible. You can argue reasonably that the x86 is simply not a suitable platform for this kind of work.

Fair enough, that's not a domain I have much experience in. But the x86 platform is what it is: a mass-market general purpose computing system. As you note, the performance per dollar is compelling, and as a general purpose machine it is suitable for a wide variety of tasks. I guess when I look back at the last thirty years I see the PC as an amazing success story, and I often feel that criticism of the platform, and of MS, comes from ivory tower types who really don't comprehend the scale of the platform's adoption, and the complexity of managing evolution across three decades.
 
I don't think it's acceptable to argue that something is good because it is widely used. If you gave me a homework that argued that 2+2=5 was true because it was a popular answer, I would fail you.

Maybe you should first ask yourself why so many people think 2+2=5.

I think everyone agrees that x86 isn't ideal, however the advantages of backwards compatibility far outweigh any advantages of trying to transition to a more modern ISA. Perhaps someday in the future when most code is contained within some virtual runtime environment (or moved to the cloud) we can begin to argue about practically transitioning to a different ISA, however that's not anywhere in the near future.

You seem to ague from the academic perspective, which involves striving for the ideal, practicality be damned, while Mark and Cog seem to be arguing the economical and the "good enough". Neither side is more correct than the other; we'll always need the one side to push improvements on the industry, while the other needs to ensure that any changes are done so in a practical and sustainable way.
 
Maybe you are talking about preferences. My field is programming language theory, so I am interested in scientific and mathematical facts. Some languages are explicitly designed with goals in mind, such as ease of static analysis, or modularity. Some languages congeal together from various sources in a more "organic" fashion. Modern OOP languages tend to be of this sort. There's nothing innately wrong with either approach, but from a program analysis point of view, the latter are going to be much more difficult to reason about. The claim to fame of OOP languages is usually something along the lines of "they compartmentalize state" but this is more a feature of a module system, and not all OOP languages have that.

The only principle I've seen to be held in common amongst all the OOP languages is the notion of "object identity". Two objects can be generated in exactly the same way with exactly the same structure but have different identity. An object can be mutated in some way but retain its identity.

While this principle can be put to good use in modeling certain situations, it becomes somewhat awkward when dealing with derived relationships of multiple objects. An example might be the representation of the intersection of two sets.

Here's a link which has several articles discussing the pitfalls of OOP: http://okmij.org/ftp/Computation/Subtyping/
I believe that is really the only definition that is needed (though, I know several language developers like to throw in their own "True OOP is this" sorts of things.)

OOP is programming that involves the use of "objects" which have state and behavior. Everything else is fluffy goodness.

The author of the Wikipedia definition took a similar approach to defining OOP.


Where is it? I can point out Pierce's Types and Programming Languages chapter on OOP, but I don't think most OO programmers would like that language. I can point to the C++ or perhaps instead the Common Lisp standard and say "That's it" but then someone from the Smalltalk or maybe the Eiffel camp is going to disagree with me. The problem with OOP is that there are indeed so many definitions to choose from.
As there are for most paradigms.

It's quite a leap to say there is no standard definition for any programming paradigm. While I haven't seen a formal definition for procedural programming personally, it is quite simple to cook one up. Functional programming languages are all extensions of the lambda-calculus which has a formal semantics. The extensions are typically defined using typing judgements and operational semantics. You can even go so far as what was done for Standard ML and formally describe an entire programming language, though most do not go to that extent, and stick to the core language.
You sort of proved my point. You've not seen a standard definition because one really doesn't exist. You can cook one up, but that wouldn't really be a standard definition because some person is going to say "Nuh uhh, Procedural programming also involves feature X" where x is a feature of his favorite programming language.


No the strawman part referred to

Considering "Oh, C never had a memory leak" to be an argument is attacking a strawman.
I really wasn't considering it to be an argument. I was commenting on a fake interview in slashdot where essentially that exact proposition was put forward. I was not trying to argue against anyone, just pointing out that the author sounded like a disgruntled C programmer. Read the article posted, in it the author specifically says
Stroustrup: does it? Have you ever noticed the difference between a 'C' project plan, and a C++ project plan? The planning stage for a C++ project is three times as long. Precisely to make sure that everything which should be inherited is, and what shouldn't isn't. Then, they still get it wrong. Whoever heard of memory leaks in a 'C' program? Now finding them is a major industry. Most companies give up, and send the product out, knowing it leaks like a sieve, simply to avoid the expense of tracking them all down.
I don't see how I could have strawmanned as that is the exact position that the author of this comment took.

I'm afraid you've been misinformed about polymorphism. It's a more general concept than what OO programmers tend to refer to as "polymorphism". To make matters worse, what Java calls "Generics" is actually another form of polymorphism. Wikipedia seems to have an article discussing the matter so I'll leave it to them http://en.wikipedia.org/wiki/Type_polymorphism.
That I am. I guess I just always thought of it as strictly being "Objects inheriting attributes from other objects."


Of course, that's what I meant when I said "legacy."

I don't fault them for it. I don't expect people to be able to see the future. It's our fault for continuing to be burdened by the decisions of the past. Are there good sound economic reasons for doing this? Of course. Does that mean the x86 architecture is good? Hardly. Pragmatic choices are not necessarily ideal choices.

I realized after I wrote that post that I focused way too much on the legacy devices which, as you rightfully point out, are typically emulated by the modern chipsets.

I'm more interested these days in verification and predictability of systems. The layers of legacy, microcode, and trade secrets that are folded into the design of the x86 architecture these days makes these goals quite nearly impossible. You can argue reasonably that the x86 is simply not a suitable platform for this kind of work.

But of course, who wants to pass up on the relatively cheap performance per dollar ratio? Pragmatically, it's what I've got to use. But it isn't the end-all of computer architecture, and I hope that someday there is better, or else we have really failed as a field.

Yep, don't really disagree at all there. Like I said, the x86 architecture is not ideal, but it is cheap and stable.
 
Cogman said:
dinkumthinkum said:
It's quite a leap to say there is no standard definition for any programming paradigm. While I haven't seen a formal definition for procedural programming personally, it is quite simple to cook one up. Functional programming languages are all extensions of the lambda-calculus which has a formal semantics. The extensions are typically defined using typing judgements and operational semantics. You can even go so far as what was done for Standard ML and formally describe an entire programming language, though most do not go to that extent, and stick to the core language.
You sort of proved my point. You've not seen a standard definition because one really doesn't exist. You can cook one up, but that wouldn't really be a standard definition because some person is going to say "Nuh uhh, Procedural programming also involves feature X" where x is a feature of his favorite programming language.

I think I wrote this paragraph in a confusing way. There are two points, the first being about procedural programming---there isn't a standard definition, but nobody really cares about it.

The second point was about functional programming, which is defined to be languages based on the lambda calculus---which itself is just a system for creating and applying functions (and nothing else). FP is rigorously defined by the semantics of the lambda calculus, and by the semantics given for each of the extensions that have been created. These exist because people care about doing proofs and verification of these programs. So standard definitions can exist, it's just about putting in the effort to do it. Of course you can have variations which relax certain rules, but when I say "pure FP" people know what I mean. That's not the case for OOP.
 
I think I wrote this paragraph in a confusing way. There are two points, the first being about procedural programming---there isn't a standard definition, but nobody really cares about it.

The second point was about functional programming, which is defined to be languages based on the lambda calculus---which itself is just a system for creating and applying functions (and nothing else). FP is rigorously defined by the semantics of the lambda calculus, and by the semantics given for each of the extensions that have been created. These exist because people care about doing proofs and verification of these programs. So standard definitions can exist, it's just about putting in the effort to do it. Of course you can have variations which relax certain rules, but when I say "pure FP" people know what I mean. That's not the case for OOP.

I think that's because OOP was created mostly to make software easier to write, but it's still an imperative computing model. When you move into the functional realm, the idea is a new way of execution, and reducing/eliminating side effects. Certain algorithms may be simpler to express in a functional form, however most large programs are still more easily built with an imperative language. So I think in this sense that a formal definition for FP is a self-fulfilling prophecy. FP and Lambda calc are rooted in mathematical theory, so formal definitions are easy compared to some imperative language that isn't so easy (nor necessary) to provide a formal definition for.
 
I usually hear it the other way around: that certain algorithms are harder to express in a functional programming language, but that large systems built like this are easier to understand because you don't have to worry about side effects and state.
 
I usually hear it the other way around: that certain algorithms are harder to express in a functional programming language, but that large systems built like this are easier to understand because you don't have to worry about side effects and state.

Perhaps... I'm in the midst of learning it myself for my compilers class, so I'm no expert by any means. I do find that FP may take a bit longer to get working (typecheck and compile), but usually once it does there aren't nearly as many bugs related to coding... they're usually just flaws in the high-level algorithm implemented.
 
I think I wrote this paragraph in a confusing way. There are two points, the first being about procedural programming---there isn't a standard definition, but nobody really cares about it.

The second point was about functional programming, which is defined to be languages based on the lambda calculus---which itself is just a system for creating and applying functions (and nothing else). FP is rigorously defined by the semantics of the lambda calculus, and by the semantics given for each of the extensions that have been created. These exist because people care about doing proofs and verification of these programs. So standard definitions can exist, it's just about putting in the effort to do it. Of course you can have variations which relax certain rules, but when I say "pure FP" people know what I mean. That's not the case for OOP.

Well then, technically I think functional programming is about the only paradigm that has a rigorous definition. OOP is very loose in its definition, but that doesn't necessarily make it a bad paradigm IMO. Even then, the definition of a language doesn't really it itself lead to a good logical argument for the superiority of the language. (It does help though, however, since so few paradigms are truly formally defined, it makes it hard argue one way or the other about the goodness of a paradigm).

I take more of the empirical approach. Since most programmers use OOP or an OOP like paradigm when programming, it supports the claim that it is one of the better languages. (Ignoring the whole correlation != causation thing 😛)

That being said, I really don't believe OOP to be the end all be all paradigm, I just view it as "The best we got so far for general purpose work". That isn't to say there aren't tasks that FP or Procedural programming are ideal for, just that most complex problems are simplified due to a OOP like paradigm. The argument that "OOP is terrible and costs so much etc." Is just as invalid as the argument that "OOP is superior to all other paradigms."


BTW, I really need to learn more FP 😀
 
Well then, technically I think functional programming is about the only paradigm that has a rigorous definition. OOP is very loose in its definition, but that doesn't necessarily make it a bad paradigm IMO. Even then, the definition of a language doesn't really it itself lead to a good logical argument for the superiority of the language. (It does help though, however, since so few paradigms are truly formally defined, it makes it hard argue one way or the other about the goodness of a paradigm).

I take more of the empirical approach. Since most programmers use OOP or an OOP like paradigm when programming, it supports the claim that it is one of the better languages. (Ignoring the whole correlation != causation thing 😛)

That being said, I really don't believe OOP to be the end all be all paradigm, I just view it as "The best we got so far for general purpose work". That isn't to say there aren't tasks that FP or Procedural programming are ideal for, just that most complex problems are simplified due to a OOP like paradigm. The argument that "OOP is terrible and costs so much etc." Is just as invalid as the argument that "OOP is superior to all other paradigms."


BTW, I really need to learn more FP 😀

I like my programming like my women: loose and type-unsafe. D:
 
Something about the phrase 'debugging your pubes' makes me a little sick to my stomach.
 
Back
Top