Home > Articles > Programming > C/C++

Bjarne Stroustrup and Herb Sutter on the Future of C++: Part 2: Video Podcast Transcript

  • Print
  • + Share This
  • 💬 Discuss
In this transcription of an OnSoftware session, Bjarne Stroustrup, Herb Sutter, and Ted Neward discuss templates in C++0x, how they relate to functional programming, and Stroustrup's chagrin over 20 years of 'very old news.'

Welcome to OnSoftware—conversations with thought leaders in software development. In this session, Bjarne Stroustrup, Herb Sutter, and Ted Neward continue discussing forthcoming features and core language changes in C++0x.

Ted Neward: So you mentioned concepts, you mentioned Lambdas—these are full closures, capturing variables off the stack as well, and so forth.

Bjarne Stroustrup: Yes. Yes.

Herb Sutter: Functions or function objects on the fly in the line, without having to go off somewhere else and define a class.

Ted: Okay.

Herb: Every popular language is doing it these days. It’s an indoor sport.

Ted: That tells you—you know, if Java is talking about closures, it must be past due for all the languages that are really hip. And that raises another question....

Bjarne: Actually, before we get into the really hip—one of the major things about the next standard is that it doesn’t break the old code.

Herb: Mm-hm.

Ted: Which is good.

Bjarne: I mean, we need to be stable. So we get all the new stuff, it is cool, but we’ve maintained the performance, we maintain the hardware access, and we don’t break your code if we can at all help it.

Ted: In C++—I’m trying to remember the forward from Design and Implementation—C++ sort of hit the streets, so to speak, in, what, the mid-’80s?

Bjarne: First commercial release was in the fall of ’85.

Ted: Yeah. And so we’re looking at, what is that—23 years now?

Bjarne: Something like that.

Ted: Wow, your jubilee is coming up.

Bjarne: It’s a long time—and, you know, some of the early code still runs.

Ted: Yeah. Yeah. Let’s see where other not-to-be-named languages are in 25 years. [Laughs.] So, one of the things that certainly emerged, particularly in the last year, it seems, has been a rise of interest in the functional language space. And somebody—I wish I could remember who said this so I could attribute it properly—somebody suggested that C++ templates were functional in nature. That really threw me for a loop, and I wasn’t quite sure what to make of that, so I figured [gestures to Stroustrup and Sutter].

Herb: You said that before we started, and that surprised me, too. Bjarne said he’d heard that and agrees, so I’m curious, too.

Bjarne: I do agree. The inspiration for a lot of the STL is functional programming, and if you look about—sort of, you go through data structures, you apply operations to it, you combine algorithms, you work generically or different data types. All of that is functional in inspiration. Most of the type work is done at compile time in C++, but what runs at runtime is a lot like that. Function objects can emulate certain forms of higher-order functions, and some of the early work with the STL was done in Scheme and Ada. Now, one thing that is different, and that’s the one thing you latched onto [gesturing to Sutter], is that functional programming usually goes with lack of state, or—

Herb: Mutable state.

Bjarne: —mutable state and such, and that’s not what we’re dealing with. We’re not trying to turn C++ into a functional language; we’re trying to see what parts of functional programming techniques can fit in with object-oriented programming, and things like that.

Herb: Now I grok what you were saying about the "functional." Let me try to restate the same thing, maybe from a different perspective. There’s really two different things going on. Usually, when you say, "A language is like this," it’s like you’re talking about the kind of code you write every day, the classes and functions and everything. C++ templates are nothing like functional in that sense; however, to the extent that templates get specialized, use other templates, there’s this whole system that gets evaluated at compile time, which is very functional in nature. Templates and template meta-programming, the people who’ve gone down that road, are doing something that is much closer to LISP than it is to C, for example.

Ted: Well, certainly a number of the functional languages support this notion of type inferencing, which some of the template expansion—is that the right term?

Bjarne: Template instantiation in C++ is join-complete, and in some ways equivalent to ML type inference—different, but with the same expressive power inside what you can do conveniently. Very different, but with the same kind of deductive mechanisms. And some of the new features—variatic templates, ability to create tuples, manipulate tuples—will be more familiar to functional programmers than to C programmers, say. So we’re trying sort of cautiously to see where the things can work together, where we have things that actually work, perform, fit into the type system. We’re not going whole-hog functional, and we’re not trying to graft something alien in that doesn’t work. A language is more than the sum of its individual parts; it’s in the combinations you get the strength.

Ted: See, now, this has me all curious. Could you do—and maybe you’ve already thought about this—could you do currying in C++ using templates?

Bjarne: Yes, we can; it’s one of the new features. It wasn’t in C++ 98. You can do currying of type arguments, yes.

Herb: But I really want to come back to where this is.

Bjarne: Yeah, people might not recognize the syntax and such; they’ll say, "Oh, but we can do all of these things. We can do everything." In C++, you can do some of the things—what can be done type-safe, statically.

Herb: But I really want to make sure that people who are listening to this understand that this is not the usual question about, "Will languages for concurrency and multicore, for example, become more functional?" This is a very different style of question, because you’re talking about the C++ template and type system, which is all at compile time. There are direct parallels there.

But when people talk about what functional language is, and multicore and concurrency, they’re talking mostly about the idea of, "When I write my code (among other things), first, everything is an expression, so I can evaluate subexpressions in parallel, so I get automatic concurrency just in the nature of the way I’ve expressed the program." That’s the one thing they would like. The drawback there is, today, it’s too fine-grained. Because the overhead of doing something concurrent, you want to ship it off to be done somewhere else, you want it to be of a certain size: "You must be this big to be worth the overhead of shipping somewhere." We’re driving that overhead down. But today, that’s too fine-grained for most applications of exploiting that.

But the other area that functional languages are great at (the pure ones) is immutable state. You copy, you modify a vector, one element in it, we just got a whole new vector with that one element change, which is great. And immutable, you need no locking, it’s wonderful—except that there are pure functional languages like that, and then there are functional languages that people actually use, for commercial code. Because you just aren’t going to copy every—

Ted: [Laughs.] You just earned a hate mail from somebody on that line.

Herb: I know! But the reality is that everybody has to make exceptions, because, for efficiency, you’re just not going to copy a million-element vector every time you change an element, so there’s always a compromise. Immutability is great, but there needs to be that balance. The one thing you see coming from functional languages that are coming into all languages are Lambdas and closures. C#, anonymous delegates; Java is working on it; and, as you know, C++0x.

Bjarne: I’d just like to add to this: The functional program community has owned a huge chunk of the educational establishment for 30 years. They have explained to generations of programmers why it was wonderful and has never gotten traction in industry. So, from my perspective, there’s lots I like, but I’m not going to bet everything on being able to go functional, because lots of people (probably smarter than me) have done that and failed.

Ted: Right.

Bjarne: We’re trying to adopt what can’t fit with other things. This is why I sort of talk about multi-paradigm programming, trying to figure out how to combine traditional C styles with object-oriented programming and generic programming; finding a sort of (if you’ll excuse me) "sweet spot" in the space, where you can do a lot of things—where you can do it with greater safety, easier writing, and still good performance. It’s tricky.

Herb: One thing about multi-paradigm development, C++ has always been big on that.

Ted: Jim Coplien wrote a book about it, if I remember.

Herb: He has, yes. And there he mentions it well and frequently, because you have your C cell-structured programming, you have object-oriented programming in the classic virtual-function dynamic polymorphism sense, you have type generosity of compile time, C++ templates—those are three big ones. But, to make them orthogonal and composeable—as soon as you say "multi-paradigm," I can apply whichever of those make sense to this part of the problem and combine them together. I can have a templated class with virtual functions and inherit from it. And people do this and it works. That is very hard. Because any time you want to make two things orthogonal, you want to make them at right angles so they don’t cast a shadow. If they’re wobbly, if you build a house that’s not square, then the roof will start having problems and cracks will appear. So working on the interactions has always been a big thing that I think C++ has done very well, so that you can in fact use these things together and actually have them interoperate, not just have three different silos that you can’t talk across.

Bjarne: I mean, people use generic algorithms on data types that turned out to be class hierarchies with virtual functions. They can fit together, and it’s not brain surgery.

Ted: Let me ask you this question real quick—and we’ll wrap up with this because, wow, the time just went like mad. Recently, at a language .NET seminar in Redmond, Anders [Hejlsberg] was in front of a crowd of mixed Microsoft and a large number of non-Microsoft folks (including Sun, some independents, and so forth). Anders said what I thought was a very, very interesting statement. He said that he believes that we are quickly approaching a point where the traditional taxonomies of languages are breaking down. Where you don’t have a language that is just object-oriented any more, or just functional, or just procedural—that we’re very quickly approaching a point where languages are just languages, and borrow from each of these historically separate categories. Talking about C++ as a multi-paradigm language, does this kind of resonate?

[Stroustrup sighs.]

Herb: You probably saw him [points to Stroustrup], because C++ has been there for 20 years.

Bjarne: I wrote a paper in 1980 explaining why you needed object-oriented techniques, inherits and encapsulations, and how you needed to have parameterization to deal with containers and algorithms on containers. I got the mechanism for that parameterization seriously wrong, but the point is that within a year of starting C++, I was talking about inheritance, parameterization (that’s parametric polymorphism and generic programming), and combining it with traditional systems programming. This is very old news.

Herb: But, having said that, Anders is making a very good point—that this is becoming more and more broadly recognized, more and more mainstream. And that’s why, for example, you see exactly the thing that I mentioned earlier, about many languages adopting Lambda functions and closures. Because you look at that, and you think, "Oh, that’s a really super-functional thing." Well, it is a part of the functional language that you don’t necessarily need to be in a functional language where everything is an expression for it to make sense. It makes sense in imperative languages. For concurrency, it makes great sense to talk about a piece of code that you pass around as a first-class thing. And, really, you know we talk about it in such an abstract way, it’s a shorthand for writing a function object. It’s a shorthand for writing a function, conveniently, locally—but, boy, what a difference it makes. The Java community has seen that, in its use of those things, and the C# community as well.

Ted: That’s one of the things a lot of Java programmers remark on, when they pick up a language like Python or Ruby that has that: "Oh, my gosh, this makes so much sense, to go collection.each.[block of code]. Oh, this is amazing! We should have that."

Herb: But let me point out, when you say exactly that—

Bjarne: Smalltalk had it in the dark ages! Again, a lot of this stuff goes way back. We’ve gone through periods where people have had all kinds of what I thought silly wars, and denouncing languages that didn’t look exactly like they expect. There’s hybrids, and—

Ted: Right. You didn’t have semicolons and curly braces; therefore, you suck.

Bjarne: Oh, it’s so silly.

Herb: What he said. [Laughs.]

Ted: Gentlemen—wow, as always, the time flew. I enjoyed myself most immensely. I hope you enjoyed yourselves.

Bjarne: Time flew.

Herb: Good to see you each year.

Ted: Yeah. We’ll have to do this again, same time, next year. I think they should do a movie or a play about this.

Bjarne: Something like that. Good idea, good idea.

Ted: Take care, and enjoy the rest of the conference.

Herb: Thank you. You, too.

For more information, visit onpodcastweekly.com and subscribe to all our podcasts. Brought to you by the publishing imprints and information portal of Pearson Education.

  • + Share This
  • 🔖 Save To Your Account


comments powered by Disqus