
The Convergence of Modern C++ on the Lisp Programming Style - effdee
https://chriskohlhepp.wordpress.com/convergence-of-modern-cplusplus-and-lisp/
======
nly
Saying it's converging on a Lisp is flexing a little bit of artistic license.
It's fair to say C++ is becoming increasing accommodating to whatever you want
to make of it _if_ you're willing to compromise and roll with it. For sure,
it's one of the very best and most practical languages for writing extremely
efficient implementations of algorithms.

I highly recommend watching Alex Stepanovs A9 lecture series, "Efficient
Programming with Components" [0], where he covers the humble min() function,
and variants thereof, for at least ~20 hours. During this time he also takes a
minor digression in to writing a memory pool for linked-list nodes, while
pointing out similarities in the design to a Lisp. It's extremely
'instructive', as he would say, in understanding how he writes C++ code. The
sheer amount of code will horrify you, it may even seem unproductive, but I
think his objective was really to get everyone thinking about the beauty of
the algorithms and the details rather than the objective. His concern for
performance, correctness of code, and composition of algorithms is interesting
and something I haven't seen presented in real time, in C++, in such a manner
before. One example for instance, are occasions where he insists on writing
functions backwards, from the return statement up.

If you want a higher level overview on his opinions of C++ and programming
etc. The less intense series, "Programming Conversations" [1], is worth a more
casual watch.

[0]
[https://www.youtube.com/playlist?list=PLHxtyCq_WDLXryyw91lah...](https://www.youtube.com/playlist?list=PLHxtyCq_WDLXryyw91lahwdtpZsmo4BGD)

[1] [https://www.youtube.com/playlist?list=PLHxtyCq_WDLXFAEA-
lYoR...](https://www.youtube.com/playlist?list=PLHxtyCq_WDLXFAEA-
lYoRNQIezL_vaSX-)

~~~
userbinator
> It's fair to say C++ is becoming increasing accommodating to whatever you
> want to make of it _if_ you're willing to compromise and roll with it.

It's also one of the few languages in common use that allows using a wide
range of abstraction levels, whatever is desired - I'm not aware of any other
language that lets you write inline Asm, procedural, OOP, and functional-style
code, all conceivably in the same source file.

~~~
betterunix
Now you are aware of another, Common Lisp (as implemented by SBCL):

[http://www.sbcl.org/sbcl-internals/index.html](http://www.sbcl.org/sbcl-
internals/index.html)

~~~
reikonomusha
You should underline that the ASM part is _extremely_ implementation
dependent. There is absolutely no language feature, or even a convention, to
have inline ASM.

Also I'd posit that inline ASM is a lot easier to reason about in C++ than it
is in Lisp, given to the rigid type structure. You're bound to shoot yourself
in the foot in Lisp, even more than in C++.

~~~
dbaupp
Inline ASM is implementation dependent in C++ too, isn't it?

~~~
reikonomusha
Yes, but the two big names in C++ compiler technology—GCC and Clang—are
largely compatible [1]. This is not the case with Lisp.

[1] [http://clang.llvm.org/compatibility.html#inline-
asm](http://clang.llvm.org/compatibility.html#inline-asm)

------
reikonomusha
This article seems to put Lisp on a bit of a pedestal when it comes to types,
but languages like Haskell blow Lisp out of the water with respect to
expressiveness. There's also a lot in this article that's plainly implying the
wrong things about what Lisp is capable of doing.

If we are to talk about types, I'd hope people would look at languages like
Haskell or Standard ML—or maybe even ATS, Agda, or Coq—for future direction,
_not_ Lisp.

Unlike Lisp, C++ lets one write efficient, generic algorithms that can operate
on several types of data. Lisp cannot, and falls back to dynamic type testing,
which makes for slower code[1]. Basically your only option in Lisp is to
specialize everything manually, or inline everything. Both approaches are
extremely poor. As I hinted, Haskell and Standard ML do an even better job
than both C++ or Lisp. This is talked about a good bit in this article [2].

(Also, for the record, Lisp has no concept of information hiding, interfaces,
or APIs. And no, Lisp's packages are just convenient structures for grouping
somehow related symbols together.)

The article talked about declaring function types in Lisp. Many
implementations plainly _ignore_ those and they provide _no_ value. It doesn't
help that idiomatic Lisp code usually doesn't include such declarations.

Lastly, a lot of the things in this article were SBCL specific, and had little
to do with the Lisp language itself.

Like others have said, I don't think any points demonstrate that C++ is
converging to Lisp. And if it were, that's probably a bad direction anyway,
given the author's initial statements about heading toward an algorithmic
language.

[1] No implementation of Lisp has true, flexible compile-time polymorphism or
any type-algebraic structure; all inferred types must be concretely realized
during the inference.

[2] [http://symbo1ics.com/blog/?p=1495](http://symbo1ics.com/blog/?p=1495)

~~~
jonnybgood
What is up with Haskellers starting these irrelevant Lisp vs Haskell
arguments? OP made no mention or implication that Lisp is better than Haskell.
It's petty and counterproductive.

~~~
reikonomusha
This isn't a Lisp versus Haskell argument. It's a matter of fact that the
expressiveness of Haskell's (and others') type systems are more expressive
than Lisp's, practically (and almost theoretically).

Also, I'm not a "Haskeller". I do program in Haskell occasionally, though. I'm
actually, at this time, a professional Common Lisp programmer.

~~~
jonnybgood
And how is that relevant to the OP? The OP made no counterclaim.

~~~
reikonomusha
My bringing up of Haskell and friends wasn't there to counter, but to merely
observe that there are better things surrounding type systems than what is
provided by even Lisp. And I wished to add that _if_ one is to look for
inspiration on building expressive type systems, then Haskell should be the
place to look, not Lisp.

I also wanted to provide Haskell as a kind of system which provides an
inference engine that is much stronger than Lisp's. I don't want anyone to get
the idea that Lisp implementations' inference engines is even commensurable to
what exists elsewhere.

------
pbsd
The latest revision to the C++ standard, including polymorphic lambdas, makes
the xplusone definition much terser:

    
    
        auto xplusone = [](auto x){ return x + 1; };
    

While at it, here's how Paul Graham's accumulator generator looks in C++14:

    
    
        auto foo = [](auto n){ return [=](auto i) mutable { return n += i; }; };
    

This does not affect the point of the article in any major way; just a
curiosity.

------
Daishiman
At the risk of getting downvoted for what could be a humongous display of
ignorance, although the theoretical grounding of the article seems
interesting, I fail to see how any of this can get applied to "real-life"
applications when this is basically ten pages of content to get to apply this
to addition operation and for other problems which are, for most day-to-day
practice, already solved quite succintly.

Perhaps it's just me, but it seems like an awful lot of conceptual baggage to
do things that can be expressed with much greater simplicity and without
resorting to concepts that need multiple years of expert knowledge of the
language to get this "elegance". And i understand the theoretical elegance,
it's just that I have to ask myself if this truly makes an actual difference
in code style, simplicity, and clarity of language.

~~~
lisper
No, you are exactly right. The C community and its progeny has, over the
years, dug itself into a deep, deep syntactic and semantic hole. It is now
coming to realize that the Lisp folks had some good ideas after all, but
because of the sunk cost fallacy the C folks are unwilling to just back out of
the hole. Instead, they keep digging the hole deeper and deeper in the hopes
that some day it will emerge into the light. C++11 is interesting in the same
way that Brainfuck is interesting: it's remarkable that you can take such a
horrible mess and make it do cool things. But it's just a parlor trick, kind
of like escaping from a straightjacket while handcuffed. It's challenging, and
it requires skill, but at the end of the day you're in the exact same place as
if you'd never put the handcuffs and straightjacket on to begin with.

~~~
hdevalence
> at the end of the day you're in the exact same place as if you'd never put
> the handcuffs and straightjacket on to begin with.

Do you have any suggestions on what tools the people doing 'parlor tricks'
with C++11 should be using to accomplish them, then? That is, what would you
suggest to get to that place without the straightjacket?

Note that at a bare minimum, these supposed tools should give the ability for
fine-grained manual resource control, zero-(runtime)-cost abstractions, and
performance roughly on-par with C++. As evidence that these hypothetical tools
work well for the purpose, we could look for some complex _and_ high-
performance software written in them: say, a browser engine, a 3d engine, a
kernel, etc., but wait a minute -- these are all things that tend to be
written in C++ or plain C.

Instead of glibly dismissing the language that's used to implement, say, every
major browser engine, wouldn't it be more productive to ask questions about
_why_ people use it? Bonus points if the answer is something more realistic
than "They don't know Lisp".

That way, you end up trying to figure out how to make a replacement for it
that is an actual replacement -- Rust is a fantastically exciting example of
this.

It'll be great when we can all move away from C++, since it's a colossal
clusterfuck of counterproductive complexity, but "C++ is a bad language"
misses the point in a really uninteresting kind of way.

~~~
lisper
> Do you have any suggestions on what tools the people doing 'parlor tricks'
> with C++11 should be using to accomplish them, then? That is, what would you
> suggest to get to that place without the straightjacket?

Duh, Lisp of course. (Or Haskell.)

> Note that at a bare minimum, these supposed tools should give the ability
> for fine-grained manual resource control

Check. Lisp provides garbage collection, but you don't have to use it. It's
perfectly possible to write Lisp programs that do manual memory management. It
isn't often done because it's hardly ever a win, but if you really want to you
can.

> zero-(runtime)-cost abstractions

a.k.a. macros

> and performance roughly on-par with C++.

The SBCL compiler is pretty good. But one of the reasons that Lisp code is not
generally as fast as C/C++ is that Lisp code is safe by default whereas C/C++
code is not. You can make Lisp code unsafe (and hence faster) but you have to
work at it, just as you can make C/C++ code safe, but you have to work at it.
I submit that in today's world, being safe and a bit slower by default might
not be such a bad place to be in the design space.

> these are all things that tend to be written in C++ or plain C.

There is a world of difference between C++ and plain C. C is actually not a
bad language if you want to write fast code with relatively little effort and
don't care about reliability or security. The value add of C++ over C is far
from clear. (I don't know of any OS written in C++. Linus wrote a famous rant
about why Linux is written in C and not C++. There are, however, examples of
operating systems written in Lisp.)

> wouldn't it be more productive to ask questions about why people use it?

I know why people use it: it's fast, there is a huge installed base, and it's
an excellent platform for studly programmers to display their studliness. That
doesn't change the fact that C++ has deep design flaws which result in its
being incredibly hard to use and extend. And the existence of coders studly
enough to be productive in C++ does not change the fact that it imposes an
extremely high cognitive load on its users.

~~~
hdevalence
> a.k.a. macros

No, macros are not what I'm talking about here: I mean that C++ provides
abstractions that only impose runtime costs if you use them. For instance, the
cost of vtable lookup is only paid if you are using virtual functions;
otherwise, you don't have any overhead for function calls beyond what's
imposed by the hardware.

As noted elsewhere in the thread:

> Unlike Lisp, C++ lets one write efficient, generic algorithms that can
> operate on several types of data. Lisp cannot, and falls back to dynamic
> type testing, which makes for slower code[1]. Basically your only option in
> Lisp is to specialize everything manually, or inline everything. Both
> approaches are extremely poor. As I hinted, Haskell and Standard ML do an
> even better job than both C++ or Lisp. This is talked about a good bit in
> this article [2].

That's what I mean when I say 'zero-cost abstraction'.

~~~
lisper
Lisp is exactly the same. You only pay the run-time cost of generic functions
and dynamic type dispatch if you use them. What many people get hung up on is
that in Lisp you get generic type dispatch by default, so to not pay that cost
you have to do some work (declare types).

------
piokuc
As a Lisp enthusiast who earns a living coding in C++ I can only say one
thing: there's still a long way to go for C++. Very long, indeed.

~~~
dang
You're preaching to the choir in my (personal) case, but what specifically do
you have in mind?

~~~
wglb
I'll chime in. Macros and simple syntax.

------
laichzeit0
I last worked with C++ 10 years ago and never looked at it since then, it
looks like it's some beast going through various stages of evolution. Perhaps
in another 5 or 10 years it will emerge as something beautiful without any
vestigial appendages.

What's interesting from a language perspective is that the C++ folks seem to
take the approach of "ugly is better" by essentially not breaking backwards
compatibility like Python 3 and Perl 6 did. The "clean break" approach seems
to create a new species and hope that the old one goes extinct (didn't happen
with Perl, probably won't). But C++ just keeps evolving.

------
username223
This is just embarrassing. Modern C++ is moving toward intricate types like
Haskell, not parser and compiler mods like Lisp.

~~~
AnimalMuppet
That's because, despite the article, C++ is not actually trying to become
Lisp.

And, for all the smug superiority of the Lisp crowd, the niche that C++ is
trying to occupy does seem to be significantly bigger than the Lisp niche...

------
vinkelhake
The terminology appears to be somewhat mixed up. xplusone<int> is not a
specializaton, just an instantiation of the template.

Specialization is when you tell the compiler to use a separate implementation
of a template when some or all of the template arguments match something.
Template specialization is used in the type traits section of the post.

------
adw
For someone who wants to learn "modern C++", particularly for number-
crunching, without a strong C background (say... someone who develops
predictive models who mostly writes Python and Java, not that I resemble that
or anything...) – where would be a good place to start?

~~~
matt_d
0\. Books--one of these: [http://isocpp.org/get-
started](http://isocpp.org/get-started)

Personally, if you aren't new to programming per se, I'd go with "C++ Primer"
by Lippman/Lajoie/Moo since it smoothly integrates modern C++11 throughout the
entire text (instead of sticking it into a separate section, as some of the
other books do).

After that, "C++ Concurrency in Action: Practical Multithreading" by Anthony
Williams: [http://www.manning.com/williams/](http://www.manning.com/williams/)

...and then the rest of the books from the isocpp list (e.g., Josuttis).

1\. Libraries:

The rich ecosystem of available libraries is one of my primary reasons for
using C++ for numerics :-)

In fact, it's rich enough that it may be best if you were to specify what kind
of number crunching you're interested in -- right now I can only try to give
you a very broad/big-picture list of some that I've found useful.

The Standard Library supports (P)RNG with a variety of statistical
distributions:
[http://en.cppreference.com/w/cpp/numeric/random](http://en.cppreference.com/w/cpp/numeric/random)

\- Boost.Math Toolkit:
[http://boost.org/libs/math](http://boost.org/libs/math) // and more broadly:
[http://boost.org/doc/libs/?view=category_Math](http://boost.org/doc/libs/?view=category_Math)
// and even more broadly ;-):
[http://www.boost.org/doc/libs/?view=categorized](http://www.boost.org/doc/libs/?view=categorized)
\- Eigen: [http://eigen.tuxfamily.org/](http://eigen.tuxfamily.org/) \- GPGPU:
[http://www.soa-world.de/echelon/2014/04/c-accelerator-
librar...](http://www.soa-world.de/echelon/2014/04/c-accelerator-
libraries.html) \- MLPACK: [http://mlpack.org/](http://mlpack.org/) \- NLopt:
[http://ab-initio.mit.edu/wiki/index.php/NLopt_C-plus-plus_Re...](http://ab-
initio.mit.edu/wiki/index.php/NLopt_C-plus-plus_Reference) \- OpenCV:
[http://opencv.org/](http://opencv.org/) \- Odeint:
[http://www.odeint.com/](http://www.odeint.com/) \- POCO:
[http://pocoproject.org/](http://pocoproject.org/) // note: not numerics, but
when you need to exchange data over the net/web, these are pretty good for
that :-) \- QuantLib: [http://quantlib.org/](http://quantlib.org/) // note:
QuantLib is primarily for quantitative finance, but also has math components:
[http://quantlib.org/reference/group__math.html](http://quantlib.org/reference/group__math.html)
\- SOCI: [http://soci.sourceforge.net/](http://soci.sourceforge.net/) // note:
not numerics, but for when you need database access, it has pretty clean API
and is easy to use :-)

2\. Talks:

* C9 Going Native: [http://channel9.msdn.com/Shows/C9-GoingNative](http://channel9.msdn.com/Shows/C9-GoingNative)

In particular: \+ "Bjarne Stroustrup - The Essence of C++: With Examples in
C++84, C++98, C++11, and C++14" \-
[http://channel9.msdn.com/Events/GoingNative/2013](http://channel9.msdn.com/Events/GoingNative/2013)

\+ "Sean Parent - C++ Seasoning" \-
[http://channel9.msdn.com/Events/GoingNative/2013/Cpp-
Seasoni...](http://channel9.msdn.com/Events/GoingNative/2013/Cpp-Seasoning)

* BoostCon / C++Now!: [https://github.com/boostcon/](https://github.com/boostcon/)

There's _lots_ of interesting talks, so explore yourself :-)

For instance, 2013 Keynote: "Dan Quinlan: C++ Use in High Performance
Computing Within DOE: Past and Future" //
[http://2013.cppnow.org/session/keynote/](http://2013.cppnow.org/session/keynote/)

// IMHO, it's worth watching these for staying up to date with the broader
developments in the field -- e.g., according to the speaker (given who he is
I'd assume credibility) most national labs, including Lawrence Livermore
National Laboratory in particular, are quite actively adopting C++ (not C) and
have been turning away from Fortran for some time now.

HTH! :-)

~~~
adw
Thank you so much!

(To answer your question: linear algebra and optimization. I do a lot of Numpy
right now, and I see Eigen in my future...)

------
frozenport
Feels more like Python because now we have lambdas and I can write auto
everywhere.

