
C++: Is It Really a Cruel Joke? (2003) - rhapsodic
http://webhome.phy.duke.edu/~rgb/Beowulf/c++_interview/c++_interview.html
======
nf05papsjfVbc
Dr. Stroustrup took a lot of pain to ensure the language was useful and did
not break backward compatibility. He also always maintained that if you don't
use a feature you shouldn't pay for it (in terms of performance). Eventually,
it became even more popular and there is now a whole organisation behind the
international standards for the language. The book "The Design and Evolution
of C++" helps one understand why some things are the way they are and
sometimes it helps understand why some things exist in the language in the
first place.

The language takes a lot of flak but I think it is quite good given the
constraints it has and the modern version is an amazing leap forward in ease
of use as well.

~~~
ttul
I read that book around 1998 and was an instant convert. C++ has a steep
learning curve, and does not stop you from getting yourself into a lot of
trouble, but it does stand up to Bjarne’s original promise about performance.

IMHO the STL is one of mankind’s greatest accomplishments.

~~~
jzwinck
The STL doesn't always live up to the standard of "zero overhead
abstractions." The list is not high performance, many implementations of hash
tables never shrink (and used to have O(n) erase!), and deque is a bad joke
(no chunk size control, plus laughable fixed chunk size on some common
platforms).

Many high performance projects use vector but few other containers.

If the STL had a lesson to teach us it should have been "iterators everywhere,
including for your own algorithms and container types." Instead most people
learned "C++ has all the containers you need built in, throw away your
performance tricks and let it call malloc a million times."

~~~
jstimpfle
> Most high performance projects use vector and little else.

Right. And then it is actually a lot simpler to simply use pointer + size
pairs [1] instead of std::vector. Changing to explicit allocation was the best
decision I've made. I now find myself not longing for any C++ features anymore
at all. I haven't needed anything besides a little allocation wrapper [2] and
maybe a string-to-hash map since.

[1] Or n pointers + 1 size for parallel arrays, indicating that it's a bad
idea to glue pointer + size in the first place.

[2]
[https://gist.github.com/jstimpfle/562b2c3e9fe537e378351bb9d5...](https://gist.github.com/jstimpfle/562b2c3e9fe537e378351bb9d5be8cdb)

~~~
jstimpfle
> string-to-hash map

string-to-int hash map

------
leajkinUnk
> There was this Oregon company - Mentor Graphics, I think they were called -
> really caught a cold trying to rewrite everything in C++ in about '90 or
> '91\. I felt sorry for them really, but I thought people would learn from
> their mistakes.

I've talked to some of the people at Mentor Graphics who were there during
that period. The company basically went from #1 in the industry to #3 or so
during the course of the C++ refactor (and the EDA industry isn't exactly
big). Bjarne Stroustrup showed up at the company now and then because the
company was such a major early adopter. Inheritance chains were 5 or 10
classes deep. A full build took a week. The company hosted barbecues on
weekends and invited the employees' families so they could see each other.

I only worked there somewhat later, so I just heard the stories from people
who were there at the time, and only after I had been there a while. Take some
old-timers out to lunch now and then, you'll learn a lot. I ended up leaving,
I was more than a bit frustrated by the organizational culture and the build
system my team used was by far the worst I have ever seen in my entire life.

But Oregon is an awesome place to live, the salary was good, and the hours
were normal.

~~~
bunderbunder
> Inheritance chains were 5 or 10 classes deep.

30 years later, I still run into the same problem regularly. I'm not sure why,
but this seems to be an anti-pattern that everyone needs to learn about the
hard way.

~~~
DerDangDerDang
I blame java

~~~
bunderbunder
Don't blame Java. Blame programming instruction.

If it's anything like when I was in school, as soon as the curriculum trots
out its first object oriented language, you get a lecture about how is-a
relationships are the greatest invention since the compiler, and deep
inheritance hierarchies are both the most practical and the most morally
righteous way to organize your abstractions.

(Meanwhile, ironically enough, I'm not sure I've _ever_ heard a CS instructor
even mention the Liskov Substitution Principle.)

------
a_bonobo
I feel compelled to note that this page is from 2003-05-13, so I have no clue
how much of its gripes are still valid in 'modern' C++ :)

~~~
colanderman
Ya this is a whiny garbage post that looks like something I would have written
when I was 18 had I been forced to use C++ for a school project (and probably
could have been, given its year of publication). I'm not sure it adds anything
to human discourse. It certainly isn't applicable to modern C++, or even the
modern zeitgeist of pre-modern C++.

~~~
xfer
You can still do all those things in modern C++ in addition to all the stuff
that get added every 4 years. The only thing that has improved dramatically is
the tooling: compilers, static/dynamic analyzers.

> And, as I said before, every C++ programmer feels bound by some mystic
> promise to use every damn element of the language on every project.

------
martincmartin
"There are only two kinds of languages: the ones people complain about and the
ones nobody uses."

\-- Bjarne Stroustrup [1]

[1] [http://www.stroustrup.com/bs_faq.html#really-say-
that](http://www.stroustrup.com/bs_faq.html#really-say-that)

~~~
kibwen
While the OP article is indeed unhelpful and unconstructive, it's a pet peeve
of mine to see people reference this quote (in the context of any language,
not just C++) as a way of deflecting criticism. Not all languages that are
used get complaints in equal proportion, and finding the constructive
complaints is how we work towards making all languages gradually better.

~~~
Jyaif
> Not all languages that are used get complaints in equal proportion

It's because they are not used in equal proportion.

~~~
recursive
Presumably "equal proportion to their use" is implied.

------
thestoicattack
From the "interview" of "Stroustrup":

> ... You know, when we had our first C++ compiler, at AT&T, I compiled 'Hello
> World', and couldn't believe the size of the executable. 2.1MB

> Interviewer: What? Well, compilers have come a long way, since then.

> Stroustrup: They have? Try it on the latest version of g++ - you won't get
> much change out of half a megabyte.

So, for grins, I did, with the gcc7 port from macports, which is GCC 7.3.0.

-rwxr-xr-x 1 ssta staff 8968 Jul 5 10:40 hello

The C version, using printf instead of (gasp) std::cout, clocked in at 8432.

~~~
abhishekjha

        #include<stdio.h>
        int main(){
                printf("Hello World");
                return 0;
        }
    

>5649591 -rwxr-xr-x 1 user user 8288 Jul 5 21:49 a.out

What am I doing wrong?

~~~
fesoliveira
Did you use -O3 or some other optimizations?

~~~
abhishekjha
No, just `gcc a.c`.

------
jcelerier
> It is sufficiently fascist that it more or less "forces" students to program
> with a certain discipline. Never mind that no real coder EVER writes
> programs with that particular discipline once they get out of diapers...

I wonder what would be the author's opinion of Rust

~~~
kibwen
_> > Never mind that no real coder EVER writes programs with that particular
discipline once they get out of diapers..._

This line appears to be a snide remark at class-based OOP, for which a regular
criticism is that real-world projects rarely slot neatly into the sort of "Cow
is-a Mammal is-a Animal" taxonomical hierarchy that is used to teach class-
based OOP in school. Rust doesn't have classes or taxonomical hierarchies, and
encourages struct-first POD design (similar to C), augmented by traits which
provide shallow has-a relationships (composition) rather than deep is-a
relationships (inheritance).

------
pjc50
2001ish? Besides, it's a really tired debate/joke. Yes, it's hard to learn and
has a bunch of misfeatures. But, like Javascript, it's widely adopted and in
use on so many real projects that you're not going to get to rewrite.

------
Raphmedia
If you only want to read the joke interview, here's a website with a better
layout:
[http://harmful.cat-v.org/software/c++/I_did_it_for_you_all](http://harmful.cat-v.org/software/c++/I_did_it_for_you_all)

------
DonHopkins
No joke! No joke! You're the joke!

[http://www.stroustrup.com/whitespace98.pdf](http://www.stroustrup.com/whitespace98.pdf)

Generalizing Overloading for C++2000. Bjarne Stroustrup. AT&T Labs, Florham
Park, NJ, USA.

------
cogs
I know C++ isn't really dying off, but what _is_ the best platform-agnostic
compiled object-orientated language these days?

I liked Borland Pascal, and I know Delphi is kind of ticking along, but I'd
rather invest in a language that is growing.

Swift looks nice but still seems too Apple focused.

Don't want to start a war, just open to some tips on the ecosystem..

~~~
AltVanilla
People often think of C++ as a Object Oriented Language. That's
understandable, as it was the key feature back in the earliest days. OOP got
hyped and over used. Now OOP is (almost) considered harmful in C++ community.

If you look at modern C++ libraries (like boost), you will see a lot of
templates and free functions, and not a lot of inheritance or dynamic
polymorphism.

~~~
jcelerier
> If you look at modern C++ libraries (like boost), you will see a lot of
> templates and free functions,

I mean, people were already calling for more templates and free functions in
1997. It's not modern by any stretch of mind, it's just normal C++.

------
_d9hm
That (fake) interview at the end is hilarious.

~~~
jcelerier
I think it's from cat-v :
[http://harmful.cat-v.org/software/c++/I_did_it_for_you_all](http://harmful.cat-v.org/software/c++/I_did_it_for_you_all)

------
jacob019
Having spent the past decade working with python, I've always considered C to
be a mystical black box for hardcore systems guys. I recently needed to write
some high performance networking code so I decided to give it a shot. At first
I found it painfully verbose and confusing, with the simplest operations
requiring lines of code. After a couple days I found it to be rather
refreshing, it encourages you to look at memory from a more basic perspective
and efficiency just falls in place. Pointers are confusing and strict typing
takes some getting used to, but they are awesome. It's great to be able to use
the same memory with different types and structures, so better than copying
things around. I still love Python, but I will be using C again when
performance matters.

~~~
gtycomb
I stumbled into Nim recently and this thing "just works". Python-like syntax,
performance like C. This is what C should have been, I'd say.

[https://nim-lang.org/](https://nim-lang.org/)

~~~
mlthoughts2018
I would encourage you to use Cython instead of Nim if you're looking for
Python-like syntax combined with C performance.

~~~
gtycomb
This is just my situation, but I gave up on Cython when I started moving
forward with it (distributed graph algorithms). I found that I had more things
to learn. With Nim, it took me half a day (after seeing the language for the
first time) to get a basic numerical process going and another few days for
distributed data communication involving messaging and postgres/mongo (Nim has
a modern JavaScript (ES6) promise/async/await like concurrency model that is
powerful and succinct like the language itself)

~~~
mlthoughts2018
I find that strange, but I concede you know your use case better.

I work on a very similar type of application that manages async workers who
process large distributed NLP tasks. Writing it in Cython was extremely easy,
because for the modules that have zero need for static typing, such as the
part using async/await in Python 3, or when we supplement with gevent, I can
just write those parts in plain Python and it’s quite a bit easier than Nim or
Cython or whatever else, while still having great performance from those
tools’ low-level implementation.

Then for the parts that do possibly benefit from static typing and compilation
(unlike the async layer), I can have precise module-level control over what
has a C-level implementation and if or how it interacts with anything in
Python.

The inability to separate the two situations in Nim (as with many statically
typed languages) just doesn’t work out well enough for my use cases.

In fact, I’d even go as far as to advocate that in today’s language landscape,
if you want to write a new greenfield project in C or C++ for performance
reasons, it’s unequivocally your best option to write the whole thing in
Cython, and avoid what you might call “premature static typing optimization”
by profiling and leaving the things with no bottleneck in Python.

~~~
gtycomb
Hi - thanks for outlining your case. My Cython know-how is limited to qualify
any kind of comparison with Cython. Its just that my attempt with Nim went
surprisingly smooth for me. Coming from some of the older languages I use and
love, the reliability of the newer Nim is pleasing and coding is fun. I ended
up eliminating all Python code from my back-end. Because Nim is statically
compiled I can deploy pieces of it anywhere just like a compiled C program,
without dependencies, and it means a lot in my particular situation.

~~~
mlthoughts2018
> "the reliability of the newer Nim is pleasing and coding is fun."

Can you elaborate on the reliability part? I've spent a lot of time grokking
Nim specifically to be able to make good judgments about whether there are use
cases in which it would be a better choice than Cython, and from a reliability
point of view I have not noticed anything that would distinguish Nim from any
other language. I can agree that Nim's syntax is nicer than many other
statically typed languages, though the language design has some warts with
`result` and `discard`, etc. But I can't see any reason to believe it is 'more
reliable.'

> "I ended up eliminating all Python code from my back-end."

While I can't know the reason for this in your exact case, generally this
seems like a very suboptimal thing to do. Python has a much richer set of
libraries, testing utilities, etc. It is a language with a huge community of
users and developers, and much more likely to be a known language for someone
new who joins the project. If a system was working well and someone proposed
to refactor away a solid base language like Python, that would almost always
be a crazy choice, regardless of any positive aspects of the targeted new
language. It's similar to why you should rarely throw away old code that has
meaningful tests. You can slowly refactor it little by little, but wholesale
switching to something else is usually evidence of wrong engineering
priorities, _especially_ when the something else is a 'latest and greatest'
kind of new language or tool, like Nim is.

> "Because Nim is statically compiled I can deploy pieces of it anywhere just
> like a compiled C program, without dependencies, and it means a lot in my
> particular situation."

This can also be done with Cython, using the options to embed an
interpreter... and there are various other third party tools that allow you to
create thick binaries for combined Python programs as executables, including
runtimes and dependencies. To boot, you definitely should be managing the
deployment of some binaries with proper dependency management practices. So
really, if you're already using dependency management techniques for the
binaries, the minor extra work to maintain Python environments and
dependencies would almost always be pretty trivial, with a huge family of
tools (pip, conda, pipenv, virtualenv, etc.) and endless tutorials on the
community-developed and mature best practices for packaging Python programs.

I would be curious to know more details about a project where it was truly
advantageous from a productivity and deliverability point of view to rewrite
the backend to move from a stable and mature ecosystem like Python to a
relatively younger and less mature system with Nim specifically to gain a
benefit somehow related to ease of deploying pieces of the code to different
locations. The details just don't sound like they could possibly be in favor
of using Nim in a case like that.

~~~
dom96
I won't comment on Cython as I haven't personally used it much, but:

> though the language design has some warts with `result` and `discard`, etc.

Why do you consider these to be warts?

~~~
mlthoughts2018
For `result`, the basic description from Nim’s example pages highlights why
it’s a severe problem.

< [https://nim-by-example.github.io/variables/result/](https://nim-by-
example.github.io/variables/result/) >

Even just needing to account for that mental gymnastics about declaring a new
result variable is, I think, not forgivable.

The bigger issue though is that initialization of the return variable is
implicit, which is inherently problematic. This is especially troublesome
because type constructors in Nim are essentially always separate factory
functions. So you have to remember to manually call a particular constructor
or else `result` might be just an improperly initialized skeleton of your data
type.

For example, I might have some type called MyType and a proc with return type
of MyType. I explicitly _don’t_ want the proc to initialize `result` to an
empty MyType behind the scenes, for whatever implementation reasons about
MyType (a common example is a type that ought to be initialized with the
acquisition of a resource and should _never_ exist in a partially initialized
state in which the resource acquisition hasn’t been attempted yet, and could
possibly fail later).

If I _only_ want it to be initialized from a special constructor like
mkMyType(), then in Nim, I have to code around this limitation by making it a
void proc, and passing in an appropriately mutable reference.

In other words, to avoid possibly inappropriate return type initialization, I
am forced to revert to poor C-style void functions all over that mutate
placeholder inputs by convention, which undermines a lot of things Nim tries
to do to improve clarity about pure vs impure procs.

I don’t have time to go into why discard is a bad design idea right now, but
hope to come back and add more later.

~~~
dom96
> For `result`, the basic description from Nim’s example pages highlights why
> it’s a severe problem.

This is simply an explanation for newcomers. It's really not something that's
a "severe problem", just something to be aware of.

> The bigger issue though is that initialization of the return variable is
> implicit, which is inherently problematic. This is especially troublesome
> because type constructors in Nim are essentially always separate factory
> functions. So you have to remember to manually call a particular constructor
> or else `result` might be just an improperly initialized skeleton of your
> data type.

This really isn't an issue when you can do this:

    
    
        import options
    
        type
          MyFile = Option[int]
    
        proc getFile(): MyFile =
          # Oh no, I didn't initialise it...
          discard
    
        echo(getFile()) # -> none[int]
    

You can also use `ref T` and achieve a similar effect: an explicit "empty"
state. So there is no weird semi-empty state problem here.

I would really like to hear why you think `discard` is a bad design idea. I
honestly cannot even imagine a reason as I consider this to be one of the best
features of Nim.

~~~
mlthoughts2018
This is not a reasonable answer regarding initialization, because you may not
want to wrap everything in an Option type. Especially not for an obscure side
effect reason like initialization, and then need to litter Option handling all
over, which destroys a lot of information in your types. It would be like
misusing Maybe in Haskell as if it was for exception handling. You could never
write pure functions that get lifted to utilize Maybe. Instead you’d be
forcing people to manually use Maybe everywhere, for all signatures. I had
also already pointed out the Ref option in my original comment, as an example
of exactly the type of anti-pattern that makes it a bad thing you constantly
have to code around in Nim.

It’s not reasonable to suggest you have to code past this intrinsic limitation
_everywhere_ by muddying all your function signatures to take Option types and
adding extra logic to pack or unpack values from Option types all over... to
solve an _initialization_ problem!

~~~
Araq
That's true and I wouldn't use the Option type for that either. There is a
switch that warns about variables that are not initialized explicitly
(including the 'result' variable) and an RFC to make this switch non-optional.
[https://github.com/nim-lang/Nim/issues/7917](https://github.com/nim-
lang/Nim/issues/7917)

------
ben509
> C++ is more insidious -- because it IS a superset of C, it sucks you in, and
> (like Pascal) it has been embraced by computer science departments
> everywhere

I've heard of this, but does it happen?

I did my degree back in the '90s at Drexel, I don't recall there being any
prescribed language... I did some systems stuff in C, some AI courses in LISP,
a concurrency course in Java, some stuff I don't recall in Perl, some math
courses used Maple, and there was a bit of shell and some familiarization with
the Solaris boxes, etc.

I don't think I ever actually took a course on a specific language, you were
just supposed to RTFM and figure it out.

------
abhishekjha
I think there should be a separate course for memory management w.r.t any
language so that people can get a better perspective of what they are doing. A
few books explaining the modern memory management principles would help as
well.

------
olly67
I used to do a lot of C++ programming, but over time I found that using either
C or a real high level language (Python, Ruby, etc) works better for most
tasks.

------
htor
yes, c++ is the cruelest most insane joke, on the same level of perhaps
embedding malware inside a compiler and distributing it to the developers. no
really, it was a nice try making an object-oriented language on top of c, but
then why not build on top of the superior oo features of smalltalk or lisp?

------
ythn
Modern C++ is WAY better than C, IMO. I can never go back to C, having tasted
C++14, and I work on embedded systems.

~~~
shrimp_emoji
Seconded. I'd even go so far as saying that C++ is my favorite lang; it's C
with the lavish furnishings of Java.

C is a space ship, Java is a plane, and C++ is an amphibious SSTO. It's
messier in design and trickier to pilot, but you can do more with it!

------
ebbv
Why do we as developers feel a need to constantly trash the things we don't
like? If you don't like something, don't use it. Talk about the things you do
like instead.

~~~
nocman
While I agree that it is good to not always dwell on the negative, there can
be a lot of value in pointing out the deficiencies of programming languages.

Some folks will just use a language without thinking about its downsides. If
you use language X for a long period of time, you can become blind to areas
where it is wasting your time.

Again, I do agree that the tone could often be much more civil, but I would
hate for experienced programmers to stop pointing out things that bother them
about their programming languages. Much of the time it is _not_ just about
personal taste -- it is about real problems with the language that have a real
cost.

~~~
ebbv
I wasn't saying there's no place for constructive criticism or critiques.
Those are definitely useful and important.

But that's not what this article is, and I think we all know the tendency I'm
talking about for people to spend more energy and feel more comfortable
trashing things than talking about what they like.

------
jasonmaydie
The only joke here is somehow this author thinks the success of your project
is dependent on the language you pick

~~~
macintux
Funny how many people believe that language choice influences success. I guess
they're all wrong?

Sure, there are an uncountable number of other factors, but picking the wrong
platform/language can be fatal.

~~~
ythn
Anytime I see someone bashing Java/PHP/JavaScript on /r/programmerhumor
(reddit) I challenge them to a coding contest. I use the language they just
bashed, they use their ideal language. No takers yet.

I think, more often than not, language choice is like golf club choice.
Different pros have their preferences, but a pro can play a good game with any
set of clubs. A novice will blame the clubs for a poor game.

~~~
swebs
And you can perform carpentry using a big rock instead of a hammer. That
doesn't negate any downsides of choosing the rock.

~~~
ythn
I would say it's more like people are pooh-poohing certain _brands_ of hammers
as being inferior to TrendTech Hammers (R) which are "obviously" superior.

