
“C++ is a horrible language” – Linus Torvalds - setra
http://article.gmane.org/gmane.comp.version-control.git/57918/
======
overgard
It's worth noting that C++ in 2007 was a lot different than C++ in 2016. There
were a lot of valid complaints back then which have since been addressed. (I
know personally in 2007 I _hated_ C++, but I find it pleasant to use now.) The
compilers are massively improved, the STL is a lot more stable, tools and
IDE's are vastly better, some of the new language features are a huge help
(lambdas, simple type inference, range based loops, override keyword, etc.)
I'd still use C for an operating system, but things have definitely improved
massively.

~~~
copperx
If I understand correctly, Linus was not only dismissing C++, but also the
object-oriented programming paradigm. His main concern was creating
abstractions that result in inefficiencies which most of us consider not bad
enough for writing userland software.

What's interesting is Stroustrup proudly states that C++ provides "cost-free
abstractions". I'm not too familiar with C++ implementations; is it fair to
say that?

~~~
ktRolster
_What 's interesting is Stroustrup proudly states that C++ provides "cost-free
abstractions". I'm not too familiar with C++ implementations; is it fair to
say that?_

C++ was invented when efficiency was measured in instruction counts and
processor cycles, so he optimized his language in those terms.

Processors are much more complicated now, and cache alignment and branch
prediction and pipelining are much more important for efficiency than counting
the number of instructions something takes.

~~~
Scea91
But even then C++ is basically cost free if used correctly. Of course I am not
comparing it with some hypothetical hand optimized assembly, but specifically
when considering cache alignment C++ has massive advantage in this area
compared to languages like Java. Compared to C there is nothing in C++ that
would make cache alignment more difficult by design.

------
Ace17
This seems like an dangerous appeal to authority to me. Just because Linus
said so doesn't make it true.

Especially when the context of this quote is almost 10 years old, mostly
irrelevant to the language itself (crappy C++ programmers and rigid
architectures) or obsolete (bugs in the STL and boost).

~~~
dstein64
> "This seems like an dangerous appeal to authority to me. Just because Linus
> said so doesn't make it true."

What is "this" that you're referring to? (e.g., posting the link to Hacker
News, the post making the HN frontpage, a specific comment on this discussion,
something in reply to Linus, etc.)

------
jpt4
To expand: "Why wasn't the Linux kernel written in C++?"

[https://news.ycombinator.com/item?id=2405387](https://news.ycombinator.com/item?id=2405387)

------
kbart
Both of the key arguments Linus provided are pretty much solved. No wonder as
this article is nearly 10 years old.. Also, Linus is a low-level programmer,
where there's still not many choices besides C yet and where fewer layers of
abstraction is usually better.

------
aksx
Looks like it's that time of the year again.

Yes Linus said that and he was probably [in 2007] C++11 and C++14 have come a
long way.

also since no one has posted this till now [2014] Gtk to Qt a strange journey
[https://www.youtube.com/watch?v=ON0A1dsQOV0](https://www.youtube.com/watch?v=ON0A1dsQOV0)

------
piyush_soni
_" Quite frankly, even if the choice of C were to do _nothing* but keep the
C++ programmers out, that in itself would be a huge reason to use C"*

His problem is assuming that there are no stupid programmers in C, and it
can't be abused. I haven't had a career as long as Linus', but I've seen
people abusing C more than C++, and writing their own inefficient
implementation of extremely basic things.

------
weq
This reminds me of my first mentor, thinking he is the target of the system,
not the coders who need to keep there source in order. Fast forward a few
years, the low-level unmaintinable code is no longer fast, and u still gotta
refactor everything even though u avoided the object model.

yeh... sure.. kernals dont talk back...

------
Kinnard
Seems like Linus Torvalds could use some training in nonviolent communication:
[http://firstround.com/review/power-up-your-team-with-
nonviol...](http://firstround.com/review/power-up-your-team-with-nonviolent-
communication-principles/)

~~~
oldmanjay
Hey thanks I was looking for a way to power up my rhetoric with ridiculous
redefinitions

~~~
Kinnard
While I could see this misapplied as a technique in that way, I don't think
that's the spirit of non-violent communication.

------
zxv
Anybody recall the InterViews[0] native C++ toolkit for X Windows, circa 1993?

At the time, it felt like a huge leap forward in ease of writing native GUI
apps for X. The point being, C++ was a medium for lots of great ideas,
including one of the early Design Patterns books [1].

[0]
[https://en.wikipedia.org/wiki/InterViews](https://en.wikipedia.org/wiki/InterViews)
[1]
[https://en.wikipedia.org/wiki/Design_Patterns](https://en.wikipedia.org/wiki/Design_Patterns)

------
zeckalpha
(2007)

------
pjmlp
Yet,
[https://github.com/torvalds/subsurface](https://github.com/torvalds/subsurface)

~~~
ryanobjc
well specifically he was trashing C++ for performance sensitive systems
applications. Fixing the speed of DVCS was a major goal of git. Plus you know,
kernels.

His better argument about C being good is that it's easier to look at a diff
and know what's going on. This is because C has minimal non-local effects,
where with C++ you have the ultimate in spaghetti programming: inheritance and
overrides.

~~~
pjmlp
If he would be consequent, given his rant, he should never touch C++ and keep
using C instead.

> His better argument about C being good is that it's easier to look at a diff
> and know what's going on.

Yep, [https://www.cve.mitre.org/cgi-
bin/cvekey.cgi?keyword=memory+...](https://www.cve.mitre.org/cgi-
bin/cvekey.cgi?keyword=memory+corruption)

~~~
taneq
EasiER, not always easy. With C++ you have to know about all overridden
operators in scope in order to correctly read code, for starters.

~~~
Scea91
Because every class has overriden operators right? You are basically trying to
compare bad code in C++ with great code C. Of course the C wins then.

~~~
taneq
No, of course not. But if any overridden operators do exist in scope then you
need to know about them in order to correctly interpret an expression. Or are
you saying that overriding operators is bad practice in general and should
never be used?

C has fewer "magic" features than C++, and so it's more often clear just by
looking at some code what it does. I like C++, but only because I'm happy to
only use the subset of the language which suits me, and I tend to avoid
features which increase the cognitive load of reading my code.

~~~
pjmlp
Besides being unsafe, C is one of the few mainstream languages that doesn't
have any form of overloading or non-alphanumeric identifiers.

I never understood why it is so had to look at a + b and see it as a.+(b).

I also need to look at the documentation or source code to see if sum(a,b)
does what I think it does.

Python, Ruby, Smalltalk, Ada, Lisp, Scheme, C#, C++, Swift, Rust, D, Scala,
Clojure, Kotlin, JavaScript(ES7), ML .... developers seem to be able to cope
with it, just C, Go and Java devs not.

------
SakiWatanabe
without stl how to do basic things like vector/map/set? roll your own
containers and implement red/black tree etc?

------
digler999
> In other words, the only way to do good, efficient, and __system-level __...

He didn't say it was horrible for everything, he's talk about system-level
programming. I have every confidence in the STL that _std::sort_ will always
work, but when you're writing code at the bit and byte level it's not a
surprise that it's not the correct tool for the job.

~~~
copperx
Git falls into application, not system, software. Yes, it can be a crucial
part of one's workflow so we want it to be as fast as possible, but how
inefficient could it had become had it used the object-oriented paradigm in
C++?

------
iofj
Maybe we should just put the conclusion of these sorts of discussions
somewhere in here as well:

This criticism applies essentially to any higher abstraction, not just to C++.
Higher abstractions make it easier to do complex things, and thus they scale
to bigger total software systems. The maximum complexity you can control using
C++ is higher than the maximum complexity you can build into a C program
before things spiral out of control.

However, higher abstractions work by figuring things out for you. Code in the
libraries or compiler will decide when things happen, in what order, and how
they relate to one another.

This has 2 consequences of note. Firstly they make it very hard to predict
when things will happen. This is very good: you don't have to know or care in
the "vast majority of cases". But yes, God help you when you do need to know.
I find numpy code is similar: you almost never have to care about the
internals, but when you do, the rabbit hole is miles deep. In a related
effect, abstractions make it hard to predict the exact sequence in which
things will happen, and mostly this does not matter. When it does matter, it
is really, really hard to find.

Second, those abstractions mean that tiny changes in the source code
frequently result in large changes in the sequence in which things happen. So
tiny, seemingly unrelated changes in the source code can result in very
different runtime behavior.

These things are very good, and very bad, depending on your viewpoint. You
want to get complex new things working ? Higher abstractions are your friend.
You want to get complex new things working fast ? C++ is your friend.

If on the other hand, you want a stable, productionized and bug-free
implementation of simple processes that you have to maintain and keep running
and keep stable and predictable for long periods of time ? Abstraction is the
enemy. It will bite you in the ass time and time again. Don't use anything
high-level.

There's a related problem. You need to know higher abstractions. When working
with C++ math libraries or numpy you quickly find this out. If you don't have
a very good math education, those abstractions will make very little sense
indeed. This means that a large cohort of programmers without the old-style
"math first, programming second" education quickly run into insurmountable
issues. Not because the issues are necessarily hard to a math Phd, but because
they don't have a math background. Hating abstractions is far easier than
fixing your math understanding.

The right tool for the job.

As for the programmers, there are many incredibly good C++ programmers. As a
rule of thumb, any successful long-lived language will have tons and tons of
bad programmers using it. This has to do with managers trying to cut costs and
the resulting effects on the marketplace for developers. It's (sadly)
beginning to be true for python these days. Doesn't have anything to do with
the language and it won't affect you if you don't screw up your hiring.

~~~
benjcooley
Don't use abstractions.. unless they're high enough that the compiler or
runtime can optimize their implementation better and faster than you could -
and the compiler/runtime environment actually does this (naturally C++ doesn't
and can't).

Higher level abstractions aren't _necessarily_ the enemy of optimal code.
Modern tracing jitted JS runtimes are great examples of how higher level code
can be transformed under the hood by an intelligent compiler into more
efficient concrete code at compile/runtime. The problem with C++ isn't that it
exposes abstractions, it's that there is no way to make the abstractions
responsive to where/how they are used, no simple automated way to optimize by
use case, and there is no reliable tooling that exposes the cost of
abstractions during development thus concealing their potential costs.

That being said.. abstractions are probably the greatest single potential
untapped source of massive performance gains as it allows performance
optimizations to be automated by machine learning and runtime performance
analysis.

