
Bjarne Stroustrup on Dennis Ritchie: They said it couldn't be done, & he did it - ColinWright
http://herbsutter.com/2011/10/12/dennis-ritchie/
======
toyg
"C is a poster child for why it’s essential to keep those people who know a
thing can’t be done from bothering the people who are doing it."

Being a "can't be done" person is easy, being a "I'll do it" person is hard...
but it's so much more fun and liberating. True story.

~~~
steve8918
How many times throughout history have we heard this? I think that this is
probably the most important part in any huge discovery throughout history,
people refusing to believe or not knowing that it that it "can't be done".

(Of course the trick is knowing when something really can't be done, ex.
alchemy)

~~~
T-hawk
Alchemy can be done. Gold has been synthesized on the nuclear level by way of
neutron bombardment of other elements. It's just not economically practical
since the process costs more in equipment and energy than the value of the
gold.

<http://en.wikipedia.org/wiki/Synthesis_of_noble_metals#Gold>

~~~
derleth
> Alchemy can be done.

Only by dishonestly redefining your terms.

Alchemy was always focused on _chemical_ methods. Converting lead to gold
_that way_ is impossible.

~~~
T-hawk
Right, and finding another way is how the "can do" person wins over the
"can't" naysayers. It's like saying Jules Verne's cannon to the moon is
impossible... which is true, but not really the point, since rocket travel to
the moon has been proven.

~~~
derleth
My point is that you can make _any_ prediction 'come true' if you redefine
enough words in it. This is a dumb parlor trick and tells us nothing about how
people of the past actually thought.

------
acqq
ALGOL had almost everything that C had including portable types at least 10
years earlier:

<http://en.wikipedia.org/wiki/ALGOL>

and really innovative guys, Burroughs computers had their operating system
written in an ALGOL dialect:

<http://en.wikipedia.org/wiki/Burroughs_Corporation>

all before C. Personally I appreciate the terseness of C and its closeness to
assembly a lot, and I believe it all reflects the good taste of Ritchie, but
still he didn't do anything "impossible" from my perspective.

To compare, you can read again:

"The Summer Of 1960 (Time Spent with don knuth)"

<http://news.ycombinator.com/item?id=2856567>

where Knuth writes an ALGOL compiler for Burroughs in 1960 working 40 hours a
week in violation of Cal Tech's policy that limits the number of hours that a
Ph.D. candidate can work.

~~~
kragen
You are mistaken.

From the article:

> There was no such thing

> as a general-purpose program that was both

> portable across a variety of hardware

> and also efficient enough

> to compete with custom code written for just that hardware.

And that is true. You couldn't run the Burroughs 5000 Master Control Program
on a PDP-11, as far as I know, _at all_ , and certainly not efficiently.

Ten years before the birth of C was about 1963. ALGOL-60 lacked a lot of
things to compete with C, which is why people in the 1970s switched to C.
ALGOL-68 required garbage collection, which wouldn't be reasonably efficient
until the invention of generational GC in 1983, and still poses problems for
real-time software like device drivers, which is why people usually use C or
C++ instead of ALGOL-68 or Java for them.

For some notes on what ALGOL-60 lacked in comparison with C, I recommend
reading "Why Pascal is Not My Favorite Programming Language", from 1981:
<http://www.lysator.liu.se/c/bwk-on-pascal.html>. Most of the objections to
Pascal also apply to its ancestor ALGOL-60, although ALGOL was intended for
real work, not just teaching. ALGOL-60 did have a sort of equivalent of
pointer parameters, namely Jensen's Device, but it was dramatically less
efficient than pointers. I _think_ (I wasn't around in the 1960s) that ALGOL
implementations from different vendors had different nonportable extensions to
work around the lack of pointers, and of course the same thing happened with
Pascal later.

Burroughs was an immensely innovative company, and all of today's popular
programming languages except C, C++, and PHP (JS, Java, C#, Python, Perl,
Ruby, Objective-C) owe an enormous debt to Smalltalk, which draws much of its
inspiration from the B5000. I do not deprecate the importance of Burroughs!
They did indeed write the first OS in a high-level language. But C achieved
what they could not.

The measure of C's achievement is that in 2011, on a multiprocessor 32-bit
machine with the x86 instruction set and a gigabyte of RAM, I still
occasionally run programs originally written in 1981 for a single-processor
16-bit PDP-11 with 64k of RAM per program, and they're still efficient; and I
still constantly run software written in the late 1980s, such as parts of the
X server and much of GNU coreutils, on single-processor 32-bit 68000s, and
much of the optimization done then is still valid. (Though not all of it!)

It was C that first made it practical for people on different architectures to
share code on a large scale, for code to outlive the architectures it was
written for without suffering a dramatic slowdown, and for people to switch
architectures, as Sun did from the 68000-family Sun3 to the SPARC, as everyone
eventually did to the i386 we use today, and as we now are to AMD64.

The SNOBOL Implementation Language, SIL, achieved the same thing in 1966 — but
only for one program, the SNOBOL interpreter! TeX was another early endeavor
in this direction; concurrent with the early evolution of C, Knuth wrote WEB,
a literate programming language which compiled into Pascal, in order to get
Pascal's portability without suffering from its drawbacks. Among other things,
WEB used a single humongous Pascal packed array of char for all of its
strings; and that was what Knuth wrote TeX and Metafont in. C's twin sibling
Ratfor was a third similar approach, compiling C-like constructs into Fortran
rather than assembly. I don't know of anything else that predated the
popularity of C.

In effect, C enabled both the birth of the SPARC and its death.

Today we stand at or near the end of the C era, for three reasons.

First, a vast amount of software is being written for which efficiency is of
minimal concern. So C's drawbacks — its proneness to subtle bugs, its
difficulty of debugging, its limited facilities for abstraction — drive people
to more modern languages.

Second, dynamic recompilation has reached a level of performance where it's
feasible to use it to emulate another processor architecture at acceptable
speeds. Transmeta was one of the most interesting explorations of this
concept, but Apple's transition strategy from the PowerPC to the i386 probably
had more field-deployed units, and I think some of the currently popular OS
virtualization approaches work this way as well. (And of course there's
Valgrind, although calling its speed "acceptable" is a bit of a stretch.) So
now there are viable alternatives to the recompilation approach.

Third, both computer architectures and compiler optimizations have changed so
much in the 40 or so years since C was invented that C is stretched rather
thin. Compilers exploiting undefined behavior makes it increasingly difficult
to write working code in C, or to recompile old C programs. And, although
people largely program GPUs in C, you cannot simply recompile Emacs for your
Fermi or your Spartan-II to get it to run faster or use less energy. The C
abstract model of computation is an increasingly poor fit to modern hardware,
despite the pressure it has exerted on hardware designs for the last 25 years.

๛

~~~
acqq
I'm not mistaken, only you appear not to have experience with FORTRAN and
Pascal.

> There was no such thing as a general-purpose program that was both portable
> across a variety of hardware and also efficient enough to compete with
> custom code written for just that hardware.

Of course ALGOL easily matches this claim. Proof: Knuth's TeX was written in
Pascal.

Not to mention everything that was written in FORTRAN.

> I still occasionally run programs originally written in 1981 for a single-
> processor 16-bit PDP-11 with 64k of RAM per program, and they're still
> efficient

Try to check when most of the FORTRAN libraries still used today were written.
Some -- decades before 1980.

See also how long FORTRAN compilers generated faster scientific code than C --
for decades after C appeared. You can also find why:

<http://en.wikipedia.org/wiki/Pointer_aliasing>

> "Why Pascal is Not My Favorite Programming Language"

The paper, as far as I remember, doesn't claim impossibility of writing useful
software, mainly laments for the lack of return, break, continue constructs,
which are convenient to have but not real show-stoppers. Note also that Pascal
standard features are just to be a "learner's language" not a "system
language" -- for which there were working compilers and uses before C.

> You couldn't run the Burroughs 5000 Master Control Program on a PDP-11, as
> far as I know, at all, and certainly not efficiently.

You'd also have to port Unix the same way you had to port MCP -- you can think
of MCP as Unix in which drivers are part of the kernel -- but "integer" in
ALGOL is certainly as portable as "int" in Unix -- they both fit the "hardware
word."

Nicer pointer arithmetics in C is a good point. Though note that Wirth also
had "Pascal lite" with pointers and which was closer to assembly for OS-level
stuff just as Burroughs had ALGOL lite for OS-level stuff. The concepts of
"closer to assembly" but higher-level languages existed certainly well before
C. The idea to separate kernel and drivers is something else, and I don't know
who achieved that first and when.

Finally, see Google's go -- it's more or less acceptance of Wirth's
directions, mixed with terser notation of C. (Now, finally, why are you still
on my lawn?)

> The C abstract model of computation is an increasingly poor fit to modern
> hardware

No, you can observe C (as well as more or less all ALGOL descendants) as a
higher level representation acceptably close to assembly, so as long as there
are CPU's which execute machine code and we need to care about details (see
<http://news.ycombinator.com/item?id=3068513>) we'll need something like that
to have efficient and not too low level representation.

~~~
kragen
(Note: I continued writing my original comment after you posted yours.)

I don't have _much_ experience with Fortran and Pascal. But the original
article also talks about how C was different from Fortran: "Fortran did okay
for array-oriented number-crunching code, but nobody could do it for general-
purpose code such as what you’d use to build just about anything down to, oh,
say, an operating system."

And that covers most of the FORTRAN libraries still used today.

Both Fortran and Pascal came close — ADVENTURE may be one of Fortran's
greatest triumphs — but they were not really usable by themselves. _Software
Tools_ was written in Ratfor (as I said, C's sibling) and TeX was written in
WEB, rather than Pascal. (Writing a text-processing program in a strongly-
typed language without a string type is an exercise in frustration.)

> you can think of MCP as Unix in which drivers are part of the kernel

I'm not sure what you're trying to say here. Drivers are part of the kernel in
Unix too.

> but "integer" in ALGOL is certainly as portable as "int" in Unix -- they
> both fit the "hardware word."

But ALGOL-60 doesn't have bitwise operations, bit shifts, unsigned arithmetic,
char, or more than one precision of floating-point. Instead you get ↑,
exponentiation. So many algorithms that can be expressed efficiently in C
cannot be expressed efficiently in ALGOL-60.

> Wirth also had "Pascal lite" with pointers

Where can I learn more? Googling [Pascal lite] is not helpful.

> The concepts of "closer to assembly" but higher-level languages existed
> certainly well before C.

Certainly true — BCPL could be described as one of them. But BCPL didn't quite
get it right.

> Finally, see Google's go -- it's more or less acceptance of Wirth's
> directions

What does Golang have in common with Oberon (?) that it doesn't have with C?

๛

~~~
acqq
> What does Golang have in common with Oberon (?) that it doesn't have with C?

<http://en.wikipedia.org/wiki/Oberon_(programming_language)>

A lot! Garbage collection, no unsafe pointer arithmetic, type specification
different from variable use (varname: type)

> ALGOL-60 doesn't have bitwise operations

Ah.... That's like saying that Unix IV didn't have driver for your network
card. It was just not standard only because CPU instruction sets were not
standard enough. Real-life compilers had it:

<http://rosettacode.org/wiki/Bitwise_operations#ALGOL_68>

> more than one precision of floating-point

FORTRAN had it, C didn't care initially. And FORTRAN remained faster for long.

> Where can I learn more?

Actually, my bad, sorry, "lite" "closer to the system" language was even
before Pascal and obviously before C, it was a bootstrap language for ALGOL W
in sixties, see sources:

[http://bitsavers.org/pdf/stanford/listing/Algol_W_Listing_No...](http://bitsavers.org/pdf/stanford/listing/Algol_W_Listing_Nov69.pdf)

And then Pascal was also written in Pascal after it was bootstrapped once.

~~~
kragen
> A lot! Garbage collection, no unsafe pointer arithmetic, type specification
> different from variable use (varname: type)

Hmm, I'll give you the last one, although they left out the colon. The others
are common to basically all high-level languages, so I don't really think of
them as due to Wirth's influence. To my eye, the interesting aspects of Golang
are interfaces, slices, and goroutines, none of which are present or even
hinted at in Oberon. Interfaces were kind of anticipated in OCaml, slices in
D, and goroutines in a family of CSP-derived languages going back to 1980.

> > ALGOL-60 doesn't have bitwise operations

> It was just not standard only because CPU instruction sets were not standard
> enough. [ALGOL-68]

Well, on one hand, it wouldn't be very useful to try to do bitwise AND on a
decimal machine. But the original claim is that, prior to C, general-purpose
(i.e. not purely numerical!) programs gained so much speed by being written
nonportably that portable versions could not compete, and C enabled high-
performance programs to be written portably. Your original rebuttal, as I read
it, was that 10 years prior to C (i.e. in 1963) ALGOL had already achieved
this.

We can stipulate, I hope, that bitwise operations are crucial for the inner
loops of a lot of important algorithms.

Now, it appears that you're saying that not only had ALGOL not achieved this
in 1963, but that it was impossible for any language to achieve it in 1963
because CPUs were too disparate, but that ALGOL-68, whose first usable
implementations were concurrent with the first usable implementations of C,
_still_ didn't standardize those operations, so you _still_ couldn't write
portable programs that used them! (Although you could write programs for one
or another ALGOL compiler that used them.)

I think you have proved the point of the original article rather than
rebutting it.

> FORTRAN had [more than one precision of floating-point], C didn't care
> initially. And FORTRAN remained faster for long.

For numerical code, yes. But I was talking about the inadequacies of ALGOL-60,
not Fortran (which is _still_ faster, as you alluded to earlier). C's limited
support for single-precision floating point was a sore point for decades, but
not supporting it _at all_ , as standard ALGOL-60 didn't, is much worse. It
doubles the size of all your arrays! That's much worse than simply doubling or
quadrupling your run-time, as C could; you can almost always run the program
for twice as long, but you can only rarely double the core at its disposal.

> sorry, "lite" "closer to the system" language was even before Pascal and
> obviously before C, it was a bootstrap language for ALGOL W in sixties, see
> sources:

>
> [http://bitsavers.org/pdf/stanford/listing/Algol_W_Listing_No...](http://bitsavers.org/pdf/stanford/listing/Algol_W_Listing_Nov69.pdf)

That code is written in PL360, and although, yes, it has bitwise operations in
it, nearly every line of it contains assumptions that it's running on a 32-bit
computer (such as the 360) and about which CPU instructions set which CPU
flags, with gems like "R0 := FLAGS(I) AND #80000000; IF ¬= THEN ...". It's
pretty strong evidence that, in 1966, even Niklaus Wirth thought he had to
write nonportable code — essentially assembly language with ALGOL syntax — in
order to get acceptable performance.

He explicitly rejected FORTRAN, and he claims he didn't have an ALGOL-60
compiler available.

> And then Pascal was also written in Pascal after it was bootstrapped once.

I've certainly seen Pascal compilers written in Pascal, but the ones I've seen
were concurrent with the development of C or later. I don't suppose you have
one in mind?

๛

~~~
acqq
See also:

<http://news.ycombinator.com/item?id=1624481>

> It's pretty strong evidence that, in 1966, even Niklaus Wirth thought he had
> to write nonportable code — essentially assembly language with ALGOL syntax
> — in order to get acceptable performance.

I'd however say that that 1966 code is not at all so far away from C. Today I
can also use registers in my big C compiler, heck, I have to for really
serious optimization. Just like then. C is also not automatically portable
unless active care is done to test it on another platforms and rewrite the
parts of it -- if you claim the opposite I have for you some 2 million lines
of code I maintain after at least 40 people worked on it -- it's not an
exception, more a typical example.

~~~
kragen
Yeah, you're right, PL360 _is_ pretty similar to C, but it failed to achieve
what C achieved: providing just enough of an abstraction from the concrete
machine to make efficient portable software possible, and in fact even
practical.

As far as register optimization in modern C, I think there's a world of
difference between saying "register long a, b, c, d, e;" and saying "WHILE W
¬= #1 DO BEGIN R1 := MEM(W); R0 := R1 SHRL 24 SHLL 12 + T; MEM(W) := R0; W :=
R1 and #FFFFFF; END". But maybe you were talking about inline assembly.

My experience with C portability agrees with yours, although it sounds like
yours is deeper.

------
beza1e1

      we now have the new ISO C11 standard. C11 includes a number of new features that parallel those in C++11
    

Did i miss something?

~~~
pavlov
ISO C11 has been known as "C1x" until its standardization:

<http://en.wikipedia.org/wiki/C1X>

Nothing earth-shaking. The __Generic_ keyword is probably the most exotic
addition (at least among those listed on the Wikipedia page) -- it's basically
_switch(typeof(X))_ for macros.

~~~
derleth
Is there a reason C11 doesn't include typeof? It's been a gcc extension for
decades now, and I see no reason it can't be part of the official standard.

~~~
nitrogen
My guess as someone who is not familiar with compiler internals is that the
design of many compilers prevents them from implementing a typeof() operator
without major changes to the way they analyze and store the parsed source
code, so the standard writers decided not to include a feature that would not
be widely implemented.

But, using _Generic(), one can do many of the things one would do with
typeof(). For example, the max() example in [0] could probably be implemented
using at most _n_ ^2 separate macros plus a _Generic() macro for each type,
where _n_ is the number of arithmetic types (though I think you'd still need
GCC's statements within expressions in order to evaluate A and B only once).

I'm thinking something like this that combines the cbrt(X) example from the
C1x spec [1] (§6.5.1.1 paragraph 5) with the gcc max() example, though I
haven't tested it, and it could almost certainly be simplified by exploiting
type promotion rules:

    
    
      #define max_ld_ld(A, B) ({long double _a = (A); \
          long double _b = (B); _a > _b ? _a : _b})
      
      #define maxld(A, B) _Generic((B), \
          long double: max_ld_ld, \
          int: max_ld_i, \
          /* etc. */ \
          )((A), (B))
      
      #define maxi(A, B) _Generic((B), \
          long double: max_i_ld, \
          int: max_i_i, \
          /* etc. */ \
          )((A), (B))
      
      #define max(A, B) _Generic((A), \
          long double: maxld, \
          int: maxi, \
          /* etc. */ \
          )((A), (B))
    

[0] <http://gcc.gnu.org/onlinedocs/gcc/Typeof.html>

[1] <http://www.open-std.org/jtc1/sc22/wg14/www/docs/n1570.pdf>

OT P.S. It is very annoying that Google has broken right-click+copy URL.

------
mhartl
The title is a bit misleading: there's a quote from Stroustrup in the post,
but the author of the article is Herb Sutter.

------
groby_b
"C is a poster child for why it’s essential to keep those people who know a
thing can’t be done from bothering the people who are doing it."

C++, on the other hand, is a poster child that just because it can be done,
you shouldn't necessarily do it.

------
evincarofautumn
The opening sentence bothers me. “Rob Pike reports that Dennis Ritchie _also_
has passed away.” (Emphasis mine.) As though he’s just some kind of footnote
in light of the death of Steve Jobs! Both Jobs and Ritchie were “I don’t care
if it’s impossible, I’m doing it” types, but I feel that Ritchie contributed
more to computing as a whole, while Jobs’s innovations were mainly in user
experience.

~~~
eropple
You're being overly sensitive. Jobs passed away first, and was remarked on
first.

~~~
evincarofautumn
I just think individuals should be treated individually.

------
andrewflnr
As a young whipper-snapper upstart with some big ideas, this is inspiring.
Maybe I'm not quite so crazy after all to think I can pull them off.

------
pnathan
What a wonderful tribute to a pioneer in our field.

------
dextorious
And one by me for Bjarne Stroustrup:

"They said it shouldn't be done, & he did it".

(yeah, a joke. Humor's not only for Reddit).

~~~
to3m
Sadly, Bjarne Stroustrup was the only person not to listen to "them" (whoever
they are), and so C++ is all we have.

Which sort of justifies his point, really...

~~~
dextorious
We'll I'll take Objective-C any day.

And if OO is your thing, we'll, we have tons of other options.

~~~
vorg
Learn Objective-C++ and you can command a salary premium from knowing twice
the complexity!

