
Most(ly dead) Influential Programming Languages - luu
https://hillelwayne.com/post/influential-dead-languages/
======
linguae
Another highly influential programming language not listed in the article is
Self ([http://www.selflanguage.org](http://www.selflanguage.org)). Although
I've taken two graduate-level courses in programming language theory, I didn't
learn about Self until just a few years ago when I was bitten by the Smalltalk
and Lisp bugs and started reading about the history of these languages and
their environments. One of Self's most significant contributions is the idea
of prototype-based programming, which is the influence behind JavaScript's
traditional lack of classes (though newer versions of JavaScript have support
for classes).

I don't know if Self was ever commercialized; I do know that once Java was
released Sun focused much of its attention on promoting Java at the exclusion
of other object-oriented environments that Sun invested in (such as the
OpenStep Objective-C API, which was actually jointly developed by NeXT and
Sun). But Self is probably one of the most influential programming languages;
it's just a shame that this language was never brought up in any of the
computer science courses I've taken.

~~~
Scarbutt
I'm surprised Clojure isn't in the list either.

Edit: To clarify, Clojure is mostly a dead language that didn't have any
innovations by itself, but it did influenced many programmers(the creator is
good at marketing). It helped push forward the FP mindset into the users of
other mainstream languages(js, python, java).

~~~
pansa2
Is Clojure really “mostly dead”?

~~~
Scarbutt
It has less market share than perl or delphi (make your own judgement). It was
really hyped(rightfully so, a practical lisp for production use? fun! sign me
in) then declined fast.

Like most(ly dead) languages, it still has its followers, in the case of
clojure, mostly a cultish group (my impression from r/clojure and other
forums).

~~~
michaelmrose
Can you explain and support the idea that the number users using the language
productively declined? The state of Clojure survey seems to have held steady
at 2500 from 2015 to 2020 with 60% saying they used it for work in 2015 vs 69%
saying they used it for work in 2020.

Going back to 2010 we see less than 500 respondents and only 27% using it for
work. A charitable assumption is that it grew substantially between 2010 and
2015 and held steady between 2015 and 2020.

[https://clojure.org/news/2020/02/20/state-of-
clojure-2020](https://clojure.org/news/2020/02/20/state-of-clojure-2020)

------
amyjess
I'm surprised by how hard this article is on Algol 68. It was pretty
influential:

\- Influenced C's type system. I'm just going to quote Dennis Ritchie on this:
"The scheme of type composition adopted by C owes considerable debt to Algol
68, although it did not, perhaps, emerge in a form that Algol's adherents
would approve of. The central notion I captured from Algol was a type
structure based on atomic types (including structures), composed into arrays,
pointers (references), and functions (procedures). Algol 68's concept of
unions and casts also had an influence that appeared later."

\- Influenced bash's syntax (fi, esac)

\- I'm not sure if this counts as an influence or not, but objections to Algol
68's design lead Niklaus Wirth to revive his earlier proposal for a new
version of Algol, called Algol W, and ultimately evolve it into Pascal.

~~~
kick
That ALGOL-68 was hated by almost everyone who like previous ALGOL versions is
not to be discounted.

ALGOL-68 was loved by people who would have never used ALGOL to begin with
(and they didn't start doing so with ALGOL-68).

It was difficult to write compilers for, slow & far too complex.

~~~
ThomasBHickey
Looking back at ALGOL-68, it looks like comparatively small language compared
to many of our current languages, e.g. Java, C++ and Python. I loved the
definition of it, but never got to use it.

~~~
kick
That modern languages are even worse does not make ALGOL-68 good, in my
opinion. It's understandable why people would like it, though, compared to
modern languages.

------
tyingq
Expecting to see Perl in an article like this in 5 to 10 years. Wasn't the
first language I learned, but there's a fond place in my heart for it. At the
time, it was the (only) less painful way to get at sockets, libc calls like
getpwnam(), etc. I know the TIOBE index is flawed, but...ouch:
[https://www.tiobe.com/tiobe-index/perl/](https://www.tiobe.com/tiobe-
index/perl/)

~~~
goto11
But how how influential was it? It seems most of the unique ideas in Perl have
not been adopted by other languages.

But if Perl can be credited with kick starting the dynamic language boom
(Python, Ruby) then it have been massively influential.

~~~
tyingq
I'd say it was pretty influential for Ruby, CGI-BIN, pcre, and other things
that will live on in more refined forms. PHP has some obvious Perl influence
as well.

Edit: Maybe also Perl's Configure (crazy wide cross-platform portability) and
CPAN. They were pretty ahead of their time.

~~~
LeonidasXIV
While I agree that Perl influenced Ruby which I guess in turn influenced
CoffeeScript and Elixir to some degree, the lineage of PHP seems to be dead. I
don't really know any languages that are inspired by PHP itself, other than
maybe Hack which one could argue is very closely related to PHP.

~~~
chipotle_coyote
I still write and actually still (sorta-kinda) like PHP, but there's no major
concept/feature I can think of that's intrinsic to PHP rather than inherited
from other languages. The Perl (and general "C-like language" influence is
notable, and as PHP has matured it's started to feel ever more like Java. ("To
write a simple PHP server app, first just initialize your PSR-11 compatible
dependency injection container and add your PSR-7 compatible HTTP
request/response handlers to it...")

~~~
tyingq
The mixed code/template capability seems to have led to JSX.

------
btilly
I am sad that Forth did not make the list.

It showed how to make a minimal programming language that runs in a very small
amount of space with just a stack. And did a lot to popularize RPN notation.

Even if you do not write in Forth, you can still benefit from knowing the
ideas. For example I could not have written my answer at
[https://stackoverflow.com/a/60817908/585411](https://stackoverflow.com/a/60817908/585411)
if I did not know the ideas of Forth.

~~~
vanderZwan
It was probably considered too "alive" in the embedded world to be on the
list.

------
gtk40
I'm a younger millennial (born in early 90s) who got his start with BASIC. The
first programming I did was on Q-BASIC on our aging desktop machine (386 with
Windows 3.1) that we didn't use for much, especially with very limited
Internet use.

When I took programming in high school, we started with TI-BASIC on TI-83
calculators for about a month, as my programming teacher felt like this best
replicated his experience learning programming on a TRS-80. I tend to agree,
and this is a great use of BASIC. It's the default programming interface on a
widely used computer to this day.

We then moved to VB6 for our "serious" programming, although we also did
JavaScript and Java.

My first programming professionally was done in an office setting at a temp
job using VBA to help with some Excel work. And then my first job as a
software engineer, even though I wasn't writing it, did have some Visual
Basic.NET floating around (most of my work was in C#).

~~~
ORioN63
I started to learn programming, by writing programs in TI-BASIC on the TI-84
(compatible with the 83) calculator. I just got it, because it was required to
have a graphic calculator for math class.

It was just enough of a push. Small programs and games were fine and I
actually got really used to the keyboard (I can still type on it pretty
quickly these days). For longer programs I def. remember wanting to program
with more monitor real estate and not having to rely on GOTOs. That's how I
started to learn Python, which I still use daily.

~~~
nemo1618
Same. My TI-BASIC magnum opus was 527 lines of spaghetti code, implementing
dozens of nested menus for solving any trigonometry problem I was assigned. I
honestly think that this is one of the best ways to introduce programming to a
kid: "Hey, that math homework looks pretty tedious...wouldn't it be nice if
your calculator could do all the work for you?"

------
ldeangelis
Great article, I love learning more about language influence. I was wondering
something:

> _Smalltalk wasn’t the only casualty of the “Javapocalypse”: Java also
> marginalized Eiffel, Ada95, and pretty much everything else in the OOP
> world. The interesting question isn’t “Why did Smalltalk die”, it’s “Why did
> C++ survive”. I think it’s because C++ had better C interop so was easier to
> extend into legacy systems._

Maybe it's linked to performance? Even today some application are rewritten
from Java to C++ (or clones are made) to gain performance (like with Cassandra
and Scylla).

~~~
linguae
I have another theory about why Java beat other object-oriented programming
languages except C++: cost. I can't speak for Eiffel and Ada since I'm
unfamiliar with those environments, but in the mid-1990s Java caused a
revolution of sorts by providing free-as-in-beer runtimes and development
tools that were available for download. I don't know how good GNU G++ was in
1995, but I know that Borland's Turbo C++ and Microsoft's Visual C++ were
affordably priced. By comparison, commercial Smalltalk implementations had
expensive licensing fees, and in 1995 Squeak and Pharo didn't exist. There was
also OpenStep and Objective-C, but that was also very expensive; this 1996
article from CNet ([https://www.cnet.com/news/next-gets-to-
nt/](https://www.cnet.com/news/next-gets-to-nt/)) says that the Windows NT
version of OpenStep cost $5,000 per developer and $25,000 for a version that
allowed deployment.

With the high prices of Smalltalk and Objective-C environments, Java attracted
a lot of companies and developers who wanted an object-oriented programming
language that provided some of Smalltalk's benefits (e.g., garbage collection,
memory safety, a rich standard library) without having to shell out the cash
for a Smalltalk implementation.

~~~
jhbadger
But there was an affordable Smalltalk system in the early 1990s -- Digitalk's
Smalltalk/V. It cost $99 and came with a huge manual that had a great
tutorial. It introduced me (and lots of others) to the whole idea of object-
orientation.

~~~
bhaak
I've seen this several times. You need to get the stuff to the students who
usually have no money to spare.

Emphasis on "NO". Affordable doesn't cut it.

If you can't download it from somewhere for free, something else will be used
by students that will later determine what they'll use at their startups or
companies.

Even better if it's legal to download for free.

~~~
jhbadger
I'm not denying that the modern world of open source tooling and online
documentation is better, but in the 1980s-early-1990s that's just not how it
was. Things like Turbo Pascal and Smalltalk/V cost money, but not that much,
and were worth it because of the large printed manuals they included, which
were needed because you couldn't just Google things.

~~~
bhaak
I don't say that it wasn't worth it.

I say that it was an additional barrier to entry which got more significant
the later we are in the 90s. That it was free was a significant boost for the
popularity of Java (probably also the free JVM from Microsoft was a
significant contributor).

In the early 80s you always had a programming language for free with your
computer and often, those manuals were not bad either as that was seen as an
additional selling point for the hardware and that's why the hardware
producers did it.

------
7thaccount
The APL section seems a little off. I downloaded the latest Dyalog version and
the RIDE IDE and it was all very simple to use. Despite having zero APL
experience I wrote a pretty nifty program to simulate something in my domain
in like 3 lines of code. The keyboard thing is a non-issue as the IDE let's
you enter symbols and you quickly start to memorize that you can enter
command-key r to enter rho and so forth. I've used the ASCII J before and find
the APL glyphs help me learn what's going on better. It's right that it isn't
a popular language, but that is sad. Dyalog has a lot of tools too from web
server, database libraries, R & Python Bridges, GUI...and so forth.

~~~
mlochbaum
I think it's fair to say that font issues were a significant reason for its
decline in the late 80s and 90s (well before good unicode support). Other
major factors were spreadsheets, which did many of the things APL was best at
with an intuitive graphical interface, and OOP. APL didn't have OOP, so it was
for dinosaurs. Structured programming could have had the same
effect—mainstream APLs picked up ifs and while loops 10-20 years later than
the rest of the world—but I think the usability gap between APL and anything
else for arrays was just too large at the time for that to hurt APL too much.

~~~
7thaccount
Thanks Marshall. I'll yield to you in anything APL as I'm a total noob.

For those that don't recognize his username, I think he works for Dyalog in
APL implementation. Is that right?

~~~
mlochbaum
That's right.

------
yiyus
I'm sorry Forth is not on that list. It could be argued that it was not so
influential for mainstream languages, but there is a whole range of
concatenative languages that can be considered direct descendants.

~~~
carapace
I think Forth's influence is more subterranean. I.e. PostScript and coreboot,
etc. Forth is there, but rarely in your face.

As far as the underlying concepts, speaking as someone who has experimented
recently with Joy (one of the concatenative languages you mention. although it
wasn't directly influenced by Forth, I think, it's a case of convergent
design), I think it's a shame it hasn't been more influential.

The time may come: check out Conal Elliot's Compiling to Categories.

~~~
yiyus
Joy is a great language! I also wish it had been more influential.

Maybe you already knew this, but there are lots of great articles about Joy in
nsl.com, and there is also Thun:
[http://joypy.osdn.io/](http://joypy.osdn.io/) (which someone suggested me
around here).

~~~
carapace
:D yeah, dude, that was _me_ (Thun, not nsl.) Cheers!

------
jbotz
Both ML and Smalltalk are definitely not dead, not even mostly dead, they’ve
just evolved into language families whose members have different names. The ML
family has Standard ML, OCaml, and Haskell, and the Smalltalk family has
Pharo, GemStone, GNU Smalltalk, and others. These all may not have hugely wide
adoption, but they are actively used in both academia and industry, and
continue to grow and evolve, and their continued evolution is still
influencing other languages.

Ok, you can argue about the definition of "mostly dead", but whatever you
decide these two just aren't in the same category as the others on this list.

~~~
danharaj
Haskell isn't in the ML family. It is a descendant of Miranda and some other
lazy functional languages.

~~~
jbotz
Granted. 8-)

------
dfan
I took two software engineering courses at MIT around 1990 that used CLU as
the implementation language (one was a general medium-scale software
engineering course, one was on writing compilers). I found it to be a very
pleasant language, although the main thing I had to compare it to was C (C++
was just getting off the ground). It's always nice to see it mentioned in
lists like these.

------
dleslie
> The interesting question isn’t “Why did Smalltalk die”, it’s “Why did C++
> survive”. I think it’s because C++ had better C interop so was easier to
> extend into legacy systems.

Video Games. From around 1999 to recently, C++ was the language of game
development. It only recently came under serious threat, from C#.

~~~
pjmlp
Very bad example.

Video games community is very luddite in what concerns adoption of
technologies, they usually only move forward when the platform owners force
them to do so.

Many moons ago, C, Pascal, Modula-2, Basic were seen as Unity is seen
nowadays, naturally real game devs had to use Assembly, specially to take care
of the special purpose graphic and sprite engines.

Playstation 1 was probably the first console that force them to start using C
instead, while on 16 bit.

So slowly everyone accepted that doing games in C, Pascal and what have wasn't
that bad.

C++ only became kind of accepted years later, and even then it was more "C
compiled with C++" than anything else.

The major boost for C++ were the OS vendors for home and office computers,
Apple, IBM and Microsoft, alongside Zortech, Borland and Watcom, with their
full stack C++ frameworks, something that ironically C++ has lost (OS C++ SDKs
where it has the front role).

~~~
boomlinde
_> Video games community is very luddite in what concerns adoption of
technologies, they usually only move forward when the platform owners force
them to do so._

I think this is an unfair assessment. "Real devs" had to use assembly language
because when targeting consoles and low-end home computers, this was the only
really performant option for a long time. For a lot of types of games this
doesn't matter so much, but there's a long ongoing trend in big titles really
trying to cram as much visual fidelity as possible into the target machine.

Same deal with C++ today. You can use Unity of course, no one is any less of a
"real game dev" because of it, but it's simply not an option if you want to
work on cutting edge tech for AAA titles. There are tons of other things you
may want to work on, other boundaries to push, which I think is why Unity is
such a popular option. But I think it's unfair to simply chalk it up to
luddism. Present the available alternatives that compare favorably in terms of
development speed and performance for non-critical software instead.

~~~
pjmlp
While it might feel unfair, I wasn't attacking anyone on purpose, it is based
on experience, from my demoscene days, to the friends I got to meet in the
industry, to my former past as IGDA member, and the ways I have kept in touch
with the industry, even though I have decided that the boring corporate world
with graphics programming as hobby, was something I rather spend my time on
than the typical studio life.

While the demands of gaming industry have always driven the hardware evolution
on mainstream computing, most studios only move to newer programing languages
when the platform owners force them to do so.

The gaming industry is not known for being early adopters of new software
stacks, and many studios would to this day actually use pure C instead of C++
if the console vendors would give them C based SDKs.

~~~
dleslie
I know several studios local to Vancouver that are making heavy use of Go,
Swift or Rust. And of course, a few that are deep into HTML5. There are games
being shipped written in mruby.

I'm not in agreement with your assessment that the industry is conservative;
it is largely responsible for pushing graphics into programmable pipelines,
for instance, and the vendor-preffered language lock-in for consoles hasn't
really been a factor since the Indie revolution took the industry by storm.

~~~
pjmlp
I don't take indies into consideration on my remark, consoles and mobile OS is
where the money is, and none of those languages have a place there currently,
with exception of Swift on iOS.

~~~
dleslie
Kotlin, Swift and Javascript are in plenty of mobile games. There are loads of
HTML5 games on Steam. There are Switch games written in Ruby.

I get the impression you're an outsider looking in, relying on decades out of
date personal experience to understand what they see.

~~~
pjmlp
Those are indie games.

Naturally Kotlin and Swift are in mobile games, they are part of the user
space official SDK languages

~~~
dleslie
I know for a fact that teams at EA are using such tech, as are teams at
Microsoft and others.

Also, thumbing your nose at Indie games is odd, considering the sales they've
enjoyed and the extent to which the industry has adjusted to adapt to their
surge in popularity.

I can't imagine Nintendo in the 90s treating indie devs the way it treats them
today.

~~~
pjmlp
Everyone is using Swift and Kotlin when targeting iOS and Android, it is
either them or Objective-C and Java.

C++ doesn't have full access to OS APIs and some integration is always
required.

~~~
dleslie
What's your point? That you agree that game developers are using new
technology?

~~~
pjmlp
Swift, Kotlin, Objective-C, Java are unavoidable when doing iOS and Android
development, the OS features that are exposed to C and C++ aren'tt enough for
doing a game.

Adoption is driven by platform owners.

------
azhenley
> _CLU might be the most influential language that nobody’s ever heard of.
> Iterators? CLU. Abstract data types? CLU. Generics? CLU. Checked exceptions?
> CLU._

Well, I learned something new and I spend a lot of time reading about PL (and
even teach undergrad PL!). Had never heard of CLU before.

~~~
restalis
I'm still trying to reconcile "most influential" with "nobody's ever heard
of". It may be that CLU implemented in premiere some concepts, but if later
those concepts may as well just have been _rediscovered_ (like the iterator
design pattern) as the obvious thing to do, then what that leaves us?

~~~
carapace
> reconcile "most influential" with "nobody's ever heard of"

We ignore the past. I've worked with people who hadn't heard of Alan Kay. I
worked with a guy tasked to revamp an expert system who had never heard of
Prolog.

~~~
7thaccount
I work with a lot of people who write software and a few professional
developers (~10-20 years experience in .NET & Java) and only one of them (has
a computer science degree) has ever heard of Prolog and neither had ever heard
of Smalltalk, APL, Forth, or Lisp when I brought them up around lunch. They're
great at what they do and are much more experienced/talented than I at
creating software, but it always makes me wonder why more professional
software developers aren't more curious about the past and other alternative
solutions. I can tell you how engineers did my job each decade going about 100
years back and how all the tools evolved over that time.

------
DylanSp
Glad to see a mention of CLU; it pioneered a _lot_ of ideas. Reading Joe
Duffy's blogs about the Midori project (greenfield language/system design at
Microsoft), he mentions CLU as an inspiration a number of times.

~~~
pjmlp
One can still play with CLU today,

[http://www.pmg.lcs.mit.edu/CLU.html](http://www.pmg.lcs.mit.edu/CLU.html)

I would already be more than happy if Go 2.0 generics would be CLU like, no
need for something more fancy.

------
dionys
Interesting to see Pascal on the list - it was the language of choice in my
high school and my first programming language, back in 2010. I haven't touched
it since, but I do remember it was pretty easy for me to pickup the syntax,
especially compared to languages I would learn later in uni, like say C.

~~~
gwbas1c
Wait, Pascal was in use in 2010? I thought it was a dead language when I used
it in high school in 1997-1998.

~~~
kick
Pascal has been in an incredibly strange state of being both dead and alive
simultaneously for the past thirty years.

...but yeah, it was definitely doing pretty good throughout the 1990s.

~~~
sigzero
"That is not dead which can eternal lie, / And with strange aeons even
languages may die."

------
thrownaway954
Very surprised that ColdFusion isn't on this list. Say what you want about it,
but remember this... You know how all you React, VueJS and frontend developers
just LOOOOOVE components... Yeah... ColdFusion had that first in what was
called "Custom Tags". Heck... It had alot of features that you find today in
almost every other language, but it is dying a SLOOOOOOOOW death.

~~~
WorldMaker
ColdFusion had much more limited exposure to the general world than say PHP.
Also, arguably and IMNSHO, ColdFusion was quite _bad_ at a lot of its own
features. Custom Tags specifically as the given example, were barely more than
duct tape over SSI (Server-Side Includes) and never really had a "proper"
Component model or DOM for most (if not all, again IMNSHO) of its history.

(CF is a language on my list of hopes to never see again.)

------
pjc50
There are a few things present in COBOL that I would like to see lifted into
more modern languages. One is the use of the full stop as an expression
terminator, as in English, instead of the semicolon. Another is the whole
"picture" mechanism for number formats.

More languages should also steal Verilog's use of _ as an optional digit
separator. It's just arrived in C# 7.

~~~
onychomys
If we start using periods to end expressions, would we then have to use
semicolons to chain things together? Or would you let the compiler know that
foo.size(). was okay and that it shouldn't be waiting around for another
function call there?

~~~
lifthrasiir
Juxtaposition (`foo size()`) is a serious contender I think. In fact, I
consider it is one of the most undervalued syntactic choices.

~~~
WorldMaker
In HS I built a toy language where it was `foo's size()`, if you want to play
even further down the English-like rabbit hole, even trying to stick to BNF-
ish CFGs. `'s` is actually a surprisingly unambiguous token in a language
without a "char type" and that sticks to double quotes only for strings.

------
bryanrasmussen
I linked to something on Erights.org a few days ago
[http://www.erights.org/](http://www.erights.org/)

I think E was more influential than its distribution would suggest.

~~~
evgen
As cool as E was, it was almost entirely based on core ideas lifted from Joule
and KeyKOS. I can't really see any influence beyond a "wouldn't it be great if
language X had feature Y like you find in E".

~~~
bryanrasmussen
I was also thinking of the community, as they moved out into other languages
and carried some ideas with them that they maybe got exposed to in the E
community.

~~~
QuesnayJr
Are there active languages influenced by E? I came across that erights site
years ago, and I always meant to learn more about it.

~~~
phpnode
the concept of promises as used in JavaScript comes from E

~~~
evgen
Promises for async results has been around for decades and E got the idea from
Joule.

------
fanf2
The ML section misses out some significant successors: there was a fork
between CAML and SML. Ocaml is still widely used, and has its own spawn such
as ReasonML and BuckleScript.

~~~
LeonidasXIV
Neither ReasonML nor BuckleScript are really OCaml descendants. ReasonML is a
different syntax for OCaml, with even closer match than CoffeeScript 1.x had
to JS. Compiled ReasonML code cannot be distinguished from code that was
written in OCaml. BuckleScript OTOH is a set of forks of older compiler
versions which have a JS backend. It works with both Reason and OCaml syntax.

ReasonML therefore is much closer to the revised OCaml syntax (a failed
previous alternate syntax) than it is to its own language.

------
andybak
Lisp?

(ducks for cover)

I know someone writing Smalltalk on a daily basis for this system:
[https://kyma.symbolicsound.com/](https://kyma.symbolicsound.com/)

~~~
diggan
The article is about "mostly dead" languages. Lisp lives on under many forms,
Common Lisp, Clojure and Racket being some of the more popular ones.

If we're only talking about influential languages, I'd agree with you and put
lisp on the top.

~~~
andybak
I was being slightly facetious knowing the crowd round these parts.

------
gbrown_
> At a time when adding two lists of numbers meant a map or a loop, APL
> introduced the idea of operating on the entire array at once.

Am I missing something, how is this different from a map?

~~~
icen
It keeps going!

    
    
          3 3 3 ⍴ ⍳10
    
         1  2 3
         4  5 6
         7  8 9
           
        10  1 2
         3  4 5
         6  7 8
    
         9 10 1
         2  3 4
         5  6 7
    
          M ← 3 3 3 ⍴ ⍳10
    
          M*M
    
        1.00000E0  4.00000E0  2.70000E1
        2.56000E2  3.12500E3  4.66560E4
        8.23543E5  1.67772E7  3.87420E8
    
        1.00000E10 1.00000E0  4.00000E0
        2.70000E1  2.56000E2  3.12500E3
        4.66560E4  8.23543E5  1.67772E7
    
        3.87420E8  1.00000E10 1.00000E0
        4.00000E0  2.70000E1  2.56000E2
        3.12500E3  4.66560E4  8.23543E5
    

That's 3 nested 3x3 matrices, and we take the pointwise product of each of
them with just the usual multiplication operator.

How many maps did you want?

~~~
bonzini
Isn't that a pointwise exponentiation (1^1, 2^2, 3^3, ...)?

~~~
jodrellblank
Yes it is; APL uses × for multiplication.

------
rst
For all the mentions of FORTRAN in the description of other languages, it's a
real surprise that FORTRAN doesn't rate a mention of its own -- though there
are corners of the world where it's still in use.

It's notable not only for being the Lingua Franca of scientific computing
through at least the '80s, but also for breaking ground in programming
language technology, starting right at the beginning -- the Fortran I
compiler, for the IBM 704 in 1957, was the first to have an optimizer (and the
technical papers on that compiler introduced terminology which has since
become standard to the field -- e.g. "basic block").

~~~
kjs3
If you leave the HN/SV bubble, you find huge amounts of Fortran. It's still
plays a significant role in scientific computing. If you lost track of it in
the '80s, you might be interested that the language had major updates in '90,
'95, '03, '08 and 2018. It's just not a part of the world of cat pix sharing
websites and innumerable "Foo of Bar" startups.

Same with COBOL.

~~~
couchridr
One thing I credit COBOL with is the use of descriptive variable naming. In
FORTRAN you see variables named i, j, x etc (just look at Numerical Recipes)
but in COBOL you see TOTAL_MONTH_SALES and the like. It is considered good
style and “self-documenting” to do that today.

~~~
kjs3
COBOL was intended from the start to be "self-documenting" and English-like
with the thinking being that it could be used by 'non-programmers' to create
business logic. Not sure that I buy it achieved that goal, but it was the
intent.

------
QuesnayJr
That was an incredibly interesting article. I never quite grasped the
historical significance of CLU.

Something I always wonder about COBOL, since it has so little influence on
other languages, is if there are good ideas that we missed out on.

~~~
rebataur
ABAP which runs in all of SAP systems in Fortune 500 companies as an ERP is
derived from COBOL. The core product has 400Million lines. Extensions would be
4X

------
padseeker
No one is doing anything new in COBOL but there is so much old stuff that
could only be replaced by a massive investment in rebuilding infrastructure.
COBOL isn't dying anytime soon.

At one point, like 10-15 years ago, I knew experienced well paid COBOL
programmers being laid off and being replaced by kids fresh out of school. And
then CS programs, if not outright stopped, at least greatly reduced teaching
COBOL courses. And no one coming out of school learning Java and Python and
Node wants to write COBOL.

And then old timer COBOL programmers started retiring, but companies
(especially banks) were not replacing their existing mainframe infrastructure.
So now you have a gap between supply (COBOL programmers) and demand ( mostly
banks), to the point that retired programmers are doing part time work for
$200 an hour. Here's an article from 4 years ago;

[https://www.reuters.com/article/us-usa-banks-cobol/banks-
scr...](https://www.reuters.com/article/us-usa-banks-cobol/banks-scramble-to-
fix-old-systems-as-it-cowboys-ride-into-sunset-idUSKBN17C0D8)

~~~
kjs3
I know at several dozen companies in my state alone who are still "doing
something new" in COBOL, including the company I currently work for. No,
that's not just maintenance, it's new apps and projects. Most of these are on
a 'Fortune' list in terms of size.

Startups and small companies can ignore anything that isn't new and shiny;
companies processing billions of dollars/transaction know better.

------
ngcc_hk
Pascal have two moments which really peak. Turbo pascal and Apple Mac. After
that not sure what its use. Ucsd pascal always slow.

~~~
avian
Does anyone know whether Turbo Pascal DOS/Windows compiler produced slower
code than C compilers at the time? For a long time I was sticking with Pascal
and resisted switching to C, even when everyone around me was doing so. One
thing I do seem to remember from that time was that the things people were
coding up in C seemed to work faster (not sure if it was Borland's Turbo C or
something else). It was one of the arguments I remember for leaving Pascal.

Am I misremembering and it was just an urban legend/marketing from C
compilers? Thinking back it doesn't seem logical that C would offer any
significant performance benefits - both Pascal and C were compiled down to
machine code and had approximately the same levels of abstraction.

~~~
pjmlp
Urban legend, as Borland fanboy, using their Pascal and C++ products, the
generated binaries were pretty much the same, even if you might had to play
with compiler pragmas, like disabling bounds checking for example.

In MS-DOS days, if you actually cared about performance, a macro Assembler was
the only way to achieve it.

~~~
FpUser
Borland Pascal had built in assembly which was very easy to use and integrated
with the rest of the language. This let me have DOS program that had it's own
high performance graphics for GUI (I basically stole Motif visual design with
some adaptations) and proprietary preemptive multithreading. C programs
venturing into same realm were not really any faster and had to use assembly
anyways for performance critical parts.

~~~
pjmlp
Yes, that is how I used it as well.

To share a similar anecdote I was so proud of myself having created my own
mouse support unit, that I then used to plug into a couple of BGI based
applications.

------
pjdemers
Why I think Pascal is gone for commercial use. Back in the mid-80's I was
working on a new project on PCs under DOS. Our first choice language was
Pascal. However, sometimes we needed to write at low levels for performance,
or to access hardware directly, or call Fortran code. For all of these
reasons, we switched to C.

~~~
WorldMaker
At least one simplified view of the death of Pascal is the rise and fall of
Borland. Borland's compilers were the fastest. Borland invested more in FFI
work and C-interop than competitors. Eventually Pascal was Borland (and vice
versa), because no commercial entity (in the 90s-ish) was using a non-Borland
Pascal compiler. Borland making bad business decisions and Microsoft making
good business decisions in poaching key Borland staff at the right moments,
seems like some obvious death nails in Pascal's coffin. (Obviously that's
still a very simplified narrative to history, but it is an easy and common
one.)

------
urxvtcd
Interesting stuff!

Is there a kind of web archive of programming languages? Can we make sure
their compilers, interpreters, etc. are not lost? Sure, there are issues of
them being proprietary, or needing some virtualization to run; I'm just
wondering.

~~~
pmcjones
[http://www.softwarepreservation.org/projects](http://www.softwarepreservation.org/projects)
Is a start.

------
EamonnMR
If you're at all interested in this topic and have access to the ACM library,
the 'History Of Programming Languages' (HOPL) conference includes papers on
these and many other languages and is fascinating reading.

~~~
EamonnMR
And they just opened this up today!
[https://dl.acm.org/action/doSearch?AllField=HOPL&startPage=&...](https://dl.acm.org/action/doSearch?AllField=HOPL&startPage=&LimitedContentGroupKey=10.1145%2F154766)

------
lucas_membrane
Nice article. Missed one thing about PL1 that I think it got first and that
you see nowadays in most languages (other than low-level languages like go and
C) and most large programs -- a hierarchy of error types, blocks of code to
catch errors (exceptions) according to type, and the ability to replace one
error-handler with another temporarily within a narrower scope. Not to say
that this is good, but it is what we've got.

------
celeritascelery
I wonder which of these languages are completely inaccessible today? As in you
can’t run them on any modern machine or even find an emulator for them.

~~~
therealcamino
That's an interesting question. I was going to guess CLU, but I was very
wrong: there are native VAX and SPARC implementations, a Portable CLU
implementation, and even a CLU-to-C compiler. Impressive for a language I
hadn't heard of before today!

[http://www.pmg.lcs.mit.edu/CLU.html](http://www.pmg.lcs.mit.edu/CLU.html)

------
mbo
This article echos a lot of the themes that I saw in "Family spaghetti of
programming languages"
([https://erkin.party/blog/190208/spaghetti/](https://erkin.party/blog/190208/spaghetti/))
which seems to be the best attempt at developing a family tree of programming
languages I've seen thus far.

~~~
amyjess
Ooh, I really like the idea of this, but I kinda wish he broke Fortran into a
whole family, with separate entries for each major version. There was a lot of
cross-influence between Algol and Fortran, and while the original Fortran
inspired Algol, Algol ended up as a massive influence on how Fortran developed
later.

------
V-2
_" COBOL was also enormously complex, even for today’s languages"_

Is that true? What was the complexity about? I've never done COBOL, but I was
always under impression that it was quite straightforward in its infamous
clunkiness, lacking lots of constructs that've been obvious for decades now

~~~
agumonkey
I don't know but there's the ALTER (and maybe a COMEFROM) ?

------
mkchoi212
_" almost all of logic programming actually stems from Prolog"_

Why isn't Prolog on the list then?? :O

~~~
detaro
is Prolog deader than the logic programming systems stemming from it? The
field seems fairly niche overall, but from the limited insight I have Prolog
seems to be still around?

~~~
mkchoi212
My thought was that prolog is now mostly used in colleges around the world to
teach students basic concepts within AI. Correct me if I'm wrong though. Would
love to see examples where Prolog is used in real life!

~~~
detaro
Academic work certainly seems to still use it, and I've seen it once or twice
for "business rule" type stuff. That's not much, but I can't remember much
else in this direction either.

I guess constraint solvers like Z3 are in the same category kind of(?), but I
don't know if/how their syntaxes are related.

------
TomMasz
I did my Master's (1994) work with Smalltalk (specifically Little Smalltalk)
and I still play around with Squeak when I have some free time. Not exactly
producing production code, obviously, but you can build things for yourself
and there's nothing wrong with that.

------
peter303
Reads like the graveyard of my high school and college languages

BASIC 1970 high school computer hacking

LISP 1972 MIT A.I. course

APL 1973 MIT circuits course

PL/I 1973 MIT systems programming course

OS360 Assembly MIT systems programming course

FORTRAN 1974 MIT physics research

C 1976 Stanford research

PASCAL, ObjectPASCAL 1984 Macintosh developer

ObjectiveC 1987 NeXT Developer

C++ 1986 scientific programming

Java 1994 scientific programming

------
paulsutter
I’m surprised that Ada isn’t listed

~~~
ThomasBHickey
Mentioned only in passing as one of the languages that Java sqashed.

------
tyingq
360 Assembler and Flash/ActionScript seem like good additions to the list.

~~~
muizelaar
What did ActionScript influence?

~~~
greencore
TypeScript, I'd say.

~~~
WorldMaker
Also TC39's efforts in all of JavaScript just about ever since. ActionScript's
ES4 efforts left a lot of scars and that "lost version" of JavaScript is quite
influential in the very least for how much ES2015+ _avoids_ ES4/ActionScript
ideas/problems/concerns as much as it learns from them and eventually
reincorporates some of them.

------
dnautics
I love both languages (well c99 anyways), but I feel like Pascal/C is like
betamax/vhs.

I think I it didn't help that free Pascal compliers were harder to come by.
IIIRC

~~~
throwaway17_17
I’ve seen multiple statements from people that Pascal was a better language
than C, but I absolutely love the VHS/BETA-Max analogy (it contains a lot of
context which usually take a good deal longer to write out in the pascal vs c
discussions). My education and experience effectively skipped Pascal, other
than mentions that it existed. So, do you have any off the cuff reading or
materials that demonstrate where Pascal shines? I’d be interesting to see if
any of Pascal’s qualities that made it ‘better than C’ were ripe for plucking
out and putting in another language.

~~~
dnautics
I haven't used Pascal in 20 years. I learned it in cs/AP cs (literally the
last year it was on the apcs exam), used it for my high school science fair
project and in my senior year I self taught and rewrote it to c++ and never
went back.

Looked at Eiffel for a hot sec in college but it never stuck, because no
toolchain for beos, which I used in college.

These days I program in elixir, and am learning Zig.

------
ChrisMarshallNY
I started off with BASIC, FORTRAN, and Pascal (as higher-level languages). I
also used PERL and PL/1.

I don’t miss them too much, but they were important stepping stones.

------
dboreham
Anyone else here written BCPL? I must be one of very few people to have used
it at _two_ different jobs.

------
hiddenhandle
No mention of Snobol? Precursor. Only saw it in texts though of Perl

------
JoeAltmaier
RPG? Or is that encompassed by Cobol...

