
Time safety is more important than memory safety - panic
https://halestrom.net/darksleep/blog/036_timesafety/
======
safety-third
I actually have had the pleasure of porting a program from a defunct 16-bit
language called Actor to C++. It wasn't a huge deal even at around 800 kLOC.
All mainstream languages in the future are going to have some combination of
structured, functional, and object-oriented programming. Converting is mostly
going to be about syntax and libraries.

As someone who has worked with C and C++ for living for over 20 years, I
wouldn't think twice about picking either Go or Rust if I were to start again.
Go gives you the fast edit/compile/test loop of an interpreted language with
the runtime speed of a compiled language. Rust is the language that the C++
Committee would make if they could start over.

That being said, Go will never take over the C, C++, and Rust niche. Going to
and from Go-land and C-land is too expensive. Google has no interest in
stepping behind libraries that aren't internet servers. Go will live a long
life as a great environment to port your Python, Ruby, and other bloated
server languages. It just will never be the next language to write a web
browser.

Rust is amazing though. I see this as the programming language of the future
until the U.S. Government slams down the hammer and forces everyone to use
DOD-approved Ada.

~~~
downerending
Rust is interesting and clever, but it will not see widespread use, for pretty
much the same reason that Haskell won't: It's simply too complex for Joe Coder
to deal with.

C++ is likewise quite complex, but it has the huge advantage that a team of
mediocre programmers can largely just keep to a simple subset of the language
and bumble along. With Rust, you _must_ learn and deal with the memory model,
and it's not an easy thing.

~~~
wolf550e
I think the conclusion is that C++ is too dangerous in the hands of non-
experts.

~~~
downerending
As far as it goes, I agree, and I rarely recommend C++ as the right language
for a project.

Nonetheless, on real-world projects, a team of average programmers can make
reasonable headway in C++. They'll probably write a lot of dubious and buggy
code and _eventually_ slam into the complexity wall. In Rust, they'd slam into
a wall almost immediately and very noticeably. Managers won't put up with
that.

~~~
pasabagi
I've never really dug into C++, but I've written a little bit of C, and I feel
it's generally easier to write a given program in Rust than it is in C.

I can see why Rust has its reputation for difficulty - the burrow checker is a
new idea, and the unfamiliar is inherently dificult. However, I think once
you've spend a modicum of time, it's a fairly straightforward language, simply
because the compiler almost always tells you exactly what you're doing wrong.

I don't feel like it's a particularly ergonomic language, and it's certainly
hard to read (very symbol dense), but I think it's easier to learn to appease
the burrow checker than it is to debug your average C program, especially if
you're the sort of person (I certainly am) that produces dubious and buggy
code more days than not.

~~~
downerending
That may be, but debugging "looks like work" in the eyes of mgmt.

The borrow checker, on the other hand, looks more like "I can't find any
applicants in Little Rock who can do this stuff.".

~~~
pasabagi
On the other hand, I think the most important thing in making developers
replaceable is the degree to which the codebase explicitly contains all the
information relevant to it.

Lifetimes are an additional, important piece of information that would have
previously been something a programmer would have to have an intuition for, or
that would need to be put in documentation.

Now, that work is delegated to the compiler, and so, your individual developer
is somewhat more fungible.

I don't know if rust is specifically the future in this regard, but I think if
I was a machivellian manager, I'd be interested in replacing instances of
human intuition and group knowledge with tooling, as much as possible. The
burrow checker is one such tool - even if automatic garbage collection is
probably a more straightforward one.

------
wtracy
If "will this be usable in ten years?" is a concern, then you shouldn't be
worried about your language choice. You should be worried about the services
your program depends on.

Does your mobile app do anything useful if Google shuts down their
authentication servers? What happens to your users if the App Store/Play Store
decides to purge applications that don't actively support recent APIs? What
happens when the Maven repositories your build depends on get shut down?

I would much rather figure out how to build a Rust program in ten years than
try to recreate a web service that simply no longer exists.

While we're at it: All the "obscure" languages discussed here are FOSS.
Finding an implementation will be relatively easy, and having access to the
source means that things like binary compatibility issues can be worked
around.

This is nothing like trying to resurrect a program written in an obscure
proprietary dialect of Pascal that was only made available on a run of 200
floppy disks.

~~~
zzo38computer
Programs I write are generally designed not to depend on any particular
internet services (if they need some, they are configurable), and this is part
of the reason to do.

And, yes in the case of these FOSS it is probably easily enough to find an
implementation and to work around the binary compatibility issues. In the case
of a program written in an obscure proprietary dialect of Pascal that you do
not have an implementation of, well, sometimes you might reimplement a subset
which is enough to run these programs (I have done similar things in the
past), although of course it is probably going to take longer and be more
difficult than FOSS is going to be.

------
pcwalton
Counterpoint: NES games were handwritten in assembler for a long-dead
architecture. You couldn't pick a worse development environment for "time
safety". The source code for most of the games is lost, too. Yet they remain
some of the most portable programs in existence, because of emulation.

The PC architecture is extremely well-documented in practice (as are
alternatives like WebAssembly), and there are going to be emulators for them
around for the foreseeable future. Emulators for the PC architecture are no
more likely to die in the future than C compilers are. You will be able to run
any program written for the PC, in any language, for a very long time.

~~~
bitwize
> The PC architecture is extremely well-documented in practice (as are
> alternatives like WebAssembly), and there are going to be emulators for them
> around for the foreseeable future.

Intel has patents on the x86 ISA. Though the basics of the instruction set
(386, 486, Pentium) are outside the scope of a patent, it's impossible to
write a complete emulator for a chip made within the past 20 years without
infringing on Intel's IP.

~~~
rcxdude
Nintendo claims the same about emulation of their consoles. It hasn't appeared
to be much of a deterrent.

~~~
fwsgonzo
Anyone can claim anything that helps their cause. All you have to do is not
implement the emulator using the official developer documentation for the
system.

For example on the original Gameboy the thing you have to avoid is using the
copyrighted BIOS ROM. But you can avoid that by just initializing the CPU
registers and instead starting at the cartridge start address.

~~~
comex
FWIW, independent reimplementation protects you against copyright infringement
but not patent infringement.

~~~
littlestymaar
True, but patent infringement are limited to few places in the world, while
copyright infringement are enforceable pretty much everywhere.

------
kelnos
> _I don 't think it's responsible to ask ordinary programmers to start their
> projects in new languages._

Rust is 14 years old[0], its compiler has been self-hosting for 9 years, and
its 1.0 release was nearly 5 years ago. Sure, that's not as old as C or C++,
but I wouldn't call it "new" either.

[0]
[https://en.wikipedia.org/wiki/Rust_(programming_language)#Hi...](https://en.wikipedia.org/wiki/Rust_\(programming_language\)#History)

~~~
ribs
I’d call it maybe still a bit too new for applications where enterprise-level
funds or harm to human life is in scope if the software fails. But it’s
getting there.

~~~
heavenlyblue
“too new”. Would you say the same about he new Boeing plane, then - too?

~~~
cshenton
The 737 max? Yeah probably.

~~~
heavenlyblue
However if 737 continued flying we would get more deaths rather than more
fixes, wouldn’t we?

Age has got nothing to do with stability.

------
pornel
Rust isn’t going anywhere in foreseeable future. Even _if_ it died today, it
could stay around for a decade or two (Python 2 did. C99 is both a dead
language and still considered new).

When it becomes so obsolete you won’t be able to use it, it’s likely that
whatever you wrote in it also won’t have any value beyond being a historical
artifact.

And if you want to preserve programs and compilers for museums or far future,
how about building computer for WASM+WASI and archiving that?

These technologies are still new, but their spec is much smaller and simpler
than C and native OS APIs. You’ll be able to recreate a WASM interpreter even
centuries later, and from there revive the compiler and rebuild the programs.

~~~
0xcde4c3db
> C99 is both a dead language and still considered new

Lest anyone doubt this, I recently worked on a (proprietary) library that the
company kept C89-clean because some (similarly proprietary) embedded targets
are basically frozen in the mid-90s.

~~~
raxxorrax
That is pretty common in the embedded world. ANSI C is still equivalent to C89
here, which isn't really true anymore.

------
tptacek
This argument only makes sense if you believe the longevity of your project is
more important that the personal safety of its users. For a lot of projects
that might be the case. For most commercial and major open source projects, it
is not.

------
ScottBurson
There's always Common Lisp. Still portable 35 years on.

I recall once, not quite 20 years ago, porting a large CL program from a
32-bit implementation to one of the then-new 64-bit Lisps. Someone who didn't
know CL thought that would be a hellish job, but it turned out to be quite
straightforward.

~~~
justinjlynn
With the new research being done in macro-level type systems there's a lot of
great reasons to check out lisp again (as a person who loves strongly typed
languages)!

Type Systems as Macros - [https://www.ccs.neu.edu/home/stchang/pubs/ckg-
popl2017.pdf](https://www.ccs.neu.edu/home/stchang/pubs/ckg-popl2017.pdf)

Dependent Type Systems as Macros -
[https://www.ccs.neu.edu/home/stchang/pubs/cbtb-
popl2020.pdf](https://www.ccs.neu.edu/home/stchang/pubs/cbtb-popl2020.pdf)

It's so cool - especially the bits where they reconstruct each part of the
type system and show how each feature they add contributes to the
power/capability of it. :D

~~~
ScottBurson
I used macro expansion for type checking in Zeta-C [0], the first C compiler
for Lisp Machines, in 1983. Didn't write a paper about it, though.

Dependent types are certainly interesting -- I'll give this a read. Thanks!

[0] [http://bitsavers.trailing-
edge.com/bits/TI/Explorer/zeta-c/](http://bitsavers.trailing-
edge.com/bits/TI/Explorer/zeta-c/)

~~~
justinjlynn
Awesome! I'm really enjoying reading your source code. Cool stuff; would
definitely be interested in hearing your opinion of the papers.

------
pdimitar
> _In contrast my old C projects from 5-8 years ago still compile and run_

Sigh. Again this circular logic. And again, this is a fact simply because many
people chose to muscle through a lot of problems that C has. It's NOT because
C is amazing or anything. Many people chose to muscle through COBOL's problems
as well. Is COBOL amazing? Is COBOL giving you time safety?

This is like saying that the Amish have the superior philosophy because there
are still Amish communities. They have... a philosophy. They hold it dear.
They insist on living by it. That's it. There's nothing more to it. No deeper
revelation.

I am not going to engage in the "C vs. Go/Rust" debate apart from saying that
_clearly_ many people dislike C and C++ and are looking for alternatives.
Which by itself should immediately hint the author that his case is not as
universally accepted as he seems to want to make it.

~~~
kazinator
"My old C projects from 5-8 years ago still compile and run ... in my trusty
Ubuntu 12 VirtualBox inside Windows 10!"

:)

------
platz
It's actually the assumptions around longevity that need to be examined.

Software is not like etching perfect museum-pieces.

It is the union of the world that defines software's context, and since that
will always change, we'll have to continue to write and re-write software to
ensure the match is appropriate

~~~
ryl00
It depends on what you're working on. I'm in the simulation area at an
engineering company, and we've got plenty of Fortran code that dates back to
the '70s that's still just fine. A little maintenance to work on removing
deprecated syntax, a little work to replace common blocks with a more
decoupled design, and we're probably good for decades more use. And if we
can't get to that maintenance work right now, the strong backwards-
compatibility of Fortran will have our back until the next opportunity. :)

~~~
safety-third
Same here. We are stuck in Watcom F77 right now due to using some old
extensions. We are in the middle of eliminating control characters from format
statements so we can finally upgrade to new compiler.

------
schoen
I thought this was going to be about concurrency, but it's instead about the
risk that programming languages will become obsolete quickly.

~~~
giornogiovanna
I thought this was going to be about bounding the time complexity of your
algorithms; I don't know if any language does that yet. I suppose this title
has a lot of interpretations.

~~~
bvinc
My understanding is that automatically determining time complexity is
impossible in the general case due to the halting problem.

And getting a compiler to truly understand the time complexities of your data
structures would involve either extremely difficult theorem proving or just
forcing it.

I do think it would still be interesting if a language had support for
determining time complexity. It seems like it's often possible even if it
isn't in the general case.

~~~
Quekid5
Another issue is that algorithmic complexity only really 'composes' in a
trivial and uninteresting way: Ok, you have a for loop over N items, multiply
the complexity of the loop body by N.

Well, no not quite... it may be the case that one particular iteration of the
loop makes all the subsequent steps completely trivial, so O(1)... and even
further it may be that such a step is guaranteed to be reached in log(M) time,
etc. etc. You see this type of thing a log in graph algorithms where they look
like they might be O(n^3) or whatever, but leaving markers on nodes can avoid
a lot of repeated work (by skipping marked node), so they end up as O(n + k)
or whatever.

Upper bounds for worst case complexity are relatively easy... just assume the
worst, but the interesting thing is proving _TIGHT_ bounds for worst case
complexity.

------
say_it_as_it_is
Spoken in the voice of David Attenborough: and here we have the lifelong C
programmer fighting for status among his tribe. He proclaims to have an
authority on time, and his tribe agrees, as they wilfully ignore the fiery
meteor that descend upon their position. Time, it seems, has almost run out.

------
pron
I think this is yet another post that falls into the trap of mistaking
HN/Reddit -- fashion publications, really -- for something else. They're GQ or
Vogue, not the New York Times.
[https://news.ycombinator.com/item?id=22106559](https://news.ycombinator.com/item?id=22106559)

The vast majority of developers do and always will choose well-established
languages. If Go or Rust or Zig or Kotlin or Clojure or Elixir ever become
truly popular, by that time they will have become established and "safe"
(unless they were Microsoft technologies, but they're not).

~~~
cbhl
If you agree with this principle, I think the end result is that you choose
the most popular language for some window.

For banks and airlines, you conservatively pick a large window. At one point
that ended up being COBOL. That is slowly migrating to Java now. I suspect it
will move to TypeScript in about a decade or so.

For new code today, I'd probably pick TypeScript.

A few years ago, I'd have picked Ruby. Before that, Python, Before that,
Perl/CGI. Before that, Java. Before that, C++.

I wonder if at some point the tides will turn back to writing pure-personal-
computer-and-no-cloud C/C++ Windows applications.

~~~
pjmlp
You just need to go into life sciences lab robots, factory automation, medical
devices, ticketing machines,.... to enjoy doing Windows WPF or C++
applications.

------
titzer
It's tempting to just drop this one because it's borderline flamebait, but
there are so many pernicious misconceptions here that it's worthwhile calling
them out explicitly. I'm generally not into author-chiding, but TBH the post
was written in a very inflammatory and ignorant fashion that I'd encourage the
author to be more thoughtful in the future so that we can all get out of the
muck.

1\. False choice between "unsafe" and "new" languages. No, just no. Safe
languages have been around since the 1960s (e.g. LISP). While it's not really
"C versus the world", C/C++ are in the vast minority when it comes to safety
across all programming languages. Throw a stick, hit a safe programming
language. Java, C#, Rust, Go, Python, Ruby, R, Lisp, Clojure, Scala, Haskell,
ML, Modula 3, TCL, JavaScript: the list goes on and on and on.

2\. C programs don't break. Just plain _false_. C and C++ are underspecified
languages, meaning they have undefined behavior in some situations--actually,
most situations. Undefined behavior is silent and there are poor diagnostics.
It's almost always a program bug. Undefined behavior and even nonportable
behavior is absolutely rife in the C and C++ world. Programs aren't
necessarily portable across platforms, compilers, language versions, and even
compiler bugs. Well-specified languages give programs a much better chance of
being portable. Which leads to:

3\. Portability is mostly a language issue. This is only partially true. The
fact is that portability and forward compatibility (time safety as this person
calls it) is a function of dependencies--on the language version, compiler,
external libraries, and general environment around the program too. Fewer
dependencies generally means better forward compatibility. It's also good if
those dependencies are maintained by teams who care a lot about backward
compatibility. Some languages do more than others.

4\. "Time safety" is a term. I have never heard of this term before, which
just speaks to the uninformed nature of this post. I think the author means
forward compatibility.

Anyway, if you feel like the core message of this post is that unsafe=future-
proof, feel free to choose another unsafe language to do your next program.
Hint: there aren't many.

PS: Java 1.0 is now 25 years old. Java 1.0 programs still compile _today_ and
the binaries that compiler generated back then _still run on today 's JVMs_.

~~~
whateveracct
> PS: Java 1.0 is now 25 years old. Java 1.0 programs still compile today and
> the binaries that compiler generated back then still run on today's JVMs.

If you want to write a random program that you don't plan to support but want
to remain usable, Java is your best bet.

I've on multiple occasions recently used really random Java GUIs that are from
the 2000s (Melee related). It sometimes took some finagling (iirc I had to use
Java 8 or something .. luckily it was just a nix-shell away!)

------
pjmlp
What a pile of garbage.

First of all try to compile C code written for 8 bit and 16 bit 80's micros to
see how well it has stood the time, without making use of hardware emulators,
spoiler alert it won't even compile.

Then plenty of programming languages, some of them older than C, are doing
great.

Anyone with enough money can enjoy NEWP on Unisys ClearPath, a systems
programming language almost 10 years older than C.

Or for free one can enjoy Algol 68 programs in 2020,
[http://algol68.sourceforge.net/](http://algol68.sourceforge.net/)

Guess what, those languages took engineering seriously, unlike C.

"The first principle was security: The principle that every syntactically
incorrect program should be rejected by the compiler and that every
syntactically correct program should give a result or an error message that
was predictable and comprehensible in terms of the source language program
itself. Thus no core dumps should ever be necessary. It was logically
impossible for any source language program to cause the computer to run wild,
either at compile time or at run time. A consequence of this principle is that
every occurrence of every subscript of every subscripted variable was on every
occasion checked at run time against both the upper and the lower declared
bounds of the array. Many years later we asked our customers whether they
wished us to provide an option to switch off these checks in the interests of
efficiency on production runs. Unanimously, they urged us not to--they already
knew how frequently subscript errors occur on production runs where failure to
detect them could be disastrous. I note with fear and horror that even in
1980, language designers and users have not learned this lesson. In any
respectable branch of engineering, failure to observe such elementary pre-
cautions would have long been against the law."

"Hoare's The 1980 ACM Turing Award Lecture",
[https://www.cs.fsu.edu/~engelen/courses/COP4610/hoare.pdf](https://www.cs.fsu.edu/~engelen/courses/COP4610/hoare.pdf)

Still disappointed how Azure Sphere can sell its security story, while
offering only C as programming language, so...

Even C++, with its copy-paste C89 compatibility, is already better than
writing plain old C. And guess what, the language is 40 years old now.

~~~
sitkack
I think of C as the Volkswagen Beetle, easy to produce inexpensive thing that
hit a niche that didn't previously exist. Worse is better, low end eats the
high end plus Dijkstra quote on Basic.

The real message in Trusting Trust is that the language has been backdoored at
the design level, it doesn't need a malicious compiler. A message always has
multiple parts, most people miss the important ones and just assume that
literal interpretation is the intended one.

------
pansa2
> C is much more likely to be the "safe" option if time x usefulness of your
> project is a goal.

What about using Lua? Sure, different versions of the language are
incompatible, but you could target a particular version. Anyone in the future
who would be able to compile your C program would also be able to compile the
correct version of the Lua interpreter, because it’s written in pure ANSI C.

Using Lua instead of C would gain memory safety, at the expense of
performance.

------
mntmoss
In theory, we are asymptotically approaching a stable set of programming
languages - a constellation of local-optimums. When something comes along with
a different name and new syntax and cleans up some semantics, it's called a
new language, but that's missing the point - the intent of the programming
environment it provides remains similar. And then writing code in it is no
real issue.

What ties us down to old languages is the whole environment - it is Joe
Armstrong's "to give a gorilla a banana you must pick up the entire jungle"
run rife through our production systems. Because our code increasingly depends
on globally-sourced libraries, we tend towards copying around the entire
planet and so there are layers of accretion all over.

And yet emulators do exist and run software successfully. The trick there is
that they have a terrarium of sorts: a boundary that limits the ecosystem. The
explosions that obsolete software tend to occur where a terrarium was not
planned for by the original authors, rather continuous dependence on an
evolving ecosystem was taken for granted.

------
benibela
I use Pascal for all my projects because it has memory safe strings and
arrays.

Almost all buffer overflows and security bugs could be solved by rewriting all
software in Pascal.

Everytime a software crashes, you should say, it crashed, because it was not
written in Pascal

I just spend two hours modifying my xml parser to load files that have a
doctype with inline declarations. Never needed to load an xml file with a
doctype before, and I only wrote the parser for the files I have. I also have
plans for a json parser. There is a surprising lack of Pascal libraries.

~~~
dana321
You would probably like go, its much like pascal. And it has libraries for
everything!

~~~
flavio81
Last time I checked, Pascal has generics and exception handling.

~~~
ratboy666
Pascal has neither generics nor exception handling. At least, with reference
to ISO Pascal.

~~~
benibela
Nobody refers to ISO Pascal anymore. It is either Delphi or FreePascal.

------
safercplusplus
If you're planning that far ahead it may not be an either-or situation. That
is, in the future C/C++ may also have an enforced memory safe subset. In which
case the issue becomes, how do you write your code today so that it will
conform to the safe subset? There are conformance tools in the works that can
already give you a sense of the restrictions that will be imposed [1].

[1]
[https://github.com/duneroadrunner/scpptool](https://github.com/duneroadrunner/scpptool)

~~~
dang
Please don't use promotional accounts on HN. We're here for curious
conversation, not promotion. I appreciate that you've been posting these links
in threads where they're mostly relevant, as opposed to blanket-spamming the
site, but it's still not in the spirit of HN to have single-purpose accounts
or to use this place primarily for promotion. It's ok to post your own work
occasionally, as long as it's part of a diverse mix of posts that are
motivated primarily by intellectual curiosity. That's the value we're
optimizing for here.

[https://news.ycombinator.com/newsguidelines.html](https://news.ycombinator.com/newsguidelines.html)

------
sly010
What kills code more than anything else is not the language of choice, but
operating system vendors "improving" user experience by deprecating APIs and
frameworks.

That said, there are language communities that definitely value change more
than stability * cough * javascript * cough.

But go. Imho it's unfair to implicate go. It has no backward incompatible
change that I know of, and it literally depends on nothing but the kernel
interface, which is known to be very stable.

------
Buge
From the title you might think it's talking about safety from pathologically
slow inputs. Especially considering there's a different post also on the front
page right now about that exact thing.

> PerfFuzz: Automatically Generating Pathological Inputs (2018) [pdf]

[https://news.ycombinator.com/item?id=22315542](https://news.ycombinator.com/item?id=22315542)

------
GlitchMr
I kinda disagree. With C, you risk old autoconf scripts breaking, depending on
library versions that operating systems don't include anymore, compilers
learning how to exploit undefined behaviour they didn't exploit before,
programs breaking due to making incorrect assumptions (`char` is signed,
right?). C++ also adds exciting issues in terms of backwards compatibility
breaks (for instance, programs using `register` keyword won't work since
C++14) - refer to Annex C in C++ specification for a long list of BC breaks.

None of this is an issue in Rust. Cargo is a proper build system, not a weird
combination of shell script and M4 using programs that are prone to breaking
changes (like awk). Cargo requires defining program dependencies and prevents
accidentally using a dependency that exists on an operating system. Rust in
safe code avoids undefined behaviour, noticeably avoiding the risk of
accidental undefined behaviour. Rust has much less implementation-defined
behaviour than C (`i8` is a well defined type). Rust edition system means that
breaking changes avoid affecting old code (even if Rust were to, say, remove
`static mut` in edition 2024, this change wouldn't affect programs that
wouldn't explicitly upgrade to edition 2014).

Rust is serious about backwards compatibility. Breaking changes are mostly
unacceptable (other than in cases of soundness bugs, but even then there is
typically a warning cycle before fixing a bug). In fact, you may take a look
at "Compatibility notes" sections in [https://github.com/rust-
lang/rust/blob/master/RELEASES.md](https://github.com/rust-
lang/rust/blob/master/RELEASES.md). The compatibility breaks tend to be very
minor, and unlikely to break stuff. For instance, in Rust 1.39, `asinh(-0.0)`
was changed to return `-0.0` instead of `0.0`. Strictly speaking a breaking
change, but a program depending on that is very very unlikely. Rust also has
crater which prevents accidentally breaking backwards compatibility by testing
whether every public GitHub repository and crates.io library still works the
same way. Additionally, even in case program compilation breaks anyway, it's
possible to use an older version of a compiler with rustup.

Rust is here to stay. Even if Mozilla ended up supporting the project, some
other company that uses Rust internally would continue to support it. It's
free software after all. Even if that wouldn't happen, the old rustc releases
would still work.

------
gameswithgo
usually the argument to keep using a bad language is “but we have all this
stuff we built and we don’t have that in newer things so we can’t easily
change”. but this is a new take! we can’t use new languages because we expect
only these flawed ones to live FOREVER

------
loopz
New programming languages should address this concern. C by itself doesn't,
really.

------
cestith
TL;DR: It's better to have buffer overflows and use-after-free bugs that still
work in 50 years than to have secure code you may need to migrate.

~~~
LeifCarrotson
I think you're being sarcastic? But I also think the article genuinely
believes that's true.

In 50 years there will be C programmers who know how to fix a buffer overflow,
compile, and run your program. Conversely, a program written in a language and
dependent on a package repository that only existed from 2017-2024 will likely
never find an academic willing to invest the thousands of hours in learning
that legacy code, digging through archives, and making that program run again.

~~~
kelnos
> _In 50 years there will be C programmers who know how to fix a buffer
> overflow, compile, and run your program._

I don't think that's a given. In 2070, C might be today's COBOL. There are a
few people who know it, and are paid obscene amounts of money to maintain
aging systems to keep them from falling apart.

Now, there's certainly more C code now than there was COBOL code at COBOL's
peak, but that's no predicting the future. What I'd call "modern" computing
and software development has only been around for 30-40 years or so. In 50
years I expect things to be radically different.

------
jstewartmobile
The infinity-plus-oneth HN pigpile-on-the-C-luddite thread. Ugh...

------
tpmx
The knowledge needed to compile a C program into executable code will never
die, as long as there is a human race.

~~~
kryptiskt
Lots of C code have already bitrotted because APIs it depends on have been
abandoned. Resurrecting it isn't just a matter of compiling the code, but
recreating the environment it ran in.

~~~
tpmx
That was before storage become abundant and cheap; from a source code storage
perspective. It will not become more expensive.

Nowadays storage capacity is developed to store video - this is really the
only thing that needs a lot of new storage capacity.

Every line of sourcecode written by every programmer ever is a fart in the
wind compared to what's being uploaded to Youtube in the next 60 seconds.

~~~
joshuamorton
Which does nothing to address "it works on my machine" build related issues.
And more modern languages have much more robust solutions to hermetic build
and package management.

~~~
rthille
That's what docker images are for :-)

------
shmerl
I don't get the argument. Is the author complaining about complexity of
languages like Rust? I.e. only simple languages supposedly have longevity? C++
is complex, yet it already exists for quite a long time too.

Just because C happens to be a long used language doesn't mean you shouldn't
be using newer and better languages and that those languages can't be used for
a long time too.

C is used not due to big benefits, but mostly for legacy reasons today.
Basically a lot of projects are already stuck with it. But for new ones,
surely use Rust, not C.

~~~
giornogiovanna
Honestly, if you care about longevity, Rust simply isn't the language yet. I'd
at least wait until there's a specification and a GCC frontend.

Anyway, C does have _some_ benefits as a programming language.

• "Legacy" projects like Linux and CPython will keep it alive for decades to
come.

• It's extremely portable. Rust is less portable, and even when Rust supports
a niche platform, it's relatively clunky to get things started.

• C libraries are always very easy to call from other languages. Rust isn't
quite there yet, and translating things like traits can be a bit of a pain.

