
C will never stop you from making mistakes - ingve
https://thephd.github.io/your-c-compiler-and-standard-library-will-not-help-you
======
kevincox
Trying to avoid adding warnings is a very silly form to twisted logic.

1\. People enabled warnings and -Werror because they want high quality code.
2\. Standard can't add warnings because people use -Werror

This means that not adding warnings is directly against the original reason to
use -Werror in the first place! We are now avoiding warning people about
dangerous things because they requested to be warned about dangerous things!

~~~
flohofwoe
The argument also doesn't make much sense because all 3 big compilers are
_already_ adding tons of warnings with each new release. Upgrading to a new
compiler version and seeing screens full of warnings scroll by when compiling
code that was warning-free in the previous compiler version is quite normal.
Why does this affect the C committee's decision making, and why is it suddenly
a problem?

~~~
sramsay
If I'm following the author's argument correctly, the very influential
companies who maintain very large, very old C code bases don't like what you
just described -- new warning messages in code that used to not have any. They
worry, then, that if the STANDARD actually mandates a warning, that is even
more likely to happen, and that makes the very influential companies very sad.

Which sounds like bullshit to me.

~~~
adrianN
It actually makes a little sense: Many companies are required to ship code
without warnings (for example in safety-critical systems). Fixing warnings is
very expensive and can introduce new bugs in code that has been running fine
for decades. If you force new warnings into the compiler the result would be
that these companies simply stop using newer versions of the compiler.

~~~
flohofwoe
Part of the normal "warning hygiene" process is deciding when a new warning in
old code should be fixed and when it's better to suppress it.

One my favourite warnings in this regard is gcc's misleading-indentation
warning. The warning makes sense for new code written by a human, but if the
code is machine generated, or decades old without showing any signs of
problems caused by a "misleading indentation", then it is indeed much less
risky to simply suppress that particular warning in that particular source
file or library.

~~~
jschwartzi
The problem here is that if someone dies while using your medical device and
you have a "warning hygeine" process then there's a sound legal argument that
you knew there were problems in the device that you chose to instead paper
over and ignore. It doesn't matter that it makes sense to software engineers.
What you have to consider is how a "jury of your peers" will react to you
calmly and rationally explaining that a newer version of the compiler added
some new warnings, but you decided that because the code has been running just
fine for decades that it's totally okay to not address those warnings. It
raises questions about how seriously you're actually taking software quality.

Take your "misleading indentation" warning. If you choose to ignore that
warning, you're setting yourself up because I can make a great argument that
you don't care that the indentation is misleading. And in fact that you're
ignoring the hazards of following misleading indentation which is that another
person reading your code could misread it and introduce a defect. And that in
fact your policy is to allow some defects, including a possible defect which
has killed the plaintiff.

~~~
michaelt
Any organisation developing safety-critical code will already be following
rules strict enough that they have an established deviation approval and
documentation procedure.

And frankly, the idea that _people writing software where defects could kill
people_ would prefer not to be shown new defects because _fixing them is an
inconvenience_ is a pretty insulting view of that industry's professionalism
and ethics.

~~~
jschwartzi
And then I might ask you "so why did you apply for deviation on this warning?
Warnings are bad, right? So shouldn't you fix a compiler warning if at all
possible?"

And then you might reply "Well no because this particular warning would
require us to change some code that's really hard to change correctly so
instead of spending the time and expense eliminating a potential defect we
just left it in."

------
hvdijk

      struct Meow* p_cat = (struct Meow*)malloc(sizeof(struct Meow));
      struct Bark* p_dog = p_cat;
    

> Most compilers warn, but this is standards-conforming ISO C code that is
> required to not be rejected

Bollocks. That is a constraint violation, ISO C requires a diagnostic for it,
and ISO C allows that diagnostic to be an error. The constraint is in the
section "Simple assignment", which contains "One of the following shall hold:"
followed by a list detailing when assignments are valid. Pointers to different
structure types on the LHS vs the RHS are nowhere in that list.

~~~
gwd
What's even stupider is that I'd be willing to bet it's also UB. If it is, it
means an ISO-C-compliant compiler is allowed to generate _code_ that does
_absolutely anything_ ; which makes the worries over adding a warning kind of
ridiculous.

~~~
wtallis
I wonder if this might be a case where the assignment isn't undefined
behavior, but an attempt to dereference the pointer and access the wrong
struct type's members would be.

------
noelwelsh
And this is how backwards compatibility comes to kill innovation. It's a
reasonable stance to keep supporting your long established users but it comes
at the cost of ceding the future to the competition ( _cough_ Rust _cough_ )

~~~
Oote3eep
Well, precisely : if people want modern/innovating/fast evolving languages,
they can use Rust, Go, Elixir, etc.

I actually started using C for my side projects since two years precisely
because I want very long term backward compatibility (that is, being able to
leave a program for years without maintaining it, then make a small edit in it
and build it with minimum pain). C is perfect for that, and I agree with the
sentiment that backward compatibility is its most important feature.

~~~
coder543
> I actually started using C for my side projects since two years precisely
> because I want very long term backward compatibility (that is, being able to
> leave a program for years without maintaining it, then make a small edit in
> it and build it with minimum pain).

Rust actually takes a very strong stance on maintaining backwards
compatibility, and Go's stance is arguably even stronger in most cases. You're
implying incorrectly that these languages are just breaking things left and
right for no reason other than "innovation!", which isn't true.

The overwhelming majority of Rust and Go code from years ago will compile
without problems today. Any code from post-1.0 that doesn't compile today was
(inadvertently) relying on buggy, incorrect behaviors that have since been
fixed... and even then, not all incorrect behaviors get fixed because
compatibility is considered so important.

C is fine for certain applications, but no one should choose it for _side
projects_ based on some notion of backwards compatibility, in my opinion. If
some company is building a business application that needs "backwards
compatibility" in the sense that it can run on all sorts of arcane
microarchitectures and operating systems, then sure... C is still a really
painful* choice, but it might be the right choice then, or if there's an
existing C code base, then it probably doesn't make business sense to rewrite
it any time soon.

* yes, having no protection from footguns, no real standard library, no built-in concept of asynchronous code, and very little of anything useful is definitely painful. If C is the only valid choice for a project, then it's the only valid choice, and that's what you have to do. The number of projects where you simply _can 't_ use something other than C is diminishing by the day.

~~~
jki275
Rust and Go haven't been around long enough to make those statements about
them.

~~~
virtue3
That and, while "backwards compatability" has been quoted as important things
in these languages. I can attest to rust having issues with what is
"idiomatic" rust:

[https://timidger.github.io/posts/i-cant-keep-up-with-
idiomat...](https://timidger.github.io/posts/i-cant-keep-up-with-idiomatic-
rust/)

there are other similar complaints around the net.

While the C and C++ code I wrote over 2 decades ago is now finally "out of
date" as of 5 years ago. It took a while. Not 2-3 years.

Which is absolutely attributed to the language being new, not a design fault.

C is already "settled"

~~~
jki275
I'm learning Rust now, I don't have anything particularly against it other
than the obvious, big static binaries and such, and maybe those will be fixed
eventually.

I've been able to compile and run C and C++ code from 20 years ago a truly
amazing number of times. It's really surprising at how easy it is to work with
well written code even if it's decades old.

Will Rust and Go age that way? Maybe. Too soon to tell.

~~~
rational_indian
>I've been able to compile and run C and C++ code from 20 years ago a truly
amazing number of times. It's really surprising at how easy it is to work with
well written code even if it's decades old.

Not if you have used any third party libraries. Dependency management is a
nightmare in these languages.

~~~
jki275
Not if the project is written well. Yes, dependency hell is a thing, but there
are ways to deal with it and make good code. Autotools will straight up tell
you version x.x.x of library y is required, and as long as that's available,
the problem is solved. Dependency hell is a thing in other languages too --
try to compile a really old Java project sometime.

I spend most of my time working deep in the internals of some things that are
10+ years old running even older versions of some highly (and often badly)
modified linux kernels. The well written C/C++ projects definitely stand out.

------
csours
C is a knife. You expect knives to cut you, so you handle them carefully.

Except that C is sometimes a knife with another knife hidden in the grip and
it you don't handle it just right, the hidden knife will also cut you.
(Thinking of libraries/other people's code)

~~~
craftinator
Lol I love this analogy. It's pretty much like Darth Maul's lightsaber. Yeah,
it'll cut through anything; even stuff you aren't looking at

~~~
virtue3
Just like with C code, you better be force(memory) sensitive to even think of
wielding the space wizard laser sword.

~~~
csours
Ah, pointers, not as clumsy as objects; an elegant weapon for a more civilized
age.

(I know objects are actually just fancy pointers)

------
bigdict

      int main (int argc, char* argv[]) {
      
          (void)argc;
          (void)argv;
    
          struct Meow* p_cat = (struct Meow*)malloc(sizeof(struct Meow));
          struct Bark* p_dog = p_cat;
          // :3
     
          return 0;
      }
    

Why declare main that way if you are going to discard the arguments?

Why cast the malloc? This isn't C++.

~~~
rootbear
This is what you get when a person who is mostly a C++ coder (as per his bio)
writes C. I cringe when I see things like

    
    
      struct Bark* p_dog = p_cat;
    

instead of

    
    
      struct Bark *p_dog = p_cat;
    

That weird affectation of C++ programmers putting the asterisk on the type and
not on the declarator, where it belongs, makes my eyes bleed. I somewhat
understand the reasoning, but I think it's a gross violation of the Law of
Least Astonishment.

~~~
bigdict
I never understood the reasoning behind this, is it that "the asterisk is part
of the type, so we group it that way"?

That misses the point of the C declaration syntax: you write an expression
that when used on its own will recover the basic type. So the asterisk goes
with the symbol name, because that's how you dereference a pointer.

Further, it doesn't work if you want to declare more than one pointer like so:

    
    
      int* a, b;  /* wrong */
      int *a, *b; /* correct */

~~~
rootbear
I think the issue is that programmers want a type "pointer to int", for
example, but C doesn't directly provide that type. It has a type, int, with a
modifier (asterisk) that can be applied _to a declarator_ to make it a pointer
to that type.

One way to create a pointer type in C would be to declare it using typedef:

    
    
        typedef int * int_p;
    

then one can write:

    
    
        int_p p, q, r;
    

and declare three pointers to int with perfect clarity, whereas using the C++
style, we'd get:

    
    
        int* p, *q, *r;
    

which is very confusing, or,

    
    
        int* p;
        int* q;
        int* r;
    

which is very verbose. I honestly don't know how C++ programmers typically
handle this situation.

I have seen some code that strikes a middle ground:

    
    
        int * p;
    

which is a little more clear, but doesn't address the multiple declarator
situation.

So why don't C++ programmers use typedef? I don't know, other than I
understand Stroustrup doesn't like it (not without reason).

(Edited for formatting and minor clarity corrections.)

~~~
kazinator
This is not "C++ style"; it's just a poorly considered style perpetrated by
coders who are not familiar with the grammar.

The C++ syntax is the same as C in this regard: a declaration has specifiers,
and then one or more declarators.

The exception are function parameters, where you have (at most) one
declarator.

> _why don 't C++ programmers use typedef?_

C++ programmers do use typedef. For instance:

    
    
      typedef std::map<from_this_type, to_this_type> from_to_map;
    

C++ programmers probably use typedef a bit less than they used to, because of
features like _auto_.

When a C++ class/struct is declared, its name is introduced into the scope as
a type name. Therefore, this C idiom is not required in C++:

    
    
      typedef struct foo { int x } foo;
    

that cuts down some typedefs. If you used a typedef for a C++ class that isn't
just a "POD", you have issues, because the typedef name doesn't serve as an
alias in all circumstances.

    
    
      typedef class x { x(); } y;
    
      y::y() // cannot write x constructor this way
      {
      }

~~~
Y_Y
It's worth noting that this nice C logic falls apart when you do something
like

    
    
        f(int &a);
    

to mean "by reference" instead of what it should be, which is "get the address
of a, and that will be an int" which is , of course, nonsense.

~~~
kazinator
I don't follow. The above is not C. It's a C++ extension over C declaration
syntax in such a way that the & is part of the declarator just like * .

    
    
      // Inexcusable trompe l'oeil:
      int& a, b;
    
      // OK;
      int &a, &b;
    

Here, the mistake may be harder to catch, because the expressions _a_ and _b_
are both of type _int_ , either way.

    
    
      // Intent: b is an alias of a.
      // Reality: b is a new variable, holding copy of x.
    
      int& a = x, b = a;
    

I think what you mean is that the "declaration follows use" principle falls
apart for C++ references.

That is necessarily true because no operator is required at all to use a C++
reference, whereas the explicit & type construction operator is required in
the declarator syntax to denote it.

However, it has little to do with the issue that & is part of the declarator
and not of the type specifiers.

Declaration follows use also falls apart for function pointers in C, because
while int (* pf)(int) _can_ be used as result = (* pf)(arg), it is usually
just used as result = pf(arg).

Declaration follows use also falls apart for the -> notation. A pointer ptr is
always being used as ptr->memb, but declared as struct foo *ptr which looks
nothing like it.

And of course, arrays can be used via pointer syntax, and pointers via array
syntax, also breaking declaration follows use.

Declaration follows use is only a weak principle used to help newbies get over
some hurdles in C declaration syntax.

------
ho_schi
Backwards compatibility for code is important, like progress in language
evolution. I have question regarding "C has no ABI that could be affected by
this, C doesn’t even respect qualifiers, how are we breaking things?!"

We have language standards for changes like this like '-std=c2x' or '-std=c89'
with GNU's GCC. I understand and accept the matter of avoiding breakage.
Furthermore C is inherently weakly typed, contrary to C++ which is strongly
typed. Something you probably should not change, because that are basic
language features. But the option to set the language standard does exist for
this situation, to allow changes which will affect users. So why it cannot be
used here?

That is not a critic. I'm sure that have their rationale for that and know
more than me.

PS: Some changes will break the ABI, in that cases we likely see a
PREPROCESSOR variable or something like that which is more complicated. The
GCC people used it for some changes to std::string if I remember correctly.

~~~
godshatter
I was wondering this myself. Those million line codebases where they are
worried about new warnings breaking things should be using a compiler already
that doesn't know about the new standard, or one that allows you to set which
standard their code is following. I guess I don't understand the problem. Does
something like MISRA assume you are using the latest standard? I understand
there are regulations involved.

------
loriverkutya
"We will not make it easier for new programmers to write better C code." \-
well, that escalated quickly

------
Upvoter33
one should never get in the way of a good rant.

more seriously, what I'd like to see in C (as a long-time programmer in C) is
less freedom around undefined behavior. I used to feel like the biggest
mistakes made in C were around pointer bugs, but you can be careful and get
things like that (mostly) right. Undefined code is a lot harder to see and
avoid without a very deep understanding of lots of small details.

~~~
wtetzner
It would be very nice if there was a special C compiler that could just warn
about all occurrences of undefined behavior in a program. It doesn't even need
to be able to generate code, it could just be a front-end that points out the
places where undefined behavior is either being invoked, or could be invoked
depending on the input to the program.

~~~
MauranKilom
Does this piece of code have UB? Could it invoke UB depending on the input to
the program?

    
    
        void count(int x)
        {
          for (int i = 0; i < x; ++i)
            printf("%d ", i);
        }
    

The answer to the latter question is of course "yes" \- signed integer
overflow is UB, so you invoke UB by passing a negative x.

Would you like every loop to be flagged as potential UB? I don't think you'd
last a single day programming in that C dialect.

~~~
wtetzner
You wouldn't use the compiler to build you programs, you'd just use it to
detect UB.

It would be especially useful if you could specify your entry point(s), and
let it find all cases where user inputs could cause UB.

I think it would be especially helpful for checking the output of compile-to-c
languages.

------
jedisct1
Take a look at Zig. [https://ziglang.org/](https://ziglang.org/)

It fixes most of the C mistakes, while still giving programmers tight control
over the system.

------
kps
The real problem with n2526 is that it proposes fixing the return types of
locale functions, rather than deprecating that hot mess entirely.

------
da39a3ee
Please tell me if I misunderstood but this is what I thought I was reading
here:

\- The author is someone quite young (undergrad age) who is serving on a C
language committee (that I assume is mostly made up of people who are over 40,
probably mostly over 50).

\- The author not only is donating his own time to C language committee work,
but also clearly knows what he's talking about regarding C.

The article came across to me as thinly-disguised frustration/anger that the
committee had no interest in making C "safer".

My take away was that the article very much fitted in with all the articles
one sees being positive about Rust not just being of academic / hobbyist
interest but being a serious contender for a replacement in many industry
contexts.

------
app4soft
> _C will never stop you from making mistakes_

As making mistakes is nature of human, and _C_ will never stop human from
making mistakes, then _C_ will preserve human be natural forever!

------
torh
Well, I liked the part about (* vague gesturing towards outside *)

------
staticassertion
What does it mean to be the 'Project Editor' of C?

~~~
steveklabnik
My understanding of the role is that it's their job to literally edit the
standard, that is, they take the papers that have been accepted, and apply
them to the standard's text to produce the next draft of the standard.

------
panpanna
Actually, "C will never stop you from doing <things>".

------
choeger
Sounds like a bad case of the tail wagging the dog to me.

------
userbinator
...and that's a good thing.

Console homebrew. iOS jailbreaking. Android rooting. Those are only some of
the freedom-enabling things this and other "insecurity" allows. It's not all
bad --- and IMHO it's necessary have these "small cracks", as it keeps the
balance of power from going too far in the direction of the increasingly
authoritarian corporations.

I always keep this quote in mind: "Freedom is not worth having if it does not
include the freedom to make mistakes."

~~~
ameliaquining
This has nothing to do with any of that. Absolutely nobody is proposing that
it shouldn't be possible to write code that reads from and writes to arbitrary
registers and memory addresses, even though this obviously makes complete
memory safety impossible. (I mean, there are legitimate use cases for
sandboxing and VMs and what have you without escape hatches, but there are
also legitimate use cases for not-that.)

This is about providing better compiler diagnostics. Such diagnostics can't
catch every mistake, as long as we require the aforementioned ability to
perform arbitrary operations, but they can catch a lot more mistakes than
they're catching now.

------
brazzy
Oh boy. The conclusion sounds pretty scathing.

------
pm24601
Yet more worshipping on the fear of change. Simply speaking. If the code is
suspicious, it should be treated as such.

I am not tied to some definition of perfection but rather the practical.
Developer tools help write safer code. If that code is running dangerous
equipment this is even more important.

The expectations and quality needs to be raised. ESPECIALLY in operating
systems, device drivers and yes code that has been running "just fine" for
years.

------
ddevault
Be advised that every new C project written by experienced developers begins
with `-Wall -Wextra -Wpedantic -Werror`

~~~
account42
-Werror is a bad idea for open source projects or anything that will be compiled by people who do not know how to fix things when their shiny new compiler added a fancy warning.

-Werror=... for specific warnings might be OK in some cases.

~~~
ddevault
They can just remove -Werror when compiling. It's useful enough to keep in
place by default if you consider warnings bugs (and you ought to).

~~~
saagarjha
So would you agree that it might be a useful thing to use for local
development, but not for code you want other people to compile?

------
dooglius
This is a dumb argument. The proposal was to _add a new kind undefined
behavior_ that breaks a bunch of existing code, and this is spun as _not_
helping?!

~~~
kevincox
That's not what the article claims. It claims that it was only adding a
warning.

------
bionhoward
sounds like a stealth argument for rust?

------
quelsolaar
If the C standard org was serious about code safety the FIRST thing they would
do is PUBLISH THE STANDARD!!!!

Complaining that people don't follow the finer details of the standard while
at the same time keeping the standard unavailable to the vast majority of C
programmers is a travesty. When compiler writers think that its ok, to put
"optimizations" in to compilers that remove vital NULL checks, because the the
spec says that something may be UB that make no sense, I think to myself, What
would they say If they downloaded the latest version of their favorite text
editor, only to find that it would format their system disk when ever the user
presses Control? When they then reached out to the maker of the text editor,
the developers would answer: "Oh, on page 204 in the documentation that costs
money to access it says that pressing Control, is undefined, we we are in our
rights to format your system disk". Would they be ok with that and think it
was fair? Thats how the C standard body is behaving!

NOBODY learns C form the C standard, and that is your fault, so don't complain
about people not following it. The fact that you are also fucking it up
doesn't help: ([https://news.quelsolaar.com/2020/03/16/how-one-word-
broke-c/](https://news.quelsolaar.com/2020/03/16/how-one-word-broke-c/))

~~~
steveklabnik
Isn't this situation due to ISO rules, and don't they publish a "draft" that
is identical to the actual standard before hitting publish, to get around
those rules?

Or maybe you're talking about something I don't understand.

~~~
quelsolaar
Yes, but this arcane way of working isn't OK, when the entire opensource world
depends on the language to produce functioning code.

~~~
steveklabnik
While I also prefer other processes, for many people, ISO standardization is
really important, and so to me it seems like they're doing the best they can,
given that constraint.

------
eerimoq
Of course the language can help the programmer to write better code. High
level constructs and easily available libraries that are well tested and
widely used helps a lot. However, the biggest problem is not the language,
it's that the programmer simply writes faulty code.

~~~
speedgoose
So you have two main solutions for this problem : have all programmers to
never ever write faulty code, or design and use programming languages that are
safer and less error prone.

One is impossible, while the other is already applied.

~~~
eerimoq
In my experience most bugs are due to misunderstanding requirements and simply
writing faulty logic. Just a few bugs are related to the language itself.

~~~
hazz99
In highr-level languages you can make a lot of faulty logic inexpressible in
your code, which eliminates a ton of bugs.

~~~
eerimoq
That's pretty much what I wrote in the first post.

------
me_me_me
A short summary.

"I tried punching myself and C let me do it. Now I am ranting how C gave me a
nose bleed."

If you are using C for projects that are big and abstract then you are THE
problem for picking wrong tools for your task. I killed a mosquito with a
hammer but it left a hole in my wall => hammers are terrible tools.

~~~
59nadir
This isn't even remotely what the post is about. It's about how the C standard
is resistant even to things like warnings for provably bad code in the name of
backwards compatibility ("more warnings for (maybe) working code = bad"). Did
you even read the post?

------
vandal_at_your
When you demonstrate that your ECNL (equally crappy new language) merits an
investment of time guaranteed not to break in perpetuity against programmers
in the future and changing standards I'll make an effort to learn it. Mind, it
must be as free (to use,modify,permute and as low level) ~ C. I must be able
to do what I need to do without being second guessed bc it's dangerous. There
are some people who know when to be dangerous and slowing us all down to the
speed of the slowest bootcamp grad or most avaricious exploiter of minimal
talent isn't progress. Otherwise I'll stick with the libraries I know, trust,
and have written in C. This article is great mudsticking btw.

------
mlthoughts2018
Neither will Haskell, Rust, etc. The extra classes of mistakes they can
prevent at compile time just aren’t a meaningful volume of mistakes to make a
practical difference in the lives of any developers apart from a few niche
system engineering use cases.

If you think languages should facilitate type system design patterns that
render large classes of application level mistakes impossible, you are just an
architectural astronaut falling prey to premature abstraction and unaware that
these languages aren’t making your application code safer or more reliable,
only more brittle to the inevitable needs to break its core abstractions to
solve expanding use cases.

~~~
ameliaquining
This is a very strong claim. Do you have evidence for it?

(Yes, I'm aware that there've been a bunch of studies that didn't find
decreased "bug density" in open source repos using statically typed languages,
but there've also been studies that found the opposite, and in any event the
methodology behind all of these is dubious. Example saga:
[https://hillelwayne.com/post/this-is-how-science-
happens/](https://hillelwayne.com/post/this-is-how-science-happens/))

~~~
mlthoughts2018
I disagree it’s a strong claim. I think the claim that strict functional
programming or type system enforcement of safety has no data supporting it
significantly improves anything (defect rate, speed of development, security,
etc.).

The strong claims come from evangelists of those extreme programming
paradigms. You should be asking _them_ for proof that consists of more than
anecdata.

It’s backwards to say that essentially what is a historically validated null
hypothesis with 50 years of development history on its side is “a strong
claim” that requires special evidence, while giving a free pass to all the
people using little more than blog posts and slick syntax to claim these
extreme design patterns are demonstrably better.

If they are so much better, where are all the companies getting free lunches
just by switching to these tools? How is it that the entire industry is so
irrational that so few companies are willing to switch?

Superior ways of working catch on very fast, just consider the radical
adoption of GitHub and no-sql data systems. Why is strict functional
programming not seeing that? What mental gymnastics does it require to take as
a premise that strict functional programming is “better” yet adoption rates
are super low and successes are not proved with data, only anecdotes?

~~~
wtetzner
> It’s backwards to say that essentially what is a historically validated null
> hypothesis with 50 years of development history on its side is “a strong
> claim” that requires special evidence

I really don't see your point here. Are you suggesting that because people
have been able to write software in C for 50 years that type systems aren't
useful? Because it's not like those programs are bug free. It also doesn't
take into account how much effort is need to build and maintain that software.

Also not really sure what you're getting at about strict functional
programming. This post was just about type checking.

> Neither will Haskell, Rust, etc.

Rust is really in no way a functional language.

~~~
mlthoughts2018
I don’t see _your_ point. What does the bugfreeness of historically used
approaches have to do with it. The new kids on the block have to _prove_ they
are better in some tangible way. The onus is not on established programming
languages to prove anything.

As far as evidence is concerned, you’re not going to solve problems faster,
safer, cheaper, more reliably or more extensibly with Rust or Haskell than you
are with C or Python, apart from some very niche exceptions.

That just an empirical observation, not an opinion.

~~~
wtetzner
> The onus is not on established programming languages to prove anything.

Why is there no onus on "established" programming languages (whatever that
means)? It's not like production software hasn't been shipped in Rust,
Haskell, OCaml, etc. Just because C is older it gets a free pass?

> As far as evidence is concerned

I would argue there's plenty of evidence that languages with better type
system are valuable. What we don't have is _proof_ , but that's because nobody
has figured out how to do the experiment(s).

~~~
oalae5niMiel7qu
No, it's because _either_ nobody has figured out how to do the experiments,
_OR_ because their hypothesis is invalid and they merely haven't _dis_ proven
it.

~~~
wtetzner
Sorry, I worded that poorly. What I should have said is that we don't have
proof/disproof, because nobody has figured out how to do the experiment(s)
needed to get there.

