Hacker News new | past | comments | ask | show | jobs | submit login

Your rationale in this and your followups are exactly what I'm talking about.

1. You're actually right if the entire program is less than about 20 lines. But bad programs always grow, and implicit declaration will inevitably lead you to have a bug which is really hard to find.

2. The trouble comes from programmer typos that turn out to be real syntax, so the compiler doesn't complain, and people tend to be blind to such mistakes so don't see it. My favorite actual real life C example:

    for (i = 0; i < 10; ++i);
    {
        do_something();
    }
My friend who coded this is an excellent, experienced programmer. He lost a day trying to debug this, and came to me sure it was a compiler bug. I pointed to the spurious ; and he just laughed.

(I incorporated this lesson into D's design, spurious ; produce a compiler error.)

3. I used to work for Boeing on flight critical systems, so I speak about how these things are really designed. Critical systems always have a backup. An assert fail means the system is in an unknown, unanticipated state, and cannot be relied on. It is shut down and the backup is engaged. The proof of this working is how incredibly safe air travel is.




> 3. I used to work for Boeing on flight critical systems, so I speak about how these things are really designed. Critical systems always have a backup. An assert fail means the system is in an unknown, unanticipated state, and cannot be relied on. It is shut down and the backup is engaged.

I ask you to reconsider your assumptions. How did this play out in the 737 MAX crashes? Was there a backup AoA sensor? Did MCAS properly shut down and backup engaged? Was manual overriding the system not vital knowledge to the crew?

You don’t have to answer. I probably wouldn’t get it anyway.

But rest assured that I won’t try to program flight control and I strongly appreciate your strive for better software.


> How did this play out in the 737 MAX crashes?

They didn't follow the rule in the MCAS design that a single point of failure cannot lead to a crash.

> Was manual overriding the system not vital knowledge to the crew?

It was, and if the crew followed the procedure they wouldn't have crashed.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: