Hacker News new | past | comments | ask | show | jobs | submit login

Fun to consider as both a computer scientist and a CFI.

Instrument training in FAA-land requires learners to understand the five hazardous attitudes: anti-authority ("the rules don't apply to me"), impulsivity ("gotta do something now!), invulnerability ("I can get away with it"), macho ("watch this!"), and resignation ("I can't do anything to stop the inevitable"). Although the stakes are different, they have applicability to software development. Before a situation gets out of hand, the pilot has to recognize and label a particular thought and then think of the antidote, e.g., "the rules are there to keep me safe" for anti-authority.

Part 121 or scheduled airline travel owes its safety record to many layers of redundancy. Two highly trained and experienced pilots are in the cockpit talking to a dispatcher on the ground, for example. They're looking outside and also have Air Traffic Control watching out for them. The author mentioned automation. This is an area where DevSecOps pipelines can add lots of redundancy in a way that leaves machines doing tedious tasks that machines are good at. As in the cockpit, it's important to understand and manage the automation rather than following the magenta line right into cumulogranite.




Former airline pilot checking in!

Remember the importance of checklists in the "grand scheme of things". It helps maintain proper "authority" during operation and makes sure you don't forget things. If you don't write it down and check it, someone, at a certain moment will forget something.

Also, the "Aviate, navigate, communicate" axiom (as mentioned by author) is really helpful if you're trying to setup incident/crisis response structures. You basically get your guiding principles for free from an industry that has 100+ years of experience in dealing with crisises. It's something I teach during every incident/crisis response workshop.

edit: Although it's not aviation specific, and a little light on the science, "The Checklist Manifesto" by A. Gawande is a nice introduction into using (and making) checklists.


And the value of good documentation, and actually reading that documentation as well as making sure the documentation is indexed and quick to peruse in a situation where you don't have time to waste.


Great observation. You can easily and routinely see all five hazardous attitudes in software development, especially in small companies and startups where there is sometimes no formal process in place. I wonder if you could measurably improve your software by focusing on those attitudes during interviews…?


IIRC the five hazardous attitudes are required material for all pilots not just IFR.


Another good one: Warnings, cautions, rules etc are often written in blood.


I feel like when applied to software the "invulnerability" point needs to be tweaked a little, the others are good. Perhaps something more towards apathy "it does't matter/it isn't worth fixing". It's the same end result (the concequences won't track back to me), but it's much more likely to be true in software development and yet is still a hazardous attitude


> anti-authority ("the rules don't apply to me")

Of course in Aviation the 'authorities' are usually rational and fair. In many other areas of life they are neither, and are incompetent to boot. Being anti-authority is justified in such cases. i.e. there is a moral responsibility to disobey unjust laws.


Authority is relative and is more nuanced. Only in recent human history have we seen a deliberate separation of church and state, for example. Prior to this, they were intertwined to a degree that would be incomprehensible to us now.

As a new pilot myself, I can say with confidence that the FAA has some major flaws and the US congress has been able to get their dirty hands in aviation policy to enact separate rules that did not originate from the recommendations by the NTSB.


Software industry prides itself with low barrier to entry and ageism is absolutely the case. Then in workplace the authorities are things like "Airbnb code guidelines", blog posts (or even tweets!) by evangelists sponsored by Google/Amazon/Meta, or design mockups by designer who hates checkboxes.


The attitudes are interesting.

WRT development, I wonder if there are attitudes that can be applied to software and hardware design that combat bad systems.

For example, cars with touchscreens instead of individual controls.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: