Hacker News new | past | comments | ask | show | jobs | submit login
Can Boeing Trust Pilots? (airfactsjournal.com)
81 points by skybrian on Mar 14, 2019 | hide | past | web | favorite | 55 comments

Assuming the author is correct, and the reaction to the MCAS issues is a simple reaction that every pilot should know by memory: Is it really acceptable that once every 3 months a 737-Max will attempt a nose dive and require a vigilant pilot who can identify and correct the issue before the plane crashes into the ground?

And this likely happened at least twice, while there were 300 MAXs in service. If there were 3,000 MAXs in service, MCAS misfires would presumably be happening 3x a month worldwide - each misfire requiring a proper pilot reaction. How can you defend Boeing in that case?

I feel like this is a strong argument for the airbus alpha protection. And I think Boeing agrees with that given the way the 777 fly by wire system is designed.

There are plenty of people (on this site and on reddit) who jump on to criticize airbus for the way it puts the aircraft's computer systems ahead of the pilots and claim that the 737's flight controls are preferable to pilots because the pilots have full control. I don't want to spark any airbus vs boeing debate, but I want to point out these accidents seem to discredit that line of thinking.

Before we get excited about computer protections, let’s acknowledge that the MCAS system under debate on the 737 Max is itself a computerized system designed to protect pilots from themselves—in this case, from pitching too high and creating a stall.

In the Lion Air accident, a malfunctioning sensor caused this system to react incorrectly, surprising the pilots.

Let’s also remember that a malfunctioning sensor was a contributing factor in the crash of Air France 447... a problem which was exacerbated by human interface choices in Airbus’s fly by wire system. In particular, stick input is not physically synchronized and conflicting inputs were averaged; neither of which is true on a Boeing aircraft. Even on a 787 the yokes move together.

Human factors matter a lot if the computers hand control back to the humans unexpectedly. That can happen even in fly by wire systems on Airbus.

Edit: fixed 737 reference up top

Unclear why you're trying to conflate the MCAS issue with AF447. Or are even bringing up AF447 at all.

On the Max 8 a bad sensor is causing an automated system to misbehave, and may have resulted in crashes.

On AF447 ice blocking the pitot-static system caused automated systems to go offline (enter Alternative Law), and loss of situational awareness caused the pilots to crash their aircraft under manual control.

If you read the final report[0] you'll see that pilot error and loss of basic airman-ship caused the accident. There's always improvements to the airframe to mitigate the human component, but acting like these two are somehow different sides of the same coin is disingenuous.

I'm not actually sure why you brought it up. Comes across as "look they're both just as bad" while ignoring key details about AF447.

[0] https://en.wikipedia.org/wiki/Air_France_Flight_447#Final_re...

> the system rejected the data as invalid and temporarily stopped the stall warnings. However, "this led to a perverse reversal that lasted nearly to the impact: each time Bonin happened to lower the nose, rendering the angle of attack marginally less severe, the stall warning sounded again—a negative reinforcement that may have locked him into his pattern of pitching up", which increased the angle of attack and thus prevented the aircraft from getting out of its stall

From your link. In my mind that is the critical piece. The pilot did the right thing, pushed the nose down, heard the stall warning again, and did the wrong thing, pulled the nose up, because that made the stall warning go away. Having been in a plane upside down at night pointed toward the ground I can imagine the stress on the crew. It’s easy to say dump the nose when the stall warning goes off, but what if they had been upside down, then he would have been doing the right thing by listening to the warning. The fact the stall warning cut off was the final critical link in the accident, not pilot error.

AF 447 and Lion Air 610 are related in a way that does not have anything to do with Airbus vs. Boeing. In both cases, the crew experienced a situation that differed in some way from those they had explicitly trained or been prepared for (in the first case, because it was considered implausible, and in the second, because Boeing had not told pilots about it.) Nevertheless, in both cases, it is somewhat surprising that they were unable to recover from it (the Southwest pilots' union was upset at Boeing for not disclosing MCAS, but the American pilots' union did not think it was a big deal, and neither group seemed to think an MCAS failure would present a serious problem for a prepared pilot.)

After Boeing has made MCAS triply-redundant (or whatever else they have to do to get these airplanes flying again), questions will remain about why nominally well-trained people sometimes perform disastrously below expectations when faced with something unusual, and what can be done about it. Is it possible, for example, that highly-prescribed training adversely affects flexibility and resilience in problem-solving?

This is exactly the type of crowd I was referring to earlier. There seems to be a tendency on this site, and on reddit for people to accuse airbus of having an inferior philosophy when it comes to the sidestick not having any feedback. And while some of those criticisms can be fair, there is a tendency to bring this argument into any discussion about the differences in cockpit design philosophy between boeing and airbus. They always bring up AF447 and attribute that accident to the airbus FBW system. Even with the Sully landing, questions were raised about how the aircraft limited Sully's ability to control the aircraft, completely missing the fact that on dual engine failure, the FBW system degrades to alternate law.

The fact that Boeing themselves chose to implement flight envelope protections and FBW systems on their newer planes show that the industry in general is in favour of them, pretty much the only difference now is that the boeing yoke provides feedback.

I don't think Airbus has an inferior philosophy compared to Boeing, but I'm sure Airbus has a different philosophy from Boeing, and each philosophy comes with its own set of tradeoffs. And neither is a panacea. For me, that's the lesson of AF447.

Adding Airbus envelope protection to the 737 Max 8 would not have prevented the Lion Air crash because the root problem was bad sensor data. Sending bad sensor data into a flight computer will give you bad output, even in a FBW system.

Conversely, if the AOA sensors had been operating correctly, or triply redundant to mitigate failure, then MCAS would have worked correctly and safely, even without Airbus-style "normal law" FBW.

How to safely integrate imperfect automation with human control is a major developing issue, and not just in aviation. The same concerns are coming to light in cars and trucks with features like smart cruise control, lane-keeping assist, highway "autopilot," etc.

On the surface it seems like greater and better automation will make everything safer, and maybe it does in the aggregate, by covering a lot of low hanging fruit. Antilock braking and traction control certainly fit into this category in cars.

But as automation gets smarter, it means that the situations where humans are required to take over from the automation will become less frequent and more unusual. So systems need to be designed to not only protect humans with automation, but also to alert and orient humans as quickly and accurately as possible when they need to step in and take over. This is why I happen to think that physically synchronizing stick/yoke movement is a good idea; it's one less thing to have to talk about in an emergency.

I don't blame Airbus FBW for the AF447 crash, but I don't blame the pilots either. I thing it was the complex interaction of the two that caused the crash. That's a very complex problem to solve, and it's one that we will have to keep addressing until automation is so good that it never hands off to a human for any reason.

Did you read the comment I’m replying to?

The common element between Lion Air and Air France crashes is bad sensor data: AOA and airspeed, respectively. A computerized system, no matter how sophisticated, cannot protect pilots from themselves in the absence of accurate sensor data.

> The common element between Lion Air and Air France crashes is bad sensor data

Agreed but the context is completely different. The airbus flight computers rely on data from three sensors and thus are triply redundant. AF447 happened in severe icing conditions with water ingressing into the pitot tubes on the ground. All three pitot tubes froze, causing all three sensors to report bad data. The aircraft identified the discrepancy and degraded from normal law to alternate law as expected so it would not make any decisions based on the faulty data. The crash ultimately was caused by the pilots being disoriented because of the very low visibility. The sidestick issue is a different discussion, but its not related to the 737MAX accidents.

The MCAS system on the other hand is not triply redundant, and that's what people have issue with. Boeing itself has more redundant envelope protections on the newer models like the 777 and 787.

Although it is perfectly possible to fly an aircraft with the loss of certain instruments. And even if the plane is in a stall, there are established procedures for getting out of it that don't involve flying the plane into the ground. But the computer would need to be prepared to disregard some data in favour of other data.

For those of us non-experts, I am glad that it was brought up because that accident immediately came to mind for me as well, and I was curious about the differences between them. Thank you for the comparison!

No, the MCAS is fundamentally different from the envelope protection on the airbus aircraft, and AFAIK the protections on the 777. The airbus aircraft sensors have triple redundancy, and if even one of the sensors disagrees with the others, the FBW system degrades from Normal law to Direct law.

In the case of Airfrance 447, all three pitot tubes malfunctioned due to icing conditions well outside the the expected range required for type certification for that aircraft. The aircraft also did degrade down from normal law as was expected.

Admittedly, the sidesticks on the airbus do not provide feedback but both pilots ignored the aural "dual input" warning, as well as the warning indication on the panel.

Oops, did you mean 737?


Just another case of humans being unable to accept the fact that computers are better than them.


Curious if you're really suggesting Boeing paid this guy to write the post, or just frustrated with people saying there's not an aircraft issue.

I don't agree with the article either, fwiw, but I suspect it's just his unpaid opinion.

I’ve learned you don’t need to pay people tit for tat. People natively reward kindness with kindness. Five years back Boeing May have given this writer a number of unnoticeable freebies every other pilot has access to and this pilot feels like part of the tribe.

Here’s what I don’t get about this whole situation:

AIUI 737 MAX has an instability such that, in near stall conditions, some attempts to recover can make the stall worse. To mitigate this, Boeing added MCAS, and MCAS can malfunction with a single sensor failure. Imagine that this failure occurs and the pilot successfully turns off MCAS but ends up in a dive, too close to the ground, or otherwise in a bad situation. Now the pilot has to recover, but they are facing a faulty AoA indicator (if they have one at all) as well as a plane that, because MCAS is off, is unstable in near-stall conditions. And the pilot has never been trained in the handling of type 737 MAX under these conditions.

Am I wrong for some reason, or is this a potentially rather dangerous situation that could be caused by a single instrument failure?

You're not wrong by my estimation. If a function provided to recover from a stall activated at low altitude, regardless of flight control at < 5k ft, I'm not a pilot but depending on how aggressive the MCAS was, I can't imagine recovery would be easy.

I don’t believe this author understands the issue here. His description of the force feedback on the wheel is not correct. The automatic trim adjustment from the MCAS would not be detected if the pilot is trimmed in and at neutral stick. They may detect the pitch change and pull up/retrim for level flight, which disables the MCAS temporarily and gives the impression that trim is under control.

The author understands the issue fine and their description is correct. MCAS is a trim-and-feel system that exists to satisfy the stick force gradient requirements of 14 CFR 25.143 [1] within a relatively normal flight envelope. A stick pusher is a stall warning/prevention device that exists to satisfy 14 CFR 25.207 and 25.103, and which doesn't activate in a normal flight envelope.

[1] https://www.law.cornell.edu/cfr/text/14/25.143

The author does not say say that MCAS works as a stick pusher, and does not describe it as if it did. He writes "When the AOA reaches a point without enough stall margin, the system adds some nose-down pitch trim. That pitches the nose down and gives the pilot the stick force to know that he is pulling too close to the stall margin" [my emphasis.] Given that the trim operates by changing the angle of incidence of the stabilizer, and apparently only provides feedback at the yoke through the aerodynamic consequences of the trim change, is that inaccurate is some way? Maybe the threshold is as much one of stability margin as stall margin, but that's not significant with respect to the issue here.

> What’s critical to the current, mostly uninformed discussion is that the 737 MAX system is not triply redundant. In other words, it can be expected to fail more frequently than one in a billion flights, which is the certification standard for flight critical systems and structures.

How was the plane delivered if it didn't pass certification?

It's grandfathered under the original 737 type certificate from the 1960s.

The aircraft would not meet today's certification requirements as a new type

The FAA reviewed the 737 Max 8 following the Lion Air crash and decided to allow it to keep flying.

Now obviously they could have made the wrong decision there, but the idea that this plane is somehow skating by with no oversight via a technicality just does not pass the smell test.

Edit to add: The 737 Max Wikipedia page details the testing that the Max variant underwent, and links to the 737 type certificate—which devotes 38 pages to the 737-8.


> Boeing announced on February 16, 2018 that the 737 MAX 9 has received an amended type certificate (ATC) from the U.S. Federal Aviation Administration (FAA), officially certifying the airplane for commercial service.

> This certification marks the culmination of a successful flight test program that began in March 2017 with two Boeing flight test airplanes. The FAA certification affirms that the airplane’s handling, systems and overall performance all comply with required aviation regulations.


As someone with limited knowledge into airplane engineering/certifications, on the surface that seems wrong.

I understand (I think) that they made changes to make it essentially handle and act like the 737 therefore it can apply to the same certificate, but surely there should be some kind of time restriction on a certificate applying to new models - even ones that are "similar".

Can someone illuminate me as to why this does or does not make sense?

Boils down to money. A brand new type certificate for a big passenger jet is in the hundreds of millions of dollars in time and money. Therefore you can justify spending quite a lot on lobbying (actual and metaphorical) to get away with reusing one.

Edit: And the airlines will lobby on your behalf too. They don't want to have to certify pilots on a new type, or have yet another split of who can fly what.

This has been mentioned in the discussions - the "Red Flag" comment : "The fact that this airplane requires such jury rigging to fly is a red flag'.[1] " What happened, I think, is deceptive naming - a fairly different model was given the name of a safe plane. I think this would have simplified and lessened the certification requirements for the new model. The other models of 737 are among the safest in the world.

[1] https://www.independent.co.uk/travel/news-and-advice/boeing-...

I think this is a misunderstanding. He's not saying the plane didn't pass certification.

The standard he's referencing is for "flight critical" things that the pilot can't compensate for or do without.

Does it have any choice? The official FAA incidents log has US pilots stating that the MAX is different to fly from normal 737s...

...and using their own words, is "unsafe!" to fly without additional training. It is unsafe to fly a plane with different layout and handling.

Did you read TFA?

It seems like FBW is the way to go and that this particular subsystem was tacked on and not adequately tested and the overall subsystem missing redundancy. Main problem is greedy executives trying to pass it off as a normal 737 when it was completely different. So some executives at Boeing and the FAA should go to prison.

MCAS needs another layer of redundancy and whole thing needs to be entirely retested and certified under a new type.

I found this article informative on several points, and the conclusion is correct ("I am sure the future belongs to FBW"), but it sets up a test that has already been failed. Can Boeing trust pilots? Well, if that's how he wants to put it, then no, evidently they cannot. We're talking about two crashes in a short period, in which hundreds of people died. There's a kind of tautology here.

Of course we can’t trust pilots, that why airliners have at least two of them, and check lists for everything, and all sorts of warnings, etc.

We just had a truly historic near-miss where a crew lined up to land on a crowded taxiway at SFO instead the runway. If they had not gone around at the last second we would be talking about it as the most deadly accident in aviation history. Fly by wire was not going to prevent that; the descending aircraft was an Airbus A320.

Whether fly by wire makes pilots more trustworthy—that’s a good question.

But I think the author is addressing a subtly different question, which is: what seems more trustworthy to people? People like: the public and politicians. Is it still acceptable to say “the human, despite its flaws, still belongs in the critical path of safety.” Or, is the reputation of computers so strong now that all planes will have to work that way to satisfy modern notions of what is safe?

The sad thing is that more than anything else, this is starting to veer into the realm of a philosophical question:

Who is more eeliable, the instrument, or the instrument user?

A machine can only be guaranteed to function on information it has. Same with the human being behind the yoke.

The human by far has the greatest capacity as the only General Intelligence in the room, but technically, that GI's performance is limited by it's access to accurate information regarding the functionality of the tool he is using.

Something about this piece doesn't smell right. The author claims to be a pilot with 45 years of experience, writing in a publication that bills itself as "by pilots, for pilots", but I'm a pilot (with 23 years of experience) and I've never before heard a pilot refer to the yoke as "the wheel". ("If you don’t believe me, sit in a Bonanza A36 or Cessna 210, or just about any other high-performance piston airplane, and pull the wheel back while sitting still on the ramp.")

I'm also very skeptical of his thesis that these accidents were pilot error, that there is nothing different about the Max 8 than any previous generation of the 737. The 737 in general has an excellent safety record. The Max 8 has had two crashes within two years of its introduction. The odds of that just being a coincidence are pretty damn low.

So I can't help but wonder if this guy is a shill. Maybe the question we should be asking is: can pilots trust Boeing?

From [this source], "McClellan has logged more than 10,000 hours as pilot-in-command, flying everything from a 1946 Cessna 140, his first airplane, to the Cessna 162 SkyCatcher and virtually all general aviation airplanes that have been in production over the past 30 years. He holds an ATP certificate for multi engine airplanes with type ratings in several business jets, has a commercial certificate for helicopters, and is a CFI-I."

[this source]: https://www.aviationpros.com/home/press-release/10393553/mac...

And the Cessna 140 has a yoke that looks like a (oblong) wheel. https://upload.wikimedia.org/wikipedia/commons/thumb/a/a9/19...

Then you simply haven’t talked to many older pilots. Yoke, control wheel, wheel for short, they’re all the same fucking thing. This seems like a dumb way to shill, no?.

Look at the cockpit of a DC-3 or pretty much anything pre-war design that doesn’t use a stick.. it will educate you.

Also, Boeing is a multi billion dollar company.. if they’re going to shill I’m sure they’d have no trouble finding an actual pilot to do it for them. We’re not that rare.

And they were wheels in the days before power-boosted controls because you had to apply a fair amount of force in using them. Flying a WWII bomber in close formation for hours at a time was a physical job.

Pilots have reported several incidents of unexpected nose dive after putting autopilot on into FAA database. Even if pilots are trusted to fix things in very rare cases, doing it as a routine seems wrong.

Boeing had already fix in the pipeline (software fix and different instructions for pilots, updating the manual) but government shutdown seems to have delayed the FAA approval process.

> The odds of that just being a coincidence are pretty damn low.

To the statisticians of HN: I am genuinely curious. What are the odds of that? Do we model it as Poisson?

In another thread, someone said that one accident per 10 million flights is normal, and that this new model 737 has had 150k flights.

So, assuming an accident probability of 1/10000000 for each flight, and assuming that each flight is an independent event, then the probability of getting 2 or more accidents in 150k flights of normally functioning planes is 1/8978.23, if I've done the math correctly.

The probability of 1 or more accidents in 150k flights is 1/67.1679.

I have no idea how to interpret this. One of the challenges in statistics is asking the right questions, and I'm not sure that the questions I've answered are the right questions.

Same answers on the math. Using a binomial I got 1/9009 for 2+, mostly same answer for 1+ in 150k. Probably rounding differences. Also agreed on asking the right questions. Could be a 1/9000 event given a 1/10million safety expectation, or it could simply be a much more probable event given that our assumption of 1/10million is incorrect. Ironically it could also be that the new plane is even more safe, but we've just experienced an e.g. 1 in 50,000 event. There's lies, damned lies, and statistics.

A "one in 10k" coincident would qualify as mildly interesting at most. We run into coicidences all the time and I think it needs to be about 1 in 1000 to even register as a coincident.

Would it be worthwhile to compare to a daily risk that people take? e.g. what are the odds of getting hit by a car when crossing the road?

Is he talking about the trim wheel? I don't know much about airplanes but that seems possible to me. Would the trim wheel get the spring force feedback like the yoke?

Edit: here is a video of the trim wheel on a 737-300 https://vimeo.com/34501723

'Control Wheel' is enshrined in boeing terminology to the point that one of the autopilot modes on older 737s is 'CWS' for 'Control Wheel Steering'.

If you're mostly surrounded by boeing equipment for most of your career then it's likely to use the terminology that they use.

I suspect he knows what he is talking about but there is also a flaw in the MCAS that interferes with or prevents pilot correction.

The fact that it's called "Air Facts Journal" is suspicious.

It's well-known in the field. The name was chosen in 1938:


Good to know, thank you. Without context the name sounds very odd.

I signed up to get text messages about airplanes sent to my cell phone, and now they won't stop!

Am I the only one who immediately stops reading when a writer says, "let me explain"?

I was totally going to let you explain. After all, I'm reading your article and I intended to read more than just the first paragraph. But your sudden insistence that I just give you a few more minutes to explain your provocative argument presumes too much about what I'm thinking. And it's over-used. And it's an annoying tone.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact