And this likely happened at least twice, while there were 300 MAXs in service. If there were 3,000 MAXs in service, MCAS misfires would presumably be happening 3x a month worldwide - each misfire requiring a proper pilot reaction. How can you defend Boeing in that case?
There are plenty of people (on this site and on reddit) who jump on to criticize airbus for the way it puts the aircraft's computer systems ahead of the pilots and claim that the 737's flight controls are preferable to pilots because the pilots have full control. I don't want to spark any airbus vs boeing debate, but I want to point out these accidents seem to discredit that line of thinking.
In the Lion Air accident, a malfunctioning sensor caused this system to react incorrectly, surprising the pilots.
Let’s also remember that a malfunctioning sensor was a contributing factor in the crash of Air France 447... a problem which was exacerbated by human interface choices in Airbus’s fly by wire system. In particular, stick input is not physically synchronized and conflicting inputs were averaged; neither of which is true on a Boeing aircraft. Even on a 787 the yokes move together.
Human factors matter a lot if the computers hand control back to the humans unexpectedly. That can happen even in fly by wire systems on Airbus.
Edit: fixed 737 reference up top
On the Max 8 a bad sensor is causing an automated system to misbehave, and may have resulted in crashes.
On AF447 ice blocking the pitot-static system caused automated systems to go offline (enter Alternative Law), and loss of situational awareness caused the pilots to crash their aircraft under manual control.
If you read the final report you'll see that pilot error and loss of basic airman-ship caused the accident. There's always improvements to the airframe to mitigate the human component, but acting like these two are somehow different sides of the same coin is disingenuous.
I'm not actually sure why you brought it up. Comes across as "look they're both just as bad" while ignoring key details about AF447.
From your link. In my mind that is the critical piece. The pilot did the right thing, pushed the nose down, heard the stall warning again, and did the wrong thing, pulled the nose up, because that made the stall warning go away. Having been in a plane upside down at night pointed toward the ground I can imagine the stress on the crew. It’s easy to say dump the nose when the stall warning goes off, but what if they had been upside down, then he would have been doing the right thing by listening to the warning. The fact the stall warning cut off was the final critical link in the accident, not pilot error.
After Boeing has made MCAS triply-redundant (or whatever else they have to do to get these airplanes flying again), questions will remain about why nominally well-trained people sometimes perform disastrously below expectations when faced with something unusual, and what can be done about it. Is it possible, for example, that highly-prescribed training adversely affects flexibility and resilience in problem-solving?
The fact that Boeing themselves chose to implement flight envelope protections and FBW systems on their newer planes show that the industry in general is in favour of them, pretty much the only difference now is that the boeing yoke provides feedback.
Adding Airbus envelope protection to the 737 Max 8 would not have prevented the Lion Air crash because the root problem was bad sensor data. Sending bad sensor data into a flight computer will give you bad output, even in a FBW system.
Conversely, if the AOA sensors had been operating correctly, or triply redundant to mitigate failure, then MCAS would have worked correctly and safely, even without Airbus-style "normal law" FBW.
How to safely integrate imperfect automation with human control is a major developing issue, and not just in aviation. The same concerns are coming to light in cars and trucks with features like smart cruise control, lane-keeping assist, highway "autopilot," etc.
On the surface it seems like greater and better automation will make everything safer, and maybe it does in the aggregate, by covering a lot of low hanging fruit. Antilock braking and traction control certainly fit into this category in cars.
But as automation gets smarter, it means that the situations where humans are required to take over from the automation will become less frequent and more unusual. So systems need to be designed to not only protect humans with automation, but also to alert and orient humans as quickly and accurately as possible when they need to step in and take over. This is why I happen to think that physically synchronizing stick/yoke movement is a good idea; it's one less thing to have to talk about in an emergency.
I don't blame Airbus FBW for the AF447 crash, but I don't blame the pilots either. I thing it was the complex interaction of the two that caused the crash. That's a very complex problem to solve, and it's one that we will have to keep addressing until automation is so good that it never hands off to a human for any reason.
The common element between Lion Air and Air France crashes is bad sensor data: AOA and airspeed, respectively. A computerized system, no matter how sophisticated, cannot protect pilots from themselves in the absence of accurate sensor data.
Agreed but the context is completely different. The airbus flight computers rely on data from three sensors and thus are triply redundant. AF447 happened in severe icing conditions with water ingressing into the pitot tubes on the ground. All three pitot tubes froze, causing all three sensors to report bad data. The aircraft identified the discrepancy and degraded from normal law to alternate law as expected so it would not make any decisions based on the faulty data. The crash ultimately was caused by the pilots being disoriented because of the very low visibility. The sidestick issue is a different discussion, but its not related to the 737MAX accidents.
The MCAS system on the other hand is not triply redundant, and that's what people have issue with. Boeing itself has more redundant envelope protections on the newer models like the 777 and 787.
In the case of Airfrance 447, all three pitot tubes malfunctioned due to icing conditions well outside the the expected range required for type certification for that aircraft. The aircraft also did degrade down from normal law as was expected.
Admittedly, the sidesticks on the airbus do not provide feedback but both pilots ignored the aural "dual input" warning, as well as the warning indication on the panel.
I don't agree with the article either, fwiw, but I suspect it's just his unpaid opinion.
AIUI 737 MAX has an instability such that, in near stall conditions, some attempts to recover can make the stall worse. To mitigate this, Boeing added MCAS, and MCAS can malfunction with a single sensor failure. Imagine that this failure occurs and the pilot successfully turns off MCAS but ends up in a dive, too close to the ground, or otherwise in a bad situation. Now the pilot has to recover, but they are facing a faulty AoA indicator (if they have one at all) as well as a plane that, because MCAS is off, is unstable in near-stall conditions. And the pilot has never been trained in the handling of type 737 MAX under these conditions.
Am I wrong for some reason, or is this a potentially rather dangerous situation that could be caused by a single instrument failure?
How was the plane delivered if it didn't pass certification?
The aircraft would not meet today's certification requirements as a new type
Now obviously they could have made the wrong decision there, but the idea that this plane is somehow skating by with no oversight via a technicality just does not pass the smell test.
Edit to add:
The 737 Max Wikipedia page details the testing that the Max variant underwent, and links to the 737 type certificate—which devotes 38 pages to the 737-8.
> Boeing announced on February 16, 2018 that the 737 MAX 9 has received an amended type certificate (ATC) from the U.S. Federal Aviation Administration (FAA), officially certifying the airplane for commercial service.
> This certification marks the culmination of a successful flight test program that began in March 2017 with two Boeing flight test airplanes. The FAA certification affirms that the airplane’s handling, systems and overall performance all comply with required aviation regulations.
I understand (I think) that they made changes to make it essentially handle and act like the 737 therefore it can apply to the same certificate, but surely there should be some kind of time restriction on a certificate applying to new models - even ones that are "similar".
Can someone illuminate me as to why this does or does not make sense?
Edit: And the airlines will lobby on your behalf too. They don't want to have to certify pilots on a new type, or have yet another split of who can fly what.
The standard he's referencing is for "flight critical" things that the pilot can't compensate for or do without.
...and using their own words, is "unsafe!" to fly without additional training. It is unsafe to fly a plane with different layout and handling.
MCAS needs another layer of redundancy and whole thing needs to be entirely retested and certified under a new type.
We just had a truly historic near-miss where a crew lined up to land on a crowded taxiway at SFO instead the runway. If they had not gone around at the last second we would be talking about it as the most deadly accident in aviation history. Fly by wire was not going to prevent that; the descending aircraft was an Airbus A320.
Whether fly by wire makes pilots more trustworthy—that’s a good question.
But I think the author is addressing a subtly different question, which is: what seems more trustworthy to people? People like: the public and politicians. Is it still acceptable to say “the human, despite its flaws, still belongs in the critical path of safety.” Or, is the reputation of computers so strong now that all planes will have to work that way to satisfy modern notions of what is safe?
Who is more eeliable, the instrument, or the instrument user?
A machine can only be guaranteed to function on information it has. Same with the human being behind the yoke.
The human by far has the greatest capacity as the only General Intelligence in the room, but technically, that GI's performance is limited by it's access to accurate information regarding the functionality of the tool he is using.
I'm also very skeptical of his thesis that these accidents were pilot error, that there is nothing different about the Max 8 than any previous generation of the 737. The 737 in general has an excellent safety record. The Max 8 has had two crashes within two years of its introduction. The odds of that just being a coincidence are pretty damn low.
So I can't help but wonder if this guy is a shill. Maybe the question we should be asking is: can pilots trust Boeing?
[this source]: https://www.aviationpros.com/home/press-release/10393553/mac...
Look at the cockpit of a DC-3 or pretty much anything pre-war design that doesn’t use a stick.. it will educate you.
Also, Boeing is a multi billion dollar company.. if they’re going to shill I’m sure they’d have no trouble finding an actual pilot to do it for them. We’re not that rare.
Boeing had already fix in the pipeline (software fix and different instructions for pilots, updating the manual) but government shutdown seems to have delayed the FAA approval process.
To the statisticians of HN: I am genuinely curious. What are the odds of that? Do we model it as Poisson?
So, assuming an accident probability of 1/10000000 for each flight, and assuming that each flight is an independent event, then the probability of getting 2 or more accidents in 150k flights of normally functioning planes is 1/8978.23, if I've done the math correctly.
The probability of 1 or more accidents in 150k flights is 1/67.1679.
I have no idea how to interpret this. One of the challenges in statistics is asking the right questions, and I'm not sure that the questions I've answered are the right questions.
Edit: here is a video of the trim wheel on a 737-300
If you're mostly surrounded by boeing equipment for most of your career then it's likely to use the terminology that they use.
I was totally going to let you explain. After all, I'm reading your article and I intended to read more than just the first paragraph. But your sudden insistence that I just give you a few more minutes to explain your provocative argument presumes too much about what I'm thinking. And it's over-used. And it's an annoying tone.