• Skip to content
  • Skip to primary sidebar

Right Attitudes

Ideas for Impact

The Boeing 737 MAX’s Achilles Heel

January 7, 2020 By Nagesh Belludi Leave a Comment

Two thousand nineteen was one of the most turbulent years in Boeing’s history. Its 737 MACS (pardon the pun) troubles went from bad to worse to staggering when aviation regulators around the world grounded the aircraft and a steady trickle of disclosures increasingly exposed software problems and corners being cut.

The flaw in this aircraft, its anti-stall mechanism that relied on data from a single sensor, offers a particularly instructive case study of the notion of single point of failure.

One Fault Could Cause an Entire System to Stop Operating

A single point of failure of a system is an element whose failure can result in the failure of the entire system. (A system may have multiple single points of failure.)

Single points of failures are eliminated by adding redundancy—by doubling the critical components or simply backing them up, so that failure of any such element does not initiate a failure of the entire system.

Boeing Mischaracterized Its Anti-Stall System as Less-than-Catastrophic in Its Safety Analysis

The two 737 MAX crashes (with Lion Air and Ethiopian Airlines) originate from a late-change that Boeing made in a trim system called the Maneuvering Characteristics Augmentation System (MCAS.)

Without the pilot’s input, the MCAS could automatically nudge the aircraft’s nose downwards if it detects that the aircraft is pointing up at a dangerous angle, for instance, at high thrust during take-off.

Reliance on One Sensor is an Anathema in Aviation

The MCAS was previously “approved” by the Federal Aviation Administration (FAA.) Nevertheless, Boeing made some design changes after the FAA approval without checking with the FAA again. The late-changes were made to improve MCAS’s response during low-speed aerodynamic stalls.

The MCAS system relied on data from just one Angle-of-Attack (AoA) sensor. With no backup, if this single sensor were to malfunction, erroneous input from that sensor would trigger a corrective nosedive just after take-off. This catastrophe is precisely what happened during the two aircraft crashes.

The AoA sensor thus became a single point of failure. Despite the existence of two angle-of-attack sensors on the nose of the aircraft, the MCAS system not only used data from either one of the sensors but also did not expect concurrence between the two sensors to infer that the aircraft was stalling. Further, Lion Air did not pay up to equip its aircraft with a warning light that could have alerted the crew to a disagreement between the AoA sensors.

Boeing Missed Safety Risks in the Design of the MAX’s Flight-Control System

Reliance on one sensor’s data is an egregious violation of a long-standing engineering principle about eliminating single points of failure. Some aircraft use three duplicate systems for flight control: if one of the three malfunctions, if two systems agree, and the third does not, the flight control software ignores the odd one out.

If the dependence on one sensor was not enough, Boeing, blinded by time- and price-pressure to stay competitive with its European rival Airbus, intentionally chose to do away with any reference to MCAS in pilot manuals to spare pilot training for its airline-customers. Indeed, Boeing did not even disclose the existence of the MCAS on the aircraft.

Boeing allows pilots to switch the trim system off to override the automated anti-stall system, but the pilots of the ill-fated Lion Air and Ethiopian Airlines flights failed to do so.

Idea for Impact: Redundancy is the Sine Qua Non of Reliable Systems

In preparation for airworthiness recertification for the 737 MAX, Boeing has corrected the MCAS blunder by having its trim software compare inputs from two AoA sensors, alerting the pilots if the sensors’ readings disagree, and limiting MCAS’s authority.

One key takeaway from the MCAS disaster is this: when you devise a highly reliable system, identify all single points of failure, and investigate how these risks and failure modes can be mitigated. Examine if every component of a product or a service you work on is a single point of failure by asking, “If this component fails, does the rest of the system still work, and, more importantly, does it still do the function it is supposed to do?”

Wondering what to read next?

  1. Availability Heuristic: Our Preference for the Familiar
  2. Many Hard Leadership Lessons in the Boeing 737 MAX Debacle
  3. How Contributing Factors Stack Up and Accidents Unfold: A Case Study of the 2024 Delta A350 & CRJ-900 Collision
  4. How to Guard Against Anything You May Inadvertently Overlook
  5. What Airline Disasters Teach About Cognitive Impairment and Decision-Making Under Stress

Filed Under: Mental Models, Sharpening Your Skills Tagged With: Aviation, Biases, Decision-Making, Problem Solving, Risk, Thinking Tools

Reader Interactions

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Primary Sidebar

Popular Now

Anxiety Assertiveness Attitudes Balance Biases Coaching Conflict Conversations Creativity Critical Thinking Decision-Making Discipline Emotions Entrepreneurs Etiquette Feedback Getting Along Getting Things Done Goals Great Manager Leadership Leadership Lessons Likeability Mental Models Mentoring Mindfulness Motivation Networking Parables Perfectionism Performance Management Persuasion Philosophy Problem Solving Procrastination Relationships Simple Living Social Skills Stress Suffering Thinking Tools Thought Process Time Management Winning on the Job Wisdom

About: Nagesh Belludi [hire] is a St. Petersburg, Florida-based freethinker, investor, and leadership coach. He specializes in helping executives and companies ensure that the overall quality of their decision-making benefits isn’t compromised by a lack of a big-picture understanding.

Get Updates

Signup for emails

Subscribe via RSS

Contact Nagesh Belludi

RECOMMENDED BOOK:
The Wright Brothers

The Wright Brothers: David McCullough

Historian David McCullough's enjoyable, fast-paced tale of how the Wrights broke through against great odds to invent inventing powered flight.

Explore

  • Announcements
  • Belief and Spirituality
  • Business Stories
  • Career Development
  • Effective Communication
  • Great Personalities
  • Health and Well-being
  • Ideas and Insights
  • Inspirational Quotations
  • Leadership
  • Leadership Reading
  • Leading Teams
  • Living the Good Life
  • Managing Business Functions
  • Managing People
  • MBA in a Nutshell
  • Mental Models
  • News Analysis
  • Personal Finance
  • Podcasts
  • Project Management
  • Proverbs & Maxims
  • Sharpening Your Skills
  • The Great Innovators

Recently,

  • Inspirational Quotations #1106
  • The Tyranny of Obligations: Summary of Sarah Knight’s ‘The Life-Changing Magic of Not Giving a F**k’
  • Acting the Part, Change Your Life: Book Summary of Richard Wiseman’s ‘The As If Principle’
  • Inspirational Quotations #1105
  • Why Doing a Terrible Job First Actually Works
  • The Barnum Effect and the Appeal of Vagueness
  • Inspirational Quotations #1104

Unless otherwise stated in the individual document, the works above are © Nagesh Belludi under a Creative Commons BY-NC-ND license. You may quote, copy and share them freely, as long as you link back to RightAttitudes.com, don't make money with them, and don't modify the content. Enjoy!