• Skip to content
  • Skip to primary sidebar

Right Attitudes

Ideas for Impact

Risk

What Airline Disasters Teach About Cognitive Impairment and Decision-Making Under Stress

February 27, 2018 By Nagesh Belludi Leave a Comment

Airline disasters often make great case studies on how a series of insignificant errors can build up into catastrophes.

As the following two case studies will illuminate, unanticipated pressures can force your mind to quickly shift to a panic-like state. As it searches frenetically for a way out of a problem, your mind can disrupt your ability to take account of all accessible evidence and attend rationally to the situation in its entirety.

Stress Can Blind You and Limit Your Ability to See the Bigger Picture: A Case Study on Eastern Airlines Flight 401

Eastern Airlines Flight 401 crashed on December 29, 1972, killing 101 people.

As Flight 401 began its approach into the Miami International Airport, first officer Albert Stockstill lowered the landing gear. But the landing gear indicator, a green light to verify that the nose gear was correctly locked in the “down” position, did not switch on. (This was later verified to be caused by a burned-out light bulb. Regardless of the indicator, the landing gear could have been manually lowered and verified.)

The flight deck got thrown into a disarray. The flight’s captain, Bob Loft, sent flight engineer Don Repo to the avionics bay underneath the flight deck to verify through a small porthole if the landing gear was actually down. Loft simultaneously directed Stockstill to put the aircraft on autopilot. Then, when Loft unintentionally leaned against the aircraft’s yoke to speak to Repo, the autopilot mistakably switched to a wrong setting that did not hold the aircraft’s altitude.

The aircraft began to descend so gradually that it could not be perceived by the crew. With the flight engineer down in the avionics bay, the captain and the first officer were so preoccupied with the malfunction of the landing gear indicator that they failed to pay attention to the altitude-warning signal from the engineer’s instrument panel.

Additionally, given that the aircraft was flying over the dark terrain of the Everglades in nighttime, no ground lights or other visual cues signaled that the aircraft was gradually descending. When Stockstill eventually became aware of the aircraft’s altitude, it was too late to recover the aircraft from crashing.

In summary, the cause of the Flight 401’s crash was not the nose landing gear, but the crew’s negligence and inattention to a bigger problem triggered by a false alarm.

Stress Can Blind You into Focusing Just on What You Think is Happening: A Case Study on United Airlines Flight 173

United Airlines Flight 173 crashed on December 28, 1978, in comparable circumstances.

When Flight 173’s pilots lowered the landing gear upon approach to the Portland International Airport, the aircraft experienced an abnormal vibration and yaw motion. In addition, the pilots observed that an indicator light did not show that the landing gear was lowered successfully. In reality, the landing gear was down and locked in position.

With the intention of troubleshooting the landing gear problem, the pilots entered a holding pattern. For the next hour, they tried to diagnose the landing gear glitch and prepare for a probable emergency landing. During this time, however, none of the pilots monitored the fuel levels.

When the landing gear problem was first suspected, the aircraft had abundant reserve fuel—even for a diversion or other contingencies. But, all through the hour-long holding procedure, the landing gear was down and the flaps were set to 15 degrees in anticipation of a landing. This significantly increased the aircraft’s fuel burn rate. With fuel exhaustion to all four engines, the aircraft crashed.

To sum up, Flight 173’s crew got preoccupied with the landing gear’s malfunction and harried preparations for an emergency landing. As a result of their inattention, the pilots failed to keep tabs on the fuel state and crashed the aircraft.

Stress Can Derail Your Train of Thought

Under pressure, your mind will digress from its rational model of thinking.

The emotional excitement from fear, anxiety, time-pressure, and stress can lead to a phenomenon known as “narrowing of the cognitive map.” This tunnel vision can restrict your field of mindful attention and impair your ability for adequate discernment.

Situational close-mindedness can constrict your across-the-board awareness of the situation and force you overlook alternative lines of thought.

Idea for Impact: To combat cognitive impairment under stress, use checklists and standard operating procedures, as well as increased training on situational awareness, crisis communication, and emergency management, as the aviation industry did in response to the aforementioned incidents.

Wondering what to read next?

  1. How Stress Impairs Your Problem-Solving Capabilities: Case Study of TransAsia Flight 235
  2. Lessons from the World’s Worst Aviation Disaster // Book Summary of ‘The Collision on Tenerife’
  3. Lessons from the Princeton Seminary Experiment: People in a Rush are Less Likely to Help Others (and Themselves)
  4. “Fly the Aircraft First”
  5. Under Pressure, The Narrowing Cognitive Map: Lessons from the Tragedy of Singapore Airlines Flight 6

Filed Under: Business Stories, Mental Models, Sharpening Your Skills Tagged With: Anxiety, Aviation, Decision-Making, Emotions, Mindfulness, Problem Solving, Risk, Stress, Thinking Tools, Thought Process, Worry

How to Guard Against Anything You May Inadvertently Overlook

October 23, 2017 By Nagesh Belludi Leave a Comment

The World is More Inundated with Uncertainties and Errors Than Ever Before

Checklists can help you learn about prospective oversights and mistakes, recognize them in context, and sharpen your decisions.

I am a big fan of Harvard surgeon and columnist Atul Gawande’s The Checklist Manifesto (2009.) His bestseller is an engaging reminder of how the world has become so complex.

The use of the humble checklist can help you manage the myriad of complexities that underlie most contemporary professional (and personal) undertakings—where what you must do is too complex to carry out reliably from memory alone. Checklists “provide a kind of a cognitive net. They catch mental flaws inherent in all of us—flaws of memory and attention and thoroughness.”

'The Checklist Manifesto: How to Get Things Right' by Atul Gawande (ISBN 0312430000) Gawande begins The Checklist Manifesto with an examination of the characteristics of errors from ignorance (mistakes you make because you don’t know enough—“much of the world and universe is—and will remain—outside our understanding and control”), and errors of ineptitude (mistakes you make because you don’t apply correctly what you know.) Most human and organizational failures involve the latter.

The philosophy is that you push the power of decision making out to the periphery and away from the center. You give people the room to adapt, based on their experience and expertise. All you ask is that they talk to one another and take responsibility. That is what works.

The surgery room, Gawande’s own profession, is the principal setting for many of the book’s illustrative examples of how the introduction of checklists dramatically reduced the rate of complications from surgery. He also provides handy stories from other realms of human endeavor—aviation, structural engineering, and Wall Street-investing.

Getting Things Right, Every Time

Checklists are particularly valuable in situations where the stakes are high enough, but your impulsive thought process could lead to suboptimal decisions.

'Think Twice: Harnessing the Power of Counterintuition' by Michael J. Mauboussin (ISBN 1422187381) The benefits of checklists also feature prominently in the thought-provoking Think Twice: Harnessing the Power of Counterintuition (2012.) The author, Credit Suisse Investment analyst and polymath Michael J. Mauboussin, argues that checklists are more effective in certain domains than in others:

A checklist’s applicability is largely a function of a domain’s stability. In stable environments, where cause and effect is pretty clear and things don’t change much, checklists are great. But in rapidly changing environments that are heavily circumstantial, creating a checklist is a lot more difficult. In those environments, checklists can help with certain aspects of the decision. For instance, an investor evaluating a stock may use a checklist to make sure that she builds her financial model properly.

A good checklist balances two opposing objectives. It should be general enough to allow for varying conditions, yet specific enough to guide action. Finding this balance means a checklist should not be too long; ideally, you should be able to fit it on one or two pages.

If you have yet to create a checklist, try it and see which issues surface. Concentrate on steps or procedures, and ask where decisions have gone off track before. And recognize that errors are often the result of neglecting a step, not from executing the other steps poorly.

In addition to creating checklists that are specific enough to guide action but general enough to handle changing circumstances, Mauboussin recommends keeping a journal to gather feedback from past decisions and performing “premortems” by envisioning that a imminent decision has already been proven wrong, and then identifying probable reasons for the failure.

No Matter How Proficient You May Be, Well-designed Checklists Can Immeasurably Improve the Outcomes

The notion of making and using checklists is so plainly obvious that it seems impracticable that they could have so vast an effect.

Investor Charlie Munger, the well-respected beacon of wisdom and multi-disciplinary thinking, has said, “No wise pilot, no matter how great his talent and experience, fails to use his checklist.” And, “I’m a great believer in solving hard problems by using a checklist. You need to get all the likely and unlikely answers before you; otherwise it’s easy to miss something important.”

Idea for Impact: Checklists can prevent many things that could go wrong in the hands of human beings, given our many well-documented biases and foibles. Well-designed checklists not only make sure that all the can-be-relied upon elements are in place in complex decision-making, but also provide for flexibility and room for ad hoc judgment.

Wondering what to read next?

  1. The Boeing 737 MAX’s Achilles Heel
  2. Question the Now, Imagine the Next
  3. Maximize Your Chance Possibilities & Get Lucky
  4. Defect Seeding: Strengthen Systems, Boost Confidence
  5. Be Smart by Not Being Stupid

Filed Under: Mental Models, Sharpening Your Skills Tagged With: Biases, Books for Impact, Creativity, Decision-Making, Problem Solving, Risk, Thinking Tools

Smart Folks are Most Susceptible to Overanalyzing and Overthinking

August 30, 2017 By Nagesh Belludi 8 Comments

Many High-IQ People Tend to Be Overthinkers: They Incessantly Overanalyze Everything

There’s this old Zen parable that relates how over-analysis is a common attribute of intelligent people.

A Zen master was resting with his quick-witted disciple. At one point, the master took a melon out of his bag and cut it in half for the two of them to eat.

In the middle of the meal, the enthusiastic disciple said, “My wise teacher, I know everything you do has a meaning. Sharing this melon with me may be a sign that you have something to teach me.”

The master continued eating in silence.

“I understand the mysterious question in your silence,” insisted the student. “I think it is this: the excellent taste of this melon that I am experiencing … is the taste on the melon or on my tongue …”

The master still said nothing. The disciple got a bit frustrated at his master’s apparent indifference.

The disciple continued, ” … and like everything in life, this too has meaning. I think I’m closer to the answer; the pleasure of the taste is an act of love and interdependence between the two, because without the melon there wouldn’t be an object of pleasure and without pleasure …”

“Enough!” exclaimed the master. “The biggest fools are those who consider themselves the most intelligent and seek an interpretation for everything! The melon is good; please let this be enough. Let me eat it in peace!”

Intelligence Can Sometimes Be a Curse

The tendency to reason and analyze is a part of human nature. It is a useful trait for discerning the many complexities of life. It’s only natural that you could go overboard some times and over-analyze a point or an issue to such a degree that the objective becomes all but moot.

Don’t get me wrong. Intelligence is indeed a gift. But intelligence can trick you into thinking you should be overthinking and calculating everything you do. The more intelligent you are, the more investigative you will be. The more your brain analyzes people and events, the more time it will spend on finding flaws in everything.

Intelligent People Overanalyze Everything, Even When it Doesn’t Matter

Many intelligent people tend to be perfectionists. Their overanalysis often cripples their productivity, especially by leading them to undesirable, frustrating, and low-probability conclusions that can limit their ability to understand reality and take meaningful risks.

Intelligent people are too hard on themselves and others—family, friends, and co-workers. They can’t settle for anything less than perfect. They tend to be less satisfied with their achievements, their relationships, and practically everything that has a place in their life. What is more, many people with speculative minds hold idealistic views of the world and lack a sound acumen about coping with the practical world.

Idea for Impact: Don’t Make Everything Seem Worse Than it Actually is!

Thinking too much about things isn’t just a nuisance for you and others around you; it can take a toll on your well-being and on your relationships.

Check your tendency to overthink and overanalyze everything. Don’t twist and turn every issue in your head until you’ve envisaged the issue from all perspectives.

Sometimes it does help to overthink and be cautious about potential risks and downfalls. But most times, it’s unnecessary to ruminate excessively. Don’t make everything seem worse than it actually is. Set limits and prioritize. Learn to let go and manage your expectations.

To avoid overthinking, use my 5-5-5 technique. Ask yourself if your decision will matter 5 weeks, 5 months, and 5 years in the future. If your answer is ‘no,’stop stressing yourself out!

Wondering what to read next?

  1. Accidents Can Happen When You Least Expect Them: The Overconfidence Effect
  2. The Waterline Principle: How Much Risk Can You Tolerate?
  3. More Data Isn’t Always Better
  4. Protect the Downside with Pre-mortems
  5. Situational Blindness, Fatal Consequences: Lessons from American Airlines 5342

Filed Under: Mental Models, Sharpening Your Skills Tagged With: Confidence, Critical Thinking, Decision-Making, Mindfulness, Perfectionism, Problem Solving, Risk, Thinking Tools, Wisdom

You Can’t Know Everything

November 4, 2016 By Nagesh Belludi Leave a Comment

“Have intellectual humility. Acknowledging what you don’t know is the dawning of wisdom.”
— Charlie Munger

You Can't Know Everything, So Embrace Uncertainty In the course of life, some of the most dangerous circumstances to be in are when you think you’re the smartest person in the room. Smarts without humility can get you into trouble because hubris leads to intellectual arrogance and a blatant disregard for opinions and judgments that are contrary to the ones you already hold.

Recognizing that you can’t know everything and that you will never know everything must not prevent you from acting. Rather, you must embrace uncertainty and take into account the possibility that you could be wrong.

Embrace Uncertainty

Risk is what is left behind after you think you’ve thought of everything you currently can. Risk embraces all those matters that are unaccounted for—everything that you need to protect yourself from.

Intelligence transforms into wisdom only when you recognize that, despite your confidence in the present circumstances, you cannot predict how things will play out in the future. You will not be able to make an optimal decision every time.

The conduct of life is not a perfect science. Rather, it is an art that necessitates acknowledging and dealing with imperfect information. Be willing to act on imperfect information and uncertainty. Set a clear course today and tackle problems that arise tomorrow. Learn to adapt more flexibly to developing situations.

Idea for Impact: The wisest people I know are the ones who acknowledge that they don’t know everything and put strategies in place to shield themselves from their own ignorance. Make risk analysis and risk reduction one of the primary goals of your intellectual processes.

Wondering what to read next?

  1. It’s Probably Not as Bad as You Think
  2. A Bit of Insecurity Can Help You Be Your Best Self
  3. How To … Be More Confident in Your Choices
  4. Ever Wonder If The Other Side May Be Right?
  5. Could Limiting Social Media Reduce Your Anxiety About Work?

Filed Under: Mental Models, Sharpening Your Skills Tagged With: Attitudes, Confidence, Conviction, Perfectionism, Risk, Wisdom

Finding Potential Problems & Risk Analysis: A Case Study on ‘The Three Faces of Eve’

June 24, 2016 By Nagesh Belludi Leave a Comment

The Three Faces of Eve (1957)

Risk Analysis is a Forerunner to Risk Reduction

My previous article stressed the importance of problem finding as an intellectual skill and as a definitive forerunner to any creative process. In this article, I will draw attention to another facet of problem finding: thinking through potential problems.

Sometimes people are unaware of the harmful, unintended side effects of their actions. They fail to realize that a current state of affairs may lead to problems later on. Their actions and decisions could result in outcomes that are different from those planned. Risk analysis reduces the chance of non-optimal results.

The Three Contracts of Eve

'The 3 Faces of Eve' by Corbett H. Thigpen and Hervey M. Cleckley (ISBN 0445081376) A particularly instructive example of finding potential problems and mitigating risk concerns the Hollywood classic The Three Faces of Eve (1957). This psychological drama features the true story of Chris Sizemore who suffered from dissociative identity disorder (also called multiple personality disorder.) Based on The Three Faces of Eve by her psychiatrists Corbett Thigpen and Hervey Cleckley, the movie portrays Sizemore’s three personalities, which manifest in three characters: Eve White, Eve Black, and Jane.

Before filming started on The Three Faces of Eve, the legal department of the 20th Century Fox studio insisted that Sizemore sign three separate contracts—one for each of her personalities—to cover the studio from any possible legal action. For that reason, Sizemore was asked to evoke “Eve White,” “Eve Black,” and “Jane,” and then sign an agreement while manifesting each of these respective personalities. According to Aubrey Solomon’s The Films of 20th Century-Fox and her commentary on the movie’s DVD, the three signatures on the three contracts were all different because they were a product of three distinct personalities that Sizemore had invoked because of her multiple personality disorder.

Idea for Impact: Risk analysis and risk reduction should be one of the primary goals of any intellectual process.

Postscript Notes

  • I recommend the movie The Three Faces of Eve for its captivating glimpse into the mind of a person afflicted with dissociative identity disorder. Actress Joanne Woodward won the 1958 Academy Award (Oscar) for best actress for her portrayal of the three Eves.
  • The automotive, aerospace, and other engineering disciplines use a formal risk analysis procedure called “failure mode and effects analysis” (FEMA.) FEMA examines the key risk factors that may fail a project, system, design, or process, the potential effects of those failures, and the seriousness of these effects.

Wondering what to read next?

  1. Overcoming Personal Constraints is a Key to Success
  2. How to Stimulate Group Creativity // Book Summary of Edward de Bono’s ‘Six Thinking Hats’
  3. You Can’t Develop Solutions Unless You Realize You Got Problems: Problem Finding is an Undervalued Skill
  4. This is Yoga for the Brain: Multidisciplinary Learning
  5. Four Ideas for Business Improvement Ideas

Filed Under: Mental Models, Sharpening Your Skills Tagged With: Creativity, Critical Thinking, Innovation, Mental Models, Personality, Risk, Thinking Tools, Thought Process, Winning on the Job

« Previous Page

Primary Sidebar

Popular Now

Anxiety Assertiveness Attitudes Balance Biases Coaching Conflict Conversations Creativity Critical Thinking Decision-Making Discipline Emotions Entrepreneurs Etiquette Feedback Getting Along Getting Things Done Goals Great Manager Innovation Leadership Leadership Lessons Likeability Mental Models Mentoring Mindfulness Motivation Networking Parables Performance Management Persuasion Philosophy Problem Solving Procrastination Relationships Simple Living Social Skills Stress Suffering Thinking Tools Thought Process Time Management Winning on the Job Wisdom

About: Nagesh Belludi [hire] is a St. Petersburg, Florida-based freethinker, investor, and leadership coach. He specializes in helping executives and companies ensure that the overall quality of their decision-making benefits isn’t compromised by a lack of a big-picture understanding.

Get Updates

Signup for emails

Subscribe via RSS

Contact Nagesh Belludi

RECOMMENDED BOOK:
Think Wrong

Think Wrong: John Bielenberg

Software firm Future Partner's exclusive problem-solving system that helps see through siloes and bottlenecks in the decision-making process.

Explore

  • Announcements
  • Belief and Spirituality
  • Business Stories
  • Career Development
  • Effective Communication
  • Great Personalities
  • Health and Well-being
  • Ideas and Insights
  • Inspirational Quotations
  • Leadership
  • Leadership Reading
  • Leading Teams
  • Living the Good Life
  • Managing Business Functions
  • Managing People
  • MBA in a Nutshell
  • Mental Models
  • News Analysis
  • Personal Finance
  • Podcasts
  • Project Management
  • Proverbs & Maxims
  • Sharpening Your Skills
  • The Great Innovators

Recently,

  • A Boss’s Presence Deserves Our Gratitude’s Might
  • Chance and the Currency of Preparedness: A Case Study on an Indonesian Handbag Entrepreneur, Sunny Kamengmau
  • Inspirational Quotations #1123
  • Should You Read a Philosophy Book or a Self-Help Book?
  • A Rule Followed Blindly Is a Principle Betrayed Quietly
  • Stoic in the Title, Shallow in the Text: Summary of Robert Rosenkranz’s ‘The Stoic Capitalist’
  • Inspirational Quotations #1122

Unless otherwise stated in the individual document, the works above are © Nagesh Belludi under a Creative Commons BY-NC-ND license. You may quote, copy and share them freely, as long as you link back to RightAttitudes.com, don't make money with them, and don't modify the content. Enjoy!