• Skip to content
  • Skip to primary sidebar

Right Attitudes

Ideas for Impact

Risk

How Stress Impairs Your Problem-Solving Capabilities: Case Study of TransAsia Flight 235

October 1, 2019 By Nagesh Belludi Leave a Comment

As I’ve examined previously, airline disasters are particularly instructive on the subjects of cognitive impairment and decision-making under stress.

Consider the case of TransAsia Airways Flight 235 that crashed in 2015 soon after takeoff from an airport in Taipei, Taiwan. Accident investigations revealed that the pilots of the ATR 72-600 turboprop erroneously switched off the plane’s working engine after the other lost power. Here’s a rundown of what happened:

  1. About one minute after takeoff, at 1,300 feet, engine #2 had an uncommanded autofeather failure. This is a routine engine failure—the aircraft is designed to be able to be flown on one engine.
  2. The Pilot Flying misdiagnosed the problem, and assumed that the still-functional engine #1 had failed. He retarded power on engine #1 and it promptly shut down.
  3. With power lost on both the engines, the pilots did not react to the stall warnings in a timely and effective manner. The Pilot Flying acknowledged his error, “wow, pulled back the wrong side throttle.”
  4. The aircraft continued its descent. The pilots rushed to restart engine #1, but the remaining altitude was not adequate enough to recover the aircraft.
  5. In a state of panic, the Pilot Flying clasped the flight controls and steered (see this video) the aircraft perilously to avoid apartment blocks and commercial buildings before clipping a bridge and crashing into a river.

A High Level of Stress Can Diminish Your Problem-solving Capabilities

Thrown into disarray after a routine engine failure, the pilots of TransAsia flight 235 did not perform their airline’s abnormal and emergency procedures to identify the failure and implement the required corrective actions. Their ineffective coordination, communication, and error management compromised the safety of the flight.

The combination of sudden threat and extreme time pressure to avert a danger fosters a state of panic, in which decision-makers are inclined to commit themselves impulsively to courses of action that they will soon come to regret.

Idea for Impact: To combat cognitive impairment under stress, use checklists and standard operating procedures, as well as increased training on situational awareness, crisis communication, and emergency management.

Wondering what to read next?

  1. What Airline Disasters Teach About Cognitive Impairment and Decision-Making Under Stress
  2. Lessons from the World’s Worst Aviation Disaster // Book Summary of ‘The Collision on Tenerife’
  3. Under Pressure, The Narrowing Cognitive Map: Lessons from the Tragedy of Singapore Airlines Flight 6
  4. “Fly the Aircraft First”
  5. The “Ashtray in the Sky” Mental Model: Idiot-Proofing by Design

Filed Under: Business Stories, Leadership, Sharpening Your Skills Tagged With: Anxiety, Aviation, Biases, Decision-Making, Emotions, Mental Models, Mindfulness, Problem Solving, Risk, Stress, Thought Process, Worry

Accidents Can Happen When You Least Expect Them: The Overconfidence Effect

September 3, 2019 By Nagesh Belludi Leave a Comment

In the context of decision-making and risk-taking, the “overconfidence effect” is a judgmental bias that can affect your subjective estimate of the likelihood of future events. This can cause you to misjudge the odds of positive/desirable events as well as negative/undesirable events.

As the following Zen story illustrates, experience breeds complacency. When confidence gives way to overconfidence, it can transform from a strength to a liability.

A master gardener, famous for his skill in climbing and pruning the highest trees, examined his disciple by letting him climb a very high tree. Many people had come to watch. The master gardener stood quietly, carefully following every move but not interfering with one word.

Having pruned the top, the disciple climbed down and was only about ten feet from the ground when the master suddenly yelled: “Take care, take care!”

When the disciple was safely down an old man asked the master gardener: “You did not let out one word when he was aloft in the most dangerous place. Why did you caution him when he was nearly down? Even if he had slipped then, he could not have greatly hurt himself.”

“But isn’t it obvious?” replied the master gardener. “Right up at the top he is conscious of the danger, and of himself takes care. But near the end when one begins to feel safe, this is when accidents occur.”

Reference: Irmgard Schlögl’s The Wisdom of the Zen Masters (1976.) Dr. Schlögl (1921–2007) became Ven. Myokyo-ni in 1984, and served as Rinzai Zen Buddhist nun and headed the Zen Centre in London.

Wondering what to read next?

  1. The “Ashtray in the Sky” Mental Model: Idiot-Proofing by Design
  2. Situational Blindness, Fatal Consequences: Lessons from American Airlines 5342
  3. Be Smart by Not Being Stupid
  4. Smart Folks are Most Susceptible to Overanalyzing and Overthinking
  5. Increase Paranoia When Things Are Going Well

Filed Under: Mental Models, Sharpening Your Skills Tagged With: Biases, Confidence, Critical Thinking, Decision-Making, Mindfulness, Parables, Risk, Thinking Tools, Thought Process, Wisdom

Beware of Key-Person Dependency Risk

September 7, 2018 By Nagesh Belludi

Key-Person Dependency Risk is the threat posed by an organization or a team’s over-reliance on one or a few individuals.

The key-person has sole custody of some critical institutional knowledge, creativity, reputation, or experience that makes him indispensable to the organization’s business continuity and its future performance. If he/she should leave, the organization suffers the loss of that valued standing and expertise.

Small businesses and start-ups are especially exposed to key-person dependency risk. Tesla, for example, faces a colossal key-man risk—its fate is linked closely to the actions of founder-CEO Elon Musk, who has come under scrutiny lately.

Much of Berkshire Hathaway’s performance over the decades has been based on CEO Warren Buffett’s reputation and his ability to wring remarkable deals from companies in duress. There’s a great deal of prestige in selling one’s business to Buffett. He is irreplaceable; given his remarkable long-term record of accomplishment, it is important that much of what he has built over the years remains intact once he is gone. Buffett has built a strong culture that is likely to endure.

Key Employees are Not Only Assets, but also Large Contingent Liabilities

The most famous “key man” of all time was Apple’s Steve Jobs. Not only was he closely linked to his company’s identity, but he also played a singular role in building Apple into the global consumer-technology powerhouse that it is. Jobs had steered Apple’s culture in a desired direction and groomed his handpicked management team to sustain Apple’s inventive culture after he was gone. Tim Cook, the operations genius who became Apple’s CEO after Jobs died in 2011, has led the company to new heights.

The basic solution to key-person dependency risk is to identify and document critical knowledge of the organization. (Capturing tacit knowledge is not easy when it resides “in the key-person’s head.”) Organizations must also focus on cross-training and succession planning to identify and enable others to develop and perform the same tasks as the key-person.

Idea for Impact: No employee should be indispensable. A well-managed company is never dependent upon the performance of one or a few individuals. As well, no employee should be allowed to hoard knowledge, relationships, or resources to achieve job security.

Wondering what to read next?

  1. You Need to Stop Turning Warren Buffett Into a Prophet
  2. What Virgin’s Richard Branson Teaches: The Entrepreneur as Savior, Stuntman, Spectacle
  3. Creativity by Imitation: How to Steal Others’ Ideas and Innovate
  4. Risk Homeostasis and Peltzman Effect: Why Risk Mitigation and Safety Measures Become Ineffective
  5. The Dramatic Fall of Theranos & Elizabeth Holmes // Book Summary of John Carreyrou’s ‘Bad Blood’

Filed Under: Business Stories, Managing People, MBA in a Nutshell, Mental Models Tagged With: Biases, Career Planning, Entrepreneurs, Human Resources, Icons, Leadership Lessons, Mental Models, Personality, Risk, Role Models

Risk Homeostasis and Peltzman Effect: Why Risk Mitigation and Safety Measures Become Ineffective

May 17, 2018 By Nagesh Belludi Leave a Comment

Risk Homeostasis and Peltzman Effect are two concepts relating to how humans react to risks.

Risk Homeostasis is the notion that our personal psychological frameworks comprise a target level of risk towards which we direct our efforts.

We measure risk on our own “risk thermostat.” Because the risk in our environment changes continuously, we are incessantly forced away from our target risk level, but revert toward it by counteracting those external influences.

If the perceived risk of a situation exceeds our target level, we undertake defensive actions to reduce the risk. And if the perceived risk is lower than our target level, we attempt to increase our risk back to our target level by exposing ourselves to dangerous actions.

Consequently, people take more risks when they’re forced to act more carefully. For instance, requiring motorcycle bikers to wear helmets may make them take more risks—to maintain their level of thrill, not to get into accidents.

Peltzman Effect is the notion that people respond to increased safety by adding new risks. The namesake, economist Sam Peltzman, argued in 1975 that when automobile safety rules were introduced, at least some of the benefits of the new safety rules were counterbalanced by changes in the behavior of drivers. Peltzman posited that making seatbelts mandatory for cars resulted in reducing the number of occupant fatalities, but increased pedestrian casualties and collision-related property damages.

Peltzman made a case that even though seatbelts reduced the risk of being severely injured in an accident, drivers compensated by driving aggressively and carelessly—driving closer to the car ahead of them, for instance—so as to save time or maintain their level of thrill, even at the risk of causing damage beyond themselves and their cars.

Risk Homeostasis and Peltzman Effect remain controversial theories. Despite their apparent relevance, the prevailing evidence remains inadequate and inconclusive about how people behave less cautiously when they feel more protected and vice versa.

Further, Risk Homeostasis and Peltzman Effect challenge the foundations of safety and injury-prevention policies. They assert that the only effective safety measures are those that alter individuals’ desired risk level. Anything that barely modifies the environment or regulates individuals’ behavior without affecting their target risk levels is useless.

Wondering what to read next?

  1. The “Ashtray in the Sky” Mental Model: Idiot-Proofing by Design
  2. Knowing When to Give Up: Establish ‘Kill Criteria’
  3. How Stress Impairs Your Problem-Solving Capabilities: Case Study of TransAsia Flight 235
  4. Be Smart by Not Being Stupid
  5. Availability Heuristic: Our Preference for the Familiar

Filed Under: Mental Models, Sharpening Your Skills Tagged With: Biases, Decision-Making, Discipline, Mental Models, Personality, Risk, Thought Process

What Airline Disasters Teach About Cognitive Impairment and Decision-Making Under Stress

February 27, 2018 By Nagesh Belludi Leave a Comment

Airline disasters often make great case studies on how a series of insignificant errors can build up into catastrophes.

As the following two case studies will illuminate, unanticipated pressures can force your mind to quickly shift to a panic-like state. As it searches frenetically for a way out of a problem, your mind can disrupt your ability to take account of all accessible evidence and attend rationally to the situation in its entirety.

Stress Can Blind You and Limit Your Ability to See the Bigger Picture: A Case Study on Eastern Airlines Flight 401

Eastern Airlines Flight 401 crashed on December 29, 1972, killing 101 people.

As Flight 401 began its approach into the Miami International Airport, first officer Albert Stockstill lowered the landing gear. But the landing gear indicator, a green light to verify that the nose gear was correctly locked in the “down” position, did not switch on. (This was later verified to be caused by a burned-out light bulb. Regardless of the indicator, the landing gear could have been manually lowered and verified.)

The flight deck got thrown into a disarray. The flight’s captain, Bob Loft, sent flight engineer Don Repo to the avionics bay underneath the flight deck to verify through a small porthole if the landing gear was actually down. Loft simultaneously directed Stockstill to put the aircraft on autopilot. Then, when Loft unintentionally leaned against the aircraft’s yoke to speak to Repo, the autopilot mistakably switched to a wrong setting that did not hold the aircraft’s altitude.

The aircraft began to descend so gradually that it could not be perceived by the crew. With the flight engineer down in the avionics bay, the captain and the first officer were so preoccupied with the malfunction of the landing gear indicator that they failed to pay attention to the altitude-warning signal from the engineer’s instrument panel.

Additionally, given that the aircraft was flying over the dark terrain of the Everglades in nighttime, no ground lights or other visual cues signaled that the aircraft was gradually descending. When Stockstill eventually became aware of the aircraft’s altitude, it was too late to recover the aircraft from crashing.

In summary, the cause of the Flight 401’s crash was not the nose landing gear, but the crew’s negligence and inattention to a bigger problem triggered by a false alarm.

Stress Can Blind You into Focusing Just on What You Think is Happening: A Case Study on United Airlines Flight 173

United Airlines Flight 173 crashed on December 28, 1978, in comparable circumstances.

When Flight 173’s pilots lowered the landing gear upon approach to the Portland International Airport, the aircraft experienced an abnormal vibration and yaw motion. In addition, the pilots observed that an indicator light did not show that the landing gear was lowered successfully. In reality, the landing gear was down and locked in position.

With the intention of troubleshooting the landing gear problem, the pilots entered a holding pattern. For the next hour, they tried to diagnose the landing gear glitch and prepare for a probable emergency landing. During this time, however, none of the pilots monitored the fuel levels.

When the landing gear problem was first suspected, the aircraft had abundant reserve fuel—even for a diversion or other contingencies. But, all through the hour-long holding procedure, the landing gear was down and the flaps were set to 15 degrees in anticipation of a landing. This significantly increased the aircraft’s fuel burn rate. With fuel exhaustion to all four engines, the aircraft crashed.

To sum up, Flight 173’s crew got preoccupied with the landing gear’s malfunction and harried preparations for an emergency landing. As a result of their inattention, the pilots failed to keep tabs on the fuel state and crashed the aircraft.

Stress Can Derail Your Train of Thought

Under pressure, your mind will digress from its rational model of thinking.

The emotional excitement from fear, anxiety, time-pressure, and stress can lead to a phenomenon known as “narrowing of the cognitive map.” This tunnel vision can restrict your field of mindful attention and impair your ability for adequate discernment.

Situational close-mindedness can constrict your across-the-board awareness of the situation and force you overlook alternative lines of thought.

Idea for Impact: To combat cognitive impairment under stress, use checklists and standard operating procedures, as well as increased training on situational awareness, crisis communication, and emergency management, as the aviation industry did in response to the aforementioned incidents.

Wondering what to read next?

  1. How Stress Impairs Your Problem-Solving Capabilities: Case Study of TransAsia Flight 235
  2. Lessons from the World’s Worst Aviation Disaster // Book Summary of ‘The Collision on Tenerife’
  3. Lessons from the Princeton Seminary Experiment: People in a Rush are Less Likely to Help Others (and Themselves)
  4. “Fly the Aircraft First”
  5. Under Pressure, The Narrowing Cognitive Map: Lessons from the Tragedy of Singapore Airlines Flight 6

Filed Under: Business Stories, Mental Models, Sharpening Your Skills Tagged With: Anxiety, Aviation, Decision-Making, Emotions, Mindfulness, Problem Solving, Risk, Stress, Thinking Tools, Thought Process, Worry

How to Guard Against Anything You May Inadvertently Overlook

October 23, 2017 By Nagesh Belludi Leave a Comment

The World is More Inundated with Uncertainties and Errors Than Ever Before

Checklists can help you learn about prospective oversights and mistakes, recognize them in context, and sharpen your decisions.

I am a big fan of Harvard surgeon and columnist Atul Gawande’s The Checklist Manifesto (2009.) His bestseller is an engaging reminder of how the world has become so complex.

The use of the humble checklist can help you manage the myriad of complexities that underlie most contemporary professional (and personal) undertakings—where what you must do is too complex to carry out reliably from memory alone. Checklists “provide a kind of a cognitive net. They catch mental flaws inherent in all of us—flaws of memory and attention and thoroughness.”

'The Checklist Manifesto: How to Get Things Right' by Atul Gawande (ISBN 0312430000) Gawande begins The Checklist Manifesto with an examination of the characteristics of errors from ignorance (mistakes you make because you don’t know enough—“much of the world and universe is—and will remain—outside our understanding and control”), and errors of ineptitude (mistakes you make because you don’t apply correctly what you know.) Most human and organizational failures involve the latter.

The philosophy is that you push the power of decision making out to the periphery and away from the center. You give people the room to adapt, based on their experience and expertise. All you ask is that they talk to one another and take responsibility. That is what works.

The surgery room, Gawande’s own profession, is the principal setting for many of the book’s illustrative examples of how the introduction of checklists dramatically reduced the rate of complications from surgery. He also provides handy stories from other realms of human endeavor—aviation, structural engineering, and Wall Street-investing.

Getting Things Right, Every Time

Checklists are particularly valuable in situations where the stakes are high enough, but your impulsive thought process could lead to suboptimal decisions.

'Think Twice: Harnessing the Power of Counterintuition' by Michael J. Mauboussin (ISBN 1422187381) The benefits of checklists also feature prominently in the thought-provoking Think Twice: Harnessing the Power of Counterintuition (2012.) The author, Credit Suisse Investment analyst and polymath Michael J. Mauboussin, argues that checklists are more effective in certain domains than in others:

A checklist’s applicability is largely a function of a domain’s stability. In stable environments, where cause and effect is pretty clear and things don’t change much, checklists are great. But in rapidly changing environments that are heavily circumstantial, creating a checklist is a lot more difficult. In those environments, checklists can help with certain aspects of the decision. For instance, an investor evaluating a stock may use a checklist to make sure that she builds her financial model properly.

A good checklist balances two opposing objectives. It should be general enough to allow for varying conditions, yet specific enough to guide action. Finding this balance means a checklist should not be too long; ideally, you should be able to fit it on one or two pages.

If you have yet to create a checklist, try it and see which issues surface. Concentrate on steps or procedures, and ask where decisions have gone off track before. And recognize that errors are often the result of neglecting a step, not from executing the other steps poorly.

In addition to creating checklists that are specific enough to guide action but general enough to handle changing circumstances, Mauboussin recommends keeping a journal to gather feedback from past decisions and performing “premortems” by envisioning that a imminent decision has already been proven wrong, and then identifying probable reasons for the failure.

No Matter How Proficient You May Be, Well-designed Checklists Can Immeasurably Improve the Outcomes

The notion of making and using checklists is so plainly obvious that it seems impracticable that they could have so vast an effect.

Investor Charlie Munger, the well-respected beacon of wisdom and multi-disciplinary thinking, has said, “No wise pilot, no matter how great his talent and experience, fails to use his checklist.” And, “I’m a great believer in solving hard problems by using a checklist. You need to get all the likely and unlikely answers before you; otherwise it’s easy to miss something important.”

Idea for Impact: Checklists can prevent many things that could go wrong in the hands of human beings, given our many well-documented biases and foibles. Well-designed checklists not only make sure that all the can-be-relied upon elements are in place in complex decision-making, but also provide for flexibility and room for ad hoc judgment.

Wondering what to read next?

  1. The Boeing 737 MAX’s Achilles Heel
  2. Defect Seeding: Strengthen Systems, Boost Confidence
  3. Be Smart by Not Being Stupid
  4. Availability Heuristic: Our Preference for the Familiar
  5. The “Ashtray in the Sky” Mental Model: Idiot-Proofing by Design

Filed Under: Mental Models, Sharpening Your Skills Tagged With: Biases, Books for Impact, Creativity, Decision-Making, Problem Solving, Risk, Thinking Tools

Smart Folks are Most Susceptible to Overanalyzing and Overthinking

August 30, 2017 By Nagesh Belludi 8 Comments

Many High-IQ People Tend to Be Overthinkers: They Incessantly Overanalyze Everything

There’s this old Zen parable that relates how over-analysis is a common attribute of intelligent people.

A Zen master was resting with his quick-witted disciple. At one point, the master took a melon out of his bag and cut it in half for the two of them to eat.

In the middle of the meal, the enthusiastic disciple said, “My wise teacher, I know everything you do has a meaning. Sharing this melon with me may be a sign that you have something to teach me.”

The master continued eating in silence.

“I understand the mysterious question in your silence,” insisted the student. “I think it is this: the excellent taste of this melon that I am experiencing … is the taste on the melon or on my tongue …”

The master still said nothing. The disciple got a bit frustrated at his master’s apparent indifference.

The disciple continued, ” … and like everything in life, this too has meaning. I think I’m closer to the answer; the pleasure of the taste is an act of love and interdependence between the two, because without the melon there wouldn’t be an object of pleasure and without pleasure …”

“Enough!” exclaimed the master. “The biggest fools are those who consider themselves the most intelligent and seek an interpretation for everything! The melon is good; please let this be enough. Let me eat it in peace!”

Intelligence Can Sometimes Be a Curse

The tendency to reason and analyze is a part of human nature. It is a useful trait for discerning the many complexities of life. It’s only natural that you could go overboard some times and over-analyze a point or an issue to such a degree that the objective becomes all but moot.

Don’t get me wrong. Intelligence is indeed a gift. But intelligence can trick you into thinking you should be overthinking and calculating everything you do. The more intelligent you are, the more investigative you will be. The more your brain analyzes people and events, the more time it will spend on finding flaws in everything.

Intelligent People Overanalyze Everything, Even When it Doesn’t Matter

Many intelligent people tend to be perfectionists. Their overanalysis often cripples their productivity, especially by leading them to undesirable, frustrating, and low-probability conclusions that can limit their ability to understand reality and take meaningful risks.

Intelligent people are too hard on themselves and others—family, friends, and co-workers. They can’t settle for anything less than perfect. They tend to be less satisfied with their achievements, their relationships, and practically everything that has a place in their life. What is more, many people with speculative minds hold idealistic views of the world and lack a sound acumen about coping with the practical world.

Idea for Impact: Don’t Make Everything Seem Worse Than it Actually is!

Thinking too much about things isn’t just a nuisance for you and others around you; it can take a toll on your well-being and on your relationships.

Check your tendency to overthink and overanalyze everything. Don’t twist and turn every issue in your head until you’ve envisaged the issue from all perspectives.

Sometimes it does help to overthink and be cautious about potential risks and downfalls. But most times, it’s unnecessary to ruminate excessively. Don’t make everything seem worse than it actually is. Set limits and prioritize. Learn to let go and manage your expectations.

To avoid overthinking, use my 5-5-5 technique. Ask yourself if your decision will matter 5 weeks, 5 months, and 5 years in the future. If your answer is ‘no,’stop stressing yourself out!

Wondering what to read next?

  1. Accidents Can Happen When You Least Expect Them: The Overconfidence Effect
  2. A Bit of Insecurity Can Help You Be Your Best Self
  3. Protect the Downside with Pre-mortems
  4. Situational Blindness, Fatal Consequences: Lessons from American Airlines 5342
  5. The “Ashtray in the Sky” Mental Model: Idiot-Proofing by Design

Filed Under: Mental Models, Sharpening Your Skills Tagged With: Confidence, Critical Thinking, Decision-Making, Mindfulness, Perfectionism, Problem Solving, Risk, Thinking Tools, Wisdom

You Can’t Know Everything

November 4, 2016 By Nagesh Belludi Leave a Comment

“Have intellectual humility. Acknowledging what you don’t know is the dawning of wisdom.”
— Charlie Munger

You Can't Know Everything, So Embrace Uncertainty In the course of life, some of the most dangerous circumstances to be in are when you think you’re the smartest person in the room. Smarts without humility can get you into trouble because hubris leads to intellectual arrogance and a blatant disregard for opinions and judgments that are contrary to the ones you already hold.

Recognizing that you can’t know everything and that you will never know everything must not prevent you from acting. Rather, you must embrace uncertainty and take into account the possibility that you could be wrong.

Embrace Uncertainty

Risk is what is left behind after you think you’ve thought of everything you currently can. Risk embraces all those matters that are unaccounted for—everything that you need to protect yourself from.

Intelligence transforms into wisdom only when you recognize that, despite your confidence in the present circumstances, you cannot predict how things will play out in the future. You will not be able to make an optimal decision every time.

The conduct of life is not a perfect science. Rather, it is an art that necessitates acknowledging and dealing with imperfect information. Be willing to act on imperfect information and uncertainty. Set a clear course today and tackle problems that arise tomorrow. Learn to adapt more flexibly to developing situations.

Idea for Impact: The wisest people I know are the ones who acknowledge that they don’t know everything and put strategies in place to shield themselves from their own ignorance. Make risk analysis and risk reduction one of the primary goals of your intellectual processes.

Wondering what to read next?

  1. It’s Probably Not as Bad as You Think
  2. A Bit of Insecurity Can Help You Be Your Best Self
  3. How to Embrace Uncertainty and Leave Room for Doubt
  4. No One Has a Monopoly on Truth
  5. Nothing Deserves Certainty

Filed Under: Mental Models, Sharpening Your Skills Tagged With: Attitudes, Confidence, Conviction, Perfectionism, Risk, Wisdom

Finding Potential Problems & Risk Analysis: A Case Study on ‘The Three Faces of Eve’

June 24, 2016 By Nagesh Belludi Leave a Comment

The Three Faces of Eve (1957)

Risk Analysis is a Forerunner to Risk Reduction

My previous article stressed the importance of problem finding as an intellectual skill and as a definitive forerunner to any creative process. In this article, I will draw attention to another facet of problem finding: thinking through potential problems.

Sometimes people are unaware of the harmful, unintended side effects of their actions. They fail to realize that a current state of affairs may lead to problems later on. Their actions and decisions could result in outcomes that are different from those planned. Risk analysis reduces the chance of non-optimal results.

The Three Contracts of Eve

'The 3 Faces of Eve' by Corbett H. Thigpen and Hervey M. Cleckley (ISBN 0445081376) A particularly instructive example of finding potential problems and mitigating risk concerns the Hollywood classic The Three Faces of Eve (1957). This psychological drama features the true story of Chris Sizemore who suffered from dissociative identity disorder (also called multiple personality disorder.) Based on The Three Faces of Eve by her psychiatrists Corbett Thigpen and Hervey Cleckley, the movie portrays Sizemore’s three personalities, which manifest in three characters: Eve White, Eve Black, and Jane.

Before filming started on The Three Faces of Eve, the legal department of the 20th Century Fox studio insisted that Sizemore sign three separate contracts—one for each of her personalities—to cover the studio from any possible legal action. For that reason, Sizemore was asked to evoke “Eve White,” “Eve Black,” and “Jane,” and then sign an agreement while manifesting each of these respective personalities. According to Aubrey Solomon’s The Films of 20th Century-Fox and her commentary on the movie’s DVD, the three signatures on the three contracts were all different because they were a product of three distinct personalities that Sizemore had invoked because of her multiple personality disorder.

Idea for Impact: Risk analysis and risk reduction should be one of the primary goals of any intellectual process.

Postscript Notes

  • I recommend the movie The Three Faces of Eve for its captivating glimpse into the mind of a person afflicted with dissociative identity disorder. Actress Joanne Woodward won the 1958 Academy Award (Oscar) for best actress for her portrayal of the three Eves.
  • The automotive, aerospace, and other engineering disciplines use a formal risk analysis procedure called “failure mode and effects analysis” (FEMA.) FEMA examines the key risk factors that may fail a project, system, design, or process, the potential effects of those failures, and the seriousness of these effects.

Wondering what to read next?

  1. Overcoming Personal Constraints is a Key to Success
  2. Defect Seeding: Strengthen Systems, Boost Confidence
  3. How to Stimulate Group Creativity // Book Summary of Edward de Bono’s ‘Six Thinking Hats’
  4. Turning a Minus Into a Plus … Constraints are Catalysts for Innovation
  5. Creativity by Imitation: How to Steal Others’ Ideas and Innovate

Filed Under: Mental Models, Sharpening Your Skills Tagged With: Creativity, Critical Thinking, Innovation, Mental Models, Personality, Risk, Thinking Tools, Thought Process, Winning on the Job

« Previous Page

Primary Sidebar

Popular Now

Anxiety Assertiveness Attitudes Balance Biases Coaching Conflict Conversations Creativity Critical Thinking Decision-Making Discipline Emotions Entrepreneurs Etiquette Feedback Getting Along Getting Things Done Goals Great Manager Innovation Leadership Leadership Lessons Likeability Mental Models Mindfulness Motivation Networking Parables Performance Management Persuasion Philosophy Problem Solving Procrastination Psychology Relationships Simple Living Social Skills Stress Suffering Thinking Tools Thought Process Time Management Winning on the Job Wisdom

About: Nagesh Belludi [hire] is a St. Petersburg, Florida-based freethinker, investor, and leadership coach. He specializes in helping executives and companies ensure that the overall quality of their decision-making benefits isn’t compromised by a lack of a big-picture understanding.

Get Updates

Signup for emails

Subscribe via RSS

Contact Nagesh Belludi

RECOMMENDED BOOK:
How Asia Works

How Asia Works: Joe Studwell

Joe Studwell on how Asia’s post-war economic miracles emerged via land reform, government-backed manufacturing, and financial repression.

Explore

  • Announcements
  • Belief and Spirituality
  • Business Stories
  • Career Development
  • Effective Communication
  • Great Personalities
  • Health and Well-being
  • Ideas and Insights
  • Inspirational Quotations
  • Leadership
  • Leadership Reading
  • Leading Teams
  • Living the Good Life
  • Managing Business Functions
  • Managing People
  • MBA in a Nutshell
  • Mental Models
  • News Analysis
  • Personal Finance
  • Podcasts
  • Project Management
  • Proverbs & Maxims
  • Sharpening Your Skills
  • The Great Innovators

Recently,

  • Book Summary: Hadley Freeman’s ‘Life Moves Pretty Fast’—How ’80s Movies Wrote America’s Story
  • Inspirational Quotations #1150
  • Corporate Boardrooms: The Governance Problem Everyone Knows and Nobody Fixes
  • Every Agreement Has a Loophole: What Puma’s Pele Gambit Teaches About Lateral Thinking
  • Five Simple Changes That Can Save You the Most Time
  • Inspirational Quotations #1149
  • Sadness Isn’t a Diagnosis

Unless otherwise stated in the individual document, the works above are © Nagesh Belludi under a Creative Commons BY-NC-ND license. You may quote, copy and share them freely, as long as you link back to RightAttitudes.com, don't make money with them, and don't modify the content. Enjoy!