• Skip to content
  • Skip to primary sidebar

Right Attitudes

Ideas for Impact

Risk

The Enron Scandal: A Lesson on Motivated Blindness

July 19, 2023 By Nagesh Belludi Leave a Comment

The fallout from the Enron fiasco had far-reaching effects on the economy and the public’s trust in corporations. It serves as a powerful lesson in the dangers of motivated blindness—when individuals have a personal stake in unethical actions, they often look the other way or find ways to rationalize their behavior.

The folks at Arthur Andersen, serving as Enron’s external auditor, found themselves in a precarious situation. On the one hand, they were supposed to ensure financial integrity, but on the other hand, they acted as consultants, aiding Enron in manipulating financial transactions to deceive investors and manipulate earnings. Enron generously poured hefty fees their way, with auditing fees exceeding $25 million and consulting fees reaching $27 million in 2001. So, why would they want to put an end to this lucrative gravy train? To complicate matters further, many auditors from Andersen were eagerly vying for coveted positions at Enron, just like their fortunate colleagues.

To combat motivated blindness, it’s crucial to reflect on our biases, hold ourselves accountable, and actively seek out diverse perspectives to gain a broader understanding of any given issue. Max Bazerman, a professor at Harvard Business School and author of The Power of Noticing: What the Best Leaders See (2014,) asserts that individuals can overcome their inclination to overlook vital clues by fostering a “noticing mindset.” This involves consistently asking oneself and others, both within and outside the organization, the question: “Which critical threats and challenges might we be neglecting?”

Wondering what to read next?

  1. Power Inspires Hypocrisy
  2. Why Groups Cheat: Complicity and Collusion
  3. The Poolguard Effect: A Little Power, A Big Ego!
  4. Power Corrupts, and Power Attracts the Corruptible
  5. Why New Managers Fail to Stop Unethical Behavior Among Subordinates

Filed Under: Business Stories, Leadership, Mental Models, Sharpening Your Skills Tagged With: Discipline, Ethics, Getting Along, Integrity, Leadership, Motivation, Psychology, Risk

Under Pressure, The Narrowing Cognitive Map: Lessons from the Tragedy of Singapore Airlines Flight 6

July 10, 2023 By Nagesh Belludi Leave a Comment

Picture this: You’re parking your car when, suddenly, you catch sight of the bus you desperately need to catch pulling into the station. Acting on instinct, you swiftly navigate your car into a vacant spot, deftly gather your bags, and launch yourself towards the bus stop, driven by an unwavering determination to evade a tedious fifteen-minute wait for the next one. In the whirlwind of your frantic sprint, you absentmindedly and hastily tuck your cherished cell phone into your back pocket, oblivious that it slips out during the adrenaline-fueled pursuit of catching the bus. It’s only after another five minutes that you become aware of your cell phone’s absence, and the weight of its loss gradually descends upon you.

Isn’t it fascinating how our minds tend to close off under time pressure? This fascinating cognitive phenomenon is known as the “narrowing of the cognitive map.” It’s as if our attention becomes laser-focused, but unfortunately, that can lead us to make unfortunate errors in judgment.

When we find ourselves in the clutches of tunnel vision, our thinking becomes constrained, and we unknowingly fall into the trap of limited perspective. Not only do we become so fixated on a specific course of action that we overlook crucial details in our environment, but we also become oblivious to the subtle signals whispering, “Something’s amiss.”

Inattentional blindness, indeed. It’s a common problem in high-stress situations, and it can have serious consequences, as in the following case study of the Singapore Airlines Flight 6 crash.

Speed Stress Causes Serious Breakdowns in the Reliability of Judgment

Flight 6’s tragic case accident occurred on October 31, 2000, at Taipei’s Chiang Kai-shek International Airport. Various factors contributed to the crash, including severe weather conditions, limited visibility, inadequate airport markings, and insufficient actions taken by both the pilots and air traffic controllers.

During a scheduled stop in Taipei on its journey from Singapore to Los Angeles, Flight 6’s flight crew became aware of an approaching storm. They realized that if they delayed the takeoff, they would have to wait for the storm to pass, resulting in a lengthy 12-hour delay. This interruption would have entailed making overnight arrangements for the passengers, disrupting the crew’s schedule, and potentially impacting future flight schedules involving the aircraft and company personnel. Consequently, the crew made the decision to expedite the departure and take off before the typhoon made landfall on the island.

The Rushed Pilots Missed Clues That They Were Taking Off on a Closed Runway

Under immense time pressure, the flight crew became singularly focused on expediting their takeoff in rainy and windy conditions before the weather conditions deteriorated further. Despite being instructed to taxi to Runway 05 Left, they deviated from the assigned route and instead positioned themselves on Runway 05 Right, which was closed for takeoff due to ongoing pavement repairs.

Complicating matters, a section of Runway 05 Right was still being used as a taxiway during the construction period. The signage at the entrance of the runway did not adequately indicate the presence of a stop sign and construction equipment along the converted taxiway.

Moreover, the local air traffic controller failed to provide progressive taxi or ground movement instructions, which would have been appropriate considering the low visibility during the taxi. However, due to the crew’s heightened sense of urgency, they neglected to request step-by-step instructions for their taxi route.

Misleading Airport Markings Contributed to Pilots’ Mistaken Belief of Correct Runway Selection

In the midst of low visibility and feeling rushed, the pilots neglected crucial resources that could have guided them to the correct runway, such as runway and taxiway charts, signage, markings, and cockpit instruments. This lapse in judgment resulted in a loss of situational awareness, leading them to initiate takeoff from a runway closed for construction.

The Harsh Reality of Rushing: Examining the Aftermath of Singapore Airlines Flight 6's Closed Runway Mishap Approximately 3,300 feet down the runway, around 11:17 PM that night, the Boeing 747 collided with concrete barriers and construction equipment, resulting in the aircraft breaking apart and bursting into flames.

Tragically, 83 out of the 179 people on board lost their lives.

The crew’s loss of awareness was further compounded by the airport’s negligence in terms of maintenance and safety precautions. By failing to place mandatory construction warnings at the entrance of Runway 05 Right, they disregarded the potential risk of aircraft mistakenly attempting to take off from a partially closed runway.

The air traffic controllers also neglected to verify the aircraft’s position before granting takeoff clearances, despite the aircraft having turned onto Runway 05 Right. The airport lacked the necessary Airport Surface Detection Equipment, which could have been crucial in detecting and mitigating risks, especially given the heavy precipitation that could have hampered radar presentation at the time. In their defense, the pilots had assumed that the air traffic controllers could visually observe the aircraft, and the fact that takeoff clearance was issued just as the aircraft turned onto the taxiway gave them the impression that everything was in order.

Anxiety Leads to Attentional Tunneling and Narrowed Field of Focus

The tragedy of Singapore Airlines Flight 6 serves as a poignant case study highlighting the dangers of tunnel vision and its ability to hinder our perspective and decision-making.

Often, seemingly minor errors, when combined with time constraints and cognitive biases, can intertwine and escalate, leading to catastrophic outcomes. Even in a highly advanced cockpit and a complex system with numerous safeguards, a chain of minor errors can transform it into a deadly trap.

The human brain is naturally inclined to seek confirmation and convince itself that it completely understands the situation at hand. When faced with contradictory information, we tend to ignore it and focus solely on our preconceived notions. Furthermore, anxiety further impairs our ability to perceive the entire situation, leaving us prone to impulsive actions rather than rational responses.

It is vital to be aware of the perils of tunnel vision. It can close our eyes to the broader context and limit our capacity to consider peripheral information. This narrowed perception can have severe consequences, emphasizing the importance of maintaining a broader perspective in decision-making.

Wondering what to read next?

  1. Lessons from the World’s Worst Aviation Disaster // Book Summary of ‘The Collision on Tenerife’
  2. “Fly the Aircraft First”
  3. How Stress Impairs Your Problem-Solving Capabilities: Case Study of TransAsia Flight 235
  4. How Contributing Factors Stack Up and Accidents Unfold: A Case Study of the 2024 Delta A350 & CRJ-900 Collision
  5. What Airline Disasters Teach About Cognitive Impairment and Decision-Making Under Stress

Filed Under: Business Stories, Effective Communication, Sharpening Your Skills Tagged With: Aviation, Biases, Conflict, Decision-Making, Mindfulness, Problem Solving, Risk, Stress, Worry

After Action Reviews: The Heartbeat of Every Learning Organization

June 15, 2023 By Nagesh Belludi Leave a Comment

The After Action Review (AAR) is a formal group reflection process used by the military and other organizations to analyze critical decisions or moves.

At its core, the AAR seeks to answer four questions: What was planned, what actually happened, why did it happen, and how can we do better next time?

The focus isn’t on grading success or failure but on identifying weaknesses that need improvement and strengths that should be sustained. The knowledge gained from the AAR can then be shared with others who’re planning, developing, implementing, and evaluating similar efforts.

Conducted in an open and honest climate, the AAR involves candid discussions of actual performance results compared to objectives. It requires input and perspectives from all stakeholders involved in the project or activity. The goal is to ensure everybody feels part of the solution, not the problem.

AARs are a powerful tool for continuous improvement that enables organizational learning through reinforcing personal and organizational accountability and continuous assessment of performance successes and failures. They’re an excellent way to identify best practices (what should be spread) and errors (what shouldn’t be repeated.)

The wisest and smartest people and businesses can reflect ex post facto. As the saying goes, “He that will not reflect is a ruined man.”

Wondering what to read next?

  1. Defect Seeding: Strengthen Systems, Boost Confidence
  2. Making Tough Decisions with Scant Data
  3. Be Smart by Not Being Stupid
  4. The “Ashtray in the Sky” Mental Model: Idiot-Proofing by Design
  5. How to Solve a Problem By Standing It on Its Head

Filed Under: MBA in a Nutshell, Mental Models, Sharpening Your Skills Tagged With: Creativity, Critical Thinking, Decision-Making, Meetings, Problem Solving, Risk, Teams, Thought Process

Availability Heuristic: Our Preference for the Familiar

May 27, 2023 By Nagesh Belludi Leave a Comment

The availability heuristic is a cognitive bias that can lead people to rely on readily available information or emotionally charged and inherently interesting examples when making decisions or judgments. Essentially, individuals tend to overestimate the probability of events that are easy to recall or that they’ve personally experienced, while underestimating the likelihood of less memorable or less frequent events.

In other words, the ease of retrieval of a misleading cue may make people rely on evidence not because it is dependable but because it is memorable or striking and thus psychologically available to them. They may do so even if the evidence is not logically acceptable or does not logically support their decision.

Doctors often depend on recalling their past dramatic cases and mistakenly apply them to the current situation. People may overestimate the crime rate in their community based on news coverage, even though crime rates may be relatively low. People may dismiss the reality of climate change if they’ve recently experienced a cold winter or heard of a cold snap in a particular region, even though global warming is a long-term trend. Individuals are more likely to purchase insurance after experiencing a natural disaster than before it occurs. In each of these scenarios, the vivid and emotional evidence feels more persuasive rather than it being the most accurate or reliable information.

The availability heuristic can also shape people’s perceptions of air travel safety and lead them to believe that flying is more dangerous than it really is. Airplane accidents are often sensationalized and highly publicized by the media, making them more memorable and more prominent in people’s minds. This can cause individuals to perceive the risk of flying much higher than it actually is, leading them to avoid air travel even though it is statistically one of the safest forms of transportation. In reality, many less vivid and less memorable (i.e., psychologically unavailable) things are much more dangerous than air travel, such as falling down stairs, drowning, choking, and accidental poisoning.

Avoid falling prey to the availability heuristic and making serious misjudgments about the risks associated with different situations. Acknowledge that personal experiences and recent events may not accurately reflect the overall reality of the situation.

Wondering what to read next?

  1. The “Ashtray in the Sky” Mental Model: Idiot-Proofing by Design
  2. The Boeing 737 MAX’s Achilles Heel
  3. How Stress Impairs Your Problem-Solving Capabilities: Case Study of TransAsia Flight 235
  4. Many Hard Leadership Lessons in the Boeing 737 MAX Debacle
  5. How to Guard Against Anything You May Inadvertently Overlook

Filed Under: Mental Models, Sharpening Your Skills Tagged With: Aviation, Biases, Decision-Making, Mental Models, Problem Solving, Psychology, Risk, Thinking Tools

Maximize Your Chance Possibilities & Get Lucky

April 27, 2023 By Nagesh Belludi 1 Comment

'Luck Factor' by Richard Wiseman (ISBN 0786869143) British psychologist Richard Wiseman’s Luck Factor (2003) explores what makes some people lucky and others unlucky.

Being lucky is a mindset to bring to life. Lucky people maximize their chances of creating and noticing a lucky opportunity. They listen to their intuition when they get an opportunistic hunch.

The book’s core premise is whether you’re generally lucky or unlucky depends on your attitude—an optimistic mindset is a self-fulfilling prophecy, indeed. Lucky people expect good fortune; they expect good things to happen in their life. When they do have a run of bad luck, they adopt a resilient attitude and somehow turn that into good luck.

Idea for Impact: Lucky people aren’t lucky by sheer accident. To maximize your chances of getting lucky, get more opportunities and feel luckier. Get out there more often, produce more work, and talk to more people. Be open to the world and ready for new opportunities.

Wondering what to read next?

  1. Luck Doesn’t Just Happen
  2. How to Guard Against Anything You May Inadvertently Overlook
  3. Question Success More Than Failure
  4. Gambler’s Fallacy is the Failure to Realize How Randomness Rules Our World
  5. Overcoming Personal Constraints is a Key to Success

Filed Under: Mental Models, Sharpening Your Skills Tagged With: Attitudes, Biases, Books for Impact, Creativity, Luck, Risk, Thinking Tools

How to … Stop That Inner Worrywart

February 22, 2023 By Nagesh Belludi Leave a Comment

I’m one of those incessant worrywarts. Risk mitigation is a significant facet of my work. Thus, I worry about the prospect of non-optimal results; I worry about the unintended side effects of my decisions, and I worry about what people aren’t telling me. I even worry that I worry too much (now, that worry is entirely unfounded.)

If, like many people, you’d like to worry less, perhaps you may find the following approaches helpful. Most of my over-worrying comes from thinking ahead, but after a reasonable effort to understand risks and make plans to adapt more flexibly to developing situations, I’ll just let up. I’ll self-talk as though I’m addressing a team, “Not everything is within our control. We’ll cross that bridge when we come to it. Let’s deal with it as it appears and course-correct.” Beyond that, I’ll get really busy with something else that keeps me too occupied to fret about the previous thing that worried me.

Wondering what to read next?

  1. Everything in Life Has an Opportunity Cost
  2. How to … Combat Those Pesky Distractions That Keep You From Living Fully
  3. Dear Hoarder, Learn to Let Go
  4. Thinking Straight in the Age of Overload // Book Summary of Daniel Levitin’s ‘The Organized Mind’
  5. Take this Quiz and Find Out if You’re a Perfectionist

Filed Under: Mental Models, Sharpening Your Skills Tagged With: Balance, Clutter, Decision-Making, Perfectionism, Procrastination, Risk

How to … Plan in a Time of Uncertainty

January 25, 2023 By Nagesh Belludi Leave a Comment

In periods of uncertainty and ambiguity, move away from annual plans and focus on the next three months. Reflect on the unpredictability of the future and stay on your toes by forging plans for unexpected scenarios so you won’t be caught flat-footed when that time comes.

Establish “trigger points” and “accelerate, maintain, or terminate criteria” in advance and keep an eye on key indicators to “wait and see” or “stay the course” should one of your planned-for scenarios materialize.

Idea for Impact: When the horizon is much shorter, operate with agility and allocate your resources in real time.

Wondering what to read next?

  1. Making Tough Decisions with Scant Data
  2. Gut Instinct as Compressed Reason—Why Disney Walked Away from Twitter in 2016
  3. When Bean Counters Turn Risk Managers: Lessons from the Ford Pinto Scandal
  4. Under Pressure, The Narrowing Cognitive Map: Lessons from the Tragedy of Singapore Airlines Flight 6
  5. The “Ashtray in the Sky” Mental Model: Idiot-Proofing by Design

Filed Under: Leadership, MBA in a Nutshell, Mental Models Tagged With: Adversity, Conflict, Decision-Making, Persuasion, Problem Solving, Risk

Be Smart by Not Being Stupid

December 12, 2022 By Nagesh Belludi Leave a Comment

No superhuman ability is usually required to dodge the many foolish choices to which we’re prone. A few basic rules are all that’s needed to shield you, if not from all errors, from silly errors.

Charlie Munger often emphasizes that minimizing mistakes may be one of the least appreciated tricks in successful investing. He has reputedly credited much of Berkshire Hathaway’s success to consistently avoiding stupidity. “It is remarkable how much long-term advantage we have gotten by trying to be consistently not stupid instead of trying to be very intelligent.” And, “I think part of the popularity of Berkshire Hathaway is that we look like people who have found a trick. It’s not brilliance. It’s just avoiding stupidity.” They’ve avoided investing in situations they don’t understand or summon experience.

As a policy, avoiding stupidity in investing shouldn’t mean avoiding risk wholly; instead, it’s taking on risk only when there’s a fair chance that you’ll be adequately rewarded for assuming that risk.

Idea for Impact: Tune out stupidity. Becoming successful in life isn’t always about what you do but what you don’t do. In other words, improving decision quality is often more about decreasing your chances of failure than increasing your chances of success.

Wondering what to read next?

  1. The “Ashtray in the Sky” Mental Model: Idiot-Proofing by Design
  2. Accidents Can Happen When You Least Expect Them: The Overconfidence Effect
  3. Making Tough Decisions with Scant Data
  4. Question the Now, Imagine the Next
  5. Protect the Downside with Pre-mortems

Filed Under: MBA in a Nutshell, Mental Models, Sharpening Your Skills Tagged With: Biases, Critical Thinking, Decision-Making, Problem Solving, Risk, Thinking Tools, Thought Process, Wisdom

Do Your Employees Feel Safe Enough to Tell You the Truth?

August 15, 2022 By Nagesh Belludi Leave a Comment

Take any corporate scandal or the Challenger and Columbia disasters, and you’ll find lower-ranking voices that tried to be heard within these organizations to prevent or minimize the consequences of the excesses or the accidents.

Some leaders are too isolated from reality and establish an “all’s-good” guise whereby anything other than affirmative becomes an undesirable—unwelcome even—answer to a performance-related question. Such leaders foster a “good-news culture,” where any truth-teller or devil’s advocate is quickly dismissed. Queries such as the cursory “Is everything okay?” elicit information-free, non-answers like “yes” and “great!”

When leaders are disconnected from reality, they become incontestably right. Employees know the rule of the game is to say what’s safe to say. To not tell the truth. To tell the leader just what she wants to hear. Employees would instead go with the flow rather than speak truth to power.

Consequently, business pressures often lead to shortcuts that go overlooked. Risk is normalized. Leaders who cannot tap into the truth get blindsided when the problems blow up because they didn’t nip the problems in the bud. Leaders have only themselves to blame when things go wrong.

Idea for Impact: Insightful leadership isn’t about the privilege of position but the privilege of information flowing upwards. Wise leaders dare to seek information they don’t want to hear. They know how to ask the right questions, look for revealing details, and set up a culture of openness that makes it easy for employees to tell the truth.

Wondering what to read next?

  1. Talk to Your Key Stakeholders Every Week
  2. Making Tough Decisions with Scant Data
  3. No Boss Likes a Surprise—Good or Bad
  4. A Superb Example of Crisis Leadership in Action
  5. You Can’t Serve Two Masters

Filed Under: Effective Communication, Leading Teams, Managing People, MBA in a Nutshell Tagged With: Critical Thinking, Delegation, Great Manager, Leadership, Managing the Boss, Problem Solving, Relationships, Risk

Sometimes a Conflict is All About the Process

July 27, 2022 By Nagesh Belludi Leave a Comment

There’s a considerable difference between a “decision conflict” and a “process conflict,” and it’s necessary to disentangle the two.

A decision conflict is about a choice or another to be made. But a process conflict is about the approach, e.g., where making a choice has lacked rigorous deliberation (haste, a lack of participation from essential stakeholders, contempt for shared priorities, lack of attention to the tradeoffs, and so forth.) A sound decision has ensued from a meticulous-enough thought process, even if the decision emerges to be defective in the fullness of time.

Idea for Impact: Worry about bad decision processes. Make the “how” the anchor for your decision-making process. Improving the quality of decisions is developing better frameworks for making those decisions.

Wondering what to read next?

  1. Accidents Can Happen When You Least Expect Them: The Overconfidence Effect
  2. Making Tough Decisions with Scant Data
  3. Situational Blindness, Fatal Consequences: Lessons from American Airlines 5342
  4. How To … Be More Confident in Your Choices
  5. The “Ashtray in the Sky” Mental Model: Idiot-Proofing by Design

Filed Under: Mental Models, Sharpening Your Skills Tagged With: Biases, Confidence, Conflict, Decision-Making, Risk, Thought Process

« Previous Page
Next Page »

Primary Sidebar

Popular Now

Anxiety Assertiveness Attitudes Balance Biases Coaching Conflict Conversations Creativity Critical Thinking Decision-Making Discipline Emotions Entrepreneurs Etiquette Feedback Getting Along Getting Things Done Goals Great Manager Innovation Leadership Leadership Lessons Likeability Mental Models Mindfulness Motivation Networking Parables Performance Management Persuasion Philosophy Problem Solving Procrastination Psychology Relationships Simple Living Social Skills Stress Suffering Thinking Tools Thought Process Time Management Winning on the Job Wisdom

About: Nagesh Belludi [hire] is a St. Petersburg, Florida-based freethinker, investor, and leadership coach. He specializes in helping executives and companies ensure that the overall quality of their decision-making benefits isn’t compromised by a lack of a big-picture understanding.

Get Updates

Signup for emails

Subscribe via RSS

Contact Nagesh Belludi

RECOMMENDED BOOK:
The Effective Executive

The Effective Executive: Peter Drucker

Management guru Peter Drucker's insightful perspective and suggestions for making executives more effective managers of both themselves and others.

Explore

  • Announcements
  • Belief and Spirituality
  • Business Stories
  • Career Development
  • Effective Communication
  • Great Personalities
  • Health and Well-being
  • Ideas and Insights
  • Inspirational Quotations
  • Leadership
  • Leadership Reading
  • Leading Teams
  • Living the Good Life
  • Managing Business Functions
  • Managing People
  • MBA in a Nutshell
  • Mental Models
  • News Analysis
  • Personal Finance
  • Podcasts
  • Project Management
  • Proverbs & Maxims
  • Sharpening Your Skills
  • The Great Innovators

Recently,

  • Five Simple Changes That Can Save You the Most Time
  • Inspirational Quotations #1149
  • Sadness Isn’t a Diagnosis
  • Optionality is the Ultimate Hack
  • Life Isn’t Fair, Nor Does It Pretend To Be: What ‘Tokyo Story’ Teaches Us About Disappointment
  • Inspirational Quotations #1148
  • The Only Cure for Imposter Syndrome Is Evidence

Unless otherwise stated in the individual document, the works above are © Nagesh Belludi under a Creative Commons BY-NC-ND license. You may quote, copy and share them freely, as long as you link back to RightAttitudes.com, don't make money with them, and don't modify the content. Enjoy!