• Skip to content
  • Skip to primary sidebar

Right Attitudes

Ideas for Impact

Risk

Why We’re So Bad At Defining Problems

July 25, 2024 By Nagesh Belludi Leave a Comment

Why We're So Bad At Defining Problems You can’t solve a problem unless you fully understand it. The quality of your solution is usually tied to how well you define the problem, as the often-misattributed quote goes, “If I had an hour to solve a problem, I’d spend 55 minutes thinking about the problem and five minutes thinking about solutions.

Unfortunately, many organizations still haven’t embraced this crucial lesson. Problem definition is challenging because many organizations focus on quick fixes rather than thoroughly understanding the issues at hand.

A solution-focused culture obscures true problem identification.

In such a culture, managers feel pressured to find immediate fixes and achieve short-term goals. They also tend to fall in love with solutions too quickly, even if these solutions don’t address the real issues. Deep, evidence-based inquiry into dormant problems and potential points of failure that may have long-term impacts are often neglected. Discussing problems, especially when the organization itself might be part of the problem, is seen as taboo or a sign of weakness.

Idea for Impact: Resist Solutionist Behaviors

Develop a greater appreciation for identifying problems.

Problem identification should be an ongoing activity, helping your boss, team, and customers identify and solve the right problems while resisting inherent solutionist behaviors.

By encouraging a culture that truly falls in love with problems, not just solutions, you not only improve your chances of solving them but also pave the way for a better, less complicated organization.

Wondering what to read next?

  1. Protect the Downside with Pre-mortems
  2. Steering the Course: Leadership’s Flight with the Instrument Scan Mental Model
  3. Five Where Only One is Needed: How Airbus Avoids Single Points of Failure
  4. Empower Your Problem-Solving with the Initial Hypothesis Method
  5. Availability Heuristic: Our Preference for the Familiar

Filed Under: Leading Teams, Mental Models, Sharpening Your Skills Tagged With: Critical Thinking, Decision-Making, Learning, Mental Models, Problem Solving, Risk

Defect Seeding: Strengthen Systems, Boost Confidence

April 15, 2024 By Nagesh Belludi Leave a Comment

Defect Seeding: Strengthen Systems, Boost Confidence Ever wondered how industries where safety and quality are paramount conduct vulnerability assessments to ensure their systems are always up to the task in critical situations? “Defect Seeding” is a method that intentionally plants faults to test system integrity and reliability of protocols, technology, and personnel.

Planting defects isn’t about causing trouble; rather, it’s a proactive assessment to ensure readiness under real-world conditions and guarantee reliable detection and rejection of faulty items. For instance, aviation security agencies conduct covert testing by planting security scenarios to assess personnel, procedures, and equipment effectiveness in spotting and handling threats.

Idea for Impact: Try Defect Seeding to furtively spot vulnerabilities, ensure everything’s up to par, and inform adjustments to protocols. It’s a great way to boost confidence in your systems.

Wondering what to read next?

  1. Question the Now, Imagine the Next
  2. The Solution to a Problem Often Depends on How You State It
  3. Be Smart by Not Being Stupid
  4. Finding Potential Problems & Risk Analysis: A Case Study on ‘The Three Faces of Eve’
  5. What the Rise of AI Demands: Teaching the Thinking That Thinks About Thinking

Filed Under: Mental Models, Sharpening Your Skills, The Great Innovators Tagged With: Creativity, Critical Thinking, Decision-Making, Innovation, Problem Solving, Quality, Risk, Thinking Tools, Thought Process

Ask For What You Want

February 22, 2024 By Nagesh Belludi Leave a Comment

Ask and Receive: Unlocking Possibilities Through Asking for Help Don’t just sit around twiddling your thumbs, waiting for the good stuff to fall in your lap. Open your mouth, and you might just catch what you’re aiming for.

There’s no shame in reaching out for a hand. If it’s all above board, and there’s something to gain without risking much, why not give it a shot?

Fear’s gonna sneak up on you, but don’t let it hold you back. Sure, you might face a few ‘no’s or some pushback, but that’s just par for the course. It’s those rejections that pave the road to that one big ‘yes’ that could change the whole game.

Winners ask for what they want. Sure, they might face a heap of rejections, but they’re also the ones more likely to snag the big wins.

Idea for Impact: As long as your ask is ethical, ask for what you want. People who hesitate to ask usually settle for far less success than they could otherwise achieve.

Don’t settle for crumbs when you could be dining at the feast.

Wondering what to read next?

  1. A Mental Hack to Overcome Fear of Rejection
  2. Are These 3 Key Fears Blocking Your Path to Growth?
  3. How to … Strengthen The ‘Asking Muscle’
  4. How to Turn Your Fears into Fuel
  5. Resilience Through Rejection

Filed Under: Effective Communication, Mental Models, Sharpening Your Skills Tagged With: Assertiveness, Confidence, Fear, Negotiation, Personal Growth, Persuasion, Procrastination, Risk

When Bean Counters Turn Risk Managers: Lessons from the Ford Pinto Scandal

December 4, 2023 By Nagesh Belludi 1 Comment

When Bean Counters Turn Risk Managers: Lessons from the Ford Pinto Scandal During the 1970s, the Ford Pinto scandal became a notorious and impactful episode within the automotive industry. This scandal revolved around significant safety concerns and ethical dilemmas associated with the Ford Pinto, a subcompact car. At the center of this controversy was the Pinto’s design flaw, which rendered it susceptible to fuel tank fires in the event of rear-end collisions.

The Pinto’s fuel tank was located in a highly vulnerable spot just behind the rear axle. This design flaw meant that, in the unfortunate event of a rear-end collision, the fuel tank could rupture, resulting in fuel leakage and, tragically, sometimes even fatal fires. Concerns regarding the safety of the Pinto were raised both internally within Ford and externally by safety advocates and engineers.

After at least fifty-nine lives had been lost, the scandal escalated dramatically when it came to light that Ford had conducted an internal cost-benefit analysis, which demonstrated that rectifying the design flaw and enhancing the Pinto’s safety would be more expensive than potentially settling legal claims for injuries and fatalities stemming from accidents. Ford had, with unwavering determination and, at times, dubious tactics, lobbied against a crucial safety standard that would have compelled them to address the risk and redesign the Pinto’s fire-prone gas tank.

This episode served as a stark lesson for the nation in the principles of cost-benefit analysis. The cost of implementing rubber liners to fix the problem was estimated at $137 million, while a meticulous calculation of the all costs associated with those who suffered and perished only amounted to $49.5 million.

Overall, society has made significant progress since the Ford Pinto scandal. Across various industries, from construction to healthcare, aviation to retail, automotive to hospitality, the principle of “safety first” is not merely a hollow industry slogan. Projects and endeavors now prioritize the well-being and protection of individuals, employees, and the general public.

While some may resent our increasingly litigious society and the abundance of frivolous lawsuits that burden the legal system and public resources, it is important to acknowledge that this litigious nature has played a crucial role in holding companies and regulators accountable.

Wondering what to read next?

  1. Making Tough Decisions with Scant Data
  2. Protect the Downside with Pre-mortems
  3. Knowing When to Give Up: Establish ‘Kill Criteria’
  4. Of Course Mask Mandates Didn’t ‘Work’—At Least Not for Definitive Proof
  5. Charlie Munger’s Iron Prescription

Filed Under: Business Stories, Mental Models Tagged With: Conflict, Critical Thinking, Decision-Making, Goals, Mental Models, Persuasion, Risk, Thinking Tools

Steering the Course: Leadership’s Flight with the Instrument Scan Mental Model

November 6, 2023 By Nagesh Belludi Leave a Comment

Instrument Scan Mental Model: Leaders Must Employ Their Instruments for Guided Insight Embarking on flight training comes with a nifty habit that instructors eagerly instill from the get-go: the art of instrument scanning.

Whispers from your instructor echo in your mind, urging you with the mantra, “Scan, scan, scan!”

Keep a Good Scan of Your Instruments, Never Be Stumped

A vital cautionary command follows closely, “Don’t stare!” You learn to effortlessly let your gaze flit from one instrument to another. The altitude indicator, heading indicator, airspeed indicator, and vertical speed indicator each hold a crucial piece of the intricate airborne puzzle.

There’s a natural instinct to fixate on a single instrument, yet doing so can lead pilots astray. Gazing at the altimeter may cause heading drift, while focusing solely on heading may compromise airspeed control.

Pilots are trained to maintain a cohesive scan of all instruments, constantly cross-checking the streams of data. By doing so, they can swiftly identify any inconsistencies, such as an altitude indicating descent while the altimeter shows level flight.

With instrument scanning, pilots can promptly isolate the problematic instrument or data stream, and if necessary devise alternative plans to obtain the necessary information and ensure the aircraft’s safe and steady flight.

Just as Pilots Use Instruments in the Air, Leaders Scan Their Realm

The concept of an instrument scan mindset serves as a potent analogy for effectively managing critical information within the realm of business. Much like pilots, leaders must engage in ongoing monitoring, analysis, and cross-referencing of pertinent data. To achieve success, it’s imperative to proactively pay attention to emerging trends, maintain a steadfast focus on the larger picture, and cultivate a curious mindset.

It is of utmost importance to avoid fixating on a single metric to the detriment of considering other vital factors that could impact the business. Leaders should routinely revisit their goals, objectives, and key performance indicators (KPIs,) and conduct a thorough analysis of data to discern trends, patterns, and areas of concern, all while embracing a proactive and inquisitive approach. They should be unafraid to pose challenging questions, challenge assumptions, and maintain a comprehensive situational awareness.

Sadly, in the world of business, this mindset is frequently overlooked. Reports are often generated, and actions taken without the rigorous cross-checking or sense-checking of the underlying data. Stakeholders become overly fixated on a single “instrument,” and in doing so, they fail to maintain a broader scan of the business landscape.

It is crucial to refrain from accepting data at face value, as maintaining a vigilant scan and a more extensive situational awareness is of paramount importance. Embrace the wisdom of instrument scanning to chart a course toward success, steering clear of perilous assumptions and acquiring a comprehensive understanding of your business’s performance.

Leaders Must Employ Their ‘Instruments’ for Guided Insight

Within the symbolic framework of leadership, as in flying an aircraft, the concept of instrument scanning encompasses the continual practice of gathering and interpreting information. This process is vital for making well-informed decisions, safeguarding the welfare of the organization or team, and steering a precise path toward the envisioned goals.

Much like how pilots depend on their instruments to navigate their flights safely and on the correct course, effective leadership through instrument scanning is essential. It serves as the linchpin for steering an organization or team toward triumph and preserving their vitality and stability.

Wondering what to read next?

  1. Five Where Only One is Needed: How Airbus Avoids Single Points of Failure
  2. How Stress Impairs Your Problem-Solving Capabilities: Case Study of TransAsia Flight 235
  3. Why We’re So Bad At Defining Problems
  4. This Hack Will Help You Think Opportunity Costs
  5. Master the Middle: Where Success Sets Sail

Filed Under: Leading Teams, Mental Models, Project Management Tagged With: Aviation, Critical Thinking, Decision-Making, Discipline, Mental Models, Mindfulness, Performance Management, Problem Solving, Risk, Targets

Protect the Downside with Pre-mortems

November 2, 2023 By Nagesh Belludi Leave a Comment

'The Obstacle Is the Way' by Ryan Holiday (ISBN 1591846358) American self-help author Ryan Holiday’s The Obstacle Is the Way (2014) draws inspiration from Stoic philosophy to demonstrate how obstacles and challenges can be transformed into opportunities for personal growth and success. One recommended mindset is the pre-mortem: envisioning potential difficulties aligns with Stoic principles of accepting what one cannot control and focusing on their responses to external events:

In a postmortem, doctors convene to examine the causes of a patient’s unexpected death so they can learn and improve for the next time a similar circumstance arises. Outside of the medical world, we call this a number of things—a debriefing, an exit interview, a wrap-up meeting, a review—but whatever it’s called, the idea is the same: We’re examining the project in hindsight, after it happened.

A pre-mortem is different. In it, we look to envision what could go wrong, what will go wrong, in advance, before we start. Far too many ambitious undertakings fail for preventable reasons. Far too many people don’t have a backup plan because they refuse to consider that something might not go exactly as they wish. Your plan and the way things turn out rarely resemble each other. What you think you deserve is also rarely what you’ll get. Yet we constantly deny this fact and are repeatedly shocked by the events of the world as they unfold.

Idea for Impact: By embracing anticipation, you equip yourself with the tools to fortify your defenses, and in some cases, sidestep challenges altogether. You’re ready with a safety net ready to catch you if you stumble. With anticipation, you can endure.

P.S. Many industries—engineering, manufacturing, healthcare just to name a few—have a very formal, structured, systematic approach to identify and prioritize potential failures, their causes, and their consequences. As with a pre-mortem, the primary purpose of FMEA is to proactively assess and mitigate risks by understanding how a process or system might fail and the impact of those failures.

Wondering what to read next?

  1. More Data Isn’t Always Better
  2. Be Smart by Not Being Stupid
  3. How to Solve a Problem By Standing It on Its Head
  4. Smart Folks are Most Susceptible to Overanalyzing and Overthinking
  5. Empower Your Problem-Solving with the Initial Hypothesis Method

Filed Under: Mental Models, Sharpening Your Skills Tagged With: Critical Thinking, Decision-Making, Mental Models, Problem Solving, Risk, Thinking Tools, Wisdom

The Enron Scandal: A Lesson on Motivated Blindness

July 19, 2023 By Nagesh Belludi Leave a Comment

The fallout from the Enron fiasco had far-reaching effects on the economy and the public’s trust in corporations. It serves as a powerful lesson in the dangers of motivated blindness—when individuals have a personal stake in unethical actions, they often look the other way or find ways to rationalize their behavior.

The folks at Arthur Andersen, serving as Enron’s external auditor, found themselves in a precarious situation. On the one hand, they were supposed to ensure financial integrity, but on the other hand, they acted as consultants, aiding Enron in manipulating financial transactions to deceive investors and manipulate earnings. Enron generously poured hefty fees their way, with auditing fees exceeding $25 million and consulting fees reaching $27 million in 2001. So, why would they want to put an end to this lucrative gravy train? To complicate matters further, many auditors from Andersen were eagerly vying for coveted positions at Enron, just like their fortunate colleagues.

To combat motivated blindness, it’s crucial to reflect on our biases, hold ourselves accountable, and actively seek out diverse perspectives to gain a broader understanding of any given issue. Max Bazerman, a professor at Harvard Business School and author of The Power of Noticing: What the Best Leaders See (2014,) asserts that individuals can overcome their inclination to overlook vital clues by fostering a “noticing mindset.” This involves consistently asking oneself and others, both within and outside the organization, the question: “Which critical threats and challenges might we be neglecting?”

Wondering what to read next?

  1. Power Inspires Hypocrisy
  2. Why Groups Cheat: Complicity and Collusion
  3. The Poolguard Effect: A Little Power, A Big Ego!
  4. Power Corrupts, and Power Attracts the Corruptible
  5. Virtue Deferred: Marcial Maciel, The Catholic Church, and How Institutions Learn to Look Away

Filed Under: Business Stories, Leadership, Mental Models, Sharpening Your Skills Tagged With: Discipline, Ethics, Getting Along, Integrity, Leadership, Motivation, Psychology, Risk

Under Pressure, The Narrowing Cognitive Map: Lessons from the Tragedy of Singapore Airlines Flight 6

July 10, 2023 By Nagesh Belludi Leave a Comment

Picture this: You’re parking your car when, suddenly, you catch sight of the bus you desperately need to catch pulling into the station. Acting on instinct, you swiftly navigate your car into a vacant spot, deftly gather your bags, and launch yourself towards the bus stop, driven by an unwavering determination to evade a tedious fifteen-minute wait for the next one. In the whirlwind of your frantic sprint, you absentmindedly and hastily tuck your cherished cell phone into your back pocket, oblivious that it slips out during the adrenaline-fueled pursuit of catching the bus. It’s only after another five minutes that you become aware of your cell phone’s absence, and the weight of its loss gradually descends upon you.

Isn’t it fascinating how our minds tend to close off under time pressure? This fascinating cognitive phenomenon is known as the “narrowing of the cognitive map.” It’s as if our attention becomes laser-focused, but unfortunately, that can lead us to make unfortunate errors in judgment.

When we find ourselves in the clutches of tunnel vision, our thinking becomes constrained, and we unknowingly fall into the trap of limited perspective. Not only do we become so fixated on a specific course of action that we overlook crucial details in our environment, but we also become oblivious to the subtle signals whispering, “Something’s amiss.”

Inattentional blindness, indeed. It’s a common problem in high-stress situations, and it can have serious consequences, as in the following case study of the Singapore Airlines Flight 6 crash.

Speed Stress Causes Serious Breakdowns in the Reliability of Judgment

Flight 6’s tragic case accident occurred on October 31, 2000, at Taipei’s Chiang Kai-shek International Airport. Various factors contributed to the crash, including severe weather conditions, limited visibility, inadequate airport markings, and insufficient actions taken by both the pilots and air traffic controllers.

During a scheduled stop in Taipei on its journey from Singapore to Los Angeles, Flight 6’s flight crew became aware of an approaching storm. They realized that if they delayed the takeoff, they would have to wait for the storm to pass, resulting in a lengthy 12-hour delay. This interruption would have entailed making overnight arrangements for the passengers, disrupting the crew’s schedule, and potentially impacting future flight schedules involving the aircraft and company personnel. Consequently, the crew made the decision to expedite the departure and take off before the typhoon made landfall on the island.

The Rushed Pilots Missed Clues That They Were Taking Off on a Closed Runway

Under immense time pressure, the flight crew became singularly focused on expediting their takeoff in rainy and windy conditions before the weather conditions deteriorated further. Despite being instructed to taxi to Runway 05 Left, they deviated from the assigned route and instead positioned themselves on Runway 05 Right, which was closed for takeoff due to ongoing pavement repairs.

Complicating matters, a section of Runway 05 Right was still being used as a taxiway during the construction period. The signage at the entrance of the runway did not adequately indicate the presence of a stop sign and construction equipment along the converted taxiway.

Moreover, the local air traffic controller failed to provide progressive taxi or ground movement instructions, which would have been appropriate considering the low visibility during the taxi. However, due to the crew’s heightened sense of urgency, they neglected to request step-by-step instructions for their taxi route.

Misleading Airport Markings Contributed to Pilots’ Mistaken Belief of Correct Runway Selection

In the midst of low visibility and feeling rushed, the pilots neglected crucial resources that could have guided them to the correct runway, such as runway and taxiway charts, signage, markings, and cockpit instruments. This lapse in judgment resulted in a loss of situational awareness, leading them to initiate takeoff from a runway closed for construction.

The Harsh Reality of Rushing: Examining the Aftermath of Singapore Airlines Flight 6's Closed Runway Mishap Approximately 3,300 feet down the runway, around 11:17 PM that night, the Boeing 747 collided with concrete barriers and construction equipment, resulting in the aircraft breaking apart and bursting into flames.

Tragically, 83 out of the 179 people on board lost their lives.

The crew’s loss of awareness was further compounded by the airport’s negligence in terms of maintenance and safety precautions. By failing to place mandatory construction warnings at the entrance of Runway 05 Right, they disregarded the potential risk of aircraft mistakenly attempting to take off from a partially closed runway.

The air traffic controllers also neglected to verify the aircraft’s position before granting takeoff clearances, despite the aircraft having turned onto Runway 05 Right. The airport lacked the necessary Airport Surface Detection Equipment, which could have been crucial in detecting and mitigating risks, especially given the heavy precipitation that could have hampered radar presentation at the time. In their defense, the pilots had assumed that the air traffic controllers could visually observe the aircraft, and the fact that takeoff clearance was issued just as the aircraft turned onto the taxiway gave them the impression that everything was in order.

Anxiety Leads to Attentional Tunneling and Narrowed Field of Focus

The tragedy of Singapore Airlines Flight 6 serves as a poignant case study highlighting the dangers of tunnel vision and its ability to hinder our perspective and decision-making.

Often, seemingly minor errors, when combined with time constraints and cognitive biases, can intertwine and escalate, leading to catastrophic outcomes. Even in a highly advanced cockpit and a complex system with numerous safeguards, a chain of minor errors can transform it into a deadly trap.

The human brain is naturally inclined to seek confirmation and convince itself that it completely understands the situation at hand. When faced with contradictory information, we tend to ignore it and focus solely on our preconceived notions. Furthermore, anxiety further impairs our ability to perceive the entire situation, leaving us prone to impulsive actions rather than rational responses.

It is vital to be aware of the perils of tunnel vision. It can close our eyes to the broader context and limit our capacity to consider peripheral information. This narrowed perception can have severe consequences, emphasizing the importance of maintaining a broader perspective in decision-making.

Wondering what to read next?

  1. Lessons from the World’s Worst Aviation Disaster // Book Summary of ‘The Collision on Tenerife’
  2. “Fly the Aircraft First”
  3. How Stress Impairs Your Problem-Solving Capabilities: Case Study of TransAsia Flight 235
  4. How Contributing Factors Stack Up and Accidents Unfold: A Case Study of the 2024 Delta A350 & CRJ-900 Collision
  5. What Airline Disasters Teach About Cognitive Impairment and Decision-Making Under Stress

Filed Under: Business Stories, Effective Communication, Sharpening Your Skills Tagged With: Aviation, Biases, Conflict, Decision-Making, Mindfulness, Problem Solving, Risk, Stress, Worry

After Action Reviews: The Heartbeat of Every Learning Organization

June 15, 2023 By Nagesh Belludi Leave a Comment

The After Action Review (AAR) is a formal group reflection process used by the military and other organizations to analyze critical decisions or moves.

At its core, the AAR seeks to answer four questions: What was planned, what actually happened, why did it happen, and how can we do better next time?

The focus isn’t on grading success or failure but on identifying weaknesses that need improvement and strengths that should be sustained. The knowledge gained from the AAR can then be shared with others who’re planning, developing, implementing, and evaluating similar efforts.

Conducted in an open and honest climate, the AAR involves candid discussions of actual performance results compared to objectives. It requires input and perspectives from all stakeholders involved in the project or activity. The goal is to ensure everybody feels part of the solution, not the problem.

AARs are a powerful tool for continuous improvement that enables organizational learning through reinforcing personal and organizational accountability and continuous assessment of performance successes and failures. They’re an excellent way to identify best practices (what should be spread) and errors (what shouldn’t be repeated.)

The wisest and smartest people and businesses can reflect ex post facto. As the saying goes, “He that will not reflect is a ruined man.”

Wondering what to read next?

  1. Defect Seeding: Strengthen Systems, Boost Confidence
  2. Making Tough Decisions with Scant Data
  3. Be Smart by Not Being Stupid
  4. Question the Now, Imagine the Next
  5. The Solution to a Problem Often Depends on How You State It

Filed Under: MBA in a Nutshell, Mental Models, Sharpening Your Skills Tagged With: Creativity, Critical Thinking, Decision-Making, Meetings, Problem Solving, Risk, Teams, Thought Process

Availability Heuristic: Our Preference for the Familiar

May 27, 2023 By Nagesh Belludi Leave a Comment

The availability heuristic is a cognitive bias that can lead people to rely on readily available information or emotionally charged and inherently interesting examples when making decisions or judgments. Essentially, individuals tend to overestimate the probability of events that are easy to recall or that they’ve personally experienced, while underestimating the likelihood of less memorable or less frequent events.

In other words, the ease of retrieval of a misleading cue may make people rely on evidence not because it is dependable but because it is memorable or striking and thus psychologically available to them. They may do so even if the evidence is not logically acceptable or does not logically support their decision.

Doctors often depend on recalling their past dramatic cases and mistakenly apply them to the current situation. People may overestimate the crime rate in their community based on news coverage, even though crime rates may be relatively low. People may dismiss the reality of climate change if they’ve recently experienced a cold winter or heard of a cold snap in a particular region, even though global warming is a long-term trend. Individuals are more likely to purchase insurance after experiencing a natural disaster than before it occurs. In each of these scenarios, the vivid and emotional evidence feels more persuasive rather than it being the most accurate or reliable information.

The availability heuristic can also shape people’s perceptions of air travel safety and lead them to believe that flying is more dangerous than it really is. Airplane accidents are often sensationalized and highly publicized by the media, making them more memorable and more prominent in people’s minds. This can cause individuals to perceive the risk of flying much higher than it actually is, leading them to avoid air travel even though it is statistically one of the safest forms of transportation. In reality, many less vivid and less memorable (i.e., psychologically unavailable) things are much more dangerous than air travel, such as falling down stairs, drowning, choking, and accidental poisoning.

Avoid falling prey to the availability heuristic and making serious misjudgments about the risks associated with different situations. Acknowledge that personal experiences and recent events may not accurately reflect the overall reality of the situation.

Wondering what to read next?

  1. The Boeing 737 MAX’s Achilles Heel
  2. How Stress Impairs Your Problem-Solving Capabilities: Case Study of TransAsia Flight 235
  3. Many Hard Leadership Lessons in the Boeing 737 MAX Debacle
  4. Be Smart by Not Being Stupid
  5. How to Guard Against Anything You May Inadvertently Overlook

Filed Under: Mental Models, Sharpening Your Skills Tagged With: Aviation, Biases, Decision-Making, Mental Models, Problem Solving, Psychology, Risk, Thinking Tools

« Previous Page
Next Page »

Primary Sidebar

Popular Now

Anxiety Assertiveness Attitudes Balance Biases Coaching Conflict Conversations Creativity Critical Thinking Decision-Making Discipline Emotions Entrepreneurs Etiquette Feedback Getting Along Getting Things Done Goals Great Manager Innovation Leadership Leadership Lessons Likeability Mental Models Mentoring Mindfulness Motivation Networking Parables Performance Management Persuasion Philosophy Problem Solving Procrastination Relationships Simple Living Social Skills Stress Suffering Thinking Tools Thought Process Time Management Winning on the Job Wisdom

About: Nagesh Belludi [hire] is a St. Petersburg, Florida-based freethinker, investor, and leadership coach. He specializes in helping executives and companies ensure that the overall quality of their decision-making benefits isn’t compromised by a lack of a big-picture understanding.

Get Updates

Signup for emails

Subscribe via RSS

Contact Nagesh Belludi

RECOMMENDED BOOK:
The Effective Executive

The Effective Executive: Peter Drucker

Management guru Peter Drucker's insightful perspective and suggestions for making executives more effective managers of both themselves and others.

Explore

  • Announcements
  • Belief and Spirituality
  • Business Stories
  • Career Development
  • Effective Communication
  • Great Personalities
  • Health and Well-being
  • Ideas and Insights
  • Inspirational Quotations
  • Leadership
  • Leadership Reading
  • Leading Teams
  • Living the Good Life
  • Managing Business Functions
  • Managing People
  • MBA in a Nutshell
  • Mental Models
  • News Analysis
  • Personal Finance
  • Podcasts
  • Project Management
  • Proverbs & Maxims
  • Sharpening Your Skills
  • The Great Innovators

Recently,

  • Chance and the Currency of Preparedness: A Case Study on an Indonesian Handbag Entrepreneur, Sunny Kamengmau
  • Inspirational Quotations #1123
  • Should You Read a Philosophy Book or a Self-Help Book?
  • A Rule Followed Blindly Is a Principle Betrayed Quietly
  • Stoic in the Title, Shallow in the Text: Summary of Robert Rosenkranz’s ‘The Stoic Capitalist’
  • Inspirational Quotations #1122
  • Five Questions to Keep Your Job from Driving You Nuts

Unless otherwise stated in the individual document, the works above are © Nagesh Belludi under a Creative Commons BY-NC-ND license. You may quote, copy and share them freely, as long as you link back to RightAttitudes.com, don't make money with them, and don't modify the content. Enjoy!