• Skip to content
  • Skip to primary sidebar

Right Attitudes

Ideas for Impact

Risk

The Power of Negative Thinking

May 21, 2020 By Nagesh Belludi Leave a Comment

Stoic philosophy recommends a practice called premeditatio malorum (“the premeditation of evils,”) i.e. intentionally visualizing the worst-case scenario in your mind’s eye.

The first point is to acknowledge that misfortunes and difficulties could, rather than certainly will, come about. The second is to envisage your most constructive response should the worst-case scenario transpire. For instance, if you’d lose your job due to coronavirus, what resources could you rely on, and how could you handle the consequences?

The direct benefit of premeditatio malorum is in taming your anxiety: when you soberly conjure up how bad things could go, you typically reckon that you could indeed cope. You’ll not dwell in the negative thoughts. Even the worst possible scenario couldn’t be so terrible after all.

Another surprising benefit of negative visualization is in raising your awareness that you could lose your relationships, possessions, routines, blessings, and everything else that you currently enjoy—but perhaps take for granted. This increases your gratitude for having them now.

This Stoic exercise has an equivalent in Buddhist meditation-based mindfulness practices that encourage nonjudgmental awareness of unpleasant sensations (the vedanā.)

Your emotions, sensations, and events are in flux. They arise and pass. You’re merely to regard yourself as the observer of these thoughts and feelings, but you’re not to identify with them. You are not your thoughts … you are not your feelings. The Buddhist teacher Jack Kornfield writes in The Wise Heart: A Guide to the Universal Teachings of Buddhist Psychology (2015,)

Thoughts and opinions arise but they think themselves and disappear, “like bubbles on the Ganges,” says the Buddha. When we do not cling to them, they lose their hold on us. In the light of awareness, the constructed self of our identification relaxes. And what is seen is just the process of life, not self nor other, but life unfolding as part of the whole.

Idea for Impact: Could you benefit from reflecting on how you think of potential negative events?

An awareness of the possible—and the self-determining attitude—can be quite liberating. Premeditatio malorum is a surprisingly useful technique, if only with a scary name.

“What then should each of us say as each hardship befalls us? It was for this that I was exercising, It was for this that I was training,” as Epictetus philosophized in Discourses (3.10.7–8.)

Wondering what to read next?

  1. Cope with Anxiety and Stop Obsessive Worrying by Creating a Worry Box
  2. Expressive Writing Can Help You Heal
  3. Get Everything Out of Your Head
  4. This May Be the Most Potent Cure for Melancholy
  5. How Thought-Stopping Can Help You Overcome Negative Thinking and Get Unstuck

Filed Under: Living the Good Life, Sharpening Your Skills Tagged With: Adversity, Anxiety, Conversations, Emotions, Introspection, Mindfulness, Resilience, Risk, Stress, Suffering, Worry

The Biggest Disaster and Its Aftermath // Book Summary of Serhii Plokhy’s ‘Chernobyl: History of a Tragedy’

May 11, 2020 By Nagesh Belludi Leave a Comment

I visited the Chernobyl Exclusion Zone last year. This 2,600 sq km (1,000 sq mi) region spanning Ukraine and Belarus is the ghastly site of the greatest peacetime nuclear disaster in history. Yes, it’s safe enough to visit—with precautions, of course. [Read travel writer Cameron Hewitt’s worthwhile trip-report.]

Chernobyl is a gripping testimony to the perils of hubris and a poignant monument to the untold misery it imposed upon swathes of people.

To round out my learning from the trip, I recently read Chernobyl: History of a Tragedy (2019,) Harvard historian Serhii Plokhy’s haunting account of the nuclear disaster.

An Accident That Was Waiting to Happen

At 1:21 A.M. on 26-April-1987, an experimental safety test at Unit 4 of the Vladimir Ilyich Lenin Nuclear Power Plant complex in Chernobyl went dreadfully wrong. The test instigated a power surge. The reactor exploded and burst open, spewing a plume of radioactive elements into the atmosphere.

The discharge amounted to some 400 times more radioactive material than from the Hiroshima atomic bomb. Deputy Chief Engineer Anatoly Dyatlov, who was in charge of the calamitous test, called the ensuing meltdown “a picture worthy of the pen of the great Dante.” Sixty percent of the radioactive fallout came to settle in Belarus. Winds carried radioactive elements all the way to Scandinavia.

Right away, hundreds of firefighters and security forces consigned themselves to stabilize the reactor and stop the fires from spreading to the other reactors. In so doing, they exposed themselves to fatal doses of radiation, spending the rest of their lives grappling with serious health problems.

The world first learned of the accident when abnormal radiation levels were detected at one of Sweden’s nuclear facilities some 52 hours after the accident. It took the Soviet regime three days to acknowledge the meltdown publicly, “There has been an accident at the Chernobyl atomic-electricity station.” Soviet leader Mikhail Gorbachev addressed the nation 18 days after the accident, “The first time we have encountered in reality such a sinister force of nuclear energy that has escaped control.”

A Soviet Dream Town Then, a Graveyard of Dreams Now

The ghost town of Pripyat, a purpose-built workers’ settlement a mile from the nuclear plant, seized my mind’s eye. It was one of the Soviet Union’s most desirable communities, and 50,000 people lived there when the accident happened. Today, it’s a post-apocalyptic time warp—full of all kinds of dilapidated civic structures that once showcased the ideal Soviet lifestyle.

Pripyat was evacuated entirely on the afternoon of the disaster. Left to rot, the town has been completely overtaken by nature. A Ferris wheel—completed two weeks before the explosion, but never used—has become an enduring symbol of the inflictions. So have unforgettable images of deserted houses engulfed by forest, loveable stray dogs in dire need of medical attention, and a day-nursery strewn with workbooks and playthings.

A Human Tragedy: Disaster, Response, Fallout

Chernobyl: History of a Tragedy (2019) is a masterful retelling of the episode and its aftermath. Author Serhii Plokhy, who leads the Harvard Ukrainian Research Institute, grew up 500 kilometers south of Chernobyl. He later discovered that his thyroid had been inflamed by radiation.

Plokhy offers deeply sympathetic portrayals of the plant’s managers and engineers, the first-responders who risked their lives to contain the damage, and the civilians in the affected areas of Ukraine and Belarus.

Drawing upon the victims’ first-hand accounts as well as official records made available only after Ukraine’s 2013–14 Euromaidan revolution, Plokhy meticulously reconstructs the making of the tragedy—from the plant’s hasty construction to the assembly of the “New Safe Containment” structure installed in 2019.

The cleanup of the radioactive fallout could continue for decades. Robotic cranes will work in intense radiation and dismantle the internal structures and dispose of radioactive remnants from the reactors. The damage from the disaster may last for centuries—the half-life of the plutonium-239 isotope, one constituent of the explosion, is 24,000 years.

Design Flaws, Not All Operator Errors

Plokhy shows how Chernobyl personified the Soviet system’s socio-economic failings. Chernobyl was a disaster waiting to happen—an absolute storm of design flaws and human error.

The Chernobyl nuclear plant was hailed as a jewel in the crown of the Soviet Union’s technological achievement and the lynchpin of an ambitious nuclear power program. The RBMK (high power channel-type reactor) was flaunted as more powerful and cheaper than other prevalent nuclear power plant designs.

Anatoliy Alexandrov, the principal designer of the RBMK reactor and head of the Soviet Academy of Sciences, reportedly claimed that the RBMK was reliable enough to be installed on the Red Square. The communist czars skimped on protective containment structures in a great hurry to commission the Chernobyl reactors.

Commissars at the Ministry of Medium Machine Building, the secretive agency in charge of the Soviet nuclear program, knew all too well of the fatal flaws in the design and the construction of the RBMK reactors. Viktor Briukhanov, the Chernobyl plant’s director, had complained, “God forbid that we suffer any serious mishap—I’m afraid that not only Ukraine but the Union as a whole would not be able to deal with such a disaster.”

Yet, the powers-that-were assumed that clever-enough reactor operators could make up for the design’s shortcomings. Little wonder, then, that the Soviets ultimately attributed the accident to “awkward and silly” mistakes by operators who failed to activate the emergency systems during the safety test.

The Fallings of the Soviet System’s Internal Workings

Chernobyl: History of a Tragedy dwells on Soviet leadership and the ubiquitous disconnects and the vast dysfunctions in the Soviet state’s affairs.

Chernobyl is a metaphor for the failing Soviet system and its reflexive secrecy, central decision-making, and disregard for candor. The KGB worked systematically to minimize news of the disaster’s impact. KGB operatives censored the news of the lethal radioactive dust (calling it “just a harmless steam discharge,”) shepherded the tribunal hearings, and downplayed the political outcomes of the disaster.

Even the hundreds of thousands of Ukrainians, Belarusians, and Russians evacuated from the thirty-kilometer zone weren’t given full details of the tragedy for weeks. In the days following the accident, the Communist Party’s apparatus, well aware of the risks of radiation, did not curtail children’s participation in Kyiv’s May Day celebrations and parades.

Author Plokhy’s most insightful chapters discuss the historic political fallout of the disaster. Moscow downplayed the design flaws in the reactor and made scapegoats of a handful of the plant’s engineers and operators—just three men received 10-year prison sentences in 1987. One of the three, Deputy Chief Engineer Anatoly Dyatlov (whom I quoted above referring to Dante,) was granted amnesty after only three years. He died five years later from a heart failure caused by radiation sickness.

Chernobyl’s outstanding narrative feature is the interpretation of the disaster in the framework of the fate of the Soviet Union. Plokhy explains how Chernobyl was a decisive trigger to the unraveling of the Soviet Union. Chernobyl served as an unqualified catalyst for Gorbachev’s policy of glasnost (“openness.”) Too, it fanned the flames of the nationalist movements in the soon-to-break-away republics of Ukraine, Belarus, and the Baltics.

Recommendation: Read This Captivating Account of a Great Human Tragedy

Serhii Plokhy’s Chernobyl: History of a Tragedy (2019) is a must-read record of human fallacies and hubris. It’s a poignant narrative of the courage and helplessness of the thousands of firefighters, police officers, doctors, nurses, military personnel, and the communities who risked their lives to mitigate the aftermath of the disaster, investigate, and “liquidate” the site. On top, Chernobyl is an edifying thesis on how the disaster accelerated the decline and the downfall of the Soviet Union.

Wondering what to read next?

  1. Tylenol Made a Hero of Johnson & Johnson: A Timeless Crisis Management Case Study
  2. Why Major Projects Fail: Summary of Bent Flyvbjerg’s Book ‘How Big Things Get Done’
  3. How Stress Impairs Your Problem-Solving Capabilities: Case Study of TransAsia Flight 235
  4. Many Hard Leadership Lessons in the Boeing 737 MAX Debacle
  5. Under Pressure, The Narrowing Cognitive Map: Lessons from the Tragedy of Singapore Airlines Flight 6

Filed Under: Business Stories, Leadership Tagged With: Biases, Decision-Making, Governance, Leadership Lessons, Parables, Problem Solving, Risk

It’s Probably Not as Bad as You Think

May 5, 2020 By Nagesh Belludi 1 Comment

The 20-40-60 Rule, believed to be written by humorist Will Rogers for his movie Life Begins at 40 (1935,) states,

When you are 20, you care about what everybody thinks of you.
When you are 40, you don’t care about what people think of you,
and when you are 60, you actually understand that people were too busy thinking about themselves.

In essence, don’t agonize about what other people are thinking about you. They’re perhaps busy worrying over what you’re thinking about them.

The 20-40-60 Rule became popular when venture capitalist Heidi Roizen cited it (incorrectly attributing it to the actress Shirley MacLaine) at a 2014 lecture at Stanford. First Round Capital’s Review has noted,

People have enormous capacity to beat themselves up over the smallest foibles—saying the wrong thing in a meeting, introducing someone using the wrong name. Weeks can be lost, important relationships avoided, productivity wasted, all because we’re afraid others are judging us. “If you find this happening to you, remember, no one is thinking about you as hard as you are thinking about yourself. So don’t let it all worry you so much.”

Idea for Impact: Don’t Beat Yourself Up Over Your Mistakes

Chances are, people around you aren’t nearly as critical of you as you are of yourself. No one’s going to remember or care about your mistakes, and neither should you.

Wondering what to read next?

  1. Care Less for What Other People Think
  2. The More You Believe in Yourself, the Less You Need Others to Do It for You
  3. How To … Be More Confident in Your Choices
  4. Ever Wonder If The Other Side May Be Right?
  5. Could Limiting Social Media Reduce Your Anxiety About Work?

Filed Under: Mental Models, Sharpening Your Skills Tagged With: Attitudes, Confidence, Conviction, Decision-Making, Getting Along, Philosophy, Resilience, Risk, Wisdom

A Superb Example of Crisis Leadership in Action

May 4, 2020 By Nagesh Belludi Leave a Comment

It is in a crisis that leaders show their mettle. The New York Times notes,

The master class on how to respond [to a crisis] belongs to Jacinda Ardern, the 39-year-old prime minister of New Zealand. On March 21, when New Zealand still had only 52 confirmed cases, she told her fellow citizens what guidelines the government would follow in ramping up its response. Her message was clear: “These decisions will place the most significant restrictions on New Zealanders’ movements in modern history. But it is our best chance to slow the virus and to save lives.” And it was compassionate: “Please be strong, be kind and united against Covid-19.”

Our political leaders’ responses to the current COVID-19 crisis are particularly instructive about how leaders should act in a crisis:

  • Lead from the front. Initiate quick, bold, and responsible action, even when it carries political risk. Don’t be overcome by panic.
  • Think the crisis through. Weigh your options carefully, and then make the call confidently. Stay focused. Don’t let stress impede your problem-solving capabilities.
  • Avert an information vacuum. Any gap in the available information will be filled by guesswork and speculation.
  • Provide an accurate picture of what’s going on. Be transparent and honest right from the beginning. Acknowledging the gravity of the situation and being clear about how you’re going to collectively address the crisis leaves your constituencies with a sense of confidence in your message.
  • Choose your words carefully. Don’t create a false sense of security. Avoid making throwaway comments that might be misconstrued.
  • Communicate often. Fine-tune your message. Update your analysis and reaffirm your assurance of support. Keeping everyone in the loop diffuses fears and uncertainties.
  • Empower employees to be part of the solution. Invite and respond to employees’ feedback and concerns. They’ll need to know they’re being heard.

Idea for Impact: When a crisis hits, constituencies fall back on their leaders for information, answers, confidence, and direction. Set the appropriate tone for the organizational response by being supportive, factual, transparent, open-minded, calm, and decisive.

Wondering what to read next?

  1. Leadership is Being Visible at Times of Crises
  2. How to … Declutter Your Organizational Ship
  3. Making Tough Decisions with Scant Data
  4. Tylenol Made a Hero of Johnson & Johnson: A Timeless Crisis Management Case Study
  5. Do Your Employees Feel Safe Enough to Tell You the Truth?

Filed Under: Effective Communication, Leadership Tagged With: Anxiety, Critical Thinking, Decision-Making, Leadership, Problem Solving, Risk, Winning on the Job

Five Where Only One is Needed: How Airbus Avoids Single Points of Failure

April 6, 2020 By Nagesh Belludi Leave a Comment

In my case study of the Boeing 737 MAX aircraft’s anti-stall mechanism, I examined how relying on data from only one Angle-of-Attack (AoA) sensor caused two accidents and the aircraft’s consequent grounding.

A single point of failure is a system component, which, upon failure, renders the entire system unavailable, dysfunctional, or unreliable. In other words, if a bunch of things relies on one component within your system, and that component breaks, you are counting the time to a catastrophe.

Case Study: How Airbus Builds Multiple Redundancies to Minimize Single Points of Failure

As the Boeing 737 MAX disaster has emphasized, single points of failure in products, services, and processes may spell disaster for organizations that have not adequately identified and mitigated these critical risks. Reducing single points of failure requires a thorough knowledge of the vital systems and processes that an organization relies on to be successful.

Since the dawn of flying, reliance on one sensor has been anathema.

The Airbus A380 aircraft, for example, features 100,000 different wires—that’s 470 km of cables weighing some 5700 kg. Airbus’s wiring includes double or triple redundancy to mitigate the risk of single points of failure caused by defect wiring (e.g., corrosion, chafing of isolation or loose contact) or cut wires (e.g., through particles intruding aircraft structure as in case of an engine burst.)

The Airbus fly-by-wire flight control system has quadruplex redundancy i.e., it has five flight control computers where only one computer is needed to fly the aircraft. Consequently, an Airbus aircraft can afford to lose four of these computers and still be flyable. Of the five flight control computers, three are primary computers and two are secondary (backup) computers. The primary and the secondary flight control computers use different processors, are designed and supplied by different vendors, feature different chips from different manufacturers, and have different software systems developed by different teams using different programming languages. All this redundancy reduces the probability of common hardware- and software-errors that could lead to system failure.

Redundancy is Expensive but Indispensable

The multiple redundant flight control computers continuously keep track of each other’s output. If one computer produces deviant results for some reason, the flight control system as a whole excludes the results from that aberrant computer in determining the appropriate actions for the flight controls.

By replicating critical sensors, computers, and actuators, Airbus provides for a “graceful degradation” state, where essential facilities remain available, allowing the pilot to fly and land the plane. If an Airbus loses all engine power, a ram air turbine can power the aircraft’s most critical systems, allowing the pilot to glide and land the plane (as happened with Air Transat Flight 236.)

Idea for Impact: Build redundancy to prevent system failure from the breakdown of a single component

When you devise a highly reliable system, identify potential single points of failure, and investigate how these risks and failure modes can be mitigated.

For every component of a product or a service you work on, identify single points of failure by asking, “If this component fails, does the rest of the system still work, and, more importantly, does it still do the function it is supposed to do?”

Add redundancy to the system so that failure of any component does not mean failure of the entire system.

If you can’t build redundancy into a system due to some physical or operational complexity, establish frequent inspections and maintenance to keep the system reliable.

Postscript: In people-management, make sure that no one person has sole custody of some critical institutional knowledge, creativity, reputation, or experience that makes him indispensable to the organization’s business continuity and its future performance. If he/she should leave, the organization suffers the loss of that valued standing and expertise. See my article about this notion of key-person dependency risk, the threat posed by an organization, or a team’s over-reliance on one or a few individuals.

Wondering what to read next?

  1. How Stress Impairs Your Problem-Solving Capabilities: Case Study of TransAsia Flight 235
  2. Defect Seeding: Strengthen Systems, Boost Confidence
  3. Steering the Course: Leadership’s Flight with the Instrument Scan Mental Model
  4. What Airline Disasters Teach About Cognitive Impairment and Decision-Making Under Stress
  5. Why We’re So Bad At Defining Problems

Filed Under: Business Stories, Sharpening Your Skills Tagged With: Aviation, Critical Thinking, Decision-Making, Innovation, Mental Models, Problem Solving, Risk, Thought Process

The Boeing 737 MAX’s Achilles Heel

January 7, 2020 By Nagesh Belludi Leave a Comment

Two thousand nineteen was one of the most turbulent years in Boeing’s history. Its 737 MACS (pardon the pun) troubles went from bad to worse to staggering when aviation regulators around the world grounded the aircraft and a steady trickle of disclosures increasingly exposed software problems and corners being cut.

The flaw in this aircraft, its anti-stall mechanism that relied on data from a single sensor, offers a particularly instructive case study of the notion of single point of failure.

One Fault Could Cause an Entire System to Stop Operating

A single point of failure of a system is an element whose failure can result in the failure of the entire system. (A system may have multiple single points of failure.)

Single points of failures are eliminated by adding redundancy—by doubling the critical components or simply backing them up, so that failure of any such element does not initiate a failure of the entire system.

Boeing Mischaracterized Its Anti-Stall System as Less-than-Catastrophic in Its Safety Analysis

The two 737 MAX crashes (with Lion Air and Ethiopian Airlines) originate from a late-change that Boeing made in a trim system called the Maneuvering Characteristics Augmentation System (MCAS.)

Without the pilot’s input, the MCAS could automatically nudge the aircraft’s nose downwards if it detects that the aircraft is pointing up at a dangerous angle, for instance, at high thrust during take-off.

Reliance on One Sensor is an Anathema in Aviation

The MCAS was previously “approved” by the Federal Aviation Administration (FAA.) Nevertheless, Boeing made some design changes after the FAA approval without checking with the FAA again. The late-changes were made to improve MCAS’s response during low-speed aerodynamic stalls.

The MCAS system relied on data from just one Angle-of-Attack (AoA) sensor. With no backup, if this single sensor were to malfunction, erroneous input from that sensor would trigger a corrective nosedive just after take-off. This catastrophe is precisely what happened during the two aircraft crashes.

The AoA sensor thus became a single point of failure. Despite the existence of two angle-of-attack sensors on the nose of the aircraft, the MCAS system not only used data from either one of the sensors but also did not expect concurrence between the two sensors to infer that the aircraft was stalling. Further, Lion Air did not pay up to equip its aircraft with a warning light that could have alerted the crew to a disagreement between the AoA sensors.

Boeing Missed Safety Risks in the Design of the MAX’s Flight-Control System

Reliance on one sensor’s data is an egregious violation of a long-standing engineering principle about eliminating single points of failure. Some aircraft use three duplicate systems for flight control: if one of the three malfunctions, if two systems agree, and the third does not, the flight control software ignores the odd one out.

If the dependence on one sensor was not enough, Boeing, blinded by time- and price-pressure to stay competitive with its European rival Airbus, intentionally chose to do away with any reference to MCAS in pilot manuals to spare pilot training for its airline-customers. Indeed, Boeing did not even disclose the existence of the MCAS on the aircraft.

Boeing allows pilots to switch the trim system off to override the automated anti-stall system, but the pilots of the ill-fated Lion Air and Ethiopian Airlines flights failed to do so.

Idea for Impact: Redundancy is the Sine Qua Non of Reliable Systems

In preparation for airworthiness recertification for the 737 MAX, Boeing has corrected the MCAS blunder by having its trim software compare inputs from two AoA sensors, alerting the pilots if the sensors’ readings disagree, and limiting MCAS’s authority.

One key takeaway from the MCAS disaster is this: when you devise a highly reliable system, identify all single points of failure, and investigate how these risks and failure modes can be mitigated. Examine if every component of a product or a service you work on is a single point of failure by asking, “If this component fails, does the rest of the system still work, and, more importantly, does it still do the function it is supposed to do?”

Wondering what to read next?

  1. Availability Heuristic: Our Preference for the Familiar
  2. Many Hard Leadership Lessons in the Boeing 737 MAX Debacle
  3. Be Smart by Not Being Stupid
  4. How Contributing Factors Stack Up and Accidents Unfold: A Case Study of the 2024 Delta A350 & CRJ-900 Collision
  5. What Airline Disasters Teach About Cognitive Impairment and Decision-Making Under Stress

Filed Under: Mental Models, Sharpening Your Skills Tagged With: Aviation, Biases, Decision-Making, Problem Solving, Risk, Thinking Tools

How Stress Impairs Your Problem-Solving Capabilities: Case Study of TransAsia Flight 235

October 1, 2019 By Nagesh Belludi Leave a Comment

As I’ve examined previously, airline disasters are particularly instructive on the subjects of cognitive impairment and decision-making under stress.

Consider the case of TransAsia Airways Flight 235 that crashed in 2015 soon after takeoff from an airport in Taipei, Taiwan. Accident investigations revealed that the pilots of the ATR 72-600 turboprop erroneously switched off the plane’s working engine after the other lost power. Here’s a rundown of what happened:

  1. About one minute after takeoff, at 1,300 feet, engine #2 had an uncommanded autofeather failure. This is a routine engine failure—the aircraft is designed to be able to be flown on one engine.
  2. The Pilot Flying misdiagnosed the problem, and assumed that the still-functional engine #1 had failed. He retarded power on engine #1 and it promptly shut down.
  3. With power lost on both the engines, the pilots did not react to the stall warnings in a timely and effective manner. The Pilot Flying acknowledged his error, “wow, pulled back the wrong side throttle.”
  4. The aircraft continued its descent. The pilots rushed to restart engine #1, but the remaining altitude was not adequate enough to recover the aircraft.
  5. In a state of panic, the Pilot Flying clasped the flight controls and steered (see this video) the aircraft perilously to avoid apartment blocks and commercial buildings before clipping a bridge and crashing into a river.

A High Level of Stress Can Diminish Your Problem-solving Capabilities

Thrown into disarray after a routine engine failure, the pilots of TransAsia flight 235 did not perform their airline’s abnormal and emergency procedures to identify the failure and implement the required corrective actions. Their ineffective coordination, communication, and error management compromised the safety of the flight.

The combination of sudden threat and extreme time pressure to avert a danger fosters a state of panic, in which decision-makers are inclined to commit themselves impulsively to courses of action that they will soon come to regret.

Idea for Impact: To combat cognitive impairment under stress, use checklists and standard operating procedures, as well as increased training on situational awareness, crisis communication, and emergency management.

Wondering what to read next?

  1. What Airline Disasters Teach About Cognitive Impairment and Decision-Making Under Stress
  2. Lessons from the World’s Worst Aviation Disaster // Book Summary of ‘The Collision on Tenerife’
  3. Under Pressure, The Narrowing Cognitive Map: Lessons from the Tragedy of Singapore Airlines Flight 6
  4. “Fly the Aircraft First”
  5. Jeju Air Flight 2216—The Alleged Failure to Think Clearly Under Fire

Filed Under: Business Stories, Leadership, Sharpening Your Skills Tagged With: Anxiety, Aviation, Biases, Decision-Making, Emotions, Mental Models, Mindfulness, Problem Solving, Risk, Stress, Thought Process, Worry

Accidents Can Happen When You Least Expect Them: The Overconfidence Effect

September 3, 2019 By Nagesh Belludi Leave a Comment

In the context of decision-making and risk-taking, the “overconfidence effect” is a judgmental bias that can affect your subjective estimate of the likelihood of future events. This can cause you to misjudge the odds of positive/desirable events as well as negative/undesirable events.

As the following Zen story illustrates, experience breeds complacency. When confidence gives way to overconfidence, it can transform from a strength to a liability.

A master gardener, famous for his skill in climbing and pruning the highest trees, examined his disciple by letting him climb a very high tree. Many people had come to watch. The master gardener stood quietly, carefully following every move but not interfering with one word.

Having pruned the top, the disciple climbed down and was only about ten feet from the ground when the master suddenly yelled: “Take care, take care!”

When the disciple was safely down an old man asked the master gardener: “You did not let out one word when he was aloft in the most dangerous place. Why did you caution him when he was nearly down? Even if he had slipped then, he could not have greatly hurt himself.”

“But isn’t it obvious?” replied the master gardener. “Right up at the top he is conscious of the danger, and of himself takes care. But near the end when one begins to feel safe, this is when accidents occur.”

Reference: Irmgard Schlögl’s The Wisdom of the Zen Masters (1976.) Dr. Schlögl (1921–2007) became Ven. Myokyo-ni in 1984, and served as Rinzai Zen Buddhist nun and headed the Zen Centre in London.

Wondering what to read next?

  1. Situational Blindness, Fatal Consequences: Lessons from American Airlines 5342
  2. Be Smart by Not Being Stupid
  3. Smart Folks are Most Susceptible to Overanalyzing and Overthinking
  4. Increase Paranoia When Things Are Going Well
  5. How to … Escape the Overthinking Trap

Filed Under: Mental Models, Sharpening Your Skills Tagged With: Biases, Confidence, Critical Thinking, Decision-Making, Mindfulness, Parables, Risk, Thinking Tools, Thought Process, Wisdom

Beware of Key-Person Dependency Risk

September 7, 2018 By Nagesh Belludi

Key-Person Dependency Risk is the threat posed by an organization or a team’s over-reliance on one or a few individuals.

The key-person has sole custody of some critical institutional knowledge, creativity, reputation, or experience that makes him indispensable to the organization’s business continuity and its future performance. If he/she should leave, the organization suffers the loss of that valued standing and expertise.

Small businesses and start-ups are especially exposed to key-person dependency risk. Tesla, for example, faces a colossal key-man risk—its fate is linked closely to the actions of founder-CEO Elon Musk, who has come under scrutiny lately.

Much of Berkshire Hathaway’s performance over the decades has been based on CEO Warren Buffett’s reputation and his ability to wring remarkable deals from companies in duress. There’s a great deal of prestige in selling one’s business to Buffett. He is irreplaceable; given his remarkable long-term record of accomplishment, it is important that much of what he has built over the years remains intact once he is gone. Buffett has built a strong culture that is likely to endure.

Key Employees are Not Only Assets, but also Large Contingent Liabilities

The most famous “key man” of all time was Apple’s Steve Jobs. Not only was he closely linked to his company’s identity, but he also played a singular role in building Apple into the global consumer-technology powerhouse that it is. Jobs had steered Apple’s culture in a desired direction and groomed his handpicked management team to sustain Apple’s inventive culture after he was gone. Tim Cook, the operations genius who became Apple’s CEO after Jobs died in 2011, has led the company to new heights.

The basic solution to key-person dependency risk is to identify and document critical knowledge of the organization. (Capturing tacit knowledge is not easy when it resides “in the key-person’s head.”) Organizations must also focus on cross-training and succession planning to identify and enable others to develop and perform the same tasks as the key-person.

Idea for Impact: No employee should be indispensable. A well-managed company is never dependent upon the performance of one or a few individuals. As well, no employee should be allowed to hoard knowledge, relationships, or resources to achieve job security.

Wondering what to read next?

  1. What Virgin’s Richard Branson Teaches: The Entrepreneur as Savior, Stuntman, Spectacle
  2. Creativity by Imitation: How to Steal Others’ Ideas and Innovate
  3. Risk Homeostasis and Peltzman Effect: Why Risk Mitigation and Safety Measures Become Ineffective
  4. The Dramatic Fall of Theranos & Elizabeth Holmes // Book Summary of John Carreyrou’s ‘Bad Blood’
  5. Innovation Without Borders: Shatter the ‘Not Invented Here’ Mindset

Filed Under: Business Stories, Managing People, MBA in a Nutshell, Mental Models Tagged With: Biases, Career Planning, Entrepreneurs, Human Resources, Icons, Leadership Lessons, Mental Models, Personality, Risk, Role Models

Risk Homeostasis and Peltzman Effect: Why Risk Mitigation and Safety Measures Become Ineffective

May 17, 2018 By Nagesh Belludi Leave a Comment

Risk Homeostasis and Peltzman Effect are two concepts relating to how humans react to risks.

Risk Homeostasis is the notion that our personal psychological frameworks comprise a target level of risk towards which we direct our efforts.

We measure risk on our own “risk thermostat.” Because the risk in our environment changes continuously, we are incessantly forced away from our target risk level, but revert toward it by counteracting those external influences.

If the perceived risk of a situation exceeds our target level, we undertake defensive actions to reduce the risk. And if the perceived risk is lower than our target level, we attempt to increase our risk back to our target level by exposing ourselves to dangerous actions.

Consequently, people take more risks when they’re forced to act more carefully. For instance, requiring motorcycle bikers to wear helmets may make them take more risks—to maintain their level of thrill, not to get into accidents.

Peltzman Effect is the notion that people respond to increased safety by adding new risks. The namesake, economist Sam Peltzman, argued in 1975 that when automobile safety rules were introduced, at least some of the benefits of the new safety rules were counterbalanced by changes in the behavior of drivers. Peltzman posited that making seatbelts mandatory for cars resulted in reducing the number of occupant fatalities, but increased pedestrian casualties and collision-related property damages.

Peltzman made a case that even though seatbelts reduced the risk of being severely injured in an accident, drivers compensated by driving aggressively and carelessly—driving closer to the car ahead of them, for instance—so as to save time or maintain their level of thrill, even at the risk of causing damage beyond themselves and their cars.

Risk Homeostasis and Peltzman Effect remain controversial theories. Despite their apparent relevance, the prevailing evidence remains inadequate and inconclusive about how people behave less cautiously when they feel more protected and vice versa.

Further, Risk Homeostasis and Peltzman Effect challenge the foundations of safety and injury-prevention policies. They assert that the only effective safety measures are those that alter individuals’ desired risk level. Anything that barely modifies the environment or regulates individuals’ behavior without affecting their target risk levels is useless.

Wondering what to read next?

  1. Knowing When to Give Up: Establish ‘Kill Criteria’
  2. How Stress Impairs Your Problem-Solving Capabilities: Case Study of TransAsia Flight 235
  3. Question the Now, Imagine the Next
  4. Hofstadter’s Law: Why Everything Takes Longer Than Anticipated
  5. Accidents Can Happen When You Least Expect Them: The Overconfidence Effect

Filed Under: Mental Models, Sharpening Your Skills Tagged With: Biases, Decision-Making, Discipline, Mental Models, Personality, Risk, Thought Process

« Previous Page
Next Page »

Primary Sidebar

Popular Now

Anxiety Assertiveness Attitudes Balance Biases Coaching Conflict Conversations Creativity Critical Thinking Decision-Making Discipline Emotions Entrepreneurs Etiquette Feedback Getting Along Getting Things Done Goals Great Manager Innovation Leadership Leadership Lessons Likeability Mental Models Mentoring Mindfulness Motivation Networking Parables Performance Management Persuasion Philosophy Problem Solving Procrastination Relationships Simple Living Social Skills Stress Suffering Thinking Tools Thought Process Time Management Winning on the Job Wisdom

About: Nagesh Belludi [hire] is a St. Petersburg, Florida-based freethinker, investor, and leadership coach. He specializes in helping executives and companies ensure that the overall quality of their decision-making benefits isn’t compromised by a lack of a big-picture understanding.

Get Updates

Signup for emails

Subscribe via RSS

Contact Nagesh Belludi

RECOMMENDED BOOK:
The Art of War

The Art of War: Sun Tzu

The ancient Chinese master Sun Tzu reveals the essence of conflict and how to win by knowing yourself, knowing your enemy, and fighting only when you can win.

Explore

  • Announcements
  • Belief and Spirituality
  • Business Stories
  • Career Development
  • Effective Communication
  • Great Personalities
  • Health and Well-being
  • Ideas and Insights
  • Inspirational Quotations
  • Leadership
  • Leadership Reading
  • Leading Teams
  • Living the Good Life
  • Managing Business Functions
  • Managing People
  • MBA in a Nutshell
  • Mental Models
  • News Analysis
  • Personal Finance
  • Podcasts
  • Project Management
  • Proverbs & Maxims
  • Sharpening Your Skills
  • The Great Innovators

Recently,

  • A Boss’s Presence Deserves Our Gratitude’s Might
  • Chance and the Currency of Preparedness: A Case Study on an Indonesian Handbag Entrepreneur, Sunny Kamengmau
  • Inspirational Quotations #1123
  • Should You Read a Philosophy Book or a Self-Help Book?
  • A Rule Followed Blindly Is a Principle Betrayed Quietly
  • Stoic in the Title, Shallow in the Text: Summary of Robert Rosenkranz’s ‘The Stoic Capitalist’
  • Inspirational Quotations #1122

Unless otherwise stated in the individual document, the works above are © Nagesh Belludi under a Creative Commons BY-NC-ND license. You may quote, copy and share them freely, as long as you link back to RightAttitudes.com, don't make money with them, and don't modify the content. Enjoy!