• Skip to content
  • Skip to primary sidebar

Right Attitudes

Ideas for Impact

Biases

We Live in a Lookist Society

July 2, 2020 By Nagesh Belludi Leave a Comment

From Irmgard Schlögl’s The Wisdom of the Zen Masters (1976,)

Wealthy donors invited Master Ikkyū to a banquet.

The Master arrived there dressed in beggar’s robes. His host, not recognizing him in this garb, hustled him away: “We cannot have you here at the doorstep. We are expecting the famous Master Ikkyū any moment.”

The Master went home, there changed into his ceremonial robe of purple brocade, and again presented himself at his host’s doorstep.

He was received with due respect, and ushered into the banquet room. There, he put his stiff robe on the cushion, saying, “I expect you invited the robe since you showed me away a little while ago,” and left.

That what you wear affects how others will perceive you is well-known empirically and has been established in scientific literature. People dressed conservatively, for example, are seen as more composed and trustworthy, whereas those dressed bold and suave are viewed as more attractive and self-assured. Women who wear menswear-inspired dress suits are more likely to be perceived well in job interviews. Men are shown to misperceive women’s friendliness as sexual intent, particularly when the women are dressed suggestively.

In the Second Quarto (1604) of Hamlet, Shakespeare, in the voice of the Polonius, declares, “For the apparrell oft proclaimes the man.” Mark Twain seemingly pronounced, “Clothes make the man. Naked people have little or no influence on society.”

Several maxims remark about the notion that an individual’s clothing is confirmation for his/her personal, professional, and social identity:

  • In Egypt: “لبس البوصة، تبقى عروسة” or “Dressing up a stick turns it into a doll”
  • In China: “我们在外面判断这件衣服, 在家里我们判断这个人” or “Abroad we judge the dress; at home we judge the man”
  • In Japan: “馬子にも衣装” or “Even a packhorse driver would look great in fine clothes”
  • In Korea: “옷이 날개다” or “Clothes are wings”

Idea for Impact: We live in a lookist society. Always dress the part. Ignore this at your own peril.

Wondering what to read next?

  1. Accidents Can Happen When You Least Expect Them: The Overconfidence Effect
  2. How to Turn Your Fears into Fuel
  3. Increase Paranoia When Things Are Going Well
  4. Situational Blindness, Fatal Consequences: Lessons from American Airlines 5342
  5. The Deceptive Power of False Authority: A Case Study of Linus Pauling’s Vitamin C Promotion

Filed Under: Sharpening Your Skills Tagged With: Biases, Confidence, Etiquette, Getting Ahead, Mindfulness, Parables, Workplace

Pulling Off the Impossible Under Immense Pressure: Leadership Lessons from Captain Sully

May 25, 2020 By Nagesh Belludi Leave a Comment

I recently watched Sully (2016,) the overrated Clint Eastwood-directed drama about the US Airways Flight 1549 incident, aka the “Miracle on the Hudson.”

Sully Movie (2016) with Tom Hanks, Clint Eastwood In summary, on 15-Jan-2009, Captain Chesley “Sully” Sullenberger (played by Tom Hanks) heroically dead-sticked his Airbus A320 aircraft in New York City’s Hudson River after both the aircraft’s engines failed from a bird strike. He then helped get passengers and crew off uninjured.

Sully centers on Sullenberger’s post-decision dissonance. To spin the real-life six-minute flight and the 24-minute swift rescue into a 96-minute Holyrood extravaganza, the filmmakers devised an antagonist in the form of National Transportation Safety Board (NTSB) investigators who try hard to blame Sullenberger for the mishap.

Overdramatized Portrayal of the NTSB Investigators

On the screen, the smirking NTSB investigators use flight simulators and computer analysis to second-guess Sullenberger’s lightning-quick decisions. They would have rather he made it to an airport nearby—a possibility that he had instantly judged was not viable given his 40 years of flying experience.

Contrary to Sully‘s portrayal, the NTSB was unequivocal that landing the aircraft on the Hudson was the right call. In his memoir, Highest Duty: My Search for What Really Matters (2009,) Sullenberger mentions that he was “buoyed by the fact that investigators determined that [first officer] Jeff and I made appropriate choices at every step.”

In the course of the real-life 18-month investigation of Flight 1549, the NTSB did investigate the odds of landing the aircraft in a nearby airport. Exploring all possible flaws that contribute to a crash is part of the NTSB’s charter. The NTSB, like other accident-investigation agencies, concerns itself principally with preventing future accidents. It rarely seeks to assign blame, nor does it make the pilots justify their actions.

The Complex Leadership Requirements of Flying

Apart from the sensationalized portrayal of the NTSB investigators, Sully misses the opportunity to call attention to the complex leadership requirements of aviation. Flying a civil aircraft is characterized by a high level of standardization and automation, while still placing a strong emphasis on formal qualification and experience.

Today, highly trained pilots have to work with ever more complicated and autonomous technology. The routinization must be weighed up against deliberate action. On Flight 1549, the A320’s much-studied fly-by-wire system allowed the pilots to concentrate on trying to resurrect the engines, starting the auxiliary power unit (APU,) and deciding the flight path in the direction of the Hudson. Airbus’s legendary computer controls will not allow the pilots to override the computer-imposed limits even in an urgent situation. Sullenberger and others have commented that lesser human-machine interaction may perhaps have allowed him a more favorable landing flare and helped him temper the aircraft’s impact with the water.

Aircrews now consist of ad hoc teams working together typically only for a few flights. They build their team quickly and rely on the crew’s collective knowledge and experience to round out the high levels of standardization.

Due to the complex demands for leadership in aircrews, specialized training programs such as Crew Resource Management (CRM) are in place to improve crew communication, situational awareness, and impromptu decision-making. These systems were established to help crews when technical failures and unexpected events disrupt highly procedualized normal operations.

Furthermore, individual and organizational learning from accidents was institutionalized through mandatory reporting of incidents—not only within the airline involved but also across the aviation community.

Leadership Lessons on Acting Under Immense Pressure: The Context of Success

Owing to intuition, experience, and quick coordination, Sullenberger was able to “land” the aircraft on the Hudson within four minutes following the bird strike and have his passengers and crew quickly evacuated onto the aircraft’s wings and onto rafts.

The rapid and highly complex coordination required for this extraordinary achievement was only achievable because of exceptional leadership, exemplary decision-making under stress, and the technical skills of both the cockpit- and cabin-crew.

The pilots were highly experienced—Sullenberger even had experience as a glider pilot. Further contextual factors—the calm weather on that afternoon and the proximity of NY Waterway ferries—helped bring this accident to a good end. All this facilitated the almost immediate rescue of passengers and crew from the rapidly sinking aircraft and the frigid water.

'Highest Duty What Really Matters' by Chesley Sullenberger (ISBN 0061924695) On Another Note, Sullenberger’s memoir, Highest Duty (2009,) is passable. The most interesting part of the book is the last fourth, where he discusses Flight 1549 and what went through his mind. Interestingly, Sullenberger writes that even after he realized that the plane was in one piece after hitting the water, he worried about the difficulties that still lay ahead. The aircraft was sinking: everyone had to be evacuated quickly. The passengers could survive only for a few minutes in the frigid waters of the Hudson.

Wondering what to read next?

  1. Lessons from the World’s Worst Aviation Disaster // Book Summary of ‘The Collision on Tenerife’
  2. Under Pressure, The Narrowing Cognitive Map: Lessons from the Tragedy of Singapore Airlines Flight 6
  3. “Fly the Aircraft First”
  4. How Contributing Factors Stack Up and Accidents Unfold: A Case Study of the 2024 Delta A350 & CRJ-900 Collision
  5. How Stress Impairs Your Problem-Solving Capabilities: Case Study of TransAsia Flight 235

Filed Under: Leadership, Project Management, Sharpening Your Skills Tagged With: Assertiveness, Aviation, Biases, Conflict, Decision-Making, Mindfulness, Problem Solving, Stress, Teams

The Biggest Disaster and Its Aftermath // Book Summary of Serhii Plokhy’s ‘Chernobyl: History of a Tragedy’

May 11, 2020 By Nagesh Belludi Leave a Comment

I visited the Chernobyl Exclusion Zone last year. This 2,600 sq km (1,000 sq mi) region spanning Ukraine and Belarus is the ghastly site of the greatest peacetime nuclear disaster in history. Yes, it’s safe enough to visit—with precautions, of course. [Read travel writer Cameron Hewitt’s worthwhile trip-report.]

Chernobyl is a gripping testimony to the perils of hubris and a poignant monument to the untold misery it imposed upon swathes of people.

To round out my learning from the trip, I recently read Chernobyl: History of a Tragedy (2019,) Harvard historian Serhii Plokhy’s haunting account of the nuclear disaster.

An Accident That Was Waiting to Happen

At 1:21 A.M. on 26-April-1987, an experimental safety test at Unit 4 of the Vladimir Ilyich Lenin Nuclear Power Plant complex in Chernobyl went dreadfully wrong. The test instigated a power surge. The reactor exploded and burst open, spewing a plume of radioactive elements into the atmosphere.

The discharge amounted to some 400 times more radioactive material than from the Hiroshima atomic bomb. Deputy Chief Engineer Anatoly Dyatlov, who was in charge of the calamitous test, called the ensuing meltdown “a picture worthy of the pen of the great Dante.” Sixty percent of the radioactive fallout came to settle in Belarus. Winds carried radioactive elements all the way to Scandinavia.

Right away, hundreds of firefighters and security forces consigned themselves to stabilize the reactor and stop the fires from spreading to the other reactors. In so doing, they exposed themselves to fatal doses of radiation, spending the rest of their lives grappling with serious health problems.

The world first learned of the accident when abnormal radiation levels were detected at one of Sweden’s nuclear facilities some 52 hours after the accident. It took the Soviet regime three days to acknowledge the meltdown publicly, “There has been an accident at the Chernobyl atomic-electricity station.” Soviet leader Mikhail Gorbachev addressed the nation 18 days after the accident, “The first time we have encountered in reality such a sinister force of nuclear energy that has escaped control.”

A Soviet Dream Town Then, a Graveyard of Dreams Now

The ghost town of Pripyat, a purpose-built workers’ settlement a mile from the nuclear plant, seized my mind’s eye. It was one of the Soviet Union’s most desirable communities, and 50,000 people lived there when the accident happened. Today, it’s a post-apocalyptic time warp—full of all kinds of dilapidated civic structures that once showcased the ideal Soviet lifestyle.

Pripyat was evacuated entirely on the afternoon of the disaster. Left to rot, the town has been completely overtaken by nature. A Ferris wheel—completed two weeks before the explosion, but never used—has become an enduring symbol of the inflictions. So have unforgettable images of deserted houses engulfed by forest, loveable stray dogs in dire need of medical attention, and a day-nursery strewn with workbooks and playthings.

A Human Tragedy: Disaster, Response, Fallout

Chernobyl: History of a Tragedy (2019) is a masterful retelling of the episode and its aftermath. Author Serhii Plokhy, who leads the Harvard Ukrainian Research Institute, grew up 500 kilometers south of Chernobyl. He later discovered that his thyroid had been inflamed by radiation.

Plokhy offers deeply sympathetic portrayals of the plant’s managers and engineers, the first-responders who risked their lives to contain the damage, and the civilians in the affected areas of Ukraine and Belarus.

Drawing upon the victims’ first-hand accounts as well as official records made available only after Ukraine’s 2013–14 Euromaidan revolution, Plokhy meticulously reconstructs the making of the tragedy—from the plant’s hasty construction to the assembly of the “New Safe Containment” structure installed in 2019.

The cleanup of the radioactive fallout could continue for decades. Robotic cranes will work in intense radiation and dismantle the internal structures and dispose of radioactive remnants from the reactors. The damage from the disaster may last for centuries—the half-life of the plutonium-239 isotope, one constituent of the explosion, is 24,000 years.

Design Flaws, Not All Operator Errors

Plokhy shows how Chernobyl personified the Soviet system’s socio-economic failings. Chernobyl was a disaster waiting to happen—an absolute storm of design flaws and human error.

The Chernobyl nuclear plant was hailed as a jewel in the crown of the Soviet Union’s technological achievement and the lynchpin of an ambitious nuclear power program. The RBMK (high power channel-type reactor) was flaunted as more powerful and cheaper than other prevalent nuclear power plant designs.

Anatoliy Alexandrov, the principal designer of the RBMK reactor and head of the Soviet Academy of Sciences, reportedly claimed that the RBMK was reliable enough to be installed on the Red Square. The communist czars skimped on protective containment structures in a great hurry to commission the Chernobyl reactors.

Commissars at the Ministry of Medium Machine Building, the secretive agency in charge of the Soviet nuclear program, knew all too well of the fatal flaws in the design and the construction of the RBMK reactors. Viktor Briukhanov, the Chernobyl plant’s director, had complained, “God forbid that we suffer any serious mishap—I’m afraid that not only Ukraine but the Union as a whole would not be able to deal with such a disaster.”

Yet, the powers-that-were assumed that clever-enough reactor operators could make up for the design’s shortcomings. Little wonder, then, that the Soviets ultimately attributed the accident to “awkward and silly” mistakes by operators who failed to activate the emergency systems during the safety test.

The Fallings of the Soviet System’s Internal Workings

Chernobyl: History of a Tragedy dwells on Soviet leadership and the ubiquitous disconnects and the vast dysfunctions in the Soviet state’s affairs.

Chernobyl is a metaphor for the failing Soviet system and its reflexive secrecy, central decision-making, and disregard for candor. The KGB worked systematically to minimize news of the disaster’s impact. KGB operatives censored the news of the lethal radioactive dust (calling it “just a harmless steam discharge,”) shepherded the tribunal hearings, and downplayed the political outcomes of the disaster.

Even the hundreds of thousands of Ukrainians, Belarusians, and Russians evacuated from the thirty-kilometer zone weren’t given full details of the tragedy for weeks. In the days following the accident, the Communist Party’s apparatus, well aware of the risks of radiation, did not curtail children’s participation in Kyiv’s May Day celebrations and parades.

Author Plokhy’s most insightful chapters discuss the historic political fallout of the disaster. Moscow downplayed the design flaws in the reactor and made scapegoats of a handful of the plant’s engineers and operators—just three men received 10-year prison sentences in 1987. One of the three, Deputy Chief Engineer Anatoly Dyatlov (whom I quoted above referring to Dante,) was granted amnesty after only three years. He died five years later from a heart failure caused by radiation sickness.

Chernobyl’s outstanding narrative feature is the interpretation of the disaster in the framework of the fate of the Soviet Union. Plokhy explains how Chernobyl was a decisive trigger to the unraveling of the Soviet Union. Chernobyl served as an unqualified catalyst for Gorbachev’s policy of glasnost (“openness.”) Too, it fanned the flames of the nationalist movements in the soon-to-break-away republics of Ukraine, Belarus, and the Baltics.

Recommendation: Read This Captivating Account of a Great Human Tragedy

Serhii Plokhy’s Chernobyl: History of a Tragedy (2019) is a must-read record of human fallacies and hubris. It’s a poignant narrative of the courage and helplessness of the thousands of firefighters, police officers, doctors, nurses, military personnel, and the communities who risked their lives to mitigate the aftermath of the disaster, investigate, and “liquidate” the site. On top, Chernobyl is an edifying thesis on how the disaster accelerated the decline and the downfall of the Soviet Union.

Wondering what to read next?

  1. Tylenol Made a Hero of Johnson & Johnson: A Timeless Crisis Management Case Study
  2. Why Major Projects Fail: Summary of Bent Flyvbjerg’s Book ‘How Big Things Get Done’
  3. How Stress Impairs Your Problem-Solving Capabilities: Case Study of TransAsia Flight 235
  4. Many Hard Leadership Lessons in the Boeing 737 MAX Debacle
  5. Under Pressure, The Narrowing Cognitive Map: Lessons from the Tragedy of Singapore Airlines Flight 6

Filed Under: Business Stories, Leadership Tagged With: Biases, Decision-Making, Governance, Leadership Lessons, Parables, Problem Solving, Risk

When One Person is More Interested in a Relationship

May 9, 2020 By Nagesh Belludi Leave a Comment

The American sociologist Willard Waller coined the term “Principle of Least Interest” to describe how differences of commitment in a relationship can have a major effect on the relationship’s dynamics.

In The Family: A Dynamic Interpretation (1938,) Waller noted that, in any relationship (romantic, familial, business, buyer-seller, and so on) where one partner is far more emotionally invested than the other, the less-involved partner has more power in the relationship. In a one-sided romantic relationship, for example, the partner who loves less has more power.

Moreover, appearing indifferent or uninterested is a common way by which people try to raise their own standing in a relationship. Recall the well-known “walk away” negotiation tactic—tell a used car salesman, “this just isn’t the deal that I’m looking for,” and he may call you the next day with a better offer.

An imbalanced relationship can only last for a while.

A nourishing relationship shouldn’t involve a constant struggle for power.

Idea for Impact: Watch out for relationships where the other seems to care less about the relationship than you do. Such relationships can drain you dry.

Wondering what to read next?

  1. The High Cost of Winning a Small Argument
  2. The Likeability Factor: Whose “Do Not Pair” List Includes You?
  3. How to … Deal with Less Intelligent People
  4. Managerial Lessons from the Show Business: Summary of Leadership from the Director’s Chair
  5. Why Your Partner May Be Lying

Filed Under: Managing People, Mental Models Tagged With: Biases, Conflict, Getting Along, Likeability, Mindfulness, Negotiation, Persuasion, Relationships

Question Success More Than Failure

March 5, 2020 By Nagesh Belludi Leave a Comment

Katrina “Kat” Cole, formerly CEO of the American baked goods-chain Cinnabon, in an interview for Adam Bryant’s “Corner Office” column in the New York Times:

I’ve learned to question success a lot more than failure. I’ll ask more questions when sales are up than I do when they’re down. I ask more questions when things seem to be moving smoothly, because I’m thinking: “There’s got to be something I don’t know. There’s always something.” This approach means that people don’t feel beat up for failing, but they should feel very concerned if they don’t understand why they’re successful. I made mistakes over the years that taught me to ask those questions.

People tend to attribute failure to external factors and success to their own abilities and performance (see self-serving bias and Dunning-Kruger effect.) The human brain is indeed riddled with cognitive and memory biases that are conducive to making people feel like they’re good and capable, regardless of reality.

Idea for Impact: Luck is so much more important than we acknowledge. Most successes and failures in life combine both skill and luck. Understanding the relative contributions of skill and luck in failure—and success, as Cole suggests above—can help you judge past and present results and, more significantly, prepare for future results.

Wondering what to read next?

  1. How to Avoid Magical Thinking
  2. Admit When You Don’t Have All the Answers
  3. Accidents Can Happen When You Least Expect Them: The Overconfidence Effect
  4. In Praise of Inner Voices: A Powerful Tool for Smarter Decisions
  5. Gambler’s Fallacy is the Failure to Realize How Randomness Rules Our World

Filed Under: Mental Models, Sharpening Your Skills Tagged With: Attitudes, Biases, Critical Thinking, Humility, Introspection, Luck, Mindfulness, Questioning, Thinking Tools, Wisdom

The Boeing 737 MAX’s Achilles Heel

January 7, 2020 By Nagesh Belludi Leave a Comment

Two thousand nineteen was one of the most turbulent years in Boeing’s history. Its 737 MACS (pardon the pun) troubles went from bad to worse to staggering when aviation regulators around the world grounded the aircraft and a steady trickle of disclosures increasingly exposed software problems and corners being cut.

The flaw in this aircraft, its anti-stall mechanism that relied on data from a single sensor, offers a particularly instructive case study of the notion of single point of failure.

One Fault Could Cause an Entire System to Stop Operating

A single point of failure of a system is an element whose failure can result in the failure of the entire system. (A system may have multiple single points of failure.)

Single points of failures are eliminated by adding redundancy—by doubling the critical components or simply backing them up, so that failure of any such element does not initiate a failure of the entire system.

Boeing Mischaracterized Its Anti-Stall System as Less-than-Catastrophic in Its Safety Analysis

The two 737 MAX crashes (with Lion Air and Ethiopian Airlines) originate from a late-change that Boeing made in a trim system called the Maneuvering Characteristics Augmentation System (MCAS.)

Without the pilot’s input, the MCAS could automatically nudge the aircraft’s nose downwards if it detects that the aircraft is pointing up at a dangerous angle, for instance, at high thrust during take-off.

Reliance on One Sensor is an Anathema in Aviation

The MCAS was previously “approved” by the Federal Aviation Administration (FAA.) Nevertheless, Boeing made some design changes after the FAA approval without checking with the FAA again. The late-changes were made to improve MCAS’s response during low-speed aerodynamic stalls.

The MCAS system relied on data from just one Angle-of-Attack (AoA) sensor. With no backup, if this single sensor were to malfunction, erroneous input from that sensor would trigger a corrective nosedive just after take-off. This catastrophe is precisely what happened during the two aircraft crashes.

The AoA sensor thus became a single point of failure. Despite the existence of two angle-of-attack sensors on the nose of the aircraft, the MCAS system not only used data from either one of the sensors but also did not expect concurrence between the two sensors to infer that the aircraft was stalling. Further, Lion Air did not pay up to equip its aircraft with a warning light that could have alerted the crew to a disagreement between the AoA sensors.

Boeing Missed Safety Risks in the Design of the MAX’s Flight-Control System

Reliance on one sensor’s data is an egregious violation of a long-standing engineering principle about eliminating single points of failure. Some aircraft use three duplicate systems for flight control: if one of the three malfunctions, if two systems agree, and the third does not, the flight control software ignores the odd one out.

If the dependence on one sensor was not enough, Boeing, blinded by time- and price-pressure to stay competitive with its European rival Airbus, intentionally chose to do away with any reference to MCAS in pilot manuals to spare pilot training for its airline-customers. Indeed, Boeing did not even disclose the existence of the MCAS on the aircraft.

Boeing allows pilots to switch the trim system off to override the automated anti-stall system, but the pilots of the ill-fated Lion Air and Ethiopian Airlines flights failed to do so.

Idea for Impact: Redundancy is the Sine Qua Non of Reliable Systems

In preparation for airworthiness recertification for the 737 MAX, Boeing has corrected the MCAS blunder by having its trim software compare inputs from two AoA sensors, alerting the pilots if the sensors’ readings disagree, and limiting MCAS’s authority.

One key takeaway from the MCAS disaster is this: when you devise a highly reliable system, identify all single points of failure, and investigate how these risks and failure modes can be mitigated. Examine if every component of a product or a service you work on is a single point of failure by asking, “If this component fails, does the rest of the system still work, and, more importantly, does it still do the function it is supposed to do?”

Wondering what to read next?

  1. Availability Heuristic: Our Preference for the Familiar
  2. Many Hard Leadership Lessons in the Boeing 737 MAX Debacle
  3. Be Smart by Not Being Stupid
  4. How Contributing Factors Stack Up and Accidents Unfold: A Case Study of the 2024 Delta A350 & CRJ-900 Collision
  5. What Airline Disasters Teach About Cognitive Impairment and Decision-Making Under Stress

Filed Under: Mental Models, Sharpening Your Skills Tagged With: Aviation, Biases, Decision-Making, Problem Solving, Risk, Thinking Tools

Lessons from the World’s Worst Aviation Disaster // Book Summary of ‘The Collision on Tenerife’

November 5, 2019 By Nagesh Belludi 1 Comment

Jon Ziomek’s nonfiction history book Collision on Tenerife (2018) is the result of years of analysis into the world’s worst aviation disaster on Tenerife Island in the Canary Islands of Spain.

Distinct Small Errors Can Become Linked and Amplified into a Big Tragedy

On 27-March-1977, two fully loaded Boeing 747 passenger jets operated by Pan American World Airways (Pan Am) and KLM Royal Dutch Airlines collided on the runway, killing 583 passengers and crew on the two airplanes. Only 61 survived—all from the Pan Am jet, including its pilot.

These two flights, and a few others, were diverted to Tenerife after a bomb went off at the Gran Canaria Airport in Las Palmas, their original destination. Tenerife was not a major airport—it had a single runway, and taxi and parking space were limited. After the Las Palmas airport reopened, flights were cleared for takeoff from Tenerife, but the fog rolled in over Tenerife reducing visibility to less than 300 feet. Several airplanes that had been diverted to Tenerife had blocked the taxiway and the parking ramp. Therefore, the KLM and Pan Am jets taxied down the single runway in preparation for takeoff, the Pan Am behind the KLM.

At one end of the runway, the KLM jet turned 180 degrees into position for takeoff. In the meantime, the Pan Am jet was still taxiing on the runway, having missed its taxiway turnoff in the fog. The KLM pilot jumped the gun and started his take-off roll before he got clearance from traffic control.

When the pilots of the two jets caught sight of each other’s airplanes through the fog, it was too late for the Pan Am jet to clear out of the runway into the grass and for KLM jet to abort the takeoff. The KLM pilot lifted his airplane off the runway prematurely, but could not avoid barreling into the Pan Am’s fuselage at 240 kmph. Both the jets exploded into flames.

The accident was blamed on miscommunication—breakdown of coordinated action, vague language from the control tower, the KLM pilot’s impatience to takeoff without clearance, and the distorted cross-talk of the KLM and Pan Am pilots and the controllers on a common radio channel.

Breakdown of Coordination Under Stress

Sweeping changes were made to international airline regulations following the accident: cockpit procedures were changed, standard phrases were introduced, and English was emphasized as a common working language.

'Collision on Tenerife' by Jon Ziomek (ISBN 1682617734) In Collision on Tenerife, Jon Ziomek, a journalism professor at Northwestern University, gives a well-written, detailed account of all the mistakes leading up to the crash and its aftermath.

The surviving passengers’ first- and second-hand accounts recall the horror of those passengers on the right side of the Pan Am jet who saw the lights of the speeding KLM 747, just as the Pan Am pilot was hastily turning his airplane onto the grass to avoid the collision.

Ziomek describes how passengers escaped. Some had to make the difficult choice of leaving loved ones or friends and strangers behind.

Dorothy Kelly … then spotted Captain Grubbs lying near the fuselage. Badly burned and shaken by his jump from the plane, he could not move. “What have I done to these people?” he yelled, pounding the ground in anguish. Kelly grabbed him under his shoulders and urged “Crawl, Captain, crawl!”

Recommendation: Read Jon Ziomek’s Collision on Tenerife

Some of the bewildering details make for difficult reading—especially the psychological effects (post-traumatic stress syndrome) on the surviving passengers. But Jon Ziomek’s Collision on Tenerife is important reading, providing a comprehensive picture of the extensive coordination required in aviation, the importance of safety and protocols, and how some humans can freeze in shock while others spring into action.

The key takeaway is the recognition of how small errors and problems (an “error chain”) can quickly become linked and amplified into disastrous outcomes.

Wondering what to read next?

  1. “Fly the Aircraft First”
  2. Under Pressure, The Narrowing Cognitive Map: Lessons from the Tragedy of Singapore Airlines Flight 6
  3. How Contributing Factors Stack Up and Accidents Unfold: A Case Study of the 2024 Delta A350 & CRJ-900 Collision
  4. What Airline Disasters Teach About Cognitive Impairment and Decision-Making Under Stress
  5. How Stress Impairs Your Problem-Solving Capabilities: Case Study of TransAsia Flight 235

Filed Under: Business Stories, Effective Communication, Sharpening Your Skills Tagged With: Anxiety, Assertiveness, Aviation, Biases, Books for Impact, Conflict, Decision-Making, Mindfulness, Problem Solving, Stress, Thinking Tools, Worry

How Stress Impairs Your Problem-Solving Capabilities: Case Study of TransAsia Flight 235

October 1, 2019 By Nagesh Belludi Leave a Comment

As I’ve examined previously, airline disasters are particularly instructive on the subjects of cognitive impairment and decision-making under stress.

Consider the case of TransAsia Airways Flight 235 that crashed in 2015 soon after takeoff from an airport in Taipei, Taiwan. Accident investigations revealed that the pilots of the ATR 72-600 turboprop erroneously switched off the plane’s working engine after the other lost power. Here’s a rundown of what happened:

  1. About one minute after takeoff, at 1,300 feet, engine #2 had an uncommanded autofeather failure. This is a routine engine failure—the aircraft is designed to be able to be flown on one engine.
  2. The Pilot Flying misdiagnosed the problem, and assumed that the still-functional engine #1 had failed. He retarded power on engine #1 and it promptly shut down.
  3. With power lost on both the engines, the pilots did not react to the stall warnings in a timely and effective manner. The Pilot Flying acknowledged his error, “wow, pulled back the wrong side throttle.”
  4. The aircraft continued its descent. The pilots rushed to restart engine #1, but the remaining altitude was not adequate enough to recover the aircraft.
  5. In a state of panic, the Pilot Flying clasped the flight controls and steered (see this video) the aircraft perilously to avoid apartment blocks and commercial buildings before clipping a bridge and crashing into a river.

A High Level of Stress Can Diminish Your Problem-solving Capabilities

Thrown into disarray after a routine engine failure, the pilots of TransAsia flight 235 did not perform their airline’s abnormal and emergency procedures to identify the failure and implement the required corrective actions. Their ineffective coordination, communication, and error management compromised the safety of the flight.

The combination of sudden threat and extreme time pressure to avert a danger fosters a state of panic, in which decision-makers are inclined to commit themselves impulsively to courses of action that they will soon come to regret.

Idea for Impact: To combat cognitive impairment under stress, use checklists and standard operating procedures, as well as increased training on situational awareness, crisis communication, and emergency management.

Wondering what to read next?

  1. What Airline Disasters Teach About Cognitive Impairment and Decision-Making Under Stress
  2. Lessons from the World’s Worst Aviation Disaster // Book Summary of ‘The Collision on Tenerife’
  3. Under Pressure, The Narrowing Cognitive Map: Lessons from the Tragedy of Singapore Airlines Flight 6
  4. “Fly the Aircraft First”
  5. Jeju Air Flight 2216—The Alleged Failure to Think Clearly Under Fire

Filed Under: Business Stories, Leadership, Sharpening Your Skills Tagged With: Anxiety, Aviation, Biases, Decision-Making, Emotions, Mental Models, Mindfulness, Problem Solving, Risk, Stress, Thought Process, Worry

Accidents Can Happen When You Least Expect Them: The Overconfidence Effect

September 3, 2019 By Nagesh Belludi Leave a Comment

In the context of decision-making and risk-taking, the “overconfidence effect” is a judgmental bias that can affect your subjective estimate of the likelihood of future events. This can cause you to misjudge the odds of positive/desirable events as well as negative/undesirable events.

As the following Zen story illustrates, experience breeds complacency. When confidence gives way to overconfidence, it can transform from a strength to a liability.

A master gardener, famous for his skill in climbing and pruning the highest trees, examined his disciple by letting him climb a very high tree. Many people had come to watch. The master gardener stood quietly, carefully following every move but not interfering with one word.

Having pruned the top, the disciple climbed down and was only about ten feet from the ground when the master suddenly yelled: “Take care, take care!”

When the disciple was safely down an old man asked the master gardener: “You did not let out one word when he was aloft in the most dangerous place. Why did you caution him when he was nearly down? Even if he had slipped then, he could not have greatly hurt himself.”

“But isn’t it obvious?” replied the master gardener. “Right up at the top he is conscious of the danger, and of himself takes care. But near the end when one begins to feel safe, this is when accidents occur.”

Reference: Irmgard Schlögl’s The Wisdom of the Zen Masters (1976.) Dr. Schlögl (1921–2007) became Ven. Myokyo-ni in 1984, and served as Rinzai Zen Buddhist nun and headed the Zen Centre in London.

Wondering what to read next?

  1. Situational Blindness, Fatal Consequences: Lessons from American Airlines 5342
  2. Be Smart by Not Being Stupid
  3. Smart Folks are Most Susceptible to Overanalyzing and Overthinking
  4. Increase Paranoia When Things Are Going Well
  5. How to … Escape the Overthinking Trap

Filed Under: Mental Models, Sharpening Your Skills Tagged With: Biases, Confidence, Critical Thinking, Decision-Making, Mindfulness, Parables, Risk, Thinking Tools, Thought Process, Wisdom

Your Product May Be Excellent, But Is There A Market For It?

July 24, 2019 By Nagesh Belludi 1 Comment

Akio Morita, the visionary co-founder of Sony, liked to tell a story about recognizing opportunities and shaping them into business concepts.

Two shoe salesmen … find themselves in a rustic backward part of Africa. The first salesman wires back to his head office: “There is no prospect of sales. Natives do not wear shoes!” The other salesman wires: “No one wears shoes here. We can dominate the market. Send all possible stock.”

Morita, along with his co-founder Masaru Ibuka, was a genius at creating consumer products for which no obvious demand existed, and then generating demand for them. Sony’s hits included such iconic products as a hand-held transistor radio, the Walkman portable audio cassette player, the Diskman portable compact disk player, and the Betamax videocassette recorder.

Products Lost in Translation

As the following case studies will illustrate, many companies haven’t had Sony’s luck in launching products that can stir up demand.

In each case in point, deeply ingrained cultural attitudes affected how consumers failed to embrace products introduced into their respective markets.

Case Study #1: Nestlé’s Paloma Iced Tea in India

Marketing and Product Introduction Failure: Nestle's Paloma Iced Tea in India When Swiss packaged food-multinational Nestlé introduced Paloma iced tea in India in the ’80s, Nestlé’s market assessment was that the Indian beverage market was ready for an iced tea variety.

Sure thing, folks in India love tea. They consume it multiple times a day. However, they must have it hot—even in the heat of the summer. Street-side tea vendors are a familiar sight in India. Huddled around the chaiwalas are patrons sipping hot tea and relishing a savory samosa or a saccharine jalebi.

It’s no wonder, then, that, despite all the marketing efforts, Paloma turned out to be a debacle. Nestlé withdrew the product within a year.

Case Study #2: Kellogg’s Cornflakes in India

The American packaged foods multinational Kellogg’s failed in its initial introduction of cornflakes into the Indian market in the mid ’90s. Kellogg’s quickly realized that its products were alien to Indians’ consumption habits—accustomed to traditional hot, spicy, and heavy grub, the Indians felt hungry after eating a bowl of sweet cornflakes for breakfast. In addition, they poured hot milk over cornflakes rendering them soggy and less appetizing.

Case Study #3: Oreo Cookies in China

Marketing and Product Introduction Case Study: Oreo Green-tea Ice Cream Cookies in China When Kraft Foods, launched Oreo in China in 1996, America’s best-loved sandwich cookie didn’t fare very well. Executives in Kraft’s Chicago headquarters expected to just drop the American cookie into the Chinese market and watch it fly off shelves.

Chinese consumers found that Oreos were too sweet. The ritual of twisting open Oreo cookies, licking the cream inside, and then dunking it in milk before enjoying them was considered a “strangely American habit.”

Not until Kraft’s local Chinese leaders developed a local concept—a wafer format in subtler flavors such as green-tea ice cream—did Oreo become popular.

Idea for Impact: Your expertise may not translate in unfamiliar and foreign markets

In marketing, if success is all about understanding the consumers, you must be grounded in the reality of their lives to be able to understand their priorities.

  • Don’t assume that what makes a product successful in one market will be a winning formula in other markets as well.
  • Make products resonate with local cultures by contextualizing the products and tailoring them for local preferences.
  • Use small-scale testing to make sure your product can sway buyers.

Wondering what to read next?

  1. The Loss Aversion Mental Model: A Case Study on Why People Think Spirit is a Horrible Airline
  2. Starbucks’ Oily Brew: Lessons on Innovation Missing the Mark
  3. What Taco Bell Can Teach You About Staying Relevant
  4. Find out What Your Customers Want and Give it to Them
  5. The Mere Exposure Effect: Why We Fall for the Most Persistent

Filed Under: Business Stories, Leadership, Managing Business Functions, MBA in a Nutshell, Mental Models, Sharpening Your Skills, The Great Innovators Tagged With: Biases, Creativity, Customer Service, Entrepreneurs, Feedback, Innovation, Leadership Lessons, Parables, Persuasion, Thought Process

« Previous Page
Next Page »

Primary Sidebar

Popular Now

Anxiety Assertiveness Attitudes Balance Biases Coaching Conflict Conversations Creativity Critical Thinking Decision-Making Discipline Emotions Entrepreneurs Etiquette Feedback Getting Along Getting Things Done Goals Great Manager Innovation Leadership Leadership Lessons Likeability Mental Models Mentoring Mindfulness Motivation Networking Parables Performance Management Persuasion Philosophy Problem Solving Procrastination Relationships Simple Living Social Skills Stress Suffering Thinking Tools Thought Process Time Management Winning on the Job Wisdom

About: Nagesh Belludi [hire] is a St. Petersburg, Florida-based freethinker, investor, and leadership coach. He specializes in helping executives and companies ensure that the overall quality of their decision-making benefits isn’t compromised by a lack of a big-picture understanding.

Get Updates

Signup for emails

Subscribe via RSS

Contact Nagesh Belludi

RECOMMENDED BOOK:
Think Wrong

Think Wrong: John Bielenberg

Software firm Future Partner's exclusive problem-solving system that helps see through siloes and bottlenecks in the decision-making process.

Explore

  • Announcements
  • Belief and Spirituality
  • Business Stories
  • Career Development
  • Effective Communication
  • Great Personalities
  • Health and Well-being
  • Ideas and Insights
  • Inspirational Quotations
  • Leadership
  • Leadership Reading
  • Leading Teams
  • Living the Good Life
  • Managing Business Functions
  • Managing People
  • MBA in a Nutshell
  • Mental Models
  • News Analysis
  • Personal Finance
  • Podcasts
  • Project Management
  • Proverbs & Maxims
  • Sharpening Your Skills
  • The Great Innovators

Recently,

  • A Boss’s Presence Deserves Our Gratitude’s Might
  • Chance and the Currency of Preparedness: A Case Study on an Indonesian Handbag Entrepreneur, Sunny Kamengmau
  • Inspirational Quotations #1123
  • Should You Read a Philosophy Book or a Self-Help Book?
  • A Rule Followed Blindly Is a Principle Betrayed Quietly
  • Stoic in the Title, Shallow in the Text: Summary of Robert Rosenkranz’s ‘The Stoic Capitalist’
  • Inspirational Quotations #1122

Unless otherwise stated in the individual document, the works above are © Nagesh Belludi under a Creative Commons BY-NC-ND license. You may quote, copy and share them freely, as long as you link back to RightAttitudes.com, don't make money with them, and don't modify the content. Enjoy!