• Skip to content
  • Skip to primary sidebar

Right Attitudes

Ideas for Impact

Biases

The Boeing 737 MAX’s Achilles Heel

January 7, 2020 By Nagesh Belludi Leave a Comment

Two thousand nineteen was one of the most turbulent years in Boeing’s history. Its 737 MACS (pardon the pun) troubles went from bad to worse to staggering when aviation regulators around the world grounded the aircraft and a steady trickle of disclosures increasingly exposed software problems and corners being cut.

The flaw in this aircraft, its anti-stall mechanism that relied on data from a single sensor, offers a particularly instructive case study of the notion of single point of failure.

One Fault Could Cause an Entire System to Stop Operating

A single point of failure of a system is an element whose failure can result in the failure of the entire system. (A system may have multiple single points of failure.)

Single points of failures are eliminated by adding redundancy—by doubling the critical components or simply backing them up, so that failure of any such element does not initiate a failure of the entire system.

Boeing Mischaracterized Its Anti-Stall System as Less-than-Catastrophic in Its Safety Analysis

The two 737 MAX crashes (with Lion Air and Ethiopian Airlines) originate from a late-change that Boeing made in a trim system called the Maneuvering Characteristics Augmentation System (MCAS.)

Without the pilot’s input, the MCAS could automatically nudge the aircraft’s nose downwards if it detects that the aircraft is pointing up at a dangerous angle, for instance, at high thrust during take-off.

Reliance on One Sensor is an Anathema in Aviation

The MCAS was previously “approved” by the Federal Aviation Administration (FAA.) Nevertheless, Boeing made some design changes after the FAA approval without checking with the FAA again. The late-changes were made to improve MCAS’s response during low-speed aerodynamic stalls.

The MCAS system relied on data from just one Angle-of-Attack (AoA) sensor. With no backup, if this single sensor were to malfunction, erroneous input from that sensor would trigger a corrective nosedive just after take-off. This catastrophe is precisely what happened during the two aircraft crashes.

The AoA sensor thus became a single point of failure. Despite the existence of two angle-of-attack sensors on the nose of the aircraft, the MCAS system not only used data from either one of the sensors but also did not expect concurrence between the two sensors to infer that the aircraft was stalling. Further, Lion Air did not pay up to equip its aircraft with a warning light that could have alerted the crew to a disagreement between the AoA sensors.

Boeing Missed Safety Risks in the Design of the MAX’s Flight-Control System

Reliance on one sensor’s data is an egregious violation of a long-standing engineering principle about eliminating single points of failure. Some aircraft use three duplicate systems for flight control: if one of the three malfunctions, if two systems agree, and the third does not, the flight control software ignores the odd one out.

If the dependence on one sensor was not enough, Boeing, blinded by time- and price-pressure to stay competitive with its European rival Airbus, intentionally chose to do away with any reference to MCAS in pilot manuals to spare pilot training for its airline-customers. Indeed, Boeing did not even disclose the existence of the MCAS on the aircraft.

Boeing allows pilots to switch the trim system off to override the automated anti-stall system, but the pilots of the ill-fated Lion Air and Ethiopian Airlines flights failed to do so.

Idea for Impact: Redundancy is the Sine Qua Non of Reliable Systems

In preparation for airworthiness recertification for the 737 MAX, Boeing has corrected the MCAS blunder by having its trim software compare inputs from two AoA sensors, alerting the pilots if the sensors’ readings disagree, and limiting MCAS’s authority.

One key takeaway from the MCAS disaster is this: when you devise a highly reliable system, identify all single points of failure, and investigate how these risks and failure modes can be mitigated. Examine if every component of a product or a service you work on is a single point of failure by asking, “If this component fails, does the rest of the system still work, and, more importantly, does it still do the function it is supposed to do?”

Wondering what to read next?

  1. Availability Heuristic: Our Preference for the Familiar
  2. The “Ashtray in the Sky” Mental Model: Idiot-Proofing by Design
  3. Many Hard Leadership Lessons in the Boeing 737 MAX Debacle
  4. Situational Blindness, Fatal Consequences: Lessons from American Airlines 5342
  5. What Airline Disasters Teach About Cognitive Impairment and Decision-Making Under Stress

Filed Under: Mental Models, Sharpening Your Skills Tagged With: Aviation, Biases, Decision-Making, Problem Solving, Risk, Thinking Tools

Lessons from the World’s Worst Aviation Disaster // Book Summary of ‘The Collision on Tenerife’

November 5, 2019 By Nagesh Belludi 1 Comment

Jon Ziomek’s nonfiction history book Collision on Tenerife (2018) is the result of years of analysis into the world’s worst aviation disaster on Tenerife Island in the Canary Islands of Spain.

Distinct Small Errors Can Become Linked and Amplified into a Big Tragedy

On 27-March-1977, two fully loaded Boeing 747 passenger jets operated by Pan American World Airways (Pan Am) and KLM Royal Dutch Airlines collided on the runway, killing 583 passengers and crew on the two airplanes. Only 61 survived—all from the Pan Am jet, including its pilot.

These two flights, and a few others, were diverted to Tenerife after a bomb went off at the Gran Canaria Airport in Las Palmas, their original destination. Tenerife was not a major airport—it had a single runway, and taxi and parking space were limited. After the Las Palmas airport reopened, flights were cleared for takeoff from Tenerife, but the fog rolled in over Tenerife reducing visibility to less than 300 feet. Several airplanes that had been diverted to Tenerife had blocked the taxiway and the parking ramp. Therefore, the KLM and Pan Am jets taxied down the single runway in preparation for takeoff, the Pan Am behind the KLM.

At one end of the runway, the KLM jet turned 180 degrees into position for takeoff. In the meantime, the Pan Am jet was still taxiing on the runway, having missed its taxiway turnoff in the fog. The KLM pilot jumped the gun and started his take-off roll before he got clearance from traffic control.

When the pilots of the two jets caught sight of each other’s airplanes through the fog, it was too late for the Pan Am jet to clear out of the runway into the grass and for KLM jet to abort the takeoff. The KLM pilot lifted his airplane off the runway prematurely, but could not avoid barreling into the Pan Am’s fuselage at 240 kmph. Both the jets exploded into flames.

The accident was blamed on miscommunication—breakdown of coordinated action, vague language from the control tower, the KLM pilot’s impatience to takeoff without clearance, and the distorted cross-talk of the KLM and Pan Am pilots and the controllers on a common radio channel.

Breakdown of Coordination Under Stress

Sweeping changes were made to international airline regulations following the accident: cockpit procedures were changed, standard phrases were introduced, and English was emphasized as a common working language.

'Collision on Tenerife' by Jon Ziomek (ISBN 1682617734) In Collision on Tenerife, Jon Ziomek, a journalism professor at Northwestern University, gives a well-written, detailed account of all the mistakes leading up to the crash and its aftermath.

The surviving passengers’ first- and second-hand accounts recall the horror of those passengers on the right side of the Pan Am jet who saw the lights of the speeding KLM 747, just as the Pan Am pilot was hastily turning his airplane onto the grass to avoid the collision.

Ziomek describes how passengers escaped. Some had to make the difficult choice of leaving loved ones or friends and strangers behind.

Dorothy Kelly … then spotted Captain Grubbs lying near the fuselage. Badly burned and shaken by his jump from the plane, he could not move. “What have I done to these people?” he yelled, pounding the ground in anguish. Kelly grabbed him under his shoulders and urged “Crawl, Captain, crawl!”

Recommendation: Read Jon Ziomek’s Collision on Tenerife

Some of the bewildering details make for difficult reading—especially the psychological effects (post-traumatic stress syndrome) on the surviving passengers. But Jon Ziomek’s Collision on Tenerife is important reading, providing a comprehensive picture of the extensive coordination required in aviation, the importance of safety and protocols, and how some humans can freeze in shock while others spring into action.

The key takeaway is the recognition of how small errors and problems (an “error chain”) can quickly become linked and amplified into disastrous outcomes.

Wondering what to read next?

  1. “Fly the Aircraft First”
  2. Under Pressure, The Narrowing Cognitive Map: Lessons from the Tragedy of Singapore Airlines Flight 6
  3. How Contributing Factors Stack Up and Accidents Unfold: A Case Study of the 2024 Delta A350 & CRJ-900 Collision
  4. What Airline Disasters Teach About Cognitive Impairment and Decision-Making Under Stress
  5. How Stress Impairs Your Problem-Solving Capabilities: Case Study of TransAsia Flight 235

Filed Under: Business Stories, Effective Communication, Sharpening Your Skills Tagged With: Anxiety, Assertiveness, Aviation, Biases, Books for Impact, Conflict, Decision-Making, Mindfulness, Problem Solving, Stress, Thinking Tools, Worry

How Stress Impairs Your Problem-Solving Capabilities: Case Study of TransAsia Flight 235

October 1, 2019 By Nagesh Belludi Leave a Comment

As I’ve examined previously, airline disasters are particularly instructive on the subjects of cognitive impairment and decision-making under stress.

Consider the case of TransAsia Airways Flight 235 that crashed in 2015 soon after takeoff from an airport in Taipei, Taiwan. Accident investigations revealed that the pilots of the ATR 72-600 turboprop erroneously switched off the plane’s working engine after the other lost power. Here’s a rundown of what happened:

  1. About one minute after takeoff, at 1,300 feet, engine #2 had an uncommanded autofeather failure. This is a routine engine failure—the aircraft is designed to be able to be flown on one engine.
  2. The Pilot Flying misdiagnosed the problem, and assumed that the still-functional engine #1 had failed. He retarded power on engine #1 and it promptly shut down.
  3. With power lost on both the engines, the pilots did not react to the stall warnings in a timely and effective manner. The Pilot Flying acknowledged his error, “wow, pulled back the wrong side throttle.”
  4. The aircraft continued its descent. The pilots rushed to restart engine #1, but the remaining altitude was not adequate enough to recover the aircraft.
  5. In a state of panic, the Pilot Flying clasped the flight controls and steered (see this video) the aircraft perilously to avoid apartment blocks and commercial buildings before clipping a bridge and crashing into a river.

A High Level of Stress Can Diminish Your Problem-solving Capabilities

Thrown into disarray after a routine engine failure, the pilots of TransAsia flight 235 did not perform their airline’s abnormal and emergency procedures to identify the failure and implement the required corrective actions. Their ineffective coordination, communication, and error management compromised the safety of the flight.

The combination of sudden threat and extreme time pressure to avert a danger fosters a state of panic, in which decision-makers are inclined to commit themselves impulsively to courses of action that they will soon come to regret.

Idea for Impact: To combat cognitive impairment under stress, use checklists and standard operating procedures, as well as increased training on situational awareness, crisis communication, and emergency management.

Wondering what to read next?

  1. What Airline Disasters Teach About Cognitive Impairment and Decision-Making Under Stress
  2. Lessons from the World’s Worst Aviation Disaster // Book Summary of ‘The Collision on Tenerife’
  3. Under Pressure, The Narrowing Cognitive Map: Lessons from the Tragedy of Singapore Airlines Flight 6
  4. “Fly the Aircraft First”
  5. The “Ashtray in the Sky” Mental Model: Idiot-Proofing by Design

Filed Under: Business Stories, Leadership, Sharpening Your Skills Tagged With: Anxiety, Aviation, Biases, Decision-Making, Emotions, Mental Models, Mindfulness, Problem Solving, Risk, Stress, Thought Process, Worry

Accidents Can Happen When You Least Expect Them: The Overconfidence Effect

September 3, 2019 By Nagesh Belludi Leave a Comment

In the context of decision-making and risk-taking, the “overconfidence effect” is a judgmental bias that can affect your subjective estimate of the likelihood of future events. This can cause you to misjudge the odds of positive/desirable events as well as negative/undesirable events.

As the following Zen story illustrates, experience breeds complacency. When confidence gives way to overconfidence, it can transform from a strength to a liability.

A master gardener, famous for his skill in climbing and pruning the highest trees, examined his disciple by letting him climb a very high tree. Many people had come to watch. The master gardener stood quietly, carefully following every move but not interfering with one word.

Having pruned the top, the disciple climbed down and was only about ten feet from the ground when the master suddenly yelled: “Take care, take care!”

When the disciple was safely down an old man asked the master gardener: “You did not let out one word when he was aloft in the most dangerous place. Why did you caution him when he was nearly down? Even if he had slipped then, he could not have greatly hurt himself.”

“But isn’t it obvious?” replied the master gardener. “Right up at the top he is conscious of the danger, and of himself takes care. But near the end when one begins to feel safe, this is when accidents occur.”

Reference: Irmgard Schlögl’s The Wisdom of the Zen Masters (1976.) Dr. Schlögl (1921–2007) became Ven. Myokyo-ni in 1984, and served as Rinzai Zen Buddhist nun and headed the Zen Centre in London.

Wondering what to read next?

  1. The “Ashtray in the Sky” Mental Model: Idiot-Proofing by Design
  2. Situational Blindness, Fatal Consequences: Lessons from American Airlines 5342
  3. Be Smart by Not Being Stupid
  4. Smart Folks are Most Susceptible to Overanalyzing and Overthinking
  5. Increase Paranoia When Things Are Going Well

Filed Under: Mental Models, Sharpening Your Skills Tagged With: Biases, Confidence, Critical Thinking, Decision-Making, Mindfulness, Parables, Risk, Thinking Tools, Thought Process, Wisdom

Your Product May Be Excellent, But Is There A Market For It?

July 24, 2019 By Nagesh Belludi 1 Comment

Akio Morita, the visionary co-founder of Sony, liked to tell a story about recognizing opportunities and shaping them into business concepts.

Two shoe salesmen … find themselves in a rustic backward part of Africa. The first salesman wires back to his head office: “There is no prospect of sales. Natives do not wear shoes!” The other salesman wires: “No one wears shoes here. We can dominate the market. Send all possible stock.”

Morita, along with his co-founder Masaru Ibuka, was a genius at creating consumer products for which no obvious demand existed, and then generating demand for them. Sony’s hits included such iconic products as a hand-held transistor radio, the Walkman portable audio cassette player, the Diskman portable compact disk player, and the Betamax videocassette recorder.

Products Lost in Translation

As the following case studies will illustrate, many companies haven’t had Sony’s luck in launching products that can stir up demand.

In each case in point, deeply ingrained cultural attitudes affected how consumers failed to embrace products introduced into their respective markets.

Case Study #1: Nestlé’s Paloma Iced Tea in India

Marketing and Product Introduction Failure: Nestle's Paloma Iced Tea in India When Swiss packaged food-multinational Nestlé introduced Paloma iced tea in India in the ’80s, Nestlé’s market assessment was that the Indian beverage market was ready for an iced tea variety.

Sure thing, folks in India love tea. They consume it multiple times a day. However, they must have it hot—even in the heat of the summer. Street-side tea vendors are a familiar sight in India. Huddled around the chaiwalas are patrons sipping hot tea and relishing a savory samosa or a saccharine jalebi.

It’s no wonder, then, that, despite all the marketing efforts, Paloma turned out to be a debacle. Nestlé withdrew the product within a year.

Case Study #2: Kellogg’s Cornflakes in India

The American packaged foods multinational Kellogg’s failed in its initial introduction of cornflakes into the Indian market in the mid ’90s. Kellogg’s quickly realized that its products were alien to Indians’ consumption habits—accustomed to traditional hot, spicy, and heavy grub, the Indians felt hungry after eating a bowl of sweet cornflakes for breakfast. In addition, they poured hot milk over cornflakes rendering them soggy and less appetizing.

Case Study #3: Oreo Cookies in China

Marketing and Product Introduction Case Study: Oreo Green-tea Ice Cream Cookies in China When Kraft Foods, launched Oreo in China in 1996, America’s best-loved sandwich cookie didn’t fare very well. Executives in Kraft’s Chicago headquarters expected to just drop the American cookie into the Chinese market and watch it fly off shelves.

Chinese consumers found that Oreos were too sweet. The ritual of twisting open Oreo cookies, licking the cream inside, and then dunking it in milk before enjoying them was considered a “strangely American habit.”

Not until Kraft’s local Chinese leaders developed a local concept—a wafer format in subtler flavors such as green-tea ice cream—did Oreo become popular.

Idea for Impact: Your expertise may not translate in unfamiliar and foreign markets

In marketing, if success is all about understanding the consumers, you must be grounded in the reality of their lives to be able to understand their priorities.

  • Don’t assume that what makes a product successful in one market will be a winning formula in other markets as well.
  • Make products resonate with local cultures by contextualizing the products and tailoring them for local preferences.
  • Use small-scale testing to make sure your product can sway buyers.

Wondering what to read next?

  1. The Loss Aversion Mental Model: A Case Study on Why People Think Spirit is a Horrible Airline
  2. Starbucks’ Oily Brew: Lessons on Innovation Missing the Mark
  3. Airline Safety Videos: From Dull Briefings to Dynamic Ad Platforms
  4. The Singapore Girl: Myth, Marketing, and Manufactured Grace
  5. The Mere Exposure Effect: Why We Fall for the Most Persistent

Filed Under: Business Stories, Leadership, Managing Business Functions, MBA in a Nutshell, Mental Models, Sharpening Your Skills, The Great Innovators Tagged With: Biases, Creativity, Customer Service, Entrepreneurs, Feedback, Innovation, Leadership Lessons, Parables, Persuasion, Thought Process

How to Make Others Feel They Owe You One: Reciprocity and Social Influence

September 18, 2018 By Nagesh Belludi Leave a Comment

Reciprocity, as described below, is a manipulative technique. My aim for this article is twofold: firstly, it sensitizes you to one of the many things people can do to get you to do their bidding. Secondly, reciprocity is a handy technique for those circumstances where certain ends can justify certain means.

Reciprocity is treating other people as they treat you, or for the purpose of this article, as you wish to be treated—specifically with the expectation that they will reciprocate your favor in the future.

In other words, reciprocity is a sneaky trick that permits deliberate interpersonal influence. Do something for other people and they will be willing to do something for you, partly because they’ll be uncomfortable feeling indebted to you.

The concept of reciprocity is ingrained in human nature. As part of our upbringing, we are taught to give something back to people who give us something. Reciprocity and cooperation are the underpinnings of a civilized society—they allow us to help people who need it and to hope that they will help us when we need it. Research suggests that the desire to repay goodwill is hard-wired in the human brain.

Jack Schafer’s The Like Switch: An Ex-FBI Agent’s Guide to Influencing, Attracting, and Winning People Over (2015) offers a clever technique to put reciprocity into action:

The next time someone thanks you for something, don’t say, “You’re welcome.” Instead, say, “I know you’d do the same thing for me.” This response invokes reciprocity. The other person is now predisposed to help you when you ask them for a favor.

The effects of goodwill are short-lived. A long-forgotten reputation for helpfulness gets you nothing. You have to renew your reputation by helping others regularly.

To learn more about reciprocity, read social psychologist Robert Cialdini’s Influence: The Psychology of Persuasion (1984.) He identified reciprocity as one of six principles that can help get others’ compliance to your requests.

Wondering what to read next?

  1. Buy Yourself Time
  2. Ever Wonder Why People Resist Gifts? // Reactance Theory
  3. The Wisdom of the Well-Timed Imperfection: The ‘Pratfall Effect’ and Authenticity
  4. When One Person is More Interested in a Relationship
  5. This Manager’s Change Initiatives Lacked Ethos, Pathos, Logos: Case Study on Aristotle’s Persuasion Framework

Filed Under: Mental Models, Sharpening Your Skills Tagged With: Assertiveness, Biases, Ethics, Likeability, Negotiation, Persuasion, Psychology, Relationships, Social Life, Social Skills

The Dramatic Fall of Theranos & Elizabeth Holmes // Book Summary of John Carreyrou’s ‘Bad Blood’

September 10, 2018 By Nagesh Belludi Leave a Comment

Bad Blood: Secrets and Lies in a Silicon Valley Startup (2018) is Wall Street Journal investigative reporter John Carreyrou’s remarkable exposé on Theranos, the former high-flying Silicon Valley tech startup founded by Elizabeth Holmes.

Theranos formally dissolved last week after a high-profile scandal revealed that the company not only deceived investors, but also risked the health of thousands of patients.

A Gripping Narrative, A Charismatic CEO, and A Big Fraud

In 2015, Theranos was one of Silicon Valley’s superstars. Valued at some $9 billion, Theranos claimed an out-and-out disruption of the $73-billion-a-year blood testing industry. Elizabeth Holmes pitched a revolutionary technology that could perform multiple tests on a few drops of capillary blood drawn by a minimally invasive finger prick, instead of the conventional—and much dreaded—venipuncture needle method.

The Story of Theranos and Elizabeth Holmes received much adulation by the media Theranos has its origins in 2004, when the brilliant Holmes, then a 19-year old Stanford sophomore, dropped out of college to start the company. Her missionary narrative swayed just about everyone to believe in the potential she touted.

Over the years, Theranos attracted a $1 billion investment, an illustrious board of directors, influential business partners (Walgreens, Safeway, Cleveland Clinic,) and significant amounts of adulation by the media—all of this lent credence to Holmes’s undertaking. She was celebrated as the youngest, self-made female billionaire in the world.

Nobody Asked the Hard Questions

Theranos’s castle in the air started to crumble in October 2015, when Carreyrou’s first Wall Street Journal article reported that the company was embellishing the potential of Theranos’s technology. Based on past employees’ disclosures, the article also cast serious doubts on the reliability of Theranos’s science. Behind the scenes, Theranos performed a majority of its blood tests with commercial analyzers purchased from other companies.

The persistent question in Carreyrou’s Bad Blood is how the many smart people who funded, endorsed, defended, and wrote about this company never set aside their confidence in Holmes’s persuasions and looked beyond her claim of “30 tests from one drop of blood.”

Without much independent due diligence, Theranos’s supporters possibly assumed that everyone else had checked out the company, its founders, and its science. Theranos got away with its actions for as long as it did because no one could conceive of the idea that the business would simply lie as much as it did.

The Story of Theranos and Elizabeth Holmes Appeared so Promising That Everybody Wanted it to Be True

Bad Blood also draws attention to Silicon Valley’s many failings, including the cult of the celebrity founder. Holmes’s smoke and mirrors was enabled by the notion of a “stealth mode” in which many Silicon Valley startups operate to protect their intellectual property. Theranos never proved that its testing technology really worked. It was performing tests on patients without having published peer-reviewed studies, getting FDA certification, or carrying out external evaluation by medical experts.

'Bad Blood' by John Carreyrou (ISBN 152473165X) Carreyrou acknowledges that Holmes’s initial intentions were honorable, even if naïve. What triggered Holmes’s downfall was the characteristic entrepreneurial “fake it till you make it” ethos—it inhibited her from conceding early on that her ambitions were simply not viable.

When things didn’t go as intended, Holmes exploited the power of storytelling to get everyone to buy into her tales. She continued to believe that the reality of the technology would catch up with her vision in the future. Trapped in a web of hyperbole and overpromises, Holmes and her associate (as well as then-lover) Sunny Balwani operated a culture of fear and intimidation at Theranos. They went as far as hiring superstar lawyers to threaten and silence employees and anyone else who dared to challenge the company or expose its deficiencies.

Book Recommendation: Bad Blood is a Must-Read

Every inventor, entrepreneur, investor, and businessperson should read Bad Blood. It’s a fascinating and meticulously researched report of personal and corporate ambition unraveled by dishonesty. This page-turner is a New York Times bestseller and is expected to be made into a movie.

Wondering what to read next?

  1. Let’s Hope She Gets Thrown in the Pokey
  2. A Real Lesson from the Downfall of Theranos: Silo Mentality
  3. Virtue Deferred: Marcial Maciel, The Catholic Church, and How Institutions Learn to Look Away
  4. What Appears Self-Evident to One May Be Entirely Opaque to Another: How the Dalai Lama Apology Highlights Cultural Relativism
  5. Beware of Key-Person Dependency Risk

Filed Under: Business Stories, Leadership Reading Tagged With: Biases, Entrepreneurs, Ethics, Icons, Leadership Lessons, Likeability, Psychology

Beware of Key-Person Dependency Risk

September 7, 2018 By Nagesh Belludi

Key-Person Dependency Risk is the threat posed by an organization or a team’s over-reliance on one or a few individuals.

The key-person has sole custody of some critical institutional knowledge, creativity, reputation, or experience that makes him indispensable to the organization’s business continuity and its future performance. If he/she should leave, the organization suffers the loss of that valued standing and expertise.

Small businesses and start-ups are especially exposed to key-person dependency risk. Tesla, for example, faces a colossal key-man risk—its fate is linked closely to the actions of founder-CEO Elon Musk, who has come under scrutiny lately.

Much of Berkshire Hathaway’s performance over the decades has been based on CEO Warren Buffett’s reputation and his ability to wring remarkable deals from companies in duress. There’s a great deal of prestige in selling one’s business to Buffett. He is irreplaceable; given his remarkable long-term record of accomplishment, it is important that much of what he has built over the years remains intact once he is gone. Buffett has built a strong culture that is likely to endure.

Key Employees are Not Only Assets, but also Large Contingent Liabilities

The most famous “key man” of all time was Apple’s Steve Jobs. Not only was he closely linked to his company’s identity, but he also played a singular role in building Apple into the global consumer-technology powerhouse that it is. Jobs had steered Apple’s culture in a desired direction and groomed his handpicked management team to sustain Apple’s inventive culture after he was gone. Tim Cook, the operations genius who became Apple’s CEO after Jobs died in 2011, has led the company to new heights.

The basic solution to key-person dependency risk is to identify and document critical knowledge of the organization. (Capturing tacit knowledge is not easy when it resides “in the key-person’s head.”) Organizations must also focus on cross-training and succession planning to identify and enable others to develop and perform the same tasks as the key-person.

Idea for Impact: No employee should be indispensable. A well-managed company is never dependent upon the performance of one or a few individuals. As well, no employee should be allowed to hoard knowledge, relationships, or resources to achieve job security.

Wondering what to read next?

  1. You Need to Stop Turning Warren Buffett Into a Prophet
  2. What Virgin’s Richard Branson Teaches: The Entrepreneur as Savior, Stuntman, Spectacle
  3. Creativity by Imitation: How to Steal Others’ Ideas and Innovate
  4. Risk Homeostasis and Peltzman Effect: Why Risk Mitigation and Safety Measures Become Ineffective
  5. The Dramatic Fall of Theranos & Elizabeth Holmes // Book Summary of John Carreyrou’s ‘Bad Blood’

Filed Under: Business Stories, Managing People, MBA in a Nutshell, Mental Models Tagged With: Biases, Career Planning, Entrepreneurs, Human Resources, Icons, Leadership Lessons, Mental Models, Personality, Risk, Role Models

How Far You’ve Come

August 2, 2018 By Nagesh Belludi Leave a Comment

While browsing through advertising genius David Ogilvy’s The Unpublished David Ogilvy, I stumbled across a mention to a 1964 letter of introduction that Ogilvy received from a gifted job-applicant.

Ogilvy calls this “the best job application letter I have ever received.” The first paragraph announces,

My father was in charge of the men’s lavatory at the Ritz Hotel. My mother was a chambermaid at the same hotel. I was educated at the London School of Economics.

Ray Taylor, that aspirant, became an Ogilvy & Mather copywriter.

It reminded me of a quotation from the American priest Henry Ward Beecher: “We should not judge people by their peak of excellence; but by the distance they have traveled from the point where they started.”

Idea for Impact: Appreciate how far you’ve (and others have) come.

Wondering what to read next?

  1. General Electric’s Jack Welch Identifies Four Types of Managers
  2. Ten Rules of Management Success from Sam Walton
  3. Seven Real Reasons Employees Disengage and Leave
  4. How to Hire People Who Are Smarter Than You Are
  5. How to Manage Overqualified Employees

Filed Under: Managing People, Sharpening Your Skills Tagged With: Biases, Great Manager, Hiring & Firing, Life Plan

The Historian’s Fallacy: People of the Past Had No Knowledge of the Future

June 7, 2018 By Nagesh Belludi Leave a Comment

The practice of picking a thesis and then setting out to establish it is a widespread intellectual pursuit. But biographers and historians sometimes portray their subjects as if the historical participants could recognize what lay ahead of them.

Assuming that people of the past pondered over the events of their day from the same perspective as we do in the present is committing The Historian’s Fallacy.

The notion of the historian’s fallacy was first presented by the British literary critic Matthew Arnold (1822–88) in The Study of Poetry (1880.) In questioning how historical backgrounds were portrayed in the development of literary styles, Arnold called attention to the frequent logical error of using hindsight to assign a sense of causality and foresight of significant historical events to the people who lived through them. In reality, those historical participants may not have had wide-ranging perspective that we assume in interpreting the context, conventions and limitations of their time. Arnold wrote,

The course of development of a nation’s language, thought, and poetry, is profoundly interesting; and by regarding a poet’s work as a stage in this course of development we may easily bring ourselves to make it of more importance as poetry than in itself it really is, we may come to use a language of quite exaggerated praise in criticising it; in short, to overrate it. So arises in our poetic judgments the fallacy caused by the estimate which we may call historic. … Our personal affinities, likings and circumstances, have great power to sway our estimate of this or that poet’s work, and to make us attach more importance to it as poetry than in itself it really possesses, because to us it is, or has been, of high importance.

The American historian David Hackett Fischer, who coined the phrase “historian’s fallacy,” cited the claim that the United States should have anticipated Japan’s surprise attack on Pearl Harbor because of the many warning signs that an attack was in the cards. Fischer argues those signs seem obvious only in hindsight—to the World War II military leaders, many of those signs suggested possible attacks on many positions other than Pearl Harbor.

A good historian strives for objectivity by ignoring his own knowledge of consequent events and employing only what the historic individuals would have known in the context of their own time.

A related fallacy is Presentism—a manner of historical analysis wherein the past is interpreted by means of present-day attitudes. Presentism often fosters moral self-righteousness. Employing present-day moral standards to reflect on the Founding Fathers’ ownership of slaves, David Hume’s racism, or Gandhi’s opposition to modernity and technology should not be tainted by our stance of temporal condescension.

Wondering what to read next?

  1. Increase Paranoia When Things Are Going Well
  2. The “Ashtray in the Sky” Mental Model: Idiot-Proofing by Design
  3. The Upsides of Slowing Down
  4. How to Solve a Problem By Standing It on Its Head
  5. Four Ideas for Business Improvement Ideas

Filed Under: Mental Models, Sharpening Your Skills Tagged With: Biases, Critical Thinking, Governance, Mental Models, Thinking Tools, Thought Process

« Previous Page
Next Page »

Primary Sidebar

Popular Now

Anxiety Assertiveness Attitudes Balance Biases Coaching Conflict Conversations Creativity Critical Thinking Decision-Making Discipline Emotions Entrepreneurs Etiquette Feedback Getting Along Getting Things Done Goals Great Manager Innovation Leadership Leadership Lessons Likeability Mental Models Mentoring Mindfulness Motivation Networking Parables Performance Management Persuasion Philosophy Problem Solving Procrastination Relationships Simple Living Social Skills Stress Suffering Thinking Tools Thought Process Time Management Winning on the Job Wisdom

About: Nagesh Belludi [hire] is a St. Petersburg, Florida-based freethinker, investor, and leadership coach. He specializes in helping executives and companies ensure that the overall quality of their decision-making benefits isn’t compromised by a lack of a big-picture understanding.

Get Updates

Signup for emails

Subscribe via RSS

Contact Nagesh Belludi

RECOMMENDED BOOK:
The 48 Laws of Power

The 48 Laws of Power: Robert Greene

Robert Greene's controversial bestseller about manipulative people and advance your cause---or how to understand others and protect yourself from the nefarious.

Explore

  • Announcements
  • Belief and Spirituality
  • Business Stories
  • Career Development
  • Effective Communication
  • Great Personalities
  • Health and Well-being
  • Ideas and Insights
  • Inspirational Quotations
  • Leadership
  • Leadership Reading
  • Leading Teams
  • Living the Good Life
  • Managing Business Functions
  • Managing People
  • MBA in a Nutshell
  • Mental Models
  • News Analysis
  • Personal Finance
  • Podcasts
  • Project Management
  • Proverbs & Maxims
  • Sharpening Your Skills
  • The Great Innovators

Recently,

  • What Appears Self-Evident to One May Be Entirely Opaque to Another: How the Dalai Lama Apology Highlights Cultural Relativism
  • Inspirational Quotations #1136
  • Ditch Deadlines That Deceive
  • Invention is Refined Theft
  • You Need to Stop Turning Warren Buffett Into a Prophet
  • Inspirational Quotations #1135
  • What the Dry January Trap Shows Us About Extremes

Unless otherwise stated in the individual document, the works above are © Nagesh Belludi under a Creative Commons BY-NC-ND license. You may quote, copy and share them freely, as long as you link back to RightAttitudes.com, don't make money with them, and don't modify the content. Enjoy!