• Skip to content
  • Skip to primary sidebar

Right Attitudes

Ideas for Impact

Biases

Under Pressure, The Narrowing Cognitive Map: Lessons from the Tragedy of Singapore Airlines Flight 6

July 10, 2023 By Nagesh Belludi Leave a Comment

Picture this: You’re parking your car when, suddenly, you catch sight of the bus you desperately need to catch pulling into the station. Acting on instinct, you swiftly navigate your car into a vacant spot, deftly gather your bags, and launch yourself towards the bus stop, driven by an unwavering determination to evade a tedious fifteen-minute wait for the next one. In the whirlwind of your frantic sprint, you absentmindedly and hastily tuck your cherished cell phone into your back pocket, oblivious that it slips out during the adrenaline-fueled pursuit of catching the bus. It’s only after another five minutes that you become aware of your cell phone’s absence, and the weight of its loss gradually descends upon you.

Isn’t it fascinating how our minds tend to close off under time pressure? This fascinating cognitive phenomenon is known as the “narrowing of the cognitive map.” It’s as if our attention becomes laser-focused, but unfortunately, that can lead us to make unfortunate errors in judgment.

When we find ourselves in the clutches of tunnel vision, our thinking becomes constrained, and we unknowingly fall into the trap of limited perspective. Not only do we become so fixated on a specific course of action that we overlook crucial details in our environment, but we also become oblivious to the subtle signals whispering, “Something’s amiss.”

Inattentional blindness, indeed. It’s a common problem in high-stress situations, and it can have serious consequences, as in the following case study of the Singapore Airlines Flight 6 crash.

Speed Stress Causes Serious Breakdowns in the Reliability of Judgment

Flight 6’s tragic case accident occurred on October 31, 2000, at Taipei’s Chiang Kai-shek International Airport. Various factors contributed to the crash, including severe weather conditions, limited visibility, inadequate airport markings, and insufficient actions taken by both the pilots and air traffic controllers.

During a scheduled stop in Taipei on its journey from Singapore to Los Angeles, Flight 6’s flight crew became aware of an approaching storm. They realized that if they delayed the takeoff, they would have to wait for the storm to pass, resulting in a lengthy 12-hour delay. This interruption would have entailed making overnight arrangements for the passengers, disrupting the crew’s schedule, and potentially impacting future flight schedules involving the aircraft and company personnel. Consequently, the crew made the decision to expedite the departure and take off before the typhoon made landfall on the island.

The Rushed Pilots Missed Clues That They Were Taking Off on a Closed Runway

Under immense time pressure, the flight crew became singularly focused on expediting their takeoff in rainy and windy conditions before the weather conditions deteriorated further. Despite being instructed to taxi to Runway 05 Left, they deviated from the assigned route and instead positioned themselves on Runway 05 Right, which was closed for takeoff due to ongoing pavement repairs.

Complicating matters, a section of Runway 05 Right was still being used as a taxiway during the construction period. The signage at the entrance of the runway did not adequately indicate the presence of a stop sign and construction equipment along the converted taxiway.

Moreover, the local air traffic controller failed to provide progressive taxi or ground movement instructions, which would have been appropriate considering the low visibility during the taxi. However, due to the crew’s heightened sense of urgency, they neglected to request step-by-step instructions for their taxi route.

Misleading Airport Markings Contributed to Pilots’ Mistaken Belief of Correct Runway Selection

In the midst of low visibility and feeling rushed, the pilots neglected crucial resources that could have guided them to the correct runway, such as runway and taxiway charts, signage, markings, and cockpit instruments. This lapse in judgment resulted in a loss of situational awareness, leading them to initiate takeoff from a runway closed for construction.

The Harsh Reality of Rushing: Examining the Aftermath of Singapore Airlines Flight 6's Closed Runway Mishap Approximately 3,300 feet down the runway, around 11:17 PM that night, the Boeing 747 collided with concrete barriers and construction equipment, resulting in the aircraft breaking apart and bursting into flames.

Tragically, 83 out of the 179 people on board lost their lives.

The crew’s loss of awareness was further compounded by the airport’s negligence in terms of maintenance and safety precautions. By failing to place mandatory construction warnings at the entrance of Runway 05 Right, they disregarded the potential risk of aircraft mistakenly attempting to take off from a partially closed runway.

The air traffic controllers also neglected to verify the aircraft’s position before granting takeoff clearances, despite the aircraft having turned onto Runway 05 Right. The airport lacked the necessary Airport Surface Detection Equipment, which could have been crucial in detecting and mitigating risks, especially given the heavy precipitation that could have hampered radar presentation at the time. In their defense, the pilots had assumed that the air traffic controllers could visually observe the aircraft, and the fact that takeoff clearance was issued just as the aircraft turned onto the taxiway gave them the impression that everything was in order.

Anxiety Leads to Attentional Tunneling and Narrowed Field of Focus

The tragedy of Singapore Airlines Flight 6 serves as a poignant case study highlighting the dangers of tunnel vision and its ability to hinder our perspective and decision-making.

Often, seemingly minor errors, when combined with time constraints and cognitive biases, can intertwine and escalate, leading to catastrophic outcomes. Even in a highly advanced cockpit and a complex system with numerous safeguards, a chain of minor errors can transform it into a deadly trap.

The human brain is naturally inclined to seek confirmation and convince itself that it completely understands the situation at hand. When faced with contradictory information, we tend to ignore it and focus solely on our preconceived notions. Furthermore, anxiety further impairs our ability to perceive the entire situation, leaving us prone to impulsive actions rather than rational responses.

It is vital to be aware of the perils of tunnel vision. It can close our eyes to the broader context and limit our capacity to consider peripheral information. This narrowed perception can have severe consequences, emphasizing the importance of maintaining a broader perspective in decision-making.

Wondering what to read next?

  1. Lessons from the World’s Worst Aviation Disaster // Book Summary of ‘The Collision on Tenerife’
  2. “Fly the Aircraft First”
  3. How Stress Impairs Your Problem-Solving Capabilities: Case Study of TransAsia Flight 235
  4. How Contributing Factors Stack Up and Accidents Unfold: A Case Study of the 2024 Delta A350 & CRJ-900 Collision
  5. What Airline Disasters Teach About Cognitive Impairment and Decision-Making Under Stress

Filed Under: Business Stories, Effective Communication, Sharpening Your Skills Tagged With: Aviation, Biases, Conflict, Decision-Making, Mindfulness, Problem Solving, Risk, Stress, Worry

Why Incentives Backfire and How to Make Them Work: Summary of Uri Gneezy’s Mixed Signals

June 20, 2023 By Nagesh Belludi Leave a Comment


Misguided Motivations: The Folly of Incentives in the Great Hanoi Rat Massacre

In the late 18th century, Governor Paul Doumer of the French colonial government had a vision to modernize Hanoi. His plan included the introduction of toilets, which unfortunately attracted disease-spreading rats. As time passed, the rat population became a growing concern. In a desperate attempt to control the vermin invasion, the government launched a program that rewarded citizens for every rat tail they brought in, hoping to reduce the rat numbers. However, this seemingly brilliant solution turned into a catastrophic event.

Unbeknownst to the government, the citizens of Hanoi discovered a loophole in the system. Instead of exterminating the rats, they started amputating the rats’ tails without killing them. This allowed the rats to continue to breed more rats with tails, as these would become a future source of income.

The situation quickly descended into utter madness. Driven by insatiable greed, some individuals established rat-breeding farms to maximize their rewards, while others resorted to importing rat tails from distant regions. The unintended consequence of this perverse incentive scheme was a massive explosion in the rat population, exacerbating the very problem it was meant to solve.

This ill-fated event, known as the “Great Hanoi Rat Massacre,” is a notorious example of the dangers of perverse incentives.

The Unintended Consequences of Incentive-driven Actions

'Uri Gneezy' by Mixed Signals (ISBN 0300255535) In his insightful book, Mixed Signals: How Incentives Really Work (2023,) Uri Gneezy, a distinguished behavioral economist from the University of California-San Diego, masterfully presents compelling examples that highlight the profound disparity between the intended behaviors incentives aim to promote and the unforeseen behaviors they unintentionally trigger. Gneezy’s astute analysis illuminates the perplexing nature of these gaps, offering invaluable insights into the actual workings of incentive systems. Another example of this point is the situation with many doctors operating under Fee for Service (FFS) payment models. In these models, doctors are incentivized to perform additional tests and procedures to increase their own payment. As a result, their focus may shift from promoting overall health to simply recommending more procedures.

To avoid sending confusing messages through incentives, Gneezy emphasizes the importance of carefully considering such initiatives’ potential outcomes and unintended effects. Gneezy strongly advocates for the use of prototype incentive programs.

Consider the case of the Wells Fargo cross-selling scandal, which was caused by aggressive sales practices. To increase the number of accounts held by existing customers, the company decided to motivate bank employees to promote additional services, like credit cards and savings accounts, to customers with checking accounts. However, due to a lack of proper oversight, employees resorted to fraudulent practices by creating over three million unauthorized credit card accounts without customers’ knowledge or consent. These unethical practices harmed customers who ended up with unwanted and unnecessary accounts, violated their trust, and exposed them to fees and penalties. In order to prevent such a scandal, Wells Fargo could have implemented prototype techniques and established an auditing system to verify the legitimacy of accounts randomly.

The Irony of Fines as Deterrents in Action

Gneezy brilliantly dissects the flawed notion that imposing fines is a universal remedy. He highlights how fines, often intended as deterrents, can backfire by diverting people’s focus from deterring behavior to merely avoiding punishment. For instance, when drivers are warned about the perils of texting while driving, they may genuinely reflect on the risks involved and the value of their own lives. However, the introduction of a $500 fine shifts their mindset. Now, their attention shifts from personal safety to the likelihood of encountering law enforcement. If they perceive a lack of police presence, the thought process changes to “No police around, no risk of getting caught—time to text!” In this way, the imposition of fines skews individuals’ attention from contemplating potential hazards to the probability of facing the consequences.

Recommendation: Fast-read Mixed Signals: How Incentives Really Work (2023.) Greezly’s work serves as a resounding reminder that designing an incentive system to encourage desired behavior while minimizing unintended consequences is no easy feat. Greezly’s advice on balancing multiple metrics to avoid the pitfalls of fixating on a single metric at the expense of others and the importance of regularly reviewing and updating the system while keeping a vigilant eye on unintended consequences is undeniably accurate.

Wondering what to read next?

  1. When Work Becomes a Metric, Metrics Risk Becoming the Work: A Case Study of the Stakhanovite Movement
  2. Numbers Games: Summary of The Tyranny of Metrics by Jerry Muller
  3. Ethics Lessons From Akira Kurosawa’s ‘High and Low’
  4. The Barnum Effect and the Appeal of Vagueness
  5. Rewards and Incentives Can Backfire

Filed Under: Leading Teams, Mental Models, Sharpening Your Skills Tagged With: Biases, Critical Thinking, Decision-Making, Discipline, Ethics, Goals, Motivation, Performance Management, Persuasion, Psychology

Availability Heuristic: Our Preference for the Familiar

May 27, 2023 By Nagesh Belludi Leave a Comment

The availability heuristic is a cognitive bias that can lead people to rely on readily available information or emotionally charged and inherently interesting examples when making decisions or judgments. Essentially, individuals tend to overestimate the probability of events that are easy to recall or that they’ve personally experienced, while underestimating the likelihood of less memorable or less frequent events.

In other words, the ease of retrieval of a misleading cue may make people rely on evidence not because it is dependable but because it is memorable or striking and thus psychologically available to them. They may do so even if the evidence is not logically acceptable or does not logically support their decision.

Doctors often depend on recalling their past dramatic cases and mistakenly apply them to the current situation. People may overestimate the crime rate in their community based on news coverage, even though crime rates may be relatively low. People may dismiss the reality of climate change if they’ve recently experienced a cold winter or heard of a cold snap in a particular region, even though global warming is a long-term trend. Individuals are more likely to purchase insurance after experiencing a natural disaster than before it occurs. In each of these scenarios, the vivid and emotional evidence feels more persuasive rather than it being the most accurate or reliable information.

The availability heuristic can also shape people’s perceptions of air travel safety and lead them to believe that flying is more dangerous than it really is. Airplane accidents are often sensationalized and highly publicized by the media, making them more memorable and more prominent in people’s minds. This can cause individuals to perceive the risk of flying much higher than it actually is, leading them to avoid air travel even though it is statistically one of the safest forms of transportation. In reality, many less vivid and less memorable (i.e., psychologically unavailable) things are much more dangerous than air travel, such as falling down stairs, drowning, choking, and accidental poisoning.

Avoid falling prey to the availability heuristic and making serious misjudgments about the risks associated with different situations. Acknowledge that personal experiences and recent events may not accurately reflect the overall reality of the situation.

Wondering what to read next?

  1. The Boeing 737 MAX’s Achilles Heel
  2. How Stress Impairs Your Problem-Solving Capabilities: Case Study of TransAsia Flight 235
  3. Many Hard Leadership Lessons in the Boeing 737 MAX Debacle
  4. Be Smart by Not Being Stupid
  5. What Airline Disasters Teach About Cognitive Impairment and Decision-Making Under Stress

Filed Under: Mental Models, Sharpening Your Skills Tagged With: Aviation, Biases, Decision-Making, Mental Models, Problem Solving, Psychology, Risk, Thinking Tools

The Bikeshedding Fallacy: Why Trivial Matters Eclipse the Important Ones

May 26, 2023 By Nagesh Belludi Leave a Comment

Parkinson’s Law of Triviality, also known as the Bikeshedding Effect, is a mental model that underscores the inclination to place undue emphasis on a simple or easily comprehensible matter while ignoring more significant ones.

The term “bikeshedding” originated from a book by C. Northcote Parkinson (who gave us Parkinson’s Law.) To illustrate the idea of bikeshedding, Parkinson evokes a situation where a cross-disciplinary committee discusses the design of a nuclear power plant. Most of the members have a limited understanding of nuclear reactor design. Consequently, they will likely rely on the experts’ opinions on these critical matters.

However, when the discussion turns to a relatively simple topic like a humble bike storage shed for employees, everyone feels the need to contribute. This is attributable to the people’s desire to be recognized as valuable contributors and showcase their competence by providing their thoughts on something everyone can understand. As a result, the committee spends a disproportionate amount of time deliberating on trivial matters like the shed’s building material or paint color while turning its back on critical issues such as how to foolproof the fuel control system.

In essence, Parkinson’s Law of Triviality highlights the human tendency to focus on easy-to-understand matters, even if they are less important, because individuals feel more confident and productive doing them.

Wondering what to read next?

  1. Zeigarnik Effect: How Incomplete Tasks Trigger Stress
  2. Hofstadter’s Law: Why Everything Takes Longer Than Anticipated
  3. Let Go of Sunk Costs
  4. Warren Buffett’s Advice on How to Focus on Priorities and Subdue Distractions
  5. Why Your Judgment Sucks

Filed Under: Leading Teams, Mental Models, Sharpening Your Skills Tagged With: Biases, Decision-Making, Meetings, Procrastination, Psychology, Teams, Thought Process, Time Management

The Streisand Effect: When Trying to Hide Only Makes it Shine

May 25, 2023 By Nagesh Belludi Leave a Comment

In a famous episode of the beloved British sitcom Father Ted, the main character and his fellow priests embark on a protest against the airing of a film titled “The Passion of Saint Tibulus.” The movie portrays a Catholic saint disrespectfully, causing outrage among the Vatican and local bishops. However, despite the priests’ efforts, their parishioners do not heed to the boycott. To their dismay, media coverage of the priests’ pickets only amplifies the controversy, inadvertently making the film even more popular.

This comical scenario perfectly exemplifies the Streisand Effect, a phenomenon wherein attempts to suppress something end up drawing more attention to it.

The term “Streisand Effect” originated in 2003 when singer and actress Barbra Streisand sued a photographer for including an aerial photo of her Malibu home in a collection of images documenting coastal erosion. The lawsuit garnered significant attention to the photo, which had only been downloaded six times before the legal action. Suddenly, the photo went viral, accumulating millions of views and symbolizing the Streisand Effect.

A more recent example of this phenomenon occurred in 2017 when then-White House press secretary Sean Spicer attempted to quash a story about his meeting with reporters. Spicer had requested that the reporters keep the meeting private, hoping to prevent it from being reported. However, his efforts backfired spectacularly when the journalists went ahead and wrote about the meeting. During a press briefing, Spicer scolded the journalists for disregarding his wishes, inadvertently bringing even more attention to the original story. Had Spicer ignored the reporting, the story might have fizzled out quietly. Instead, it became a viral sensation, sparking numerous memes and jokes.

These examples serve as a powerful reminder to carefully consider the potential consequences before attempting to suppress or control information.

Wondering what to read next?

  1. Ethics Lessons From Akira Kurosawa’s ‘High and Low’
  2. Presenting Facts Can Sometimes Backfire
  3. Fight Ignorance, Not Each Other
  4. The Data Never “Says”
  5. Beyond the Illusion: The Barnum Effect and Personality Tests

Filed Under: Mental Models, Sharpening Your Skills Tagged With: Biases, Celebrities, Confidence, Conflict, Conviction, Critical Thinking, Persuasion, Psychology

Mise En Place Your Life: How This Culinary Concept Can Boost Your Productivity

May 24, 2023 By Nagesh Belludi Leave a Comment

“Mise en place” may sound like a highfalutin term, but it is a French phrase that means “set in place.” In the culinary world, it refers to the practice of preparing all ingredients and equipment in advance of cooking. This means tasks such as chopping vegetables, measuring ingredients, preheating ovens, and organizing equipment are taken care of before cooking begins. The benefit of this preparation is that cooks can concentrate entirely on cooking during service, free from the need to stop and gather or prepare ingredients. Mise en place is an essential aspect of professional cooking and symbolizes a well-organized and efficient kitchen.

When it comes to exceptional cooking, chefs take their craft seriously. Mise en place isn’t just a time-saving technique; it’s a way of life. Messing with it is like kicking a hornet’s nest, as Anthony Bourdain, the culinary world’s travel documentarian, underscored in his bestselling book, Kitchen Confidential (2000): “Mise en place is the religion of all good line cooks.” Everything from their station to their tools, supplies, and backups should be arranged with military precision, and disturbing this sacred set-up is like throwing the universe off balance. Things can quickly spiral out of control, and anyone in the restaurant is advised not to mess with a line cook’s “meez” unless they want to face their wrath!

The same concept can be applied to any project or task. Pre-planning and careful preparation reduce the risk of interruptions and distractions. Take time to plan ahead, gather the necessary resources, and know your goal before starting. Keep the mundane concerns from keeping you focused on the job you’re there to do.

Think of it as a personal mise en place. Sit down and plan out what you need to succeed, including the necessary skills, resources, and people. Doing so allows you to channel your full attention to the task at hand, avoiding distractions and increasing your overall effectiveness.

Wondering what to read next?

  1. The 5 Habits of Highly Organized People
  2. In Imperfection, the True Magic of the Holidays Shines
  3. Everything in Life Has an Opportunity Cost
  4. Decisions, Decisions: Are You a Maximizing Maniac or a Satisficing Superstar?
  5. Thinking Straight in the Age of Overload // Book Summary of Daniel Levitin’s ‘The Organized Mind’

Filed Under: Business Stories, Managing People, Mental Models, Sharpening Your Skills Tagged With: Assertiveness, Biases, Clutter, Discipline, Mindfulness, Perfectionism, Procrastination, Psychology, Tardiness

Decoy Effect: The Sneaky Sales Trick That Turns Shoppers into Spenders

May 23, 2023 By Nagesh Belludi Leave a Comment

Imagine yourself at the movie theater, deciding whether to buy a small popcorn for $5 or a large popcorn for $8. You’re wondering if the extra popcorn is worth the extra money, so you consider the small size. Suddenly, the cashier offers you medium popcorn for $7.50, and you buy it instead of the small one.

However, the medium popcorn is a lure—a true distraction. By introducing it, the theater has made the large popcorn seem like a better value and the small popcorn seem less attractive. This is a classic marketing strategy known as the Decoy Effect, which aims to influence your decision-making.

In essence, the Decoy Effect presents you with two options and then adds a third option designed to make one of the original options more appealing. This can sway your decision-making and lead you to choose the more expensive option.

Studies have shown that framing can influence our decisions, as a well-designed decoy can shift opinions by up to 40%. One well-known example of the decoy effect in action is from The Economist, the influential weekly international news and business publication. Behavioral economist Dan Ariely’s book Predictably Irrational (2008) describes how the magazine offered a digital subscription for $59, a print subscription for $125, and a combined print and online subscription for the same price of $125. The print-only subscription was clearly a decoy, designed to make the combined subscription seem like a better value, and it worked; the presence of the decoy significantly increased the uptake of the combined subscription.

While psychologists are still debating the exact reasons for this cognitive bias, one theory suggests that the decoy provides a straightforward justification for a decision that might otherwise seem arbitrary.

Idea for Impact: If you run a business, you too can use the decoy effect to steer consumers towards certain purchasing decisions that benefit your bottom line. By strategically adding a decoy product to your offerings, you can provide perceived value for your customers while boosting your profits.

Wondering what to read next?

  1. Clever Marketing Exploits the Anchoring Bias
  2. Your Product May Be Excellent, But Is There A Market For It?
  3. Airline Safety Videos: From Dull Briefings to Dynamic Ad Platforms
  4. The Longest Holdout: The Shoichi Yokoi Fallacy
  5. The Loss Aversion Mental Model: A Case Study on Why People Think Spirit is a Horrible Airline

Filed Under: Business Stories, MBA in a Nutshell, Mental Models, Sharpening Your Skills Tagged With: Biases, Creativity, Marketing, Persuasion, Psychology, Thought Process

The Longest Holdout: The Shoichi Yokoi Fallacy

May 22, 2023 By Nagesh Belludi Leave a Comment

In 1972, while hunting near the Talofofo River in Guam, two cousins from the village of Talofofo were startled by rustling sounds emanating from the tall reeds. Initially, they assumed it was an animal or a hidden child, but to their surprise, they came face to face with an elderly and disheveled man clutching a shrimp trap. This unexpected encounter took aback the hunters, and after some initial confusion, they captured the man and escorted him back to their makeshift jungle home, about an hour’s walk away. The old man pleaded with the cousins to end his life.

That fugitive turned out to be Shoichi Yokoi, a Japanese soldier. During the latter stages of World War II, Yokoi served in the supply corps of the Japanese army stationed on the island of Guam. In 1944, when General Douglas MacArthur’s troops invaded and reclaimed control of the island, Yokoi retreated into the dense jungle. There, he sought refuge in an underground cave and remained hidden for 28 years, living as a determined survivor under harsh conditions.

Yokoi sustained himself by inhabiting a tunnel-like cave he had carved amidst the thick foliage, relying on a diet of nuts, fruits, shrimp, frogs, and rats. He fashioned his clothing by skillfully weaving tree bark strips and using the moon’s phases to track time. In 1952, he chanced upon a leaflet announcing the war’s end, but he and his fellow soldiers dismissed it as enemy propaganda, choosing not to surrender. Over time, all of Yokoi’s comrades perished due to starvation or illness, or were captured.

Loyalty Without a Glance Can Shroud the Mind in Ignorance

Yokoi remained firmly convinced that his fellow soldiers would eventually come to rescue him, and he clung tenaciously to this belief. Surrender was out of the question, as he later explained, “We Japanese soldiers were taught to choose death over the shame of being taken alive.” (Additionally, stragglers like him believed that returning to Japan was impossible, fearing they would be branded as deserters and face the death penalty.)

In 1972, Yokoi finally returned to Japan, where he was hailed as a national hero. Upon his arrival in Tokyo, he famously declared, “It is with much embarrassment that I have returned alive,” echoing the indoctrination he had received before the war. For the older generation, he symbolized greatness, embodying the prewar values of diligence. However, for the younger generation, he represented an awkward reminder of outdated ideals. Being captured and surviving was deemed cowardly, as the ideal soldier made the ultimate sacrifice for the divine emperor, even at the cost of his own life.

Yokoi’s remarkable story of surviving in the jungle captured the imagination of the Japanese people. The country was undergoing an industrial boom, and many were fascinated by his ability to endure on a meager diet and his resourcefulness in creating clothing from tree bark. Yokoi even returned his army-issued rifle to “the honorable emperor,” expressing his embarrassment at having returned alive rather than dying in service to the emperor. He regretted not having served his majesty to the fullest.

However, Yokoi never quite felt at home in modern society. Before his conscription in 1941, he had been an apprentice tailor, and now, he found himself overwhelmed by the changes that had occurred during his absence. He subsequently led a quiet life as a hermit, becoming a popular television personality and advocating for a simple way of life. He traveled across the country, delivering public lectures criticizing Japan’s “wasteful modern lifestyle” and championing values of thrift and self-reliance. He was deeply admired for his unwavering determination, his spirit of ganbaru (“enduring adversity without giving in,”) and his unwavering commitment to traditional values.

Embrace the Gifts That Doubt Can Bring. Let Enlightenment Take Flight.

Overall, Yokoi spent 27 years in isolation in the jungles of Guam, stubbornly holding onto his identity as a Japanese soldier long after the war had ended. In doing so, he squandered his life by adhering to ideals that held no significance for anyone else, sacrificing his relationships, career, and personal happiness to pursue the Japanese principle of ganbaru, or unwavering perseverance.

There reaches a point where virtue, taken to the extreme, can transform into a vice. Shoichi Yokoi personified this fallacy. We often admire the act of unwavering commitment, but we tend to lose sight of the underlying reasons behind it due to the blinding effects of rigid adherence.

Beware of blind devotion to any ideology that promotes rigid and restrictive beliefs. Do not overestimate the value of your morals beyond their practical utility, and be receptive to changing your perspective when circumstances demand it. This requires reevaluating your priorities and recognizing that what you once cherished may no longer align with your desires or aspirations. When faced with new information or situations, consider the possibility of altering your stance. There is a difference between sticking to your principles and being imprudent.

Wondering what to read next?

  1. The Problem with People Who Don’t Think They Can Change
  2. Decoy Effect: The Sneaky Sales Trick That Turns Shoppers into Spenders
  3. Ethics Lessons From Akira Kurosawa’s ‘High and Low’
  4. The Power of Counterintuitive Thinking
  5. Charlie Munger’s Iron Prescription

Filed Under: Mental Models, Sharpening Your Skills Tagged With: Attitudes, Biases, Mental Models, Persistence, Persuasion, Philosophy, Psychology, Thought Process, Wisdom

Maximize Your Chance Possibilities & Get Lucky

April 27, 2023 By Nagesh Belludi 2 Comments

'Luck Factor' by Richard Wiseman (ISBN 0786869143) British psychologist Richard Wiseman’s Luck Factor (2003) explores what makes some people lucky and others unlucky.

Being lucky is a mindset to bring to life. Lucky people maximize their chances of creating and noticing a lucky opportunity. They listen to their intuition when they get an opportunistic hunch.

The book’s core premise is whether you’re generally lucky or unlucky depends on your attitude—an optimistic mindset is a self-fulfilling prophecy, indeed. Lucky people expect good fortune; they expect good things to happen in their life. When they do have a run of bad luck, they adopt a resilient attitude and somehow turn that into good luck.

Idea for Impact: Lucky people aren’t lucky by sheer accident. To maximize your chances of getting lucky, get more opportunities and feel luckier. Get out there more often, produce more work, and talk to more people. Be open to the world and ready for new opportunities.

Wondering what to read next?

  1. Luck Doesn’t Just Happen
  2. How to Guard Against Anything You May Inadvertently Overlook
  3. Question Success More Than Failure
  4. Gambler’s Fallacy is the Failure to Realize How Randomness Rules Our World
  5. Overcoming Personal Constraints is a Key to Success

Filed Under: Mental Models, Sharpening Your Skills Tagged With: Attitudes, Biases, Books for Impact, Creativity, Luck, Risk, Thinking Tools

It Takes Luck as Much as Talent

April 24, 2023 By Nagesh Belludi Leave a Comment

In The Frontiers of Management (1986,) Peter Drucker writes about how Thomas J. Watson, Sr. emerged as a pioneer in the development of accounting and computing equipment:

Twice in the 1930s [Thomas J. Watson, Sr.] personally was on the verge of bankruptcy. What saved him and spurred IBM sales during the Depression were two New Deal laws: the Social Security Act in 1935 and the Wage-Hours Act of 1937–38. They mandated records of wages paid, hours worked, and overtime earned by employees, in a form in which the employer could not tamper with the records. Overnight they created markets for the tabulating machines and time clocks that Thomas Watson, Sr., had been trying for long years to sell with only moderate success.

Idea for Impact: It’s hard for people who pride themselves on their extraordinary skills to accept that they’re just as lucky as they’re smart.

Luck is primarily the result of identifying opportunities and taking appropriate action. Watson could capitalize on the newly created need for business machines because he had worked in the field for decades. And he gave this kind of luck much credit without feeling that doing so devalued his talent and hard work.

Wondering what to read next?

  1. Question Success More Than Failure
  2. If You’re Looking for Bad Luck, You’ll Soon Find It
  3. The Business of Popular Causes
  4. Choose Your Role Models Carefully
  5. Beware of Key-Person Dependency Risk

Filed Under: Business Stories, Mental Models Tagged With: Biases, Entrepreneurs, Humility, Luck, Wisdom

« Previous Page
Next Page »

Primary Sidebar

Popular Now

Anxiety Assertiveness Attitudes Balance Biases Coaching Conflict Conversations Creativity Critical Thinking Decision-Making Discipline Emotions Entrepreneurs Etiquette Feedback Getting Along Getting Things Done Goals Great Manager Leadership Leadership Lessons Likeability Mental Models Mentoring Mindfulness Motivation Networking Parables Perfectionism Performance Management Persuasion Philosophy Problem Solving Procrastination Relationships Simple Living Social Skills Stress Suffering Thinking Tools Thought Process Time Management Winning on the Job Wisdom

About: Nagesh Belludi [hire] is a St. Petersburg, Florida-based freethinker, investor, and leadership coach. He specializes in helping executives and companies ensure that the overall quality of their decision-making benefits isn’t compromised by a lack of a big-picture understanding.

Get Updates

Signup for emails

Subscribe via RSS

Contact Nagesh Belludi

RECOMMENDED BOOK:
The Art of Stillness

The Art of Stillness: Pico Iyer

Travel writer Pico Iyer’s argues the importance of taking a timeout from busyness. Examples of a privileged few who have found peace through stillness in practice.

Explore

  • Announcements
  • Belief and Spirituality
  • Business Stories
  • Career Development
  • Effective Communication
  • Great Personalities
  • Health and Well-being
  • Ideas and Insights
  • Inspirational Quotations
  • Leadership
  • Leadership Reading
  • Leading Teams
  • Living the Good Life
  • Managing Business Functions
  • Managing People
  • MBA in a Nutshell
  • Mental Models
  • News Analysis
  • Personal Finance
  • Podcasts
  • Project Management
  • Proverbs & Maxims
  • Sharpening Your Skills
  • The Great Innovators

Recently,

  • Acting the Part, Change Your Life: Book Summary of Richard Wiseman’s ‘The As If Principle’
  • Inspirational Quotations #1105
  • Why Doing a Terrible Job First Actually Works
  • The Barnum Effect and the Appeal of Vagueness
  • Inspirational Quotations #1104
  • How to … Address Over-Apologizing
  • Van Gogh Didn’t Just Copy—He Reinvented

Unless otherwise stated in the individual document, the works above are © Nagesh Belludi under a Creative Commons BY-NC-ND license. You may quote, copy and share them freely, as long as you link back to RightAttitudes.com, don't make money with them, and don't modify the content. Enjoy!