• Skip to content
  • Skip to primary sidebar

Right Attitudes

Ideas for Impact

Risk

Steering the Course: Leadership’s Flight with the Instrument Scan Mental Model

November 6, 2023 By Nagesh Belludi Leave a Comment

Instrument Scan Mental Model: Leaders Must Employ Their Instruments for Guided Insight Embarking on flight training comes with a nifty habit that instructors eagerly instill from the get-go: the art of instrument scanning.

Whispers from your instructor echo in your mind, urging you with the mantra, “Scan, scan, scan!”

Keep a Good Scan of Your Instruments, Never Be Stumped

A vital cautionary command follows closely, “Don’t stare!” You learn to effortlessly let your gaze flit from one instrument to another. The altitude indicator, heading indicator, airspeed indicator, and vertical speed indicator each hold a crucial piece of the intricate airborne puzzle.

There’s a natural instinct to fixate on a single instrument, yet doing so can lead pilots astray. Gazing at the altimeter may cause heading drift, while focusing solely on heading may compromise airspeed control.

Pilots are trained to maintain a cohesive scan of all instruments, constantly cross-checking the streams of data. By doing so, they can swiftly identify any inconsistencies, such as an altitude indicating descent while the altimeter shows level flight.

With instrument scanning, pilots can promptly isolate the problematic instrument or data stream, and if necessary devise alternative plans to obtain the necessary information and ensure the aircraft’s safe and steady flight.

Just as Pilots Use Instruments in the Air, Leaders Scan Their Realm

The concept of an instrument scan mindset serves as a potent analogy for effectively managing critical information within the realm of business. Much like pilots, leaders must engage in ongoing monitoring, analysis, and cross-referencing of pertinent data. To achieve success, it’s imperative to proactively pay attention to emerging trends, maintain a steadfast focus on the larger picture, and cultivate a curious mindset.

It is of utmost importance to avoid fixating on a single metric to the detriment of considering other vital factors that could impact the business. Leaders should routinely revisit their goals, objectives, and key performance indicators (KPIs,) and conduct a thorough analysis of data to discern trends, patterns, and areas of concern, all while embracing a proactive and inquisitive approach. They should be unafraid to pose challenging questions, challenge assumptions, and maintain a comprehensive situational awareness.

Sadly, in the world of business, this mindset is frequently overlooked. Reports are often generated, and actions taken without the rigorous cross-checking or sense-checking of the underlying data. Stakeholders become overly fixated on a single “instrument,” and in doing so, they fail to maintain a broader scan of the business landscape.

It is crucial to refrain from accepting data at face value, as maintaining a vigilant scan and a more extensive situational awareness is of paramount importance. Embrace the wisdom of instrument scanning to chart a course toward success, steering clear of perilous assumptions and acquiring a comprehensive understanding of your business’s performance.

Leaders Must Employ Their ‘Instruments’ for Guided Insight

Within the symbolic framework of leadership, as in flying an aircraft, the concept of instrument scanning encompasses the continual practice of gathering and interpreting information. This process is vital for making well-informed decisions, safeguarding the welfare of the organization or team, and steering a precise path toward the envisioned goals.

Much like how pilots depend on their instruments to navigate their flights safely and on the correct course, effective leadership through instrument scanning is essential. It serves as the linchpin for steering an organization or team toward triumph and preserving their vitality and stability.

Wondering what to read next?

  1. Five Where Only One is Needed: How Airbus Avoids Single Points of Failure
  2. How Stress Impairs Your Problem-Solving Capabilities: Case Study of TransAsia Flight 235
  3. The Waterline Principle: How Much Risk Can You Tolerate?
  4. This Hack Will Help You Think Opportunity Costs
  5. What Airline Disasters Teach About Cognitive Impairment and Decision-Making Under Stress

Filed Under: Leading Teams, Mental Models, Project Management Tagged With: Aviation, Critical Thinking, Decision-Making, Discipline, Mental Models, Mindfulness, Performance Management, Problem Solving, Risk, Targets

Protect the Downside with Pre-mortems

November 2, 2023 By Nagesh Belludi Leave a Comment

'The Obstacle Is the Way' by Ryan Holiday (ISBN 1591846358) American self-help author Ryan Holiday’s The Obstacle Is the Way (2014) draws inspiration from Stoic philosophy to demonstrate how obstacles and challenges can be transformed into opportunities for personal growth and success. One recommended mindset is the pre-mortem: envisioning potential difficulties aligns with Stoic principles of accepting what one cannot control and focusing on their responses to external events:

In a postmortem, doctors convene to examine the causes of a patient’s unexpected death so they can learn and improve for the next time a similar circumstance arises. Outside of the medical world, we call this a number of things—a debriefing, an exit interview, a wrap-up meeting, a review—but whatever it’s called, the idea is the same: We’re examining the project in hindsight, after it happened.

A pre-mortem is different. In it, we look to envision what could go wrong, what will go wrong, in advance, before we start. Far too many ambitious undertakings fail for preventable reasons. Far too many people don’t have a backup plan because they refuse to consider that something might not go exactly as they wish. Your plan and the way things turn out rarely resemble each other. What you think you deserve is also rarely what you’ll get. Yet we constantly deny this fact and are repeatedly shocked by the events of the world as they unfold.

Idea for Impact: By embracing anticipation, you equip yourself with the tools to fortify your defenses, and in some cases, sidestep challenges altogether. You’re ready with a safety net ready to catch you if you stumble. With anticipation, you can endure.

P.S. Many industries—engineering, manufacturing, healthcare just to name a few—have a very formal, structured, systematic approach to identify and prioritize potential failures, their causes, and their consequences. As with a pre-mortem, the primary purpose of FMEA is to proactively assess and mitigate risks by understanding how a process or system might fail and the impact of those failures.

Wondering what to read next?

  1. More Data Isn’t Always Better
  2. Be Smart by Not Being Stupid
  3. How to Solve a Problem By Standing It on Its Head
  4. Smart Folks are Most Susceptible to Overanalyzing and Overthinking
  5. The Waterline Principle: How Much Risk Can You Tolerate?

Filed Under: Mental Models, Sharpening Your Skills Tagged With: Critical Thinking, Decision-Making, Mental Models, Problem Solving, Risk, Thinking Tools, Wisdom

The Enron Scandal: A Lesson on Motivated Blindness

July 19, 2023 By Nagesh Belludi Leave a Comment

The fallout from the Enron fiasco had far-reaching effects on the economy and the public’s trust in corporations. It serves as a powerful lesson in the dangers of motivated blindness—when individuals have a personal stake in unethical actions, they often look the other way or find ways to rationalize their behavior.

The folks at Arthur Andersen, serving as Enron’s external auditor, found themselves in a precarious situation. On the one hand, they were supposed to ensure financial integrity, but on the other hand, they acted as consultants, aiding Enron in manipulating financial transactions to deceive investors and manipulate earnings. Enron generously poured hefty fees their way, with auditing fees exceeding $25 million and consulting fees reaching $27 million in 2001. So, why would they want to put an end to this lucrative gravy train? To complicate matters further, many auditors from Andersen were eagerly vying for coveted positions at Enron, just like their fortunate colleagues.

To combat motivated blindness, it’s crucial to reflect on our biases, hold ourselves accountable, and actively seek out diverse perspectives to gain a broader understanding of any given issue. Max Bazerman, a professor at Harvard Business School and author of The Power of Noticing: What the Best Leaders See (2014,) asserts that individuals can overcome their inclination to overlook vital clues by fostering a “noticing mindset.” This involves consistently asking oneself and others, both within and outside the organization, the question: “Which critical threats and challenges might we be neglecting?”

Wondering what to read next?

  1. Power Inspires Hypocrisy
  2. Why Groups Cheat: Complicity and Collusion
  3. The Poolguard Phenomenon
  4. Power Corrupts, and Power Attracts the Corruptible
  5. The Ethics Test

Filed Under: Business Stories, Leadership, Mental Models, Sharpening Your Skills Tagged With: Discipline, Ethics, Getting Along, Integrity, Leadership, Motivation, Psychology, Risk

Under Pressure, The Narrowing Cognitive Map: Lessons from the Tragedy of Singapore Airlines Flight 6

July 10, 2023 By Nagesh Belludi Leave a Comment

Picture this: You’re parking your car when, suddenly, you catch sight of the bus you desperately need to catch pulling into the station. Acting on instinct, you swiftly navigate your car into a vacant spot, deftly gather your bags, and launch yourself towards the bus stop, driven by an unwavering determination to evade a tedious fifteen-minute wait for the next one. In the whirlwind of your frantic sprint, you absentmindedly and hastily tuck your cherished cell phone into your back pocket, oblivious that it slips out during the adrenaline-fueled pursuit of catching the bus. It’s only after another five minutes that you become aware of your cell phone’s absence, and the weight of its loss gradually descends upon you.

Isn’t it fascinating how our minds tend to close off under time pressure? This fascinating cognitive phenomenon is known as the “narrowing of the cognitive map.” It’s as if our attention becomes laser-focused, but unfortunately, that can lead us to make unfortunate errors in judgment.

When we find ourselves in the clutches of tunnel vision, our thinking becomes constrained, and we unknowingly fall into the trap of limited perspective. Not only do we become so fixated on a specific course of action that we overlook crucial details in our environment, but we also become oblivious to the subtle signals whispering, “Something’s amiss.”

Inattentional blindness, indeed. It’s a common problem in high-stress situations, and it can have serious consequences, as in the following case study of the Singapore Airlines Flight 6 crash.

Speed Stress Causes Serious Breakdowns in the Reliability of Judgment

Flight 6’s tragic case accident occurred on October 31, 2000, at Taipei’s Chiang Kai-shek International Airport. Various factors contributed to the crash, including severe weather conditions, limited visibility, inadequate airport markings, and insufficient actions taken by both the pilots and air traffic controllers.

During a scheduled stop in Taipei on its journey from Singapore to Los Angeles, Flight 6’s flight crew became aware of an approaching storm. They realized that if they delayed the takeoff, they would have to wait for the storm to pass, resulting in a lengthy 12-hour delay. This interruption would have entailed making overnight arrangements for the passengers, disrupting the crew’s schedule, and potentially impacting future flight schedules involving the aircraft and company personnel. Consequently, the crew made the decision to expedite the departure and take off before the typhoon made landfall on the island.

The Rushed Pilots Missed Clues That They Were Taking Off on a Closed Runway

Under immense time pressure, the flight crew became singularly focused on expediting their takeoff in rainy and windy conditions before the weather conditions deteriorated further. Despite being instructed to taxi to Runway 05 Left, they deviated from the assigned route and instead positioned themselves on Runway 05 Right, which was closed for takeoff due to ongoing pavement repairs.

Complicating matters, a section of Runway 05 Right was still being used as a taxiway during the construction period. The signage at the entrance of the runway did not adequately indicate the presence of a stop sign and construction equipment along the converted taxiway.

Moreover, the local air traffic controller failed to provide progressive taxi or ground movement instructions, which would have been appropriate considering the low visibility during the taxi. However, due to the crew’s heightened sense of urgency, they neglected to request step-by-step instructions for their taxi route.

Misleading Airport Markings Contributed to Pilots’ Mistaken Belief of Correct Runway Selection

In the midst of low visibility and feeling rushed, the pilots neglected crucial resources that could have guided them to the correct runway, such as runway and taxiway charts, signage, markings, and cockpit instruments. This lapse in judgment resulted in a loss of situational awareness, leading them to initiate takeoff from a runway closed for construction.

The Harsh Reality of Rushing: Examining the Aftermath of Singapore Airlines Flight 6's Closed Runway Mishap Approximately 3,300 feet down the runway, around 11:17 PM that night, the Boeing 747 collided with concrete barriers and construction equipment, resulting in the aircraft breaking apart and bursting into flames.

Tragically, 83 out of the 179 people on board lost their lives.

The crew’s loss of awareness was further compounded by the airport’s negligence in terms of maintenance and safety precautions. By failing to place mandatory construction warnings at the entrance of Runway 05 Right, they disregarded the potential risk of aircraft mistakenly attempting to take off from a partially closed runway.

The air traffic controllers also neglected to verify the aircraft’s position before granting takeoff clearances, despite the aircraft having turned onto Runway 05 Right. The airport lacked the necessary Airport Surface Detection Equipment, which could have been crucial in detecting and mitigating risks, especially given the heavy precipitation that could have hampered radar presentation at the time. In their defense, the pilots had assumed that the air traffic controllers could visually observe the aircraft, and the fact that takeoff clearance was issued just as the aircraft turned onto the taxiway gave them the impression that everything was in order.

Anxiety Leads to Attentional Tunneling and Narrowed Field of Focus

The tragedy of Singapore Airlines Flight 6 serves as a poignant case study highlighting the dangers of tunnel vision and its ability to hinder our perspective and decision-making.

Often, seemingly minor errors, when combined with time constraints and cognitive biases, can intertwine and escalate, leading to catastrophic outcomes. Even in a highly advanced cockpit and a complex system with numerous safeguards, a chain of minor errors can transform it into a deadly trap.

The human brain is naturally inclined to seek confirmation and convince itself that it completely understands the situation at hand. When faced with contradictory information, we tend to ignore it and focus solely on our preconceived notions. Furthermore, anxiety further impairs our ability to perceive the entire situation, leaving us prone to impulsive actions rather than rational responses.

It is vital to be aware of the perils of tunnel vision. It can close our eyes to the broader context and limit our capacity to consider peripheral information. This narrowed perception can have severe consequences, emphasizing the importance of maintaining a broader perspective in decision-making.

Wondering what to read next?

  1. Lessons from the World’s Worst Aviation Disaster // Book Summary of ‘The Collision on Tenerife’
  2. “Fly the Aircraft First”
  3. How Stress Impairs Your Problem-Solving Capabilities: Case Study of TransAsia Flight 235
  4. What Airline Disasters Teach About Cognitive Impairment and Decision-Making Under Stress
  5. The Boeing 737 MAX’s Achilles Heel

Filed Under: Business Stories, Effective Communication, Sharpening Your Skills Tagged With: Aviation, Biases, Conflict, Decision-Making, Mindfulness, Problem Solving, Risk, Stress, Worry

After Action Reviews: The Heartbeat of Every Learning Organization

June 15, 2023 By Nagesh Belludi Leave a Comment

The After Action Review (AAR) is a formal group reflection process used by the military and other organizations to analyze critical decisions or moves.

At its core, the AAR seeks to answer four questions: What was planned, what actually happened, why did it happen, and how can we do better next time?

The focus isn’t on grading success or failure but on identifying weaknesses that need improvement and strengths that should be sustained. The knowledge gained from the AAR can then be shared with others who’re planning, developing, implementing, and evaluating similar efforts.

Conducted in an open and honest climate, the AAR involves candid discussions of actual performance results compared to objectives. It requires input and perspectives from all stakeholders involved in the project or activity. The goal is to ensure everybody feels part of the solution, not the problem.

AARs are a powerful tool for continuous improvement that enables organizational learning through reinforcing personal and organizational accountability and continuous assessment of performance successes and failures. They’re an excellent way to identify best practices (what should be spread) and errors (what shouldn’t be repeated.)

The wisest and smartest people and businesses can reflect ex post facto. As the saying goes, “He that will not reflect is a ruined man.”

Wondering what to read next?

  1. Making Tough Decisions with Scant Data
  2. Be Smart by Not Being Stupid
  3. The Solution to a Problem Often Depends on How You State It
  4. How to Solve a Problem By Standing It on Its Head
  5. Many Creative People Think They Can Invent Best Working Solo

Filed Under: MBA in a Nutshell, Mental Models, Sharpening Your Skills Tagged With: Creativity, Critical Thinking, Decision-Making, Meetings, Problem Solving, Risk, Teams, Thought Process

Availability Heuristic: Our Preference for the Familiar

May 27, 2023 By Nagesh Belludi Leave a Comment

The availability heuristic is a cognitive bias that can lead people to rely on readily available information or emotionally charged and inherently interesting examples when making decisions or judgments. Essentially, individuals tend to overestimate the probability of events that are easy to recall or that they’ve personally experienced, while underestimating the likelihood of less memorable or less frequent events.

In other words, the ease of retrieval of a misleading cue may make people rely on evidence not because it is dependable but because it is memorable or striking and thus psychologically available to them. They may do so even if the evidence is not logically acceptable or does not logically support their decision.

Doctors often depend on recalling their past dramatic cases and mistakenly apply them to the current situation. People may overestimate the crime rate in their community based on news coverage, even though crime rates may be relatively low. People may dismiss the reality of climate change if they’ve recently experienced a cold winter or heard of a cold snap in a particular region, even though global warming is a long-term trend. Individuals are more likely to purchase insurance after experiencing a natural disaster than before it occurs. In each of these scenarios, the vivid and emotional evidence feels more persuasive rather than it being the most accurate or reliable information.

The availability heuristic can also shape people’s perceptions of air travel safety and lead them to believe that flying is more dangerous than it really is. Airplane accidents are often sensationalized and highly publicized by the media, making them more memorable and more prominent in people’s minds. This can cause individuals to perceive the risk of flying much higher than it actually is, leading them to avoid air travel even though it is statistically one of the safest forms of transportation. In reality, many less vivid and less memorable (i.e., psychologically unavailable) things are much more dangerous than air travel, such as falling down stairs, drowning, choking, and accidental poisoning.

Avoid falling prey to the availability heuristic and making serious misjudgments about the risks associated with different situations. Acknowledge that personal experiences and recent events may not accurately reflect the overall reality of the situation.

Wondering what to read next?

  1. The Boeing 737 MAX’s Achilles Heel
  2. How Stress Impairs Your Problem-Solving Capabilities: Case Study of TransAsia Flight 235
  3. Many Hard Leadership Lessons in the Boeing 737 MAX Debacle
  4. Be Smart by Not Being Stupid
  5. How to Guard Against Anything You May Inadvertently Overlook

Filed Under: Mental Models, Sharpening Your Skills Tagged With: Aviation, Biases, Decision-Making, Mental Models, Problem Solving, Psychology, Risk, Thinking Tools

Maximize Your Chance Possibilities & Get Lucky

April 27, 2023 By Nagesh Belludi Leave a Comment

'Luck Factor' by Richard Wiseman (ISBN 0786869143) British psychologist Richard Wiseman’s Luck Factor (2003) explores what makes some people lucky and others unlucky.

Being lucky is a mindset to bring to life. Lucky people maximize their chances of creating and noticing a lucky opportunity. They listen to their intuition when they get an opportunistic hunch.

The book’s core premise is whether you’re generally lucky or unlucky depends on your attitude—an optimistic mindset is a self-fulfilling prophecy, indeed. Lucky people expect good fortune; they expect good things to happen in their life. When they do have a run of bad luck, they adopt a resilient attitude and somehow turn that into good luck.

Idea for Impact: Lucky people aren’t lucky by sheer accident. To maximize your chances of getting lucky, get more opportunities and feel luckier. Get out there more often, produce more work, and talk to more people. Be open to the world and ready for new opportunities.

Wondering what to read next?

  1. How to Guard Against Anything You May Inadvertently Overlook
  2. Question Success More Than Failure
  3. Gambler’s Fallacy is the Failure to Realize How Randomness Rules Our World
  4. The Boeing 737 MAX’s Achilles Heel
  5. Be Smart by Not Being Stupid

Filed Under: Mental Models, Sharpening Your Skills Tagged With: Attitudes, Biases, Books for Impact, Creativity, Luck, Risk, Thinking Tools

How to … Stop That Inner Worrywart

February 22, 2023 By Nagesh Belludi Leave a Comment

I’m one of those incessant worrywarts. Risk mitigation is a significant facet of my work. Thus, I worry about the prospect of non-optimal results; I worry about the unintended side effects of my decisions, and I worry about what people aren’t telling me. I even worry that I worry too much (now, that worry is entirely unfounded.)

If, like many people, you’d like to worry less, perhaps you may find the following approaches helpful. Most of my over-worrying comes from thinking ahead, but after a reasonable effort to understand risks and make plans to adapt more flexibly to developing situations, I’ll just let up. I’ll self-talk as though I’m addressing a team, “Not everything is within our control. We’ll cross that bridge when we come to it. Let’s deal with it as it appears and course-correct.” Beyond that, I’ll get really busy with something else that keeps me too occupied to fret about the previous thing that worried me.

Wondering what to read next?

  1. Everything in Life Has an Opportunity Cost
  2. Dear Hoarder, Learn to Let Go
  3. Thinking Straight in the Age of Overload // Book Summary of Daniel Levitin’s ‘The Organized Mind’
  4. This Hack Will Help You Think Opportunity Costs
  5. Making Tough Decisions with Scant Data

Filed Under: Mental Models, Sharpening Your Skills Tagged With: Balance, Clutter, Decision-Making, Perfectionism, Procrastination, Risk

How to … Plan in a Time of Uncertainty

January 25, 2023 By Nagesh Belludi Leave a Comment

In periods of uncertainty and ambiguity, move away from annual plans and focus on the next three months. Reflect on the unpredictability of the future and stay on your toes by forging plans for unexpected scenarios so you won’t be caught flat-footed when that time comes.

Establish “trigger points” and “accelerate, maintain, or terminate criteria” in advance and keep an eye on key indicators to “wait and see” or “stay the course” should one of your planned-for scenarios materialize.

Idea for Impact: When the horizon is much shorter, operate with agility and allocate your resources in real time.

Wondering what to read next?

  1. Making Tough Decisions with Scant Data
  2. Under Pressure, The Narrowing Cognitive Map: Lessons from the Tragedy of Singapore Airlines Flight 6
  3. This Hack Will Help You Think Opportunity Costs
  4. After Action Reviews: The Heartbeat of Every Learning Organization
  5. Be Smart by Not Being Stupid

Filed Under: Leadership, MBA in a Nutshell, Mental Models Tagged With: Adversity, Conflict, Decision-Making, Persuasion, Problem Solving, Risk

Be Smart by Not Being Stupid

December 12, 2022 By Nagesh Belludi Leave a Comment

No superhuman ability is usually required to dodge the many foolish choices to which we’re prone. A few basic rules are all that’s needed to shield you, if not from all errors, from silly errors.

Charlie Munger often emphasizes that minimizing mistakes may be one of the least appreciated tricks in successful investing. He has reputedly credited much of Berkshire Hathaway’s success to consistently avoiding stupidity. “It is remarkable how much long-term advantage we have gotten by trying to be consistently not stupid instead of trying to be very intelligent.” And, “I think part of the popularity of Berkshire Hathaway is that we look like people who have found a trick. It’s not brilliance. It’s just avoiding stupidity.” They’ve avoided investing in situations they don’t understand or summon experience.

As a policy, avoiding stupidity in investing shouldn’t mean avoiding risk wholly; instead, it’s taking on risk only when there’s a fair chance that you’ll be adequately rewarded for assuming that risk.

Idea for Impact: Tune out stupidity. Becoming successful in life isn’t always about what you do but what you don’t do. In other words, improving decision quality is often more about decreasing your chances of failure than increasing your chances of success.

Wondering what to read next?

  1. Accidents Can Happen When You Least Expect Them: The Overconfidence Effect
  2. Making Tough Decisions with Scant Data
  3. More Data Isn’t Always Better
  4. Protect the Downside with Pre-mortems
  5. How to Solve a Problem By Standing It on Its Head

Filed Under: MBA in a Nutshell, Mental Models, Sharpening Your Skills Tagged With: Biases, Critical Thinking, Decision-Making, Problem Solving, Risk, Thinking Tools, Thought Process, Wisdom

Next Page »

Primary Sidebar

Popular Now

Anxiety Assertiveness Attitudes Balance Biases Books Coaching Conflict Conversations Creativity Critical Thinking Decision-Making Discipline Emotions Entrepreneurs Etiquette Feedback Getting Along Getting Things Done Goals Great Manager Leadership Leadership Lessons Likeability Mental Models Mentoring Mindfulness Motivation Networking Parables Performance Management Persuasion Philosophy Problem Solving Procrastination Relationships Simple Living Social Skills Stress Thinking Tools Thought Process Time Management Winning on the Job Wisdom Worry

About: Nagesh Belludi [hire] is a St. Petersburg, Florida-based freethinker, investor, and leadership coach. He specializes in helping executives and companies ensure that the overall quality of their decision-making benefits isn’t compromised by a lack of a big-picture understanding.

Get Updates

Signup for emails

Subscribe via RSS

Contact Nagesh Belludi

RECOMMENDED BOOK:
The Art of Stillness

The Art of Stillness: Pico Iyer

Travel writer Pico Iyer’s argues the importance of taking a timeout from busyness. Examples of a privileged few who have found peace through stillness in practice.

Explore

  • Announcements
  • Belief and Spirituality
  • Business Stories
  • Career Development
  • Effective Communication
  • Great Personalities
  • Health and Well-being
  • Ideas and Insights
  • Inspirational Quotations
  • Leadership
  • Leadership Reading
  • Leading Teams
  • Living the Good Life
  • Managing Business Functions
  • Managing People
  • MBA in a Nutshell
  • Mental Models
  • News Analysis
  • Personal Finance
  • Podcasts
  • Project Management
  • Proverbs & Maxims
  • Sharpening Your Skills
  • The Great Innovators
  • Uncategorized

Recently,

  • Beyond the Illusion: The Barnum Effect and Personality Tests
  • Take this Quiz and Find Out if You’re a Perfectionist
  • Inspirational Quotations #1025
  • What to Say When Words Escape You
  • Balancing Acts: Navigating ‘Good’ Addictions
  • Stop Owning Other People’s Problems
  • The Never-Ending Office vs. Remote Work Debate

Unless otherwise stated in the individual document, the works above are © Nagesh Belludi under a Creative Commons BY-NC-ND license. You may quote, copy and share them freely, as long as you link back to RightAttitudes.com, don't make money with them, and don't modify the content. Enjoy!