• Skip to content
  • Skip to primary sidebar

Right Attitudes

Ideas for Impact

Biases

Let’s Hope She Gets Thrown in the Pokey

November 16, 2021 By Nagesh Belludi 1 Comment

The Elizabeth Holmes-Theranos criminal trial hasn’t been without its share of theatrics.

Yes, Holmes’s massive fraud is obvious. She entranced (read WSJ reporter John Carreyrou’s excellent chronicle, Bad Blood (2018; my summary)) journalists, investors, politicians, and business partners into believing her fantasy science. She may even be responsible for negligent homicide if people died because of her company’s fake test results.

Then again, these sorts of cases generally hang on subtle distinctions between hyperbole and outright dishonesty and whether such deceit was deliberate.

Holmes’s lawyers will argue that she was merely an ambitious entrepreneur who failed to realize her vision but wasn’t a fraudster. Her lawyers will make a case that she is not to be blamed because people took her puffery and exaggeration as factually accurate. At what point do her wishfulness and enthusiasm go from optimism to intentional fraud? That’ll be the critical question.

'Bad Blood' by John Carreyrou (ISBN 152473165X) At any rate, the Theranos verdict is unlikely to deter others from the swagger, self-assurance, hustle, and the “fake it till you make it” ethos that is so endemic to start-up culture. Investors will never cease looking at people and ideas rather than the viability of their work.

Idea for Impact: Don’t be so swayed by story-telling that has a way of making people less objectively observant. Assemble the facts, and ask yourself what truth the facts bear out. Never let yourself be sidetracked by what you wish to believe.

Wondering what to read next?

  1. A Real Lesson from the Downfall of Theranos: Silo Mentality
  2. The Dramatic Fall of Theranos & Elizabeth Holmes // Book Summary of John Carreyrou’s ‘Bad Blood’
  3. When Work Becomes a Metric, Metrics Risk Becoming the Work: A Case Study of the Stakhanovite Movement
  4. The Wisdom of the Well-Timed Imperfection: The ‘Pratfall Effect’ and Authenticity
  5. Virtue Deferred: Marcial Maciel, The Catholic Church, and How Institutions Learn to Look Away

Filed Under: Business Stories, Mental Models Tagged With: Biases, Critical Thinking, Entrepreneurs, Ethics, Likeability, Psychology, Questioning, Risk

You Can’t Believe Those Scientific Studies You Read About in the Papers

November 11, 2021 By Nagesh Belludi Leave a Comment

Look at the filler articles in the well-being section of your preferred newspaper, and you’ll often luck into health advice with nuance-free mentions of all sorts of scientific studies.

One week, drinking coffee is good for you. Next week it’s harmful. Ditto video games. Swearing makes you look intelligent, but hold your flipping horses … the next day, swearing makes you seem verbally challenged to explain your annoyances respectfully.

Gutter press revelations isn’t only less-than-scientific, but it actually defeats the objectives of science.

In June 2014, the Proceedings of the National Academy of Sciences published an allegedly peer-reviewed paper titled “Female hurricanes are deadlier than male hurricanes.” The study deduced that hurricanes with feminine names generate more casualties supposedly because tacit sexism makes communities take the storms with a feminine name less seriously. The work was discredited as soon as its methods were dissected. Nevertheless, the dubious paper had made its way into media channels across the country because of the believability implied by the influential National Academy of Sciences.

Positive results that make a sensational headline tend to get published readily—especially if they speak to the audiences’ worldviews. In truth, many of these studies are low-quality studies where the variables are latent, and the effects aren’t directly observable and quantifiable, especially in the social sciences. Sadly, with the push to produce ever more papers in academia, peer review doesn’t necessarily corroborate the quality of research nearly so much as it enforces a discipline’s norms.

Idea for Impact: Let’s be skeptical readers. Let’s be better readers.

Let’s subject every claim to the common-sense test: is the claim possible, plausible, probable, and likely? Everything possible isn’t plausible, everything plausible isn’t probable, and everything probable isn’t likely.

Being skeptical does not mean doubting the validity of everything, nor does it mean being cynical. Instead, to be skeptical is to judge the validity of a claim based on objective evidence and understand the assertions’ nuances. Yes, even extraordinary claims can be valid, but the more extraordinary a claim, the more remarkable the evidence to be mobilized.

While we’re on the subject, have you heard about research that found that you could make unsuspecting people believe in anything by merely asserting that it’s been “shown by research?” Now then, the former’s the only research worth believing. Very much so, yes, even without evidence!

Wondering what to read next?

  1. Question Success More Than Failure
  2. What the Rise of AI Demands: Teaching the Thinking That Thinks About Thinking
  3. The Data Never “Says”
  4. The Upsides of Slowing Down
  5. Of Course Mask Mandates Didn’t ‘Work’—At Least Not for Definitive Proof

Filed Under: Sharpening Your Skills Tagged With: Biases, Critical Thinking, Questioning, Thinking Tools

Don’t Get Stuck in Middle Management

September 21, 2021 By Nagesh Belludi Leave a Comment

This survey by the Association of Asian Americans in Investment Management reports (via The New York Times DealBook column) the nature of discrimination and bias faced by Asian Americans:

Asian Americans and Pacific Islanders are often stereotyped as lacking leadership skills. At investment firms, they “fill middle management ranks, but their percentages plummet in senior management and C-suites.” Respondents said they were often tapped as technical experts and benefited from the perception that they are good workers. But their advancement stalled as they sought more senior roles that emphasize networking and communication skills.

Most professionals fail to realize that the competencies that made them successful in their early corporate roles are not necessarily the attributes that will allow them to outshine in roles higher up on the ladder. These desirable qualities would include forming coalitions, managing relationships and alliances, determining where and when to shift one’s focus, and learning to appreciate different perspectives.

Work out what you need to get to the top and fight the perceptions

  • Evaluate where your development priorities should be. Find out how you can acquire the necessary skills and competencies. Go get them. Become more visible to management and situate yourself for a promotion.
  • Network wisely. Understanding who must be won over to your point of view is vital for training for your promotion. Spend time cultivating meaningful relationships.
  • Ask for honest feedback—not just from your boss but also from well-respected peers, customers, mentors, and others. Confront problems quickly lest they metastasize.

Idea for Impact: In today’s world, your skills and promotability are your responsibility.

Wondering what to read next?

  1. How to … Be More Confident at Work
  2. Risk More, Risk Earlier
  3. The Career-Altering Question: Generalist or Specialist?
  4. Five Ways … You Could Elevate Good to Great
  5. Before Jumping Ship, Consider This

Filed Under: Career Development, Sharpening Your Skills Tagged With: Biases, Career Planning, Interpersonal, Leadership, Personal Growth, Skills for Success

Many Hard Leadership Lessons in the Boeing 737 MAX Debacle

August 24, 2021 By Nagesh Belludi Leave a Comment

The U.S. House committee’s report on Boeing’s 737 MAX disaster makes interesting reading on contemporary leadership, particularly the pressures of rapid product development.

The rush to market and a culture of contributory negligence and concealment conspired to ensure that a not-yet-airworthy plane carried passengers into service, resulting in two fatal accidents and a long grounding.

Boeing’s design and development of the 737 MAX was marred by technical design failures, a lack of transparency with both regulators and customers, and efforts to downplay or disregard concerns about the operation of the aircraft.

Of particular importance are the “management failures,” “inherent conflicts of interest,” and “grossly insufficient oversight” at both Boeing and its regulator, the Federal Aviation Administration (FAA.) Boeing failed to offset the design limitations and cost- and schedule-pressures in favor of attention to customer safety. Leadership was fixated on fending off the runaway success of the Airbus A320neo program.

The company relied on too many technical assumptions—and they couldn’t make themselves the space and time to be reasonable about any of this. Boeing’s “culture of concealment” and an “unwillingness to share technical details” are the report’s most damning indictment. Employees spoke but went unheard; indeed, their voices were suppressed.

Wondering what to read next?

  1. The Boeing 737 MAX’s Achilles Heel
  2. Availability Heuristic: Our Preference for the Familiar
  3. The “Ashtray in the Sky” Mental Model: Idiot-Proofing by Design
  4. How Stress Impairs Your Problem-Solving Capabilities: Case Study of TransAsia Flight 235
  5. Lessons from the World’s Worst Aviation Disaster // Book Summary of ‘The Collision on Tenerife’

Filed Under: Leadership Tagged With: Aviation, Biases, Change Management, Decision-Making, Problem Solving, Risk, Thinking Tools

Lessons from David Dao Incident: Watch Out for the Availability Bias!

August 23, 2021 By Nagesh Belludi Leave a Comment

In the weeks and months after the United Airlines’David Dao incident and the ensuing customer service debacle, news of all kinds of disruptive airline incidents, coldblooded managers, and inconsiderate airline staff showed up everywhere.

The United incident raised everyone’s awareness of airline incidents. Expectedly, the media started drawing attention to all sorts of airline incidents—fights on airplanes, confusion and airports, seats taken from small children, insects in inflight meals, snakes on the plane—affecting every airline, large and small. However, such unpleasant incidents rarely happen, with thousands of flights every day experiencing nothing of the sort.

Parenthetically, the underlying problem that led to the David Dao incident wasn’t unique to United. The incident could have happened at other airlines. All airlines had similar policies regarding involuntary-denied boarding and prioritizing crew repositioning. Every other airline, I’m sure, felt lucky the David Dao incident didn’t happen on their airline.

In the aftermath of the incident, many people vowed to boycott United. Little by little, that negative consumer sentiment faded away while the backlash—and media coverage—over the incident diminished.

Availability bias occurs when we make decisions based on easy or incomplete ideas.

The David Dao incident’s media coverage is an archetypal case of the Availability Bias (or Availability Heuristic) in force. Humans are inclined to disproportionately assess how likely something will happen by how easy it is to summon up comparable–and recent–examples. Moreover, examples that carry a fierce emotional weight tend to come to mind quickly.

The availability heuristic warps our perception of real risks. Therefore, if we’re assessing whether something is likely to happen and a similar event has occurred recently, we’re much more liable to expect the future possibility to occur.

What we remember is shaped by many things, including our beliefs, emotions, and things like intensity and frequency of exposure, particularly in mass media. When rare events occur, as was the case with the David Dao incident, they become evident. Suppose you’re in a car accident involving a Chevy, you are likely to rate the odds of getting into another car accident in a Chevy much higher than base rates would suggest.

If you are aware of the availability bias and begin to look for it, you will be surprised how often it shows up in all kinds of situations. As with many other biases, we can’t remove this natural tendency. Still, we can let our rational minds account for this bias in making better decisions by being aware of the availability bias.

Idea for Impact: Don’t be disproportionately swayed by what you remember. Don’t underestimate or overestimate a risk or choosing to focus on the wrong risks. Don’t overreact to the recent facts.

Wondering what to read next?

  1. Why Your Judgment Sucks
  2. The Unthinking Habits of Your Mind // Book Summary of David McRaney’s ‘You Are Not So Smart’
  3. Situational Blindness, Fatal Consequences: Lessons from American Airlines 5342
  4. The “Ashtray in the Sky” Mental Model: Idiot-Proofing by Design
  5. Availability Heuristic: Our Preference for the Familiar

Filed Under: Mental Models, Sharpening Your Skills Tagged With: Aviation, Biases, Change Management, Critical Thinking, Decision-Making, Psychology, Thought Process

If You’re Looking for Bad Luck, You’ll Soon Find It

August 16, 2021 By Nagesh Belludi Leave a Comment

Consider a woman who complained that her neighborhood dry cleaner ruined her expensive slacks. “Last month, he spoiled my wool blazer. Last Christmas, he … . It always happens,” she grumbled.

This woman knew she was taking chances with this dry cleaner. She allowed it to happen.

Luck is sometimes the result of taking appropriate action. And, bad luck is sometimes the result of tempting fate.

Say, you’ve been planning for weeks for your next big trip. You got an incredible deal on the day’s very last flight to your destination. On the day of departure, your late-night flight gets canceled. Sure, you’re a victim of back luck—but you invited it. Think about it. Odds are, you’re more likely to have a flight delay or cancellation later in the day because airlines schedule their rosters tightly to maximize aircraft and flight crew utilization. Delays and disruptions from earlier in the day propagate onward to the late flights.

Often, luck has nothing to do with bad luck. “The fault,” as Shakespeare wrote, “is not in our stars, but in ourselves.”

Sometimes you can be your own worst enemy. Don’t self-sabotage yourself by tempting fate.

Idea for Impact: Bad choices beget bad luck

You have to be lucky to get lucky. You have no control over many outcomes in life, but you can always increase the odds of getting lucky by taking appropriate action. More importantly, you can minimize the chance of bad luck by decreasing its odds.

Remember, a good mathematics student never buys a lottery ticket, and if he does, he never grumbles about not winning the jackpot!

Wondering what to read next?

  1. Be Smart by Not Being Stupid
  2. The “Ashtray in the Sky” Mental Model: Idiot-Proofing by Design
  3. Accidents Can Happen When You Least Expect Them: The Overconfidence Effect
  4. More Data Isn’t Always Better
  5. Protect the Downside with Pre-mortems

Filed Under: Mental Models Tagged With: Biases, Critical Thinking, Decision-Making, Luck, Risk, Wisdom

Why Your Judgment Sucks

April 5, 2021 By Nagesh Belludi Leave a Comment

Israeli-American psychologist Daniel Kahneman’s bestselling Thinking, Fast and Slow (2011) describes the finer points of decision-making. It’s an engaging showcase of the innate biases of the mind and unthinking approaches to decision-making.

Human Beings are Intuitive Thinkers

Kahneman is a behavioral economics pioneer and the winner of the 2002 Nobel Memorial Prize in Economic Sciences. His lifelong collaboration with Amos Tversky (1937—96) has molded humans’ thinking about human error, risk, judgment, decision-making, happiness, and more. Tversky died in 1996, so he did not share in the Nobel.

Thinking, Fast and Slow explores what Kahneman calls the “mind’s machinery” as two coexisting modes of thought (“fast and slow,” as the title says.) Kahneman splits the brain into two radically divergent ways, employing a two-tier model of cognition.

  • System One makes judgments instantly, intuitively, and automatically, as when a cricket batsman decides whether to cut or pull. A significant part of System One is “evolved heuristics” that lets us read a person’s expression in a microsecond from a block away, for example. And it can’t be switched off. System One’s thinking is fast and effortless. It often jumps to the wrong conclusions, relies on hunches and biases, and perhaps overconfident.
  • System Two is slower, conscious, calculated, and deliberate, like long division. Its operations require attention. System Two is what we think of as “thinking”—slow, tiring, and essential. It’s what makes us human. Even if System Two believes it is on top of things, System One makes many of our decisions.

System One Isn’t All Flawed

In a world that often necessitates swift judgment and rapid decision-making (e.g., fight or flight,) a person who solely relies on deliberative thinking (System Two) wouldn’t last long. Doctors and firefighters, for example, through training and repetition, develop what’s called “expert intuition” that helps them identify patterns and impulsively devise the right response to a complex emergency.

We as humans are not simple rational agents. Consequently, our thinking boils down to two “Systems” of thinking/processing. As we strive to make better decisions in our work and personal lives, it benefits us to slow down and use a more deliberate System 2 way of thinking. Learn to doubt your fast/quick way of thinking!

Human Intuition is Imperfect

Thinking, Fast and Slow is an eye-opener in various ways. It can be a frightening catalog of the biases, shortcuts, and cognitive illusions that come to err our judgment—the endowment effect, priming, halo effect, anchoring effect, conjugation fallacy, the narrative fallacy, and the rest. Such mental processes are not intrinsically flawed; they are heuristics—rules of thumb, stereotypes, shortcuts. They are strategies the mind embraces to find a path in a tsunami of data.

Kahneman teaches how to recognize situations that require slower, deliberative thinking. Kahneman asserts that the value of the book is to give people the vocabulary to spot biases and to criticize the decisions of others: “Ultimately, a richer language is essential to the skill of constructive criticism.”

Recommendation: Read Daniel Kahneman’s Thinking, Fast and Slow (2011.) As one of the most popular non-fiction books in the last decade, it’ll open your eyes to the quirky and error-prone ways in which you can be influenced in ways you don’t suspect.

The conceptions behind behavioral economics make Thinking, Fast and Slow a laborious read. Many chapters are bogged down by hair-splitting details of his rigorous scientific work and academic gobbledygook. It’s a commanding survey of this field, but it’s superbly written and intelligible to non-experts.

Complement with Rolf Dobelli’s accessible The Art of Thinking Clearly (2013.)

Wondering what to read next?

  1. The Unthinking Habits of Your Mind // Book Summary of David McRaney’s ‘You Are Not So Smart’
  2. The Data Never “Says”
  3. Question the Now, Imagine the Next
  4. Situational Blindness, Fatal Consequences: Lessons from American Airlines 5342
  5. Accidents Can Happen When You Least Expect Them: The Overconfidence Effect

Filed Under: Mental Models, Sharpening Your Skills Tagged With: Biases, Critical Thinking, Decision-Making, Psychology, Thinking Tools, Thought Process

The Unthinking Habits of Your Mind // Book Summary of David McRaney’s ‘You Are Not So Smart’

April 1, 2021 By Nagesh Belludi Leave a Comment

Psychologists have argued that many cognitive biases are rooted in mental shortcuts. They are heuristics—rules of thumb, stereotypes, instincts—that help you make sense of the world. They aren’t intrinsically flawed, but, they’re often quirky and error-prone. Your mental models can affect you in ways that you don’t suspect.

David McRaney’s You Are Not So Smart (2011) suggests a brief—if hurried—tour of 48 cognitive biases that can deceive you. Based on the author’s popular blog, the book is a satisfying assessment of understanding people’s—and your own—behavior a little bit better.

There is a growing body of work coming out of psychology and cognitive science that says you have no clue why you act the way you do, choose the things you choose, or think the thoughts you think. … From the greatest scientist to the most humble artisan, every brain within every body is infested with preconceived notions and patterns of thought that lead it astray without the brain knowing it. So you are in good company.

Each chapter starts with a brief statement of a misconception followed by the fact. Then a synopsis of a related behavioral study shows how our brains produce the counterpart deception and the truth. Some of the less-known preconceptions discussed are,

  • Confabulation. You tend to create unreliable narratives to explain away your choices post hoc. These reassuring perceptions can make you think you’re more rational than you actually are.
  • Groupthink. People tend to fall in with the rest of the group to minimize conflict and foster group cohesiveness and social acceptance. No one wants to be the one person with a dissenting opinion.
  • Social Loafing. That others in a team will pick up your slack may induce you to put in less effort if you think you’ll get away with it. This can curb your own performance, even if you’re a conscientious, hardworking type. If you don’t feel your participation will be noticed, why bother putting in the effort?
  • Availability Heuristic. You’re likely to estimate the likelihood of an event based on your ability to recall immediate and easily accessed examples.
  • Fundamental Attribution Error. You tend to assign external reasons for your own behavior but internal motives to other people. For instance, if you’re late for a meeting, you’ll blame it on public transport. If someone else is running late for a meeting with you, you’ll blame it on her poor time-keeping.

Recommendation: Read David McRaney’s You Are Not So Smart. It’s an engaging, easy-to-read primer to how the mind works. Read it as a lead up to Daniel Kahneman’s bestselling dissertation Thinking, Fast and Slow (2011; summary forthcoming.)

Idea for Impact: Once you learn to spot the cognitive biases we all grapple with, they’re easier to overcome.

Wondering what to read next?

  1. Why Your Judgment Sucks
  2. The Data Never “Says”
  3. Question the Now, Imagine the Next
  4. Situational Blindness, Fatal Consequences: Lessons from American Airlines 5342
  5. Accidents Can Happen When You Least Expect Them: The Overconfidence Effect

Filed Under: Mental Models, Sharpening Your Skills Tagged With: Biases, Critical Thinking, Decision-Making, Psychology, Thinking Tools, Thought Process

Five Ways … You Could Avoid Being Wrong

March 20, 2021 By Nagesh Belludi Leave a Comment

  • Beware exaggeration of a kernel of truth. For instance: indeed, many of us don’t realize our full intellectual potential; but that doesn’t give credence to the notion that most people use only 10% of their brainpower. Besides, beware of overstatements of small differences. Sure, men and women tend to differ somewhat in their communication styles, but declaring that “men are from Mars” and “women are from Venus” is taking a kernel of reality to an extreme, not to mention coercing psychology into stereotypes.
  • Don’t infer causation from correlation. Don’t be tempted to conclude that if two things co-occur statistically, they must be causally related to each other. (Rabbi Harold Kushner once asserted that circumcision seems to increase men’s chances of winning a Nobel Prize.) Seek contending explanation.
  • Beware biased sampling and extrapolation. Inferences from a biased sample are not as trustworthy as conclusions from a truly random sample—e.g., don’t ask people coming out of Sunday Mass if they have a personal relationship with Jesus Christ and infer that Americans are turning to God. Don’t ascribe to the whole any attribute of the part.
  • Don’t let stress impair your problem-solving capabilities. As many airline disasters confirm (example, example, example,) speed can narrow your cognitive map—small errors can quickly become linked up and amplified into disastrous outcomes. When you feel rushed, you’re likely to miss details. You’re not present enough in the moment to notice what’s important and make the most beneficial choices.
  • Beware argumentum ad nauseam. Don’t confuse a statement’s familiarity (such as urban legends) with its accuracy. The fact that you’ve heard a claim repeated over and over again (think of President Trump’s allegations of widespread voter fraud,) sometimes with rearranged phrasing and substitute terms, doesn’t make it correct.

Bonus: Be suspicious of any claim that doesn’t come with counterarguments or disconfirming evidence.

Wondering what to read next?

  1. Be Smart by Not Being Stupid
  2. How to … Escape the Overthinking Trap
  3. The “Ashtray in the Sky” Mental Model: Idiot-Proofing by Design
  4. Accidents Can Happen When You Least Expect Them: The Overconfidence Effect
  5. If You’re Looking for Bad Luck, You’ll Soon Find It

Filed Under: Mental Models, Sharpening Your Skills Tagged With: Biases, Critical Thinking, Decision-Making, Risk

The Data Never “Says”

March 1, 2021 By Nagesh Belludi Leave a Comment

Data doesn’t say anything. Indeed, data can’t say anything for itself about an issue any more than a saw can form furniture, or a sauce can simmer a stew.

Data is inert and inanimate. Data doesn’t know why it was created. Data doesn’t have a mind of its own, and, therefore, it can’t infer anything.

Data is a necessary ingredient in judgment. It’s people who select and interpret data. People can turn it into insight or torture it to bring their agenda to bear. Data is therefore only as useful as its quality and the skills of the people wielding it.

Far more than we admit, subjectivity and intuition play a significant role in deciding how we collect, choose, process, explain, interpret, and apply the data. As entrepreneur Margaret Heffernan warns in Willful Blindness: Why We Ignore the Obvious at Our Peril (2012,) “We mostly admit the information that makes us feel great about ourselves, while conveniently filtering whatever unsettles our fragile egos and most vital beliefs.”

In the hands of careless users, data can end up having the opposite effect its creators intended. All data is good or bad depending on how it’s employed in a compelling story and what end it’s serving—neither of which the data itself can control.

  • Don’t let data drive your conclusions. Let data inform your conclusions.
  • Don’t declare, “The data says,” (as in, “the stock market thinks.”) Data by itself cannot have a particular interpretation.
  • When you find data that seems to support the case you wish to make, don’t swoop on it without caution and suspicion. Data can be very deceptive when used carelessly.
  • Be familiar with the limitations of your data. Investigate if your data informs any other equally valid hypothesis that could propose an alternative conclusion.

Idea for Impact: Beware of the risk of invoking data in ways that end up undermining your message.

Wondering what to read next?

  1. What if Something Can’t Be Measured
  2. In Praise of Inner Voices: A Powerful Tool for Smarter Decisions
  3. Be Smart by Not Being Stupid
  4. Of Course Mask Mandates Didn’t ‘Work’—At Least Not for Definitive Proof
  5. The “Ashtray in the Sky” Mental Model: Idiot-Proofing by Design

Filed Under: Mental Models, Sharpening Your Skills Tagged With: Biases, Conversations, Conviction, Critical Thinking, Decision-Making, Persuasion, Problem Solving, Thinking Tools, Thought Process

« Previous Page
Next Page »

Primary Sidebar

Popular Now

Anxiety Assertiveness Attitudes Balance Biases Coaching Conflict Conversations Creativity Critical Thinking Decision-Making Discipline Emotions Entrepreneurs Etiquette Feedback Getting Along Getting Things Done Goals Great Manager Innovation Leadership Leadership Lessons Likeability Mental Models Mentoring Mindfulness Motivation Networking Parables Performance Management Persuasion Philosophy Problem Solving Procrastination Relationships Simple Living Social Skills Stress Suffering Thinking Tools Thought Process Time Management Winning on the Job Wisdom

About: Nagesh Belludi [hire] is a St. Petersburg, Florida-based freethinker, investor, and leadership coach. He specializes in helping executives and companies ensure that the overall quality of their decision-making benefits isn’t compromised by a lack of a big-picture understanding.

Get Updates

Signup for emails

Subscribe via RSS

Contact Nagesh Belludi

RECOMMENDED BOOK:
Taking Advice

Taking Advice: Dan Ciampa

Executive coach Dan Ciampa offers an excellent framework on the advice network you need on strategic, operational, political, and personal elements of your work and life.

Explore

  • Announcements
  • Belief and Spirituality
  • Business Stories
  • Career Development
  • Effective Communication
  • Great Personalities
  • Health and Well-being
  • Ideas and Insights
  • Inspirational Quotations
  • Leadership
  • Leadership Reading
  • Leading Teams
  • Living the Good Life
  • Managing Business Functions
  • Managing People
  • MBA in a Nutshell
  • Mental Models
  • News Analysis
  • Personal Finance
  • Podcasts
  • Project Management
  • Proverbs & Maxims
  • Sharpening Your Skills
  • The Great Innovators

Recently,

  • What Appears Self-Evident to One May Be Entirely Opaque to Another: How the Dalai Lama Apology Highlights Cultural Relativism
  • Inspirational Quotations #1136
  • Ditch Deadlines That Deceive
  • Invention is Refined Theft
  • You Need to Stop Turning Warren Buffett Into a Prophet
  • Inspirational Quotations #1135
  • What the Dry January Trap Shows Us About Extremes

Unless otherwise stated in the individual document, the works above are © Nagesh Belludi under a Creative Commons BY-NC-ND license. You may quote, copy and share them freely, as long as you link back to RightAttitudes.com, don't make money with them, and don't modify the content. Enjoy!